WorldWideScience

Sample records for isotopic analysis software

  1. Spectral analysis software improves confidence in plant and soil water stable isotope analyses performed by isotope ratio infrared spectroscopy (IRIS).

    Science.gov (United States)

    West, A G; Goldsmith, G R; Matimati, I; Dawson, T E

    2011-08-30

    Previous studies have demonstrated the potential for large errors to occur when analyzing waters containing organic contaminants using isotope ratio infrared spectroscopy (IRIS). In an attempt to address this problem, IRIS manufacturers now provide post-processing spectral analysis software capable of identifying samples with the types of spectral interference that compromises their stable isotope analysis. Here we report two independent tests of this post-processing spectral analysis software on two IRIS systems, OA-ICOS (Los Gatos Research Inc.) and WS-CRDS (Picarro Inc.). Following a similar methodology to a previous study, we cryogenically extracted plant leaf water and soil water and measured the δ(2)H and δ(18)O values of identical samples by isotope ratio mass spectrometry (IRMS) and IRIS. As an additional test, we analyzed plant stem waters and tap waters by IRMS and IRIS in an independent laboratory. For all tests we assumed that the IRMS value represented the "true" value against which we could compare the stable isotope results from the IRIS methods. Samples showing significant deviations from the IRMS value (>2σ) were considered to be contaminated and representative of spectral interference in the IRIS measurement. Over the two studies, 83% of plant species were considered contaminated on OA-ICOS and 58% on WS-CRDS. Post-analysis, spectra were analyzed using the manufacturer's spectral analysis software, in order to see if the software correctly identified contaminated samples. In our tests the software performed well, identifying all the samples with major errors. However, some false negatives indicate that user evaluation and testing of the software are necessary. Repeat sampling of plants showed considerable variation in the discrepancies between IRIS and IRMS. As such, we recommend that spectral analysis of IRIS data must be incorporated into standard post-processing routines. Furthermore, we suggest that the results from spectral analysis be

  2. Cooperation on Improved Isotopic Identification and Analysis Software for Portable, Electrically Cooled High-Resolution Gamma Spectrometry Systems Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Dreyer, Jonathan G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Wang, Tzu-Fang [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Vo, Duc T. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Funk, Pierre F. [Inst. for Radiological Protection and Nuclear Safety (IRSN), Fontenay-aux-Roses (France); Weber, Anne-Laure [Inst. for Radiological Protection and Nuclear Safety (IRSN), Fontenay-aux-Roses (France)

    2017-07-20

    Under a 2006 agreement between the Department of Energy (DOE) of the United States of America and the Institut de Radioprotection et de Sûreté Nucléaire (IRSN) of France, the National Nuclear Security Administration (NNSA) within DOE and IRSN initiated a collaboration to improve isotopic identification and analysis of nuclear material [i.e., plutonium (Pu) and uranium (U)]. The specific aim of the collaborative project was to develop new versions of two types of isotopic identification and analysis software: (1) the fixed-energy response-function analysis for multiple energies (FRAM) codes and (2) multi-group analysis (MGA) codes. The project is entitled Action Sheet 4 – Cooperation on Improved Isotopic Identification and Analysis Software for Portable, Electrically Cooled, High-Resolution Gamma Spectrometry Systems (Action Sheet 4). FRAM and MGA/U235HI are software codes used to analyze isotopic ratios of U and Pu. FRAM is an application that uses parameter sets for the analysis of U or Pu. MGA and U235HI are two separate applications that analyze Pu or U, respectively. They have traditionally been used by safeguards practitioners to analyze gamma spectra acquired with high-resolution gamma spectrometry (HRGS) systems that are cooled by liquid nitrogen. However, it was discovered that these analysis programs were not as accurate when used on spectra acquired with a newer generation of more portable, electrically cooled HRGS (ECHRGS) systems. In response to this need, DOE/NNSA and IRSN collaborated to update the FRAM and U235HI codes to improve their performance with newer ECHRGS systems. Lawrence Livermore National Laboratory (LLNL) and Los Alamos National Laboratory (LANL) performed this work for DOE/NNSA.

  3. Uranium Isotopic Analysis with the FRAM Isotopic Analysis Code

    International Nuclear Information System (INIS)

    Vo, D.T.; Sampson, T.E.

    1999-01-01

    FRAM is the acronym for Fixed-Energy Response-Function Analysis with Multiple efficiency. This software was developed at Los Alamos National Laboratory originally for plutonium isotopic analysis. Later, it was adapted for uranium isotopic analysis in addition to plutonium. It is a code based on a self-calibration using several gamma-ray peaks for determining the isotopic ratios. The versatile-parameter database structure governs all facets of the data analysis. User editing of the parameter sets allows great flexibility in handling data with different isotopic distributions, interfering isotopes, and different acquisition parameters such as energy calibration and detector type

  4. HAZARD ANALYSIS SOFTWARE

    International Nuclear Information System (INIS)

    Sommer, S; Tinh Tran, T.

    2008-01-01

    Washington Safety Management Solutions, LLC developed web-based software to improve the efficiency and consistency of hazard identification and analysis, control selection and classification, and to standardize analysis reporting at Savannah River Site. In the new nuclear age, information technology provides methods to improve the efficiency of the documented safety analysis development process which includes hazard analysis activities. This software provides a web interface that interacts with a relational database to support analysis, record data, and to ensure reporting consistency. A team of subject matter experts participated in a series of meetings to review the associated processes and procedures for requirements and standard practices. Through these meetings, a set of software requirements were developed and compiled into a requirements traceability matrix from which software could be developed. The software was tested to ensure compliance with the requirements. Training was provided to the hazard analysis leads. Hazard analysis teams using the software have verified its operability. The software has been classified as NQA-1, Level D, as it supports the analysis team but does not perform the analysis. The software can be transported to other sites with alternate risk schemes. The software is being used to support the development of 14 hazard analyses. User responses have been positive with a number of suggestions for improvement which are being incorporated as time permits. The software has enforced a uniform implementation of the site procedures. The software has significantly improved the efficiency and standardization of the hazard analysis process

  5. Isotope dilution analysis

    Energy Technology Data Exchange (ETDEWEB)

    Fudge, A.

    1978-12-15

    The following aspects of isotope dilution analysis are covered in this report: fundamental aspects of the technique; elements of interest in the nuclear field, choice and standardization of spike nuclide; pre-treatment to achieve isotopic exchange and chemical separation; sensitivity; selectivity; and accuracy.

  6. Software safety hazard analysis

    International Nuclear Information System (INIS)

    Lawrence, J.D.

    1996-02-01

    Techniques for analyzing the safety and reliability of analog-based electronic protection systems that serve to mitigate hazards in process control systems have been developed over many years, and are reasonably well understood. An example is the protection system in a nuclear power plant. The extension of these techniques to systems which include digital computers is not well developed, and there is little consensus among software engineering experts and safety experts on how to analyze such systems. One possible technique is to extend hazard analysis to include digital computer-based systems. Software is frequently overlooked during system hazard analyses, but this is unacceptable when the software is in control of a potentially hazardous operation. In such cases, hazard analysis should be extended to fully cover the software. A method for performing software hazard analysis is proposed in this paper

  7. Actinide isotopic analysis systems

    International Nuclear Information System (INIS)

    Koenig, Z.M.; Ruhter, W.D.; Gunnink, R.

    1990-01-01

    This manual provides instructions and procedures for using the Lawrence Livermore National Laboratory's two-detector actinide isotope analysis system to measure plutonium samples with other possible actinides (including uranium, americium, and neptunium) by gamma-ray spectrometry. The computer program that controls the system and analyzes the gamma-ray spectral data is driven by a menu of one-, two-, or three-letter options chosen by the operator. Provided in this manual are descriptions of these options and their functions, plus detailed instructions (operator dialog) for choosing among the options. Also provided are general instructions for calibrating the actinide isotropic analysis system and for monitoring its performance. The inventory measurement of a sample's total plutonium and other actinides content is determined by two nondestructive measurements. One is a calorimetry measurement of the sample's heat or power output, and the other is a gamma-ray spectrometry measurement of its relative isotopic abundances. The isotopic measurements needed to interpret the observed calorimetric power measurement are the relative abundances of various plutonium and uranium isotopes and americium-241. The actinide analysis system carries out these measurements. 8 figs

  8. Stable isotope analysis

    International Nuclear Information System (INIS)

    Tibari, Elghali; Taous, Fouad; Marah, Hamid

    2014-01-01

    This report presents results related to stable isotopes analysis carried out at the CNESTEN DASTE in Rabat (Morocco), on behalf of Senegal. These analyzes cover 127 samples. These results demonstrate that Oxygen-18 and Deuterium in water analysis were performed by infrared Laser spectroscopy using a LGR / DLT-100 with Autosampler. Also, the results are expressed in δ values (‰) relative to V-SMOW to ± 0.3 ‰ for oxygen-18 and ± 1 ‰ for deuterium.

  9. Achievements in testing of the MGA and FRAM isotopic software codes under the DOE/NNSA-IRSN cooperation of gamma-ray isotopic measurement systems

    International Nuclear Information System (INIS)

    Vo, Duc; Wang, Tzu-Fang; Funk, Pierre; Weber, Anne-Laure; Pepin, Nicolas; Karcher, Anna

    2009-01-01

    DOE/NNSA and IRSN collaborated on a study of gamma-ray instruments and analysis methods used to perform isotopic measurements of special nuclear materials. The two agencies agreed to collaborate on the project in response to inconsistencies that were found in the various versions of software and hardware used to determine the isotopic abundances of uranium and plutonium. IRSN used software developed internally to test the MGA and FRAM isotopic analysis codes for criteria used to stop data acquisition. The stop-criterion test revealed several unusual behaviors in both the MGA and FRAM software codes.

  10. Automated Software Vulnerability Analysis

    Science.gov (United States)

    Sezer, Emre C.; Kil, Chongkyung; Ning, Peng

    Despite decades of research, software continues to have vulnerabilities. Successful exploitations of these vulnerabilities by attackers cost millions of dollars to businesses and individuals. Unfortunately, most effective defensive measures, such as patching and intrusion prevention systems, require an intimate knowledge of the vulnerabilities. Many systems for detecting attacks have been proposed. However, the analysis of the exploited vulnerabilities is left to security experts and programmers. Both the human effortinvolved and the slow analysis process are unfavorable for timely defensive measure to be deployed. The problem is exacerbated by zero-day attacks.

  11. Basic methods of isotope analysis

    International Nuclear Information System (INIS)

    Ochkin, A.V.; Rozenkevich, M.B.

    2000-01-01

    The bases of the most applied methods of the isotope analysis are briefly presented. The possibilities and analytical characteristics of the mass-spectrometric, spectral, radiochemical and special methods of the isotope analysis, including application of the magnetic resonance, chromatography and refractometry, are considered [ru

  12. Software quality testing process analysis

    OpenAIRE

    Mera Paz, Julián

    2016-01-01

    Introduction: This article is the result of reading, review, analysis of books, magazines and articles well known for their scientific and research quality, which have addressed the software quality testing process. The author, based on his work experience in software development companies, teaching and other areas, has compiled and selected information to argue and substantiate the importance of the software quality testing process. Methodology: the existing literature on the software qualit...

  13. Software FMEA analysis for safety-related application software

    International Nuclear Information System (INIS)

    Park, Gee-Yong; Kim, Dong Hoon; Lee, Dong Young

    2014-01-01

    Highlights: • We develop a modified FMEA analysis suited for applying to software architecture. • A template for failure modes on a specific software language is established. • A detailed-level software FMEA analysis on nuclear safety software is presented. - Abstract: A method of a software safety analysis is described in this paper for safety-related application software. The target software system is a software code installed at an Automatic Test and Interface Processor (ATIP) in a digital reactor protection system (DRPS). For the ATIP software safety analysis, at first, an overall safety or hazard analysis is performed over the software architecture and modules, and then a detailed safety analysis based on the software FMEA (Failure Modes and Effect Analysis) method is applied to the ATIP program. For an efficient analysis, the software FMEA analysis is carried out based on the so-called failure-mode template extracted from the function blocks used in the function block diagram (FBD) for the ATIP software. The software safety analysis by the software FMEA analysis, being applied to the ATIP software code, which has been integrated and passed through a very rigorous system test procedure, is proven to be able to provide very valuable results (i.e., software defects) that could not be identified during various system tests

  14. Isotope analysis in petroleum exploration

    International Nuclear Information System (INIS)

    Rodrigues, R.

    1982-01-01

    The study about isotopic analysis in petroleum exploration performed at Petrobras Research Center is showed. The results of the petroleum recuperation in same Brazilian basin and shelves are comented. (L.H.L.L.) [pt

  15. On-Orbit Software Analysis

    Science.gov (United States)

    Moran, Susanne I.

    2004-01-01

    The On-Orbit Software Analysis Research Infusion Project was done by Intrinsyx Technologies Corporation (Intrinsyx) at the National Aeronautics and Space Administration (NASA) Ames Research Center (ARC). The Project was a joint collaborative effort between NASA Codes IC and SL, Kestrel Technology (Kestrel), and Intrinsyx. The primary objectives of the Project were: Discovery and verification of software program properties and dependencies, Detection and isolation of software defects across different versions of software, and Compilation of historical data and technical expertise for future applications

  16. Software architecture analysis tool : software architecture metrics collection

    NARCIS (Netherlands)

    Muskens, J.; Chaudron, M.R.V.; Westgeest, R.

    2002-01-01

    The Software Engineering discipline lacks the ability to evaluate software architectures. Here we describe a tool for software architecture analysis that is based on metrics. Metrics can be used to detect possible problems and bottlenecks in software architectures. Even though metrics do not give a

  17. Software development for teleroentgenogram analysis

    Science.gov (United States)

    Goshkoderov, A. A.; Khlebnikov, N. A.; Obabkov, I. N.; Serkov, K. V.; Gajniyarov, I. M.; Aliev, A. A.

    2017-09-01

    A framework for the analysis and calculation of teleroentgenograms was developed. Software development was carried out in the Department of Children's Dentistry and Orthodontics in Ural State Medical University. The software calculates the teleroentgenogram by the original method which was developed in this medical department. Program allows designing its own methods for calculating the teleroentgenograms by new methods. It is planned to use the technology of machine learning (Neural networks) in the software. This will help to make the process of calculating the teleroentgenograms easier because methodological points will be placed automatically.

  18. Software Design for Smile Analysis

    Directory of Open Access Journals (Sweden)

    A. Sarkhosh

    2010-12-01

    Full Text Available Introduction: Esthetics and attractiveness of the smile is one of the major demands in contemporary orthodontic treatment. In order to improve a smile design, it is necessary to record “posed smile” as an intentional, non-pressure, static, natural and reproduciblesmile. The record then should be analyzed to determine its characteristics. In this study,we intended to design and introduce a software to analyze the smile rapidly and precisely in order to produce an attractive smile for the patients.Materials and Methods: For this purpose, a practical study was performed to design multimedia software “Smile Analysis” which can receive patients’ photographs and videographs. After giving records to the software, the operator should mark the points and lines which are displayed on the system’s guide and also define the correct scale for each image. Thirty-three variables are measured by the software and displayed on the report page. Reliability of measurements in both image and video was significantly high(=0.7-1.Results: In order to evaluate intra- operator and inter-operator reliability, five cases were selected randomly. Statistical analysis showed that calculations performed in smile analysis software were both valid and highly reliable (for both video and photo.Conclusion: The results obtained from smile analysis could be used in diagnosis,treatment planning and evaluation of the treatment progress.

  19. Molecular Isotopic Distribution Analysis (MIDAs) with adjustable mass accuracy.

    Science.gov (United States)

    Alves, Gelio; Ogurtsov, Aleksey Y; Yu, Yi-Kuo

    2014-01-01

    In this paper, we present Molecular Isotopic Distribution Analysis (MIDAs), a new software tool designed to compute molecular isotopic distributions with adjustable accuracies. MIDAs offers two algorithms, one polynomial-based and one Fourier-transform-based, both of which compute molecular isotopic distributions accurately and efficiently. The polynomial-based algorithm contains few novel aspects, whereas the Fourier-transform-based algorithm consists mainly of improvements to other existing Fourier-transform-based algorithms. We have benchmarked the performance of the two algorithms implemented in MIDAs with that of eight software packages (BRAIN, Emass, Mercury, Mercury5, NeutronCluster, Qmass, JFC, IC) using a consensus set of benchmark molecules. Under the proposed evaluation criteria, MIDAs's algorithms, JFC, and Emass compute with comparable accuracy the coarse-grained (low-resolution) isotopic distributions and are more accurate than the other software packages. For fine-grained isotopic distributions, we compared IC, MIDAs's polynomial algorithm, and MIDAs's Fourier transform algorithm. Among the three, IC and MIDAs's polynomial algorithm compute isotopic distributions that better resemble their corresponding exact fine-grained (high-resolution) isotopic distributions. MIDAs can be accessed freely through a user-friendly web-interface at http://www.ncbi.nlm.nih.gov/CBBresearch/Yu/midas/index.html.

  20. Application of Software Safety Analysis Methods

    International Nuclear Information System (INIS)

    Park, G. Y.; Hur, S.; Cheon, S. W.; Kim, D. H.; Lee, D. Y.; Kwon, K. C.; Lee, S. J.; Koo, Y. H.

    2009-01-01

    A fully digitalized reactor protection system, which is called the IDiPS-RPS, was developed through the KNICS project. The IDiPS-RPS has four redundant and separated channels. Each channel is mainly composed of a group of bistable processors which redundantly compare process variables with their corresponding setpoints and a group of coincidence processors that generate a final trip signal when a trip condition is satisfied. Each channel also contains a test processor called the ATIP and a display and command processor called the COM. All the functions were implemented in software. During the development of the safety software, various software safety analysis methods were applied, in parallel to the verification and validation (V and V) activities, along the software development life cycle. The software safety analysis methods employed were the software hazard and operability (Software HAZOP) study, the software fault tree analysis (Software FTA), and the software failure modes and effects analysis (Software FMEA)

  1. Calcium Isotope Analysis by Mass Spectrometry

    Science.gov (United States)

    Boulyga, S.; Richter, S.

    2010-12-01

    The variations in the isotopic composition of calcium caused by fractionation in heterogeneous systems and by nuclear reactions can provide insight into numerous biological, geological, and cosmic processes, and therefore isotopic analysis finds a wide spectrum of applications in cosmo- and geochemistry, paleoclimatic, nutritional, and biomedical studies. The measurement of calcium isotopic abundances in natural samples has challenged the analysts for more than three decades. Practically all Ca isotopes suffer from significant isobaric interferences, whereas low-abundant isotopes can be particularly affected by neighboring major isotopes. The extent of natural variations of stable isotopes appears to be relatively limited, and highly precise techniques are required to resolve isotopic effects. Isotope fractionation during sample preparation and measurements and instrumental mass bias can significantly exceed small isotope abundance variations in samples, which have to be investigated. Not surprisingly, a TIMS procedure developed by Russell et al. (Russell et al., 1978. Geochim Cosmochim Acta 42: 1075-1090) for Ca isotope measurements was considered as revolutionary for isotopic measurements in general, and that approach is used nowadays (with small modifications) for practically all isotopic systems and with different mass spectrometric techniques. Nevertheless, despite several decades of calcium research and corresponding development of mass spectrometers, the available precision and accuracy is still not always sufficient to achieve the challenging goals. This presentation discusses figures of merits of presently used analytical methods and instrumentation, and attempts to critically assess their limitations. Additionally, the availability of Ca isotope reference materials will be discussed.

  2. MAUS: MICE Analysis User Software

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    The Muon Ionization Cooling Experiment (MICE) has developed the MICE Analysis User Software (MAUS) to simulate and analyse experimental data. It serves as the primary codebase for the experiment, providing for online data quality checks and offline batch simulation and reconstruction. The code is structured in a Map-Reduce framework to allow parallelization whether on a personal machine or in the control room. Various software engineering practices from industry are also used to ensure correct and maintainable physics code, which include unit, functional and integration tests, continuous integration and load testing, code reviews, and distributed version control systems. Lastly, there are various small design decisions like using JSON as the data structure, using SWIG to allow developers to write components in either Python or C++, or using the SCons python-based build system that may be of interest to other experiments.

  3. Development of a code for the isotopic analysis of Uranium

    Energy Technology Data Exchange (ETDEWEB)

    Kim, J. H.; Kang, M. Y.; Kim, Jinhyeong; Choi, H. D. [Seoul National Univ., Seoul (Korea, Republic of)

    2013-05-15

    To strengthen the national nuclear nonproliferation regime by an establishment of nuclear forensic system, the techniques for nuclear material analysis and the categorization of important domestic nuclear materials are being developed. MGAU and FRAM are commercial software for the isotopic analysis of Uranium by using γ-spectroscopy, but the diversity of detection geometry and some effects - self attenuation, coincidence summing, etc. - suggest an analysis tool under continual improvement and modification. Hence, developing another code for HPGe γ- and x-ray spectrum analysis is started in this study. The analysis of the 87-101 keV region of Uranium spectrum is attempted based on the isotopic responses similar to those developed in MGAU. The code for isotopic analysis of Uranium is started from a fitting.

  4. Gamma-ray isotopic analysis development at Los Alamos

    Energy Technology Data Exchange (ETDEWEB)

    Thomas E. Sampson

    1999-11-01

    This report describes the development history and characteristics of software developed in the Safeguards Science and Technology group at Los Alamos for gamma-ray isotopic analysis. This software analyzes the gamma-ray spectrum from measurements performed on actinide samples (principally plutonium and uranium) of arbitrary size, geometry, and physical and chemical composition. The results are obtained without calibration using only fundamental tabulated nuclear constants. Characteristics of the current software versions are discussed in some detail and many examples of implemented measurement systems are shown.

  5. Software Performs Complex Design Analysis

    Science.gov (United States)

    2008-01-01

    Designers use computational fluid dynamics (CFD) to gain greater understanding of the fluid flow phenomena involved in components being designed. They also use finite element analysis (FEA) as a tool to help gain greater understanding of the structural response of components to loads, stresses and strains, and the prediction of failure modes. Automated CFD and FEA engineering design has centered on shape optimization, which has been hindered by two major problems: 1) inadequate shape parameterization algorithms, and 2) inadequate algorithms for CFD and FEA grid modification. Working with software engineers at Stennis Space Center, a NASA commercial partner, Optimal Solutions Software LLC, was able to utilize its revolutionary, one-of-a-kind arbitrary shape deformation (ASD) capability-a major advancement in solving these two aforementioned problems-to optimize the shapes of complex pipe components that transport highly sensitive fluids. The ASD technology solves the problem of inadequate shape parameterization algorithms by allowing the CFD designers to freely create their own shape parameters, therefore eliminating the restriction of only being able to use the computer-aided design (CAD) parameters. The problem of inadequate algorithms for CFD grid modification is solved by the fact that the new software performs a smooth volumetric deformation. This eliminates the extremely costly process of having to remesh the grid for every shape change desired. The program can perform a design change in a markedly reduced amount of time, a process that would traditionally involve the designer returning to the CAD model to reshape and then remesh the shapes, something that has been known to take hours, days-even weeks or months-depending upon the size of the model.

  6. Intercomparison of gamma ray analysis software packages

    International Nuclear Information System (INIS)

    1998-04-01

    The IAEA undertook an intercomparison exercise to review available software for gamma ray spectra analysis. This document describes the methods used in the intercomparison exercise, characterizes the software packages reviewed and presents the results obtained. Only direct results are given without any recommendation for a particular software or method for gamma ray spectra analysis

  7. Infusing Reliability Techniques into Software Safety Analysis

    Science.gov (United States)

    Shi, Ying

    2015-01-01

    Software safety analysis for a large software intensive system is always a challenge. Software safety practitioners need to ensure that software related hazards are completely identified, controlled, and tracked. This paper discusses in detail how to incorporate the traditional reliability techniques into the entire software safety analysis process. In addition, this paper addresses how information can be effectively shared between the various practitioners involved in the software safety analyses. The author has successfully applied the approach to several aerospace applications. Examples are provided to illustrate the key steps of the proposed approach.

  8. Measurement system analysis (MSA) of the isotopic ratio for uranium isotope enrichment process control

    Energy Technology Data Exchange (ETDEWEB)

    Medeiros, Josue C. de; Barbosa, Rodrigo A.; Carnaval, Joao Paulo R., E-mail: josue@inb.gov.br, E-mail: rodrigobarbosa@inb.gov.br, E-mail: joaocarnaval@inb.gov.br [Industrias Nucleares do Brasil (INB), Rezende, RJ (Brazil)

    2013-07-01

    Currently, one of the stages in nuclear fuel cycle development is the process of uranium isotope enrichment, which will provide the amount of low enriched uranium for the nuclear fuel production to supply 100% Angra 1 and 20% Angra 2 demands. Determination of isotopic ration n({sup 235}U)/n({sup 238}U) in uranium hexafluoride (UF{sub 6} - used as process gas) is essential in order to control of enrichment process of isotopic separation by gaseous centrifugation cascades. The uranium hexafluoride process is performed by gas continuous feeding in separation unit which uses the centrifuge force principle, establishing a density gradient in a gas containing components of different molecular weights. The elemental separation effect occurs in a single ultracentrifuge that results in a partial separation of the feed in two fractions: an enriched on (product) and another depleted (waste) in the desired isotope ({sup 235}UF{sub 6}). Industrias Nucleares do Brasil (INB) has used quadrupole mass spectrometry (QMS) by electron impact (EI) to perform isotopic ratio n({sup 235}U)/n({sup 238}U) analysis in the process. The decision of adjustments and change te input variables are based on the results presented in these analysis. A study of stability, bias and linearity determination has been performed in order to evaluate the applied method, variations and systematic errors in the measurement system. The software used to analyze the techniques above was the Minitab 15. (author)

  9. Fault tree analysis of KNICS RPS software

    International Nuclear Information System (INIS)

    Park, Gee Yong; Kwon, Kee Choon; Koh, Kwang Yong; Jee, Eun Kyoung; Seong, Poong Hyun; Lee, Dae Hyung

    2008-01-01

    This paper describes the application of a software Fault Tree Analysis (FTA) as one of the analysis techniques for a Software Safety Analysis (SSA) at the design phase and its analysis results for the safety-critical software of a digital reactor protection system, which is called the KNICS RPS, being developed in the KNICS (Korea Nuclear Instrumentation and Control Systems) project. The software modules in the design description were represented by Function Blocks (FBs), and the software FTA was performed based on the well-defined fault tree templates for the FBs. The SSA, which is part of the verification and validation (V and V) activities, was activated at each phase of the software lifecycle for the KNICS RPS. At the design phase, the software HAZOP (Hazard and Operability) and the software FTA were employed in the SSA in such a way that the software HAZOP was performed first and then the software FTA was applied. The software FTA was applied to some critical modules selected from the software HAZOP analysis

  10. Dependability Analysis Methods For Configurable Software

    International Nuclear Information System (INIS)

    Dahll, Gustav; Pulkkinen, Urho

    1996-01-01

    Configurable software systems are systems which are built up by standard software components in the same way as a hardware system is built up by standard hardware components. Such systems are often used in the control of NPPs, also in safety related applications. A reliability analysis of such systems is therefore necessary. This report discusses what configurable software is, and what is particular with respect to reliability assessment of such software. Two very commonly used techniques in traditional reliability analysis, viz. failure mode, effect and criticality analysis (FMECA) and fault tree analysis are investigated. A real example is used to illustrate the discussed methods. Various aspects relevant to the assessment of the software reliability in such systems are discussed. Finally some models for quantitative software reliability assessment applicable on configurable software systems are described. (author)

  11. Reliability analysis of software based safety functions

    International Nuclear Information System (INIS)

    Pulkkinen, U.

    1993-05-01

    The methods applicable in the reliability analysis of software based safety functions are described in the report. Although the safety functions also include other components, the main emphasis in the report is on the reliability analysis of software. The check list type qualitative reliability analysis methods, such as failure mode and effects analysis (FMEA), are described, as well as the software fault tree analysis. The safety analysis based on the Petri nets is discussed. The most essential concepts and models of quantitative software reliability analysis are described. The most common software metrics and their combined use with software reliability models are discussed. The application of software reliability models in PSA is evaluated; it is observed that the recent software reliability models do not produce the estimates needed in PSA directly. As a result from the study some recommendations and conclusions are drawn. The need of formal methods in the analysis and development of software based systems, the applicability of qualitative reliability engineering methods in connection to PSA and the need to make more precise the requirements for software based systems and their analyses in the regulatory guides should be mentioned. (orig.). (46 refs., 13 figs., 1 tab.)

  12. Software architecture analysis of usability

    NARCIS (Netherlands)

    Folmer, Eelke

    2005-01-01

    One of the qualities that has received increased attention in recent decades is usability. A software product with poor usability is likely to fail in a highly competitive market; therefore software developing organizations are paying more and more attention to ensuring the usability of their

  13. Nuclear Fuel Depletion Analysis Using Matlab Software

    Science.gov (United States)

    Faghihi, F.; Nematollahi, M. R.

    Coupled first order IVPs are frequently used in many parts of engineering and sciences. In this article, we presented a code including three computer programs which are joint with the Matlab software to solve and plot the solutions of the first order coupled stiff or non-stiff IVPs. Some engineering and scientific problems related to IVPs are given and fuel depletion (production of the 239Pu isotope) in a Pressurized Water Nuclear Reactor (PWR) are computed by the present code.

  14. Development of Emittance Analysis Software for Ion Beam Characterization

    International Nuclear Information System (INIS)

    Padilla, M.J.; Liu, Yuan

    2007-01-01

    Transverse beam emittance is a crucial property of charged particle beams that describes their angular and spatial spread. It is a figure of merit frequently used to determine the quality of ion beams, the compatibility of an ion beam with a given beam transport system, and the ability to suppress neighboring isotopes at on-line mass separator facilities. Generally, a high-quality beam is characterized by a small emittance. In order to determine and improve the quality of ion beams used at the Holifield Radioactive Ion Beam Facility (HRIBF) for nuclear physics and nuclear astrophysics research, the emittances of the ion beams are measured at the off-line Ion Source Test Facilities. In this project, emittance analysis software was developed to perform various data processing tasks for noise reduction, to evaluate root-mean-square emittance, Twiss parameters, and area emittance of different beam fractions. The software also provides 2D and 3D graphical views of the emittance data, beam profiles, emittance contours, and RMS. Noise exclusion is essential for accurate determination of beam emittance values. A Self-Consistent, Unbiased Elliptical Exclusion (SCUBEEx) method is employed. Numerical data analysis techniques such as interpolation and nonlinear fitting are also incorporated into the software. The software will provide a simplified, fast tool for comprehensive emittance analysis. The main functions of the software package have been completed. In preliminary tests with experimental emittance data, the analysis results using the software were shown to be accurate

  15. DEVELOPMENT OF EMITTANCE ANALYSIS SOFTWARE FOR ION BEAM CHARACTERIZATION

    Energy Technology Data Exchange (ETDEWEB)

    Padilla, M. J.; Liu, Y.

    2007-01-01

    Transverse beam emittance is a crucial property of charged particle beams that describes their angular and spatial spread. It is a fi gure of merit frequently used to determine the quality of ion beams, the compatibility of an ion beam with a given beam transport system, and the ability to suppress neighboring isotopes at on-line mass separator facilities. Generally a high quality beam is characterized by a small emittance. In order to determine and improve the quality of ion beams used at the Holifi eld Radioactive Ion beam Facility (HRIBF) for nuclear physics and nuclear astrophysics research, the emittances of the ion beams are measured at the off-line Ion Source Test Facilities. In this project, emittance analysis software was developed to perform various data processing tasks for noise reduction, to evaluate root-mean-square emittance, Twiss parameters, and area emittance of different beam fractions. The software also provides 2D and 3D graphical views of the emittance data, beam profi les, emittance contours, and RMS. Noise exclusion is essential for accurate determination of beam emittance values. A Self-Consistent, Unbiased Elliptical Exclusion (SCUBEEx) method is employed. Numerical data analysis techniques such as interpolation and nonlinear fi tting are also incorporated into the software. The software will provide a simplifi ed, fast tool for comprehensive emittance analysis. The main functions of the software package have been completed. In preliminary tests with experimental emittance data, the analysis results using the software were shown to be accurate.

  16. Analysis of open source GIS software

    OpenAIRE

    Božnis, Andrius

    2006-01-01

    GIS is one of the most perspective information technology sciences sphere. GIS conjuncts the digital image analysis and data base systems. This makes GIS wide applicable and very high skills demanding system. There is a lot of commercial GIS software which is well advertised and which functionality is pretty well known, while open source software is forgotten. In this diploma work is made analysis of available open source GIS software on the Internet, in the scope of different projects interr...

  17. Real-time software for multi-isotopic source term estimation

    International Nuclear Information System (INIS)

    Goloubenkov, A.; Borodin, R.; Sohier, A.

    1996-01-01

    Consideration is given to development of software for one of crucial components of the RODOS - assessment of the source rate (SR) from indirect measurements. Four components of the software are described in the paper. First component is a GRID system, which allow to prepare stochastic meteorological and radioactivity fields using measured data. Second part is a model of atmospheric transport which can be adapted for emulation of practically any gamma dose/spectrum detectors. The third one is a method which allows space-time and quantitative discrepancies in measured and modelled data to be taken into account simultaneously. It bases on the preference scheme selected by an expert. Last component is a special optimization method for calculation of multi-isotopic SR and its uncertainties. Results of a validation of the software using tracer experiments data and Chernobyl source estimation for main dose-forming isotopes are enclosed in the paper

  18. Gamma-Ray Spectrum Analysis Software GDA

    International Nuclear Information System (INIS)

    Wanabongse, P.

    1998-01-01

    The developmental work on computer software for gamma-ray spectrum analysis has been completed as a software package version 1.02 named GDA, which is an acronym for Gamma-spectrum Deconvolution and Analysis. The software package consists of three 3.5-inch diskettes for setup and a user's manual. GDA software can be installed for using on a personal computer with Windows 95 or Windows NT 4.0 operating system. A computer maybe the type of 80486 CPU with 8 megabytes of memory

  19. Software architecture analysis of usability

    NARCIS (Netherlands)

    Folmer, E; van Gurp, J; Bosch, J; Bastide, R; Palanque, P; Roth, J

    2005-01-01

    Studies of software engineering projects show that a large number of usability related change requests are made after its deployment. Fixing usability problems during the later stages of development often proves to be costly, since many of the necessary changes require changes to the system that

  20. Software ASPRO-NUC. Gamma-ray spectrometry, routine NAA, isotope identification and data management

    International Nuclear Information System (INIS)

    Kolotov, V.P.; Atrashkevich, V.V.

    1995-01-01

    The software ASPRO-NUC is based on new improved algorithms suggested and tested in the laboratory and intended for routine analysis. The package consists of the program ASPRO for gamma-ray spectra processing (peak search, multiplets deconvolution by means of method of moments, computation of correction coefficient for geometry and material of radioactive source), a program for isotope identification and a program for NAA by means of relative standardization. All output information is loaded into a data base (Paradox v.3.5 format) for supporting of queries, creation of reports, planning of routine analysis, estimation of expenses, supporting of network of analytical survey, etc. The ASPRO-NUC package also includes a vast nuclear data base containing evaluated decay and activation data (reactor, generator of fast neutrons, Cf-252 source). The data base environment allows for easy integration of a gamma spectrometer into a flexible information shell and the creation of a logical system for information management. (author) 15 refs.; 2 figs.; 2 tabs

  1. Development of a fatigue analysis software system

    International Nuclear Information System (INIS)

    Choi, B. I.; Lee, H. J.; Han, S. W.; Kim, J. Y.; Hwang, K. H.; Kang, J. Y.

    2001-01-01

    A general purpose fatigue analysis software to predict fatigue lives of mechanical components and structures was developed. This software has some characteristic features including functions of searching weak regions on the free surface in order to reduce computing time significantly, a database of fatigue properties for various materials, and an expert system which can assist any users to get more proper results. This software can be used in the environment consists of commercial finite element packages. Using the software developed fatigue analyses for a SAE keyhole specimen and an automobile knuckle were carried out. It was observed that the results were agree well with those from commercial packages

  2. Numerical methods in software and analysis

    CERN Document Server

    Rice, John R

    1992-01-01

    Numerical Methods, Software, and Analysis, Second Edition introduces science and engineering students to the methods, tools, and ideas of numerical computation. Introductory courses in numerical methods face a fundamental problem-there is too little time to learn too much. This text solves that problem by using high-quality mathematical software. In fact, the objective of the text is to present scientific problem solving using standard mathematical software. This book discusses numerous programs and software packages focusing on the IMSL library (including the PROTRAN system) and ACM Algorithm

  3. Software Process Improvement Using Force Field Analysis ...

    African Journals Online (AJOL)

    An improvement plan is then drawn and implemented. This paper studied the state of Nigerian software development organizations based on selected attributes. Force field analysis is used to partition the factors obtained into driving and restraining forces. An attempt was made to improve the software development process ...

  4. Analysis of barium by isotope mass spectrometry

    International Nuclear Information System (INIS)

    Long Kaiming; Jia Baoting; Liu Xuemei

    2004-01-01

    The isotopic abundance ratios for barium at sub-microgram level are analyzed by thermal surface ionization mass spectrometry (TIMS). Rhenium trips used for sample preparation are firstly treated to eliminate possible barium background interference. During the preparation of barium samples phosphoric acid is added as an emitting and stabilizing reagent. The addition of phosphoric acid increases the collection efficiency and ion current strength and stability for barium. A relative standard deviation of 0.02% for the isotopic abundance ratio of 137 Ba to 138 Ba is achieved when the 138 Ba ion current is (1-3) x 10 -12 A. The experimental results also demonstrate that the isotope fractionation effect is negligibly small in the isotopic analysis of barium

  5. Software safety analysis practice in installation phase

    Energy Technology Data Exchange (ETDEWEB)

    Huang, H. W.; Chen, M. H.; Shyu, S. S., E-mail: hwhwang@iner.gov.t [Institute of Nuclear Energy Research, No. 1000 Wenhua Road, Chiaan Village, Longtan Township, 32546 Taoyuan County, Taiwan (China)

    2010-10-15

    This work performed a software safety analysis in the installation phase of the Lung men nuclear power plant in Taiwan, under the cooperation of Institute of Nuclear Energy Research and Tpc. The US Nuclear Regulatory Commission requests licensee to perform software safety analysis and software verification and validation in each phase of software development life cycle with Branch Technical Position 7-14. In this work, 37 safety grade digital instrumentation and control systems were analyzed by failure mode and effects analysis, which is suggested by IEEE standard 7-4.3.2-2003. During the installation phase, skew tests for safety grade network and point to point tests were performed. The failure mode and effects analysis showed all the single failure modes can be resolved by the redundant means. Most of the common mode failures can be resolved by operator manual actions. (Author)

  6. Software safety analysis practice in installation phase

    International Nuclear Information System (INIS)

    Huang, H. W.; Chen, M. H.; Shyu, S. S.

    2010-10-01

    This work performed a software safety analysis in the installation phase of the Lung men nuclear power plant in Taiwan, under the cooperation of Institute of Nuclear Energy Research and Tpc. The US Nuclear Regulatory Commission requests licensee to perform software safety analysis and software verification and validation in each phase of software development life cycle with Branch Technical Position 7-14. In this work, 37 safety grade digital instrumentation and control systems were analyzed by failure mode and effects analysis, which is suggested by IEEE standard 7-4.3.2-2003. During the installation phase, skew tests for safety grade network and point to point tests were performed. The failure mode and effects analysis showed all the single failure modes can be resolved by the redundant means. Most of the common mode failures can be resolved by operator manual actions. (Author)

  7. Computer-assisted qualitative data analysis software.

    Science.gov (United States)

    Cope, Diane G

    2014-05-01

    Advances in technology have provided new approaches for data collection methods and analysis for researchers. Data collection is no longer limited to paper-and-pencil format, and numerous methods are now available through Internet and electronic resources. With these techniques, researchers are not burdened with entering data manually and data analysis is facilitated by software programs. Quantitative research is supported by the use of computer software and provides ease in the management of large data sets and rapid analysis of numeric statistical methods. New technologies are emerging to support qualitative research with the availability of computer-assisted qualitative data analysis software (CAQDAS).CAQDAS will be presented with a discussion of advantages, limitations, controversial issues, and recommendations for this type of software use.

  8. Software safety analysis application in installation phase

    International Nuclear Information System (INIS)

    Huang, H. W.; Yih, S.; Wang, L. H.; Liao, B. C.; Lin, J. M.; Kao, T. M.

    2010-01-01

    This work performed a software safety analysis (SSA) in the installation phase of the Lungmen nuclear power plant (LMNPP) in Taiwan, under the cooperation of INER and TPC. The US Nuclear Regulatory Commission (USNRC) requests licensee to perform software safety analysis (SSA) and software verification and validation (SV and V) in each phase of software development life cycle with Branch Technical Position (BTP) 7-14. In this work, 37 safety grade digital instrumentation and control (I and C) systems were analyzed by Failure Mode and Effects Analysis (FMEA), which is suggested by IEEE Standard 7-4.3.2-2003. During the installation phase, skew tests for safety grade network and point to point tests were performed. The FMEA showed all the single failure modes can be resolved by the redundant means. Most of the common mode failures can be resolved by operator manual actions. (authors)

  9. Software criticality analysis of COTS/SOUP

    International Nuclear Information System (INIS)

    Bishop, Peter; Bloomfield, Robin; Clement, Tim; Guerra, Sofia

    2003-01-01

    This paper describes the Software Criticality Analysis (SCA) approach that was developed to support the justification of using commercial off-the-shelf software (COTS) in a safety-related system. The primary objective of SCA is to assess the importance to safety of the software components within the COTS and to show there is segregation between software components with different safety importance. The approach taken was a combination of Hazops based on design documents and on a detailed analysis of the actual code (100 kloc). Considerable effort was spent on validation and ensuring the conservative nature of the results. The results from reverse engineering from the code showed that results based only on architecture and design documents would have been misleading

  10. Software criticality analysis of COTS/SOUP

    Energy Technology Data Exchange (ETDEWEB)

    Bishop, Peter; Bloomfield, Robin; Clement, Tim; Guerra, Sofia

    2003-09-01

    This paper describes the Software Criticality Analysis (SCA) approach that was developed to support the justification of using commercial off-the-shelf software (COTS) in a safety-related system. The primary objective of SCA is to assess the importance to safety of the software components within the COTS and to show there is segregation between software components with different safety importance. The approach taken was a combination of Hazops based on design documents and on a detailed analysis of the actual code (100 kloc). Considerable effort was spent on validation and ensuring the conservative nature of the results. The results from reverse engineering from the code showed that results based only on architecture and design documents would have been misleading.

  11. Software

    Energy Technology Data Exchange (ETDEWEB)

    Macedo, R.; Budd, G.; Ross, E.; Wells, P.

    2010-07-15

    The software section of this journal presented new software programs that have been developed to help in the exploration and development of hydrocarbon resources. Software provider IHS Inc. has made additions to its geological and engineering analysis software tool, IHS PETRA, a product used by geoscientists and engineers to visualize, analyze and manage well production, well log, drilling, reservoir, seismic and other related information. IHS PETRA also includes a directional well module and a decline curve analysis module to improve analysis capabilities in unconventional reservoirs. Petris Technology Inc. has developed a software to help manage the large volumes of data. PetrisWinds Enterprise (PWE) helps users find and manage wellbore data, including conventional wireline and MWD core data; analysis core photos and images; waveforms and NMR; and external files documentation. Ottawa-based Ambercore Software Inc. has been collaborating with Nexen on the Petroleum iQ software for steam assisted gravity drainage (SAGD) producers. Petroleum iQ integrates geology and geophysics data with engineering data in 3D and 4D. Calgary-based Envirosoft Corporation has developed a software that reduces the costly and time-consuming effort required to comply with Directive 39 of the Alberta Energy Resources Conservation Board. The product includes an emissions modelling software. Houston-based Seismic Micro-Technology (SMT) has developed the Kingdom software that features the latest in seismic interpretation. Holland-based Joa Oil and Gas and Calgary-based Computer Modelling Group have both supplied the petroleum industry with advanced reservoir simulation software that enables reservoir interpretation. The 2010 software survey included a guide to new software applications designed to facilitate petroleum exploration, drilling and production activities. Oil and gas producers can use the products for a range of functions, including reservoir characterization and accounting. In

  12. The CSSIAR v.1.00 Software: A new tool based on SIAR to assess soil redistribution using Compound Specific Stable Isotopes

    Directory of Open Access Journals (Sweden)

    de los Santos-Villalobos Sergio

    2017-01-01

    Full Text Available Soil erosion is one of the biggest challenges for food production around the world. Many techniques have been used to evaluate and mitigate soil degradation. Nowadays isotopic techniques are becoming a powerful tool to assess soil apportionment. One of the innovative techniques used is the Compound Specific Stable Isotopes (CSSI analysis, which has been used to track down sediments and specify their sources by the isotopic signature of δ13C in specific fatty acids. The application of this technique on soil apportionment has been recently developed, however there is a lack of user-friendly Software for data processing and interpretation. The aim of this article is to introduce a new open source tool for working with data sets generated by the use of the CSSI technique to assess soil apportionment, called the CSSIARv1.00 Software

  13. The CSSIAR v.1.00 Software: A new tool based on SIAR to assess soil redistribution using Compound Specific Stable Isotopes

    Science.gov (United States)

    Sergio, de los Santos-Villalobos; Claudio, Bravo-Linares; dos Anjos Roberto, Meigikos; Renan, Cardoso; Max, Gibbs; Andrew, Swales; Lionel, Mabit; Gerd, Dercon

    Soil erosion is one of the biggest challenges for food production around the world. Many techniques have been used to evaluate and mitigate soil degradation. Nowadays isotopic techniques are becoming a powerful tool to assess soil apportionment. One of the innovative techniques used is the Compound Specific Stable Isotopes (CSSI) analysis, which has been used to track down sediments and specify their sources by the isotopic signature of δ13 C in specific fatty acids. The application of this technique on soil apportionment has been recently developed, however there is a lack of user-friendly Software for data processing and interpretation. The aim of this article is to introduce a new open source tool for working with data sets generated by the use of the CSSI technique to assess soil apportionment, called the CSSIARv1.00 Software

  14. Acoustic Emission Analysis Applet (AEAA) Software

    Science.gov (United States)

    Nichols, Charles T.; Roth, Don J.

    2013-01-01

    NASA Glenn Research and NASA White Sands Test Facility have developed software supporting an automated pressure vessel structural health monitoring (SHM) system based on acoustic emissions (AE). The software, referred to as the Acoustic Emission Analysis Applet (AEAA), provides analysts with a tool that can interrogate data collected on Digital Wave Corp. and Physical Acoustics Corp. software using a wide spectrum of powerful filters and charts. This software can be made to work with any data once the data format is known. The applet will compute basic AE statistics, and statistics as a function of time and pressure (see figure). AEAA provides value added beyond the analysis provided by the respective vendors' analysis software. The software can handle data sets of unlimited size. A wide variety of government and commercial applications could benefit from this technology, notably requalification and usage tests for compressed gas and hydrogen-fueled vehicles. Future enhancements will add features similar to a "check engine" light on a vehicle. Once installed, the system will ultimately be used to alert International Space Station crewmembers to critical structural instabilities, but will have little impact to missions otherwise. Diagnostic information could then be transmitted to experienced technicians on the ground in a timely manner to determine whether pressure vessels have been impacted, are structurally unsound, or can be safely used to complete the mission.

  15. Principles of isotopic analysis by mass spectrometry

    International Nuclear Information System (INIS)

    Herrmann, M.

    1980-01-01

    The use of magnetic sector field mass spectrometers in isotopic analysis, especially for nitrogen gas, is outlined. Two measuring methods are pointed out: the scanning mode for significantly enriched samples and the double collector method for samples near the natural abundance of 15 N. The calculation formulas are derived and advice is given for corrections. (author)

  16. Isotopic neutron sources for neutron activation analysis

    International Nuclear Information System (INIS)

    Hoste, J.

    1988-06-01

    This User's Manual is an attempt to provide for teaching and training purposes, a series of well thought out demonstrative experiments in neutron activation analysis based on the utilization of an isotopic neutron source. In some cases, these ideas can be applied to solve practical analytical problems. 19 refs, figs and tabs

  17. Introducing a New Software for Geodetic Analysis

    Science.gov (United States)

    Hjelle, Geir Arne; Dähnn, Michael; Fausk, Ingrid; Kirkvik, Ann-Silje; Mysen, Eirik

    2017-04-01

    At the Norwegian Mapping Authority, we are currently developing Where, a new software for geodetic analysis. Where is built on our experiences with the Geosat software, and will be able to analyse and combine data from VLBI, SLR, GNSS and DORIS. The software is mainly written in Python which has proved very fruitful. The code is quick to write and the architecture is easily extendable and maintainable, while at the same time taking advantage of well-tested code like the SOFA and IERS libraries. This presentation will show some of the current capabilities of Where, including benchmarks against other software packages, and outline our plans for further progress. In addition we will report on some investigations we have done experimenting with alternative weighting strategies for VLBI.

  18. Software for Graph Analysis and Visualization

    Directory of Open Access Journals (Sweden)

    M. I. Kolomeychenko

    2014-01-01

    Full Text Available This paper describes the software for graph storage, analysis and visualization. The article presents a comparative analysis of existing software for analysis and visualization of graphs, describes the overall architecture of application and basic principles of construction and operation of the main modules. Furthermore, a description of the developed graph storage oriented to storage and processing of large-scale graphs is presented. The developed algorithm for finding communities and implemented algorithms of autolayouts of graphs are the main functionality of the product. The main advantage of the developed software is high speed processing of large size networks (up to millions of nodes and links. Moreover, the proposed graph storage architecture is unique and has no analogues. The developed approaches and algorithms are optimized for operating with big graphs and have high productivity.

  19. Power and performance software analysis and optimization

    CERN Document Server

    Kukunas, Jim

    2015-01-01

    Power and Performance: Software Analysis and Optimization is a guide to solving performance problems in modern Linux systems. Power-efficient chips are no help if the software those chips run on is inefficient. Starting with the necessary architectural background as a foundation, the book demonstrates the proper usage of performance analysis tools in order to pinpoint the cause of performance problems, and includes best practices for handling common performance issues those tools identify. Provides expert perspective from a key member of Intel's optimization team on how processors and memory

  20. PIV/HPIV Film Analysis Software Package

    Science.gov (United States)

    Blackshire, James L.

    1997-01-01

    A PIV/HPIV film analysis software system was developed that calculates the 2-dimensional spatial autocorrelations of subregions of Particle Image Velocimetry (PIV) or Holographic Particle Image Velocimetry (HPIV) film recordings. The software controls three hardware subsystems including (1) a Kodak Megaplus 1.4 camera and EPIX 4MEG framegrabber subsystem, (2) an IEEE/Unidex 11 precision motion control subsystem, and (3) an Alacron I860 array processor subsystem. The software runs on an IBM PC/AT host computer running either the Microsoft Windows 3.1 or Windows 95 operating system. It is capable of processing five PIV or HPIV displacement vectors per second, and is completely automated with the exception of user input to a configuration file prior to analysis execution for update of various system parameters.

  1. Software for computerised analysis of cardiotocographic traces.

    Science.gov (United States)

    Romano, M; Bifulco, P; Ruffo, M; Improta, G; Clemente, F; Cesarelli, M

    2016-02-01

    Despite the widespread use of cardiotocography in foetal monitoring, the evaluation of foetal status suffers from a considerable inter and intra-observer variability. In order to overcome the main limitations of visual cardiotocographic assessment, computerised methods to analyse cardiotocographic recordings have been recently developed. In this study, a new software for automated analysis of foetal heart rate is presented. It allows an automatic procedure for measuring the most relevant parameters derivable from cardiotocographic traces. Simulated and real cardiotocographic traces were analysed to test software reliability. In artificial traces, we simulated a set number of events (accelerations, decelerations and contractions) to be recognised. In the case of real signals, instead, results of the computerised analysis were compared with the visual assessment performed by 18 expert clinicians and three performance indexes were computed to gain information about performances of the proposed software. The software showed preliminary performance we judged satisfactory in that the results matched completely the requirements, as proved by tests on artificial signals in which all simulated events were detected from the software. Performance indexes computed in comparison with obstetricians' evaluations are, on the contrary, not so satisfactory; in fact they led to obtain the following values of the statistical parameters: sensitivity equal to 93%, positive predictive value equal to 82% and accuracy equal to 77%. Very probably this arises from the high variability of trace annotation carried out by clinicians. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  2. The Need to Support and Maintain Legacy Software: Ensuring Ongoing Support for the Isotopics Codes

    International Nuclear Information System (INIS)

    Weber, A.-L.; Funk, P.; McGinnis, B.; Vo, D.; Wang, T.-F.; Peerani, P.; Zsigrai, J.; )

    2015-01-01

    Since about four decades, gamma evaluation codes for plutonium and uranium isotope abundance measurements are a key component of international, regional and domestic safeguards inspections. However, the development of these codes still relies upon a very limited number of experts. This led the safeguards authorities to express concerns, and to request continuity of knowledge and maintenance capability for the codes. The presentation describes initiatives undertaken in the past ten years to ensure ongoing support for the isotopic codes. As a follow-up to the 2005 international workshop, the IAEA issued a roadmap for future developments of gamma codes, followed by a request for support in this field to several MSSP's (namely JNT A 01684). The international working group on gamma spectrometry techniques for U and Pu isotopics (IWG-GST) was launched by the European, French and US MSSPs in 2007, to respond to the needs expressed by the IAEA and other national or international inspectorates. Its activities started with the organization in 2008 of a workshop on gamma spectrometry analysis codes for U and Pu isotopics. The working group is currently developing an international database of reference spectra that will be made available to the community of users and developers. In parallel, IRSN contributes to the JNT A 01684 by advising the IAEA on establishing a procedure for validating a new version of isotopics codes compared to the previous version. The most recent initiative, proposed by the IAEA, consists in organizing an inter-comparison exercise to assess the performances of U and Pu isotopics and mass assay techniques based on medium resolution gamma spectrometry (MRGS). All these initiatives contributed to the continuity of knowledge and maintenance of the gamma isotopic codes, but further efforts are needed to ensure the long-term sustainability of the codes. (author)

  3. GRACAT, Software for grounding and collision analysis

    DEFF Research Database (Denmark)

    Friis-Hansen, Peter; Simonsen, Bo Cerup

    2002-01-01

    and grounding accidents. The software consists of three basic analysis modules and one risk mitigation module: 1) frequency, 2) damage, and 3) consequence. These modules can be used individually or in series and the analyses can be performed in deterministic or probabilistic mode. Finally, in the mitigation...

  4. Residence time distribution software analysis. User's manual

    International Nuclear Information System (INIS)

    1996-01-01

    Radiotracer applications cover a wide range of industrial activities in chemical and metallurgical processes, water treatment, mineral processing, environmental protection and civil engineering. Experiment design, data acquisition, treatment and interpretation are the basic elements of tracer methodology. The application of radiotracers to determine impulse response as RTD as well as the technical conditions for conducting experiments in industry and in the environment create a need for data processing using special software. Important progress has been made during recent years in the preparation of software programs for data treatment and interpretation. The software package developed for industrial process analysis and diagnosis by the stimulus-response methods contains all the methods for data processing for radiotracer experiments

  5. Improving Software Systems By Flow Control Analysis

    Directory of Open Access Journals (Sweden)

    Piotr Poznanski

    2012-01-01

    Full Text Available Using agile methods during the implementation of the system that meets mission critical requirements can be a real challenge. The change in the system built of dozens or even hundreds of specialized devices with embedded software requires the cooperation of a large group of engineers. This article presents a solution that supports parallel work of groups of system analysts and software developers. Deployment of formal rules to the requirements written in natural language enables using formal analysis of artifacts being a bridge between software and system requirements. Formalism and textual form of requirements allowed the automatic generation of message flow graph for the (sub system, called the “big-picture-model”. Flow diagram analysis helped to avoid a large number of defects whose repair cost in extreme cases could undermine the legitimacy of agile methods in projects of this scale. Retrospectively, a reduction of technical debt was observed. Continuous analysis of the “big picture model” improves the control of the quality parameters of the software architecture. The article also tries to explain why the commercial platform based on UML modeling language may not be sufficient in projects of this complexity.

  6. Intraprocedural dataflow analysis for software product lines

    DEFF Research Database (Denmark)

    Brabrand, Claus; Ribeiro, Márcio; Tolêdo, Társis

    2013-01-01

    Software product lines (SPLs) developed using annotative approaches such as conditional compilation come with an inherent risk of constructing erroneous products. For this reason, it is essential to be able to analyze such SPLs. However, as dataflow analysis techniques are not able to deal with SP...... and memory characteristics on five qualitatively different SPLs. On our benchmarks, the combined analysis strategy is up to almost eight times faster than the brute-force approach....

  7. Automating risk analysis of software design models.

    Science.gov (United States)

    Frydman, Maxime; Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance.

  8. Isotope dilution analysis of environmental samples

    International Nuclear Information System (INIS)

    Tolgyessy, J.; Lesny, J.; Korenova, Z.; Klas, J.; Klehr, E.H.

    1986-01-01

    Isotope dilution analysis has been used for the determination of several trace elements - especially metals - in a variety of environmental samples, including aerosols, water, soils, biological materials and geological materials. Variations of the basic concept include classical IDA, substoichiometric IDA, and more recently, sub-superequivalence IDA. Each variation has its advantages and limitations. A periodic chart has been used to identify those elements which have been measured in environmental samples using one or more of these methods. (author)

  9. Hydrogen isotope analysis by quadrupole mass spectrometry

    International Nuclear Information System (INIS)

    Ellefson, R.E.; Moddeman, W.E.; Dylla, H.F.

    1981-03-01

    The analysis of isotopes of hydrogen (H, D, T) and helium ( 3 He, 4 He) and selected impurities using a quadrupole mass spectrometer (QMS) has been investigated as a method of measuring the purity of tritium gas for injection into the Tokamak Fusion Test Reactor (TFTR). A QMS was used at low resolution, m/Δm 3 He, and 4 He in HT/D 2

  10. Radiation and environmental data analysis computer (REDAC) hardware, software band analysis procedures

    International Nuclear Information System (INIS)

    Hendricks, T.J.

    1985-01-01

    The REDAC was conceived originally as a tape verifier for the Radiation and Environmental Data Acquisition Recorder (REDAR). From that simple beginning in 1971, the REDAC has evolved into a family of systems used for complete analysis of data obtained by the REDAR and other acquisition systems. Portable or mobile REDACs are deployed to support checkout and analysis tasks in the field. Laboratory systems are additionally used for software development, physics investigations, data base management and graphics. System configurations range from man-portable systems to a large laboratory-based system which supports time-shared analysis and development tasks. Custom operating software allows the analyst to process data either interactively or by batch procedures. Analysis packages are provided for numerous necessary functions. All these analysis procedures can be performed even on the smallest man-portable REDAC. Examples of the multi-isotope stripping and radiation isopleth mapping are presented. Techniques utilized for these operations are also presented

  11. Development of neutron activation analysis software

    International Nuclear Information System (INIS)

    Wang Liyu

    1987-10-01

    The software for quantitative neutron activation analysis was developed to run under the MS/DOS operating system. The programmes of the IBM/SPAN include: spectra file transfer from and to a Canberra Series 35 multichannel analyzer, spectrum evaluation routines, calibration subprogrammes, and quantitative analysis. The programmes for spectrum analysis include fitting routine for separation of multiple lines by reproducing the peak shape with a combination of Gaussian and exponential terms. The programmes were tested on an IBM/AT-compatible computer. The programmes and the sources are available costfree for the IAEA projects of Technical Cooperation. 7 refs, 3 figs

  12. Impact analysis of a hydrogen isotopes container

    International Nuclear Information System (INIS)

    Lee, M. S.; Hwang, C. S.; Jeong, H. S.

    2003-01-01

    The container used for the radioactive materials, containing hydrogen isotopes is evaluated in a view of hypothetical accident. The computational analysis is a cost effective tool to minimize testing and streamline the regulatory procedures, and supports experimental programs to qualify the container for the safe transport of radioactive materials. The numerical analysis of 9m free-drop onto a flat unyielding, horizontal surface has been performed using the explicit finite element computer program ABAQUS. Especially free-drop simulations for 30 .deg. C tilted condition are precisely estimated

  13. IUE Data Analysis Software for Personal Computers

    Science.gov (United States)

    Thompson, R.; Caplinger, J.; Taylor, L.; Lawton , P.

    1996-01-01

    This report summarizes the work performed for the program titled, "IUE Data Analysis Software for Personal Computers" awarded under Astrophysics Data Program NRA 92-OSSA-15. The work performed was completed over a 2-year period starting in April 1994. As a result of the project, 450 IDL routines and eight database tables are now available for distribution for Power Macintosh computers and Personal Computers running Windows 3.1.

  14. Application of Metric-based Software Reliability Analysis to Example Software

    International Nuclear Information System (INIS)

    Kim, Man Cheol; Smidts, Carol

    2008-07-01

    The software reliability of TELLERFAST ATM software is analyzed by using two metric-based software reliability analysis methods, a state transition diagram-based method and a test coverage-based method. The procedures for the software reliability analysis by using the two methods and the analysis results are provided in this report. It is found that the two methods have a relation of complementary cooperation, and therefore further researches on combining the two methods to reflect the benefit of the complementary cooperative effect to the software reliability analysis are recommended

  15. Measurement of isotope abundance variations in nature by gravimetric spiking isotope dilution analysis (GS-IDA).

    Science.gov (United States)

    Chew, Gina; Walczyk, Thomas

    2013-04-02

    Subtle variations in the isotopic composition of elements carry unique information about physical and chemical processes in nature and are now exploited widely in diverse areas of research. Reliable measurement of natural isotope abundance variations is among the biggest challenges in inorganic mass spectrometry as they are highly sensitive to methodological bias. For decades, double spiking of the sample with a mix of two stable isotopes has been considered the reference technique for measuring such variations both by multicollector-inductively coupled plasma mass spectrometry (MC-ICPMS) and multicollector-thermal ionization mass spectrometry (MC-TIMS). However, this technique can only be applied to elements having at least four stable isotopes. Here we present a novel approach that requires measurement of three isotope signals only and which is more robust than the conventional double spiking technique. This became possible by gravimetric mixing of the sample with an isotopic spike in different proportions and by applying principles of isotope dilution for data analysis (GS-IDA). The potential and principle use of the technique is demonstrated for Mg in human urine using MC-TIMS for isotopic analysis. Mg is an element inaccessible to double spiking methods as it consists of three stable isotopes only and shows great potential for metabolically induced isotope effects waiting to be explored.

  16. Digital PIV (DPIV) Software Analysis System

    Science.gov (United States)

    Blackshire, James L.

    1997-01-01

    A software package was developed to provide a Digital PIV (DPIV) capability for NASA LaRC. The system provides an automated image capture, test correlation, and autocorrelation analysis capability for the Kodak Megaplus 1.4 digital camera system for PIV measurements. The package includes three separate programs that, when used together with the PIV data validation algorithm, constitutes a complete DPIV analysis capability. The programs are run on an IBM PC/AT host computer running either Microsoft Windows 3.1 or Windows 95 using a 'quickwin' format that allows simple user interface and output capabilities to the windows environment.

  17. Stable isotope analysis in primatology: a critical review.

    Science.gov (United States)

    Sandberg, Paul A; Loudon, James E; Sponheimer, Matt

    2012-11-01

    Stable isotope analysis has become an important tool in ecology over the last 25 years. A wealth of ecological information is stored in animal tissues in the relative abundances of the stable isotopes of several elements, particularly carbon and nitrogen, because these isotopes navigate through ecological processes in predictable ways. Stable carbon and nitrogen isotopes have been measured in most primate taxonomic groups and have yielded information about dietary content, dietary variability, and habitat use. Stable isotopes have recently proven useful for addressing more fine-grained questions about niche dynamics and anthropogenic effects on feeding ecology. Here, we discuss stable carbon and nitrogen isotope systematics and critically review the published stable carbon and nitrogen isotope data for modern primates with a focus on the problems and prospects for future stable isotope applications in primatology. © 2012 Wiley Periodicals, Inc.

  18. Specdata: Automated Analysis Software for Broadband Spectra

    Science.gov (United States)

    Oliveira, Jasmine N.; Martin-Drumel, Marie-Aline; McCarthy, Michael C.

    2017-06-01

    With the advancement of chirped-pulse techniques, broadband rotational spectra with a few tens to several hundred GHz of spectral coverage are now routinely recorded. When studying multi-component mixtures that might result, for example, with the use of an electrical discharge, lines of new chemical species are often obscured by those of known compounds, and analysis can be laborious. To address this issue, we have developed SPECdata, an open source, interactive tool which is designed to simplify and greatly accelerate the spectral analysis and discovery. Our software tool combines both automated and manual components that free the user from computation, while giving him/her considerable flexibility to assign, manipulate, interpret and export their analysis. The automated - and key - component of the new software is a database query system that rapidly assigns transitions of known species in an experimental spectrum. For each experiment, the software identifies spectral features, and subsequently assigns them to known molecules within an in-house database (Pickett .cat files, list of frequencies...), or those catalogued in Splatalogue (using automatic on-line queries). With suggested assignments, the control is then handed over to the user who can choose to accept, decline or add additional species. Data visualization, statistical information, and interactive widgets assist the user in making decisions about their data. SPECdata has several other useful features intended to improve the user experience. Exporting a full report of the analysis, or a peak file in which assigned lines are removed are among several options. A user may also save their progress to continue at another time. Additional features of SPECdata help the user to maintain and expand their database for future use. A user-friendly interface allows one to search, upload, edit or update catalog or experiment entries.

  19. Software development processes and analysis software: a mismatch and a novel framework

    International Nuclear Information System (INIS)

    Kelly, D.; Harauz, J.

    2011-01-01

    This paper discusses the salient characteristics of analysis software and the impact of those characteristics on its development. From this discussion, it can be seen that mainstream software development processes, usually characterized as Plan Driven or Agile, are built upon assumptions that are mismatched to the development and maintenance of analysis software. We propose a novel software development framework that would match the process normally observed in the development of analysis software. In the discussion of this framework, we suggest areas of research and directions for future work. (author)

  20. CMS Computing Software and Analysis Challenge 2006

    Energy Technology Data Exchange (ETDEWEB)

    De Filippis, N. [Dipartimento interateneo di Fisica M. Merlin and INFN Bari, Via Amendola 173, 70126 Bari (Italy)

    2007-10-15

    The CMS (Compact Muon Solenoid) collaboration is making a big effort to test the workflow and the dataflow associated with the data handling model. With this purpose the computing, software and analysis Challenge 2006, namely CSA06, started the 15th of September. It was a 50 million event exercise that included all the steps of the analysis chain, like the prompt reconstruction, the data streaming, calibration and alignment iterative executions, the data distribution to regional sites, up to the end-user analysis. Grid tools provided by the LCG project are also experimented to gain access to the data and the resources by providing a user friendly interface to the physicists submitting the production and the analysis jobs. An overview of the status and results of the CSA06 is presented in this work.

  1. CMS Computing Software and Analysis Challenge 2006

    International Nuclear Information System (INIS)

    De Filippis, N.

    2007-01-01

    The CMS (Compact Muon Solenoid) collaboration is making a big effort to test the workflow and the dataflow associated with the data handling model. With this purpose the computing, software and analysis Challenge 2006, namely CSA06, started the 15th of September. It was a 50 million event exercise that included all the steps of the analysis chain, like the prompt reconstruction, the data streaming, calibration and alignment iterative executions, the data distribution to regional sites, up to the end-user analysis. Grid tools provided by the LCG project are also experimented to gain access to the data and the resources by providing a user friendly interface to the physicists submitting the production and the analysis jobs. An overview of the status and results of the CSA06 is presented in this work

  2. Applications of stable isotope analysis in mammalian ecology.

    Science.gov (United States)

    Walter, W David; Kurle, Carolyn M; Hopkins, John B

    2014-01-01

    In this editorial, we provide a brief introduction and summarize the 10 research articles included in this Special Issue on Applications of stable isotope analysis in mammalian ecology. The first three articles report correction and discrimination factors that can be used to more accurately estimate the diets of extinct and extant mammals using stable isotope analysis. The remaining seven applied research articles use stable isotope analysis to address a variety of wildlife conservation and management questions from the oceans to the mountains.

  3. Static analysis of software the abstract interpretation

    CERN Document Server

    Boulanger, Jean-Louis

    2013-01-01

    The existing literature currently available to students and researchers is very general, covering only the formal techniques of static analysis. This book presents real examples of the formal techniques called ""abstract interpretation"" currently being used in various industrial fields: railway, aeronautics, space, automotive, etc. The purpose of this book is to present students and researchers, in a single book, with the wealth of experience of people who are intrinsically involved in the realization and evaluation of software-based safety critical systems. As the authors are people curr

  4. A new paradigm for the development of analysis software

    International Nuclear Information System (INIS)

    Kelly, D.; Harauz, J.

    2012-01-01

    For the CANDU industry, analysis software is an important tool for scientists and engineers to examine issues related to safety, operation, and design. However, the software quality assurance approach currently used for these tools assumes the software is the delivered product. In this paper, we present a model that shifts the emphasis from software being the end-product to software being support for the end-product, the science. We describe a novel software development paradigm that supports this shift and provides the groundwork for re-examining the quality assurance practices used for analysis software. (author)

  5. Modularity analysis of automotive control software

    OpenAIRE

    Dajsuren, Y.; Brand, van den, M.G.J.; Serebrenik, A.

    2013-01-01

    A design language and tool like MATLAB/Simulink is used for the graphical modelling and simulation of automotive control software. As the functionality based on electronics and software systems increases in motor vehicles, it is becoming increasingly important for system/software architects and control engineers in the automotive industry to ensure the quality of the highly complex MATLAB/Simulink control software. For automotive software, modularity is recognized as being a crucial quality a...

  6. Carbon isotope analysis in apple nectar beverages

    Directory of Open Access Journals (Sweden)

    Ricardo Figueira

    2013-03-01

    Full Text Available The aims of this study were to use the isotope analysis method to quantify the carbon of C3 photosynthetic cycle in commercial apple nectars and to determine the legal limit to identify the beverages that do not conform to the safety standards established by the Brazilian Ministry of Agriculture, Livestock and Food Supply. These beverages (apple nectars were produced in the laboratory according to the Brazilian legislation. Adulterated nectars were also produced with an amount of pulp juice below the permitted threshold limit value. The δ13C values of the apple nectars and their fractions (pulp and purified sugar were measured to quantify the C3 source percentage. In order to demonstrate the existence of adulteration, the values found were compared to the limit values established by the Brazilian Law. All commercial apple nectars analyzed were within the legal limits, which enabled to identify the nectars that were in conformity with the Brazilian Law. The isotopic methodology developed proved efficient to quantify the carbon of C3 origin in commercial apple nectars.

  7. Development of Image Analysis Software of MAXI

    Science.gov (United States)

    Eguchi, S.; Ueda, Y.; Hiroi, K.; Isobe, N.; Sugizaki, M.; Suzuki, M.; Tomida, H.; Maxi Team

    2010-12-01

    Monitor of All-sky X-ray Image (MAXI) is an X-ray all-sky monitor, attached to the Japanese experiment module Kibo on the International Space Station. The main scientific goals of the MAXI mission include the discovery of X-ray novae followed by prompt alerts to the community (Negoro et al., in this conference), and production of X-ray all-sky maps and new source catalogs with unprecedented sensitivities. To extract the best capabilities of the MAXI mission, we are working on the development of detailed image analysis tools. We utilize maximum likelihood fitting to a projected sky image, where we take account of the complicated detector responses, such as the background and point spread functions (PSFs). The modeling of PSFs, which strongly depend on the orbit and attitude of MAXI, is a key element in the image analysis. In this paper, we present the status of our software development.

  8. STAR: Software Toolkit for Analysis Research

    International Nuclear Information System (INIS)

    Doak, J.E.; Prommel, J.M.; Whiteson, R.; Hoffbauer, B.L.; Thomas, T.R.; Helman, P.

    1993-01-01

    Analyzing vast quantities of data from diverse information sources is an increasingly important element for nonproliferation and arms control analysis. Much of the work in this area has used human analysts to assimilate, integrate, and interpret complex information gathered from various sources. With the advent of fast computers, we now have the capability to automate this process thereby shifting this burden away from humans. In addition, there now exist huge data storage capabilities which have made it possible to formulate large integrated databases comprising many thereabouts of information spanning a variety of subjects. We are currently designing a Software Toolkit for Analysis Research (STAR) to address these issues. The goal of STAR is to Produce a research tool that facilitates the development and interchange of algorithms for locating phenomena of interest to nonproliferation and arms control experts. One major component deals with the preparation of information. The ability to manage and effectively transform raw data into a meaningful form is a prerequisite for analysis by any methodology. The relevant information to be analyzed can be either unstructured text structured data, signals, or images. Text can be numerical and/or character, stored in raw data files, databases, streams of bytes, or compressed into bits in formats ranging from fixed, to character-delimited, to a count followed by content The data can be analyzed in real-time or batch mode. Once the data are preprocessed, different analysis techniques can be applied. Some are built using expert knowledge. Others are trained using data collected over a period of time. Currently, we are considering three classes of analyzers for use in our software toolkit: (1) traditional machine learning techniques, (2) the purely statistical system, and (3) expert systems

  9. The ESA's Space Trajectory Analysis software suite

    Science.gov (United States)

    Ortega, Guillermo

    The European Space Agency (ESA) initiated in 2005 an internal activity to develop an open source software suite involving university science departments and research institutions all over the world. This project is called the "Space Trajectory Analysis" or STA. This article describes the birth of STA and its present configuration. One of the STA aims is to promote the exchange of technical ideas, and raise knowledge and competence in the areas of applied mathematics, space engineering, and informatics at University level. Conceived as a research and education tool to support the analysis phase of a space mission, STA is able to visualize a wide range of space trajectories. These include among others ascent, re-entry, descent and landing trajectories, orbits around planets and moons, interplanetary trajectories, rendezvous trajectories, etc. The article explains that STA project is an original idea of the Technical Directorate of ESA. It was born in August 2005 to provide a framework in astrodynamics research at University level. As research and education software applicable to Academia, a number of Universities support this development by joining ESA in leading the development. ESA and Universities partnership are expressed in the STA Steering Board. Together with ESA, each University has a chair in the board whose tasks are develop, control, promote, maintain, and expand the software suite. The article describes that STA provides calculations in the fields of spacecraft tracking, attitude analysis, coverage and visibility analysis, orbit determination, position and velocity of solar system bodies, etc. STA implements the concept of "space scenario" composed of Solar system bodies, spacecraft, ground stations, pads, etc. It is able to propagate the orbit of a spacecraft where orbital propagators are included. STA is able to compute communication links between objects of a scenario (coverage, line of sight), and to represent the trajectory computations and

  10. Effective Results Analysis for the Similar Software Products’ Orthogonality

    Directory of Open Access Journals (Sweden)

    Ion Ivan

    2009-10-01

    Full Text Available It is defined the concept of similar software. There are established conditions of archiving the software components. It is carried out the orthogonality evaluation and the correlation between the orthogonality and the complexity of the homogenous software components is analyzed. Shall proceed to build groups of similar software products, belonging to the orthogonality intervals. There are presented in graphical form the results of the analysis. There are detailed aspects of the functioning of the software product allocated for the orthogonality.

  11. Modularity analysis of automotive control software

    NARCIS (Netherlands)

    Dajsuren, Y.; Brand, van den M.G.J.; Serebrenik, A.

    2013-01-01

    A design language and tool like MATLAB/Simulink is used for the graphical modelling and simulation of automotive control software. As the functionality based on electronics and software systems increases in motor vehicles, it is becoming increasingly important for system/software architects and

  12. Basic methods of isotope analysis; Osnovnye metody analiza izotopov

    Energy Technology Data Exchange (ETDEWEB)

    Ochkin, A V; Rozenkevich, M B

    2000-07-01

    The bases of the most applied methods of the isotope analysis are briefly presented. The possibilities and analytical characteristics of the mass-spectrometric, spectral, radiochemical and special methods of the isotope analysis, including application of the magnetic resonance, chromatography and refractometry, are considered.

  13. Elementary study on γ analysis software for low level measurement

    International Nuclear Information System (INIS)

    Ruan Guanglin; Huang Xianguo; Xing Shixiong

    2001-01-01

    The difficulty in using fashion γ analysis software in low level measurement is discussed. The ROI report file of ORTEC operation system has been chosen as interface file to write γ analysis software for low-level measurement. The author gives software flowchart and applied example and discusses the existent problems

  14. Visual querying and analysis of large software repositories

    NARCIS (Netherlands)

    Voinea, Lucian; Telea, Alexandru

    We present a software framework for mining software repositories. Our extensible framework enables the integration of data extraction from repositories with data analysis and interactive visualization. We demonstrate the applicability of the framework by presenting several case studies performed on

  15. Isotope analysis of closely adjacent minerals

    International Nuclear Information System (INIS)

    Smith, M.P.

    1990-01-01

    This patent describes a method of determining an indicator of at least one of hydrocarbon formation, migration, and accumulation during mineral development. It comprises: searching for a class of minerals in a mineral specimen comprising more than one class of minerals; identifying in the mineral specimen a target sample of the thus searched for class; directing thermally pyrolyzing laser beam radiation onto surface mineral substance of the target sample in the mineral specimen releasing surface mineral substance pyrolysate gases therefrom; and determining isotope composition essentially of the surface mineral substance from analyzing the pyrolysate gases released from the thus pyrolyzed target sample, the isotope composition including isotope(s) selected from the group consisting of carbon, hydrogen, and oxygen isotopes; determining an indicator of at least one of hydrocarbon formation, migration, and accumulation during mineral development of the target mineral from thus determined isotope composition of surface mineral substance pyrolysate

  16. Image processing and analysis software development

    International Nuclear Information System (INIS)

    Shahnaz, R.

    1999-01-01

    The work presented in this project is aimed at developing a software 'IMAGE GALLERY' to investigate various image processing and analysis techniques. The work was divided into two parts namely the image processing techniques and pattern recognition, which further comprised of character and face recognition. Various image enhancement techniques including negative imaging, contrast stretching, compression of dynamic, neon, diffuse, emboss etc. have been studied. Segmentation techniques including point detection, line detection, edge detection have been studied. Also some of the smoothing and sharpening filters have been investigated. All these imaging techniques have been implemented in a window based computer program written in Visual Basic Neural network techniques based on Perception model have been applied for face and character recognition. (author)

  17. Software applications for flux balance analysis.

    Science.gov (United States)

    Lakshmanan, Meiyappan; Koh, Geoffrey; Chung, Bevan K S; Lee, Dong-Yup

    2014-01-01

    Flux balance analysis (FBA) is a widely used computational method for characterizing and engineering intrinsic cellular metabolism. The increasing number of its successful applications and growing popularity are possibly attributable to the availability of specific software tools for FBA. Each tool has its unique features and limitations with respect to operational environment, user-interface and supported analysis algorithms. Presented herein is an in-depth evaluation of currently available FBA applications, focusing mainly on usability, functionality, graphical representation and inter-operability. Overall, most of the applications are able to perform basic features of model creation and FBA simulation. COBRA toolbox, OptFlux and FASIMU are versatile to support advanced in silico algorithms to identify environmental and genetic targets for strain design. SurreyFBA, WEbcoli, Acorn, FAME, GEMSiRV and MetaFluxNet are the distinct tools which provide the user friendly interfaces in model handling. In terms of software architecture, FBA-SimVis and OptFlux have the flexible environments as they enable the plug-in/add-on feature to aid prospective functional extensions. Notably, an increasing trend towards the implementation of more tailored e-services such as central model repository and assistance to collaborative efforts was observed among the web-based applications with the help of advanced web-technologies. Furthermore, most recent applications such as the Model SEED, FAME, MetaFlux and MicrobesFlux have even included several routines to facilitate the reconstruction of genome-scale metabolic models. Finally, a brief discussion on the future directions of FBA applications was made for the benefit of potential tool developers.

  18. Standardizing Activation Analysis: New Software for Photon Activation Analysis

    Science.gov (United States)

    Sun, Z. J.; Wells, D.; Segebade, C.; Green, J.

    2011-06-01

    Photon Activation Analysis (PAA) of environmental, archaeological and industrial samples requires extensive data analysis that is susceptible to error. For the purpose of saving time, manpower and minimizing error, a computer program was designed, built and implemented using SQL, Access 2007 and asp.net technology to automate this process. Based on the peak information of the spectrum and assisted by its PAA library, the program automatically identifies elements in the samples and calculates their concentrations and respective uncertainties. The software also could be operated in browser/server mode, which gives the possibility to use it anywhere the internet is accessible. By switching the nuclide library and the related formula behind, the new software can be easily expanded to neutron activation analysis (NAA), charged particle activation analysis (CPAA) or proton-induced X-ray emission (PIXE). Implementation of this would standardize the analysis of nuclear activation data. Results from this software were compared to standard PAA analysis with excellent agreement. With minimum input from the user, the software has proven to be fast, user-friendly and reliable.

  19. Standardizing Activation Analysis: New Software for Photon Activation Analysis

    International Nuclear Information System (INIS)

    Sun, Z. J.; Wells, D.; Green, J.; Segebade, C.

    2011-01-01

    Photon Activation Analysis (PAA) of environmental, archaeological and industrial samples requires extensive data analysis that is susceptible to error. For the purpose of saving time, manpower and minimizing error, a computer program was designed, built and implemented using SQL, Access 2007 and asp.net technology to automate this process. Based on the peak information of the spectrum and assisted by its PAA library, the program automatically identifies elements in the samples and calculates their concentrations and respective uncertainties. The software also could be operated in browser/server mode, which gives the possibility to use it anywhere the internet is accessible. By switching the nuclide library and the related formula behind, the new software can be easily expanded to neutron activation analysis (NAA), charged particle activation analysis (CPAA) or proton-induced X-ray emission (PIXE). Implementation of this would standardize the analysis of nuclear activation data. Results from this software were compared to standard PAA analysis with excellent agreement. With minimum input from the user, the software has proven to be fast, user-friendly and reliable.

  20. System and method for high precision isotope ratio destructive analysis

    Science.gov (United States)

    Bushaw, Bruce A; Anheier, Norman C; Phillips, Jon R

    2013-07-02

    A system and process are disclosed that provide high accuracy and high precision destructive analysis measurements for isotope ratio determination of relative isotope abundance distributions in liquids, solids, and particulate samples. The invention utilizes a collinear probe beam to interrogate a laser ablated plume. This invention provides enhanced single-shot detection sensitivity approaching the femtogram range, and isotope ratios that can be determined at approximately 1% or better precision and accuracy (relative standard deviation).

  1. Unit of stable isotopic N15 analysis

    International Nuclear Information System (INIS)

    Cabrera de Bisbal, Evelin; Paredes U, Maria

    1997-01-01

    The continuous and growing demand of crops and cattle for the domestic inhabitants, forces the search of technical solutions in agriculture. One of the solutions able to be covered in a near future it is the escalation of agricultural production in lands already being cultivated, either by means of an intensification of cultivation and / or increasing the unitary yields. In the intensive cropping systems, the crops extract substantial quantities of nutriments that is recovered by means of the application of fertilizers. Due to the lack of resources and to the increase of commercial inputs prices, it has been necessary to pay attention to the analysis and improvement of low inputs cropping systems and to the effective use of resources. Everything has made to establish a concept of plant nutrition focused system, which integrate the sources of nutriments for plants and the production factors of crops in a productive cropping system, to improve the fertility of soils, the agricultural productivity and profitability. This system includes the biggest efficiency of chemical fertilizers as the maximum profit of alternative sources of nutriments, such as organic fertilizers, citrate-phosphate rocks and biological nitrogen fixation. By means of field experiments under different environmental conditions (soils and climate) it can be determined the best combination of fertilizers practice (dose, placement, opportunity and source) for selected cropping systems. The experimentation with fertilizer, marked with stable and radioactive isotopes, provides a direct and express method to obtain conclusive answers to the questions: where, when and how should be applied. The fertilizers marked with N 1 5 have been used to understand the application of marked fertilizer to the cultivations, and the determination of the proportion of crops nutritious element derived from fertilizer. The isotopic techniques offer a fast and reliable mean to obtain information about the distribution of

  2. LANDSAFE: LANDING SITE RISK ANALYSIS SOFTWARE FRAMEWORK

    Directory of Open Access Journals (Sweden)

    R. Schmidt

    2012-08-01

    Full Text Available The European Space Agency (ESA is planning a Lunar Lander mission in the 2018 timeframe that will demonstrate precise soft landing at the polar regions of the Moon. To ensure a safe and successful landing a careful risk analysis has to be carried out. This is comprised of identifying favorable target areas and evaluating the surface conditions in these areas. Features like craters, boulders, steep slopes, rough surfaces and shadow areas have to be identified in order to assess the risk associated to a landing site in terms of a successful touchdown and subsequent surface operation of the lander. In addition, global illumination conditions at the landing site have to be simulated and analyzed. The Landing Site Risk Analysis software framework (LandSAfe is a system for the analysis, selection and certification of safe landing sites on the lunar surface. LandSAfe generates several data products including high resolution digital terrain models (DTMs, hazard maps, illumination maps, temperature maps and surface reflectance maps which assist the user in evaluating potential landing site candidates. This paper presents the LandSAfe system and describes the methods and products of the different modules. For one candidate landing site on the rim of Shackleton crater at the south pole of the Moon a high resolution DTM is showcased.

  3. Software patterns, knowledge maps, and domain analysis

    CERN Document Server

    Fayad, Mohamed E; Hegde, Srikanth GK; Basia, Anshu; Vakil, Ashka

    2014-01-01

    Preface AcknowledgmentsAuthors INTRODUCTIONAn Overview of Knowledge MapsIntroduction: Key Concepts-Software Stable Models, Knowledge Maps, Pattern Language, Goals, Capabilities (Enduring Business Themes + Business Objects) The Motivation The Problem The Objectives Overview of Software Stability Concepts Overview of Knowledge Maps Pattern Languages versus Knowledge Maps: A Brief ComparisonThe Solution Knowledge Maps Methodology or Concurrent Software Development ModelWhy Knowledge Maps? Research Methodology Undertaken Research Verification and Validation The Stratification of This Book Summary

  4. ATTA - A new method of ultrasensitive isotope trace analysis

    International Nuclear Information System (INIS)

    Bailey, K.; Chen, C.Y.; Du, X.; Li, Y.M.; Lu, Z.-T.; O'Connor, T.P.; Young, L.

    2000-01-01

    A new method of ultrasensitive isotope trace analysis has been developed. This method, based on the technique of laser manipulation of neutral atoms, has been used to count individual 85 Kr and 81 Kr atoms present in a natural krypton gas sample with isotopic abundances in the range of 10 -11 and 10 -13 , respectively. This method is free of contamination from other isotopes and elements and can be applied to various different isotope tracers for a wide range of applications. The demonstrated detection efficiency is 1x10 -7 . System improvements could increase the efficiency by many orders of magnitude

  5. The SAVI Vulnerability Analysis Software Package

    International Nuclear Information System (INIS)

    Mc Aniff, R.J.; Paulus, W.K.; Key, B.; Simpkins, B.

    1987-01-01

    SAVI (Systematic Analysis of Vulnerability to Intrusion) is a new PC-based software package for modeling Physical Protection Systems (PPS). SAVI utilizes a path analysis approach based on the Adversary Sequence Diagram (ASD) methodology. A highly interactive interface allows the user to accurately model complex facilities, maintain a library of these models on disk, and calculate the most vulnerable paths through any facility. Recommendations are provided to help the user choose facility upgrades which should reduce identified path vulnerabilities. Pop-up windows throughout SAVI are used for the input and display of information. A menu at the top of the screen presents all options to the user. These options are further explained on a message line directly below the menu. A diagram on the screen graphically represents the current protection system model. All input is checked for errors, and data are presented in a logical and clear manner. Print utilities provide the user with hard copies of all information and calculated results

  6. Isotopic Abundance and Chemical Purity Analysis of Stable Isotope Deuterium Labeled Sudan I

    Directory of Open Access Journals (Sweden)

    CAI Yin-ping;LEI Wen;ZHENG Bo;DU Xiao-ning

    2014-02-01

    Full Text Available It is important that to analysis of the isotopic abundance and chemical purity of Sudan I-D5, which is the internal standard of isotope dilution mass spectrometry. The isotopic abundance of Sudan I-D5 is detected by “mass cluster” classification method and LC-MS. The repeatability and reproducibility experiments were carried out by using different mass spectrometers and different operators. The RSD was less than 0.1%, so the repeatability and reproducibility were satisfactory. The accuracy and precision of the isotopic abundance analysis method was good with the results of F test and t test. The high performance liquid chromatography (HPLC had been used for detecting the chemical purity of Sudan I-D5 as external standard method.

  7. Isotopic analysis of plutonium by computer controlled mass spectrometry

    International Nuclear Information System (INIS)

    1974-01-01

    Isotopic analysis of plutonium chemically purified by ion exchange is achieved using a thermal ionization mass spectrometer. Data acquisition from and control of the instrument is done automatically with a dedicated system computer in real time with subsequent automatic data reduction and reporting. Separation of isotopes is achieved by varying the ion accelerating high voltage with accurate computer control

  8. Social network analysis in software process improvement

    DEFF Research Database (Denmark)

    Nielsen, Peter Axel; Tjørnehøj, Gitte

    2010-01-01

    Software process improvement in small organisation is often problematic and communication and knowledge sharing is more informal. To improve software processes we need to understand how they communicate and share knowledge. In this article have studied the company SmallSoft through action research...

  9. Effective Results Analysis for the Similar Software Products’ Orthogonality

    OpenAIRE

    Ion Ivan; Daniel Milodin

    2009-01-01

    It is defined the concept of similar software. There are established conditions of archiving the software components. It is carried out the orthogonality evaluation and the correlation between the orthogonality and the complexity of the homogenous software components is analyzed. Shall proceed to build groups of similar software products, belonging to the orthogonality intervals. There are presented in graphical form the results of the analysis. There are detailed aspects of the functioning o...

  10. Extermination Of Uranium Isotopes Composition Using A Micro Computer With An IEEE-488 Interface For Mass Spectrometer Analysis

    International Nuclear Information System (INIS)

    Prajitno; Taftazani, Agus; Yusuf

    1996-01-01

    A mass spectrometry method can be used to make qualitative or quantitative analysis. For qualitative analysis, identification of unknown materials by a Mass Spectrometer requires definite assignment of mass number to peak on chart. In quantitative analysis, a mass spectrometer is used to determine isotope composition material in the sample. Analysis system of a Mass Spectrometer possession of PPNY-BATAN based on comparison ion current intensity which enter the collector, and have been used to analyse isotope composition. Calculation of isotope composition have been manually done. To increase the performance and to avoid manual data processing, a micro computer and IEEE-488 interface have been installed, also software packaged has been made. So that the determination of the isotope composition of material in the sample will be faster and more efficient. Tile accuracy of analysis using this program on sample standard U 3 O 8 NBS 010 is between 93,87% - 99,98%

  11. Potku – New analysis software for heavy ion elastic recoil detection analysis

    International Nuclear Information System (INIS)

    Arstila, K.; Julin, J.; Laitinen, M.I.; Aalto, J.; Konu, T.; Kärkkäinen, S.; Rahkonen, S.; Raunio, M.; Itkonen, J.; Santanen, J.-P.; Tuovinen, T.; Sajavaara, T.

    2014-01-01

    Time-of-flight elastic recoil detection (ToF-ERD) analysis software has been developed. The software combines a Python-language graphical front-end with a C code computing back-end in a user-friendly way. The software uses a list of coincident time-of-flight–energy (ToF–E) events as an input. The ToF calibration can be determined with a simple graphical procedure. The graphical interface allows the user to select different elements and isotopes from a ToF–E histogram and to convert the selections to individual elemental energy and depth profiles. The resulting sample composition can be presented as relative or absolute concentrations by integrating the depth profiles over user-defined ranges. Beam induced composition changes can be studied by displaying the event-based data in fractions relative to the substrate reference data. Optional angular input data allows for kinematic correction of the depth profiles. This open source software is distributed under the GPL license for Linux, Mac, and Windows environments

  12. Potku – New analysis software for heavy ion elastic recoil detection analysis

    Energy Technology Data Exchange (ETDEWEB)

    Arstila, K., E-mail: kai.arstila@jyu.fi [Department of Physics, University of Jyväskylä, P.O. Box 35, FI-40014, Jyväskylä (Finland); Julin, J.; Laitinen, M.I. [Department of Physics, University of Jyväskylä, P.O. Box 35, FI-40014, Jyväskylä (Finland); Aalto, J.; Konu, T.; Kärkkäinen, S.; Rahkonen, S.; Raunio, M.; Itkonen, J.; Santanen, J.-P.; Tuovinen, T. [Department of Mathematical Information Technology, University of Jyväskylä, P.O. Box 35, FI-40014, Jyväskylä (Finland); Sajavaara, T. [Department of Physics, University of Jyväskylä, P.O. Box 35, FI-40014, Jyväskylä (Finland)

    2014-07-15

    Time-of-flight elastic recoil detection (ToF-ERD) analysis software has been developed. The software combines a Python-language graphical front-end with a C code computing back-end in a user-friendly way. The software uses a list of coincident time-of-flight–energy (ToF–E) events as an input. The ToF calibration can be determined with a simple graphical procedure. The graphical interface allows the user to select different elements and isotopes from a ToF–E histogram and to convert the selections to individual elemental energy and depth profiles. The resulting sample composition can be presented as relative or absolute concentrations by integrating the depth profiles over user-defined ranges. Beam induced composition changes can be studied by displaying the event-based data in fractions relative to the substrate reference data. Optional angular input data allows for kinematic correction of the depth profiles. This open source software is distributed under the GPL license for Linux, Mac, and Windows environments.

  13. A method of uranium isotopes concentration analysis

    International Nuclear Information System (INIS)

    Lin Yuangen; Jiang Meng; Wu Changli; Duan Zhanyuan; Guo Chunying

    2010-01-01

    A basic method of uranium isotopes concentration is described in this paper. The iteration method is used to calculate the relative efficiency curve, by analyzing the characteristic γ energy spectrum of 235 U, 232 U and the daughter nuclide of 238 U, then the relative activity can be calculated, at last the uranium isotopes concentration can be worked out, and the result is validated by the experimentation. (authors)

  14. AVNG System Software-Attribute Verification System with Information Barriers for Mass Isotopic Measurements

    International Nuclear Information System (INIS)

    Elmont, T.H.; Langner, Diana C.; MacArthur, D.W.; Mayo, D.R.; Smith, M.K.; Modenov, A.

    2005-01-01

    This report describes the software development for the plutonium attribute verification system - AVNG. A brief synopsis of the technical solution for the measurement system is presented. The main tasks for the software development that is underway are formulated. The development tasks are shown in software structural flowcharts, measurement system state diagram and a description of the software. The current status of the AVNG software development is elucidated.

  15. COMPARATIVE ANALYSIS OF SOFTWARE DEVELOPMENT MODELS

    OpenAIRE

    Sandeep Kaur*

    2017-01-01

    No geek is unfamiliar with the concept of software development life cycle (SDLC). This research deals with the various SDLC models covering waterfall, spiral, and iterative, agile, V-shaped, prototype model. In the modern era, all the software systems are fallible as they can’t stand with certainty. So, it is tried to compare all aspects of the various models, their pros and cons so that it could be easy to choose a particular model at the time of need

  16. CAX a software for automated spectrum analysis

    International Nuclear Information System (INIS)

    Zahn, Guilherme S.; Genezini, Frederico A.

    2017-01-01

    In this work, the scripting capabilities of Genie-2000 were used to develop a software that automatically analyses all spectrum files in either Ortec's CHN or Canberra's MCA or CNF formats in a folder, generating two output files: a print-ready text le (.DAT) and a Comma-Separated Values (.CSV) le which can be easily imported in any major spreadsheet software. This software, named CAX ('Convert and Analyse for eXcel'), uses Genie-2000's functions to import spectrum files into Genie's native CNF format and analyze the converted spectra. The software can also, if requested, import energy and FWHM calibrations from a stored calibrated spectrum. The print-ready output le (.DAT) is generated by Genie-2000 using a customized script, and the CSV le is generated by a custom-built DAT2CSV software which generates a CSV le that complies to the Brazilian standards, with commas as a decimal indicator and semicolons as eld separators. This software is already used in the daily routines in IPEN's Neutron Activation Laboratory, greatly reducing the time required for sample analyses, as well as reducing the possibility of transcription errors. (author)

  17. CAX a software for automated spectrum analysis

    Energy Technology Data Exchange (ETDEWEB)

    Zahn, Guilherme S.; Genezini, Frederico A., E-mail: gzahn@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (CRPq/IPEN/CNEN-SP), Sao Paulo, SP (Brazil). Centro do Reator de Pesquisas

    2017-11-01

    In this work, the scripting capabilities of Genie-2000 were used to develop a software that automatically analyses all spectrum files in either Ortec's CHN or Canberra's MCA or CNF formats in a folder, generating two output files: a print-ready text le (.DAT) and a Comma-Separated Values (.CSV) le which can be easily imported in any major spreadsheet software. This software, named CAX ('Convert and Analyse for eXcel'), uses Genie-2000's functions to import spectrum files into Genie's native CNF format and analyze the converted spectra. The software can also, if requested, import energy and FWHM calibrations from a stored calibrated spectrum. The print-ready output le (.DAT) is generated by Genie-2000 using a customized script, and the CSV le is generated by a custom-built DAT2CSV software which generates a CSV le that complies to the Brazilian standards, with commas as a decimal indicator and semicolons as eld separators. This software is already used in the daily routines in IPEN's Neutron Activation Laboratory, greatly reducing the time required for sample analyses, as well as reducing the possibility of transcription errors. (author)

  18. Development of a New VLBI Data Analysis Software

    Science.gov (United States)

    Bolotin, Sergei; Gipson, John M.; MacMillan, Daniel S.

    2010-01-01

    We present an overview of a new VLBI analysis software under development at NASA GSFC. The new software will replace CALC/SOLVE and many related utility programs. It will have the capabilities of the current system as well as incorporate new models and data analysis techniques. In this paper we give a conceptual overview of the new software. We formulate the main goals of the software. The software should be flexible and modular to implement models and estimation techniques that currently exist or will appear in future. On the other hand it should be reliable and possess production quality for processing standard VLBI sessions. Also, it needs to be capable of processing observations from a fully deployed network of VLBI2010 stations in a reasonable time. We describe the software development process and outline the software architecture.

  19. Isotopic analysis of radioactive waste packages (an inexpensive approach)

    International Nuclear Information System (INIS)

    Padula, D.A.; Richmond, J.S.

    1983-01-01

    A computer printout of the isotopic analysis for all radioactive waste packages containing resins, or other aqueous filter media is now required at the disposal sites at Barnwell, South Carolina, and Beatty, Nevada. Richland, Washington requires an isotopic analysis for all radioactive waste packages. The NRC (Nuclear Regulatory Commission), through 10 CFR 61, will require shippers of radioactive waste to classify and label for disposal all radioactive waste forms. These forms include resins, filters, sludges, and dry active waste (trash). The waste classification is to be based upon 10 CFR 61 (Section 1-7). The isotopes upon which waste classification is to be based are tabulated. 7 references, 8 tables

  20. Continuous software quality analysis for the ATLAS experiment

    CERN Document Server

    Washbrook, Andrew; The ATLAS collaboration

    2017-01-01

    The software for the ATLAS experiment on the Large Hadron Collider at CERN has evolved over many years to meet the demands of Monte Carlo simulation, particle detector reconstruction and data analysis. At present over 3.8 million lines of C++ code (and close to 6 million total lines of code) are maintained by an active worldwide developer community. In order to run the experiment software efficiently at hundreds of computing centres it is essential to maintain a high level of software quality standards. The methods proposed to improve software quality practices by incorporating checks into the new ATLAS software build infrastructure.

  1. Selective laser ionization for mass-spectral isotopic analysis

    International Nuclear Information System (INIS)

    Miller, C.M.; Nogar, N.S.; Downey, S.W.

    1983-01-01

    Resonant enhancement of the ionization process can provide a high degree of elemental selectivity, thus eliminating or drastically reducing the interference problem. In addition, extension of this method to isotopically selective ionization has the potential for greatly increasing the range of isotope ratios that can be determined experimentally. This gain can be realized by reducing or eliminating the tailing of the signal from the high-abundance isotope into that of the low-abundance isotope, augmenting the dispersion of the mass spectrometer. We briefly discuss the hardware and techniques used in both our pulsed and cw RIMS experiments. Results are presented for both cw ionization experiments on Lu/Yb mixtures, and spectroscopic studies of multicolor RIMS of Tc. Lastly, we discuss practical limits of cw RIMS analysis in terms of detection limits and measurable isotope ratios

  2. Software Piracy in Research: A Moral Analysis.

    Science.gov (United States)

    Santillanes, Gary; Felder, Ryan Marshall

    2015-08-01

    Researchers in virtually every discipline rely on sophisticated proprietary software for their work. However, some researchers are unable to afford the licenses and instead procure the software illegally. We discuss the prohibition of software piracy by intellectual property laws, and argue that the moral basis for the copyright law offers the possibility of cases where software piracy may be morally justified. The ethics codes that scientific institutions abide by are informed by a rule-consequentialist logic: by preserving personal rights to authored works, people able to do so will be incentivized to create. By showing that the law has this rule-consequentialist grounding, we suggest that scientists who blindly adopt their institutional ethics codes will commit themselves to accepting that software piracy could be morally justified, in some cases. We hope that this conclusion will spark debate over important tensions between ethics codes, copyright law, and the underlying moral basis for these regulations. We conclude by offering practical solutions (other than piracy) for researchers.

  3. A proposal for performing software safety hazard analysis

    International Nuclear Information System (INIS)

    Lawrence, J.D.; Gallagher, J.M.

    1997-01-01

    Techniques for analyzing the safety and reliability of analog-based electronic protection systems that serve to mitigate hazards in process control systems have been developed over many years, and are reasonably understood. An example is the protection system in a nuclear power plant. The extension of these techniques to systems which include digital computers is not well developed, and there is little consensus among software engineering experts and safety experts on how to analyze such systems. One possible technique is to extend hazard analysis to include digital computer-based systems. Software is frequently overlooked during system hazard analyses, but this is unacceptable when the software is in control of a potentially hazardous operation. In such cases, hazard analysis should be extended to fully cover the software. A method for performing software hazard analysis is proposed in this paper. The method concentrates on finding hazards during the early stages of the software life cycle, using an extension of HAZOP

  4. INFOS: spectrum fitting software for NMR analysis

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Albert A., E-mail: alsi@nmr.phys.chem.ethz.ch [ETH Zürich, Physical Chemistry (Switzerland)

    2017-02-15

    Software for fitting of NMR spectra in MATLAB is presented. Spectra are fitted in the frequency domain, using Fourier transformed lineshapes, which are derived using the experimental acquisition and processing parameters. This yields more accurate fits compared to common fitting methods that use Lorentzian or Gaussian functions. Furthermore, a very time-efficient algorithm for calculating and fitting spectra has been developed. The software also performs initial peak picking, followed by subsequent fitting and refinement of the peak list, by iteratively adding and removing peaks to improve the overall fit. Estimation of error on fitting parameters is performed using a Monte-Carlo approach. Many fitting options allow the software to be flexible enough for a wide array of applications, while still being straightforward to set up with minimal user input.

  5. Isotopic analysis using optical spectroscopy (1963)

    International Nuclear Information System (INIS)

    Gerstenkorn, S.

    1963-01-01

    The isotopic displacement in the atomic lines of certain elements (H, He, Li, Ne, Sr, Hg, Pb, U, Pu) is used for dosing these elements isotopically. The use of the Fabry-Perot photo-electric interference spectrometer is shown to be particularly adapted for this sort of problem: in each case we give on the one hand the essential results obtained with this apparatus, and on the other hand the results previously obtained with a conventional apparatus (grating, photographic plate). These results together give an idea of the possibilities of optical spectroscopy: in the best case, the precision which may be expected is of the order of 1 to 2 per cent for isotopes whose concentration is about 1 per cent. (author) [fr

  6. Isotope analysis in the transmission electron microscope.

    Science.gov (United States)

    Susi, Toma; Hofer, Christoph; Argentero, Giacomo; Leuthner, Gregor T; Pennycook, Timothy J; Mangler, Clemens; Meyer, Jannik C; Kotakoski, Jani

    2016-10-10

    The Ångström-sized probe of the scanning transmission electron microscope can visualize and collect spectra from single atoms. This can unambiguously resolve the chemical structure of materials, but not their isotopic composition. Here we differentiate between two isotopes of the same element by quantifying how likely the energetic imaging electrons are to eject atoms. First, we measure the displacement probability in graphene grown from either 12 C or 13 C and describe the process using a quantum mechanical model of lattice vibrations coupled with density functional theory simulations. We then test our spatial resolution in a mixed sample by ejecting individual atoms from nanoscale areas spanning an interface region that is far from atomically sharp, mapping the isotope concentration with a precision better than 20%. Although we use a scanning instrument, our method may be applicable to any atomic resolution transmission electron microscope and to other low-dimensional materials.

  7. Isotopic analysis of bullet lead samples

    International Nuclear Information System (INIS)

    Sankar Das, M.; Venkatasubramanian, V.S.; Sreenivas, K.

    1976-01-01

    The possibility of using the isotopic composition of lead for the identification of bullet lead is investigated. Lead from several spent bullets were converted to lead sulphide and analysed for the isotopic abundances using an MS-7 mass spectrometer. The abundances are measured relative to that for Pb 204 was too small to permit differentiation, while the range of variation of Pb 206 and Pb 207 and the better precision in their analyses permitted differentiating samples from one another. The correlation among the samples examined has been pointed out. The method is complementary to characterisation of bullet leads by the trace element composition. The possibility of using isotopically enriched lead for tagging bullet lead is pointed out. (author)

  8. Isotope analysis of lithium by thermionic mass spectrometry

    International Nuclear Information System (INIS)

    Kakazu, M.H.; Sarkis, J.E.S.

    1991-04-01

    An analytical mass spectrometric method for the isotope analysis of lithium has been studied. The analysis were carried out by using a single focusing thermoionic mass spectrometer Varian Mat TH5 with 90 sup(0) magnetic sector field and 21.4 cm deflection radius, equipped with a dual Re-filament thermal ionization ion source. The effect of different lithium chemical forms, such as, carbonate, chloride, nitrate and sulfate upon the isotopic ratios sup(6)Li/ sup(7)Li has been studied. Isotopic fractionation of lithium was studied in terms of the time of analysis. The results obtained with lithium carbonate yielded a precision of ±0.1% and an accuracy of ± 0.6%, whereas with other chemical forms yielded precisions of ±0.5% and accuracies of ±2%. A fractionation correction factor, K=1.005, was obtained for different samples of lithium carbonate isotopic standard CBNM IRM 016, which has been considered constant. (author)

  9. Isotopic abundance in atom trap trace analysis

    Science.gov (United States)

    Lu, Zheng-Tian; Hu, Shiu-Ming; Jiang, Wei; Mueller, Peter

    2014-03-18

    A method and system for detecting ratios and amounts of isotopes of noble gases. The method and system is constructed to be able to measure noble gas isotopes in water and ice, which helps reveal the geological age of the samples and understand their movements. The method and system uses a combination of a cooled discharge source, a beam collimator, a beam slower and magneto-optic trap with a laser to apply resonance frequency energy to the noble gas to be quenched and detected.

  10. New Isotope Analysis Method: Atom Trap Mass Spectrometry

    International Nuclear Information System (INIS)

    Ko, Kwang Hoon; Park, Hyun Min; Han, Jae Min; Kim, Taek Soo; Cha, Yong Ho; Lim, Gwon; Jeong, Do Young

    2011-01-01

    Trace isotope analysis has been an important role in science, archaeological dating, geology, biology and nuclear industry. Some fission products such as Sr-90, Cs-135 and Kr-85 can be released to the environment when nuclear accident occurs and the reprocessing factory operates. Thus, the analysis of artificially produced radioactive isotopes has been of interest in nuclear industry. But it is difficult to detect them due to low natural abundance less then 10 -10 . In general, radio-chemical method has been applied to detect ultra-trace radio isotopes. But this method has disadvantages of long measurement time for long lived radioisotopes and toxic chemical process for the purification. The Accelerator Mass Spectrometer has high isotope selectivity, but the system is huge and its selectivity is affected by isobars. The laser based method, such as RIMS (Resonance Ionization Mass Spectrometry) has the advantage of isobar-effect free characteristics. But the system size is still huge for high isotope selective system. Recently, ATTA (Atom Trap Trace Analysis) has been successfully applied to detect ultra-trace isotope, Kr-81 and Kr-85. ATTA is the isobar-effect free detection with high isotope selectivity and the system size is small. However, it requires steady atomic beam source during detection, and is not allowed simultaneous detection of several isotopes. In this presentation, we introduce new isotope detection method which is a coupled method of Atom Trap Mass Spectrometry (ATMS). We expect that it can overcome the disadvantage of ATTA while it has both advantages of ATTA and mass spectrometer. The basic concept and the system design will be presented. In addition, the experimental status of ATMS will also be presented

  11. Determination of marble provenance: limits of isotopic analysis

    International Nuclear Information System (INIS)

    Germann, K.; Holzmann, G.; Winkler, F.J.

    1980-01-01

    Provenance determination of Thessalian stelae marbles using the C/O isotopic analysis proved to be misleading, as the isotopic composition even in very small quarrying areas is heterogeneous and isotopic coincidence of marbles from very distant sources occurs. Therefore additional geological features must be taken into consideration and preference should be given to combinations of both petrographical and geochemical properties. Geological field work to establish the range of possible marble sources and the variability within these sources is one of the prerequisites of provenance studies. (author)

  12. Ion sources for solids isotopic analysis

    International Nuclear Information System (INIS)

    Tyrrell, A.C.

    Of the dozen or so methods of producing ions from solid samples only the surface or thermal ionisation method has found general application for precise measurement of isotopic ratios. The author discusses the principal variables affecting the performance of the thermal source; sample preparation, loading onto the filament, sample pre-treatment, filament material. (Auth.)

  13. Ion sources for solids isotopic analysis

    Energy Technology Data Exchange (ETDEWEB)

    Tyrrell, A. C. [Ministry of Defence, Foulness (UK). Atomic Weapons Research Establishment

    1978-12-15

    Of the dozen or so methods of producing ions from solid samples only the surface or thermal ionisation method has found general application for precise measurement of isotopic ratios. The author discusses the principal variables affecting the performance of the thermal source; sample preparation, loading onto the filament, sample pre-treatment, filament material.

  14. Soil Carbon: Compositional and Isotopic Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Moran, James J.; Alexander, M. L.; Laskin, Alexander

    2016-11-01

    This is a short chapter to be included in the next edition of the Encyclopedia of Soil Science. The work here describes techniques being developed at PNNL for investigating organic carbon in soils. Techniques discussed include: laser ablation isotope ratio mass spectrometry, laser ablation aerosol mass spectrometry, and nanospray desorption electrospray ionization mass spectrometry.

  15. Research and Development on Food Nutrition Statistical Analysis Software System

    OpenAIRE

    Du Li; Ke Yun

    2013-01-01

    Designing and developing a set of food nutrition component statistical analysis software can realize the automation of nutrition calculation, improve the nutrition processional professional’s working efficiency and achieve the informatization of the nutrition propaganda and education. In the software development process, the software engineering method and database technology are used to calculate the human daily nutritional intake and the intelligent system is used to evaluate the user’s hea...

  16. Cross-instrument Analysis Correlation Software

    Energy Technology Data Exchange (ETDEWEB)

    2017-06-28

    This program has been designed to assist with the tracking of a sample from one analytical instrument to another such as SEM, microscopes, micro x-ray diffraction and other instruments where particular positions/locations on the sample are examined, photographed, etc. The software is designed to easily enter the position of fiducials and locations of interest such that in a future session in the same of different instrument the positions of interest can be re-found through using the known location fiducials in the current and reference session to transform the point into the current sessions coordinate system. The software is dialog box driven guiding the user through the necessary data entry and program choices. Information is stored in a series of text based extensible markup language (XML) files.

  17. Oxygen isotope analysis of plant water without extraction procedure

    International Nuclear Information System (INIS)

    Gan, K.S.; Wong, S.C.; Farquhar, G.D.; Yong, J.W.H.

    2001-01-01

    Isotopic analyses of plant water (mainly xylem, phloem and leaf water) are gaming importance as the isotopic signals reflect plant-environment interactions, affect the oxygen isotopic composition of atmospheric O 2 and CO 2 and are eventually incorporated into plant organic matter. Conventionally, such isotopic measurements require a time-consuming process of isolating the plant water by azeotropic distillation or vacuum extraction, which would not complement the speed of isotope analysis provided by continuous-flow IRMS (Isotope-Ratio Mass Spectrometry), especially when large data sets are needed for statistical calculations in biological studies. Further, a substantial amount of plant material is needed for water extraction and leaf samples would invariably include unenriched water from the fine veins. To measure sub-microlitre amount of leaf mesophyll water, a new approach is undertaken where a small disc of fresh leaf is cut using a specially designed leaf punch, and pyrolysed directly in an IRMS. By comparing with results from pyrolysis of the dry matter of the same leaf, the 18 O content of leaf water can be determined without extraction from fresh leaves. This method is validated using a range of cellulose-water mixtures to simulate the constituents of fresh leaf. Cotton leaf water δ 18 O obtained from both methods of fresh leaf pyrolysis and azeotropic distillation will be compared. The pyrolysis technique provides a robust approach to measure the isotopic content of water or any volatile present in a homogeneous solution or solid hydrous substance

  18. A study of software safety analysis system for safety-critical software

    International Nuclear Information System (INIS)

    Chang, H. S.; Shin, H. K.; Chang, Y. W.; Jung, J. C.; Kim, J. H.; Han, H. H.; Son, H. S.

    2004-01-01

    The core factors and requirements for the safety-critical software traced and the methodology adopted in each stage of software life cycle are presented. In concept phase, Failure Modes and Effects Analysis (FMEA) for the system has been performed. The feasibility evaluation of selected safety parameter was performed and Preliminary Hazards Analysis list was prepared using HAZOP(Hazard and Operability) technique. And the check list for management control has been produced via walk-through technique. Based on the evaluation of the check list, activities to be performed in requirement phase have been determined. In the design phase, hazard analysis has been performed to check the safety capability of the system with regard to safety software algorithm using Fault Tree Analysis (FTA). In the test phase, the test items based on FMEA have been checked for fitness guided by an accident scenario. The pressurizer low pressure trip algorithm has been selected to apply FTA method to software safety analysis as a sample. By applying CASE tool, the requirements traceability of safety critical system has been enhanced during all of software life cycle phases

  19. Water-hydrogen isotope exchange process analysis

    International Nuclear Information System (INIS)

    Fedorchenko, O.; Alekseev, I.; Uborsky, V.

    2008-01-01

    The use of a numerical method is needed to find a solution to the equation system describing a general case of heterogeneous isotope exchange between gaseous hydrogen and liquid water in a column. A computer model of the column merely outputting the isotope compositions in the flows leaving the column, like the experimental column itself, is a 'black box' to a certain extent: the solution is not transparent and occasionally not fully comprehended. The approximate analytical solution was derived from the ZXY-diagram (McCabe-Thiele diagram), which illustrates the solution of the renewed computer model called 'EVIO-4.2' Several 'unusual' results and dependences have been analyzed and explained. (authors)

  20. Change Impact Analysis of Crosscutting in Software Architectural Design

    NARCIS (Netherlands)

    van den Berg, Klaas

    2006-01-01

    Software architectures should be amenable to changes in user requirements and implementation technology. The analysis of the impact of these changes can be based on traceability of architectural design elements. Design elements have dependencies with other software artifacts but also evolve in time.

  1. Integrated analysis software for bulk power system stability

    Energy Technology Data Exchange (ETDEWEB)

    Tanaka, T; Nagao, T; Takahashi, K [Central Research Inst. of Electric Power Industry, Tokyo (Japan)

    1994-12-31

    This paper presents Central Research Inst.of Electric Power Industry - CRIEPI`s - own developed three softwares for bulk power network analysis and the user support system which arranges tremendous data necessary for these softwares with easy and high reliability. (author) 3 refs., 7 figs., 2 tabs.

  2. NAC, Neutron Activation Analysis and Isotope Inventory

    International Nuclear Information System (INIS)

    1995-01-01

    1 - Description of program or function: NAC was designed to predict the neutron-induced gamma-ray radioactivity for a wide variety of composite materials. The NAC output includes the input data, a list of all reactions for each constituent element, and the end-of-irradiation disintegration rates for each reaction. NAC also compiles a product isotope inventory containing the isotope name, the disintegration rate, the gamma-ray source strength and the absorbed dose rate at 1 meter from an unshielded point source. The induced activity is calculated as a function of irradiation and decay times; the effect of cyclic irradiation can also be calculated. 2 - Method of solution: The standard neutron activation and decay equations are programmed. A data library is supplied which contains target element names, atomic densities, reaction indices, individual reactions and reaction parameters, and product isotopes and gamma energy yields. 3 - Restrictions on the complexity of the problem: Each composite material may consist of up to 20 different elements and up to 20 different decay times may be included. Both limits may be increased by the user by increasing the appropriate items in the dimension statement

  3. Continuous Software Quality analysis for the ATLAS experiment

    CERN Document Server

    Washbrook, Andrew; The ATLAS collaboration

    2017-01-01

    The regular application of software quality tools in large collaborative projects is required to reduce code defects to an acceptable level. If left unchecked the accumulation of defects invariably results in performance degradation at scale and problems with the long-term maintainability of the code. Although software quality tools are effective for identification there remains a non-trivial sociological challenge to resolve defects in a timely manner. This is a ongoing concern for the ATLAS software which has evolved over many years to meet the demands of Monte Carlo simulation, detector reconstruction and data analysis. At present over 3.8 million lines of C++ code (and close to 6 million total lines of code) are maintained by a community of hundreds of developers worldwide. It is therefore preferable to address code defects before they are introduced into a widely used software release. Recent wholesale changes to the ATLAS software infrastructure have provided an ideal opportunity to apply software quali...

  4. Software Users Manual (SUM): Extended Testability Analysis (ETA) Tool

    Science.gov (United States)

    Maul, William A.; Fulton, Christopher E.

    2011-01-01

    This software user manual describes the implementation and use the Extended Testability Analysis (ETA) Tool. The ETA Tool is a software program that augments the analysis and reporting capabilities of a commercial-off-the-shelf (COTS) testability analysis software package called the Testability Engineering And Maintenance System (TEAMS) Designer. An initial diagnostic assessment is performed by the TEAMS Designer software using a qualitative, directed-graph model of the system being analyzed. The ETA Tool utilizes system design information captured within the diagnostic model and testability analysis output from the TEAMS Designer software to create a series of six reports for various system engineering needs. The ETA Tool allows the user to perform additional studies on the testability analysis results by determining the detection sensitivity to the loss of certain sensors or tests. The ETA Tool was developed to support design and development of the NASA Ares I Crew Launch Vehicle. The diagnostic analysis provided by the ETA Tool was proven to be valuable system engineering output that provided consistency in the verification of system engineering requirements. This software user manual provides a description of each output report generated by the ETA Tool. The manual also describes the example diagnostic model and supporting documentation - also provided with the ETA Tool software release package - that were used to generate the reports presented in the manual

  5. Isotopic analysis of boron by thermal ionization mass spectrometry

    International Nuclear Information System (INIS)

    Kakazu, M.H.; Sarkis, J.E.S.; Souza, I.M.S.

    1991-07-01

    This paper presents a methodology for isotopic analysis of boron by thermal ionization mass spectrometry technique through the ion intensity measurement of Na 2 BO + 2 in H 3 BO 3 , B o and B 4 C. The samples were loaded on single tantalum filaments by different methods. In the case of H 3 BO 3 , the method of neutralization with NaOH was used. For B 4 C the alcaline fusion with Na 2 CO 3 and for B o dissolution with 1:1 nitric sulfuric acid mixture followed by neutralization with NaOH was used. The isotopic ratio measurements were obtained by the use of s Faraday cup detector with external precision of ±0,4% and accuracy of ±0,1%, relative to H 3 BO 3 isotopic standard NBS 951. The effects of isotopic fractionation was studied in function of the time during the analyses and the different chemical forms of deposition. (author)

  6. Isotopic analysis of uranium by thermoionic mass spectrometry

    International Nuclear Information System (INIS)

    Moraes, N.M.P. de.

    1979-01-01

    Uranium isotopic ratio measurements by thermoionic spectrometry are presented. Emphasis is given upon the investigation of the parameters that directly affect the precision and accuracy of the results. Optimized procedures, namely, chemical processing, sample loading on the filaments, vaporization, ionization and measurements of ionic currents, are established. Adequate statistical analysis of the data for the calculation of the internal and external variances and mean standard deviation are presented. These procedures are applied to natural and NBS isotopic standard uranium samples. The results obtained agree with the certified values within specified limits. 235 U/ 238 U isotopic ratios values determined for NBS-U500, and a series of standard samples with variable isotopic compositon, are used to calculate mass discrimination factor [pt

  7. Study of medical isotope production facility stack emissions and noble gas isotopic signature using automatic gamma-spectra analysis platform

    Science.gov (United States)

    Zhang, Weihua; Hoffmann, Emmy; Ungar, Kurt; Dolinar, George; Miley, Harry; Mekarski, Pawel; Schrom, Brian; Hoffman, Ian; Lawrie, Ryan; Loosz, Tom

    2013-04-01

    The nuclear industry emissions of the four CTBT (Comprehensive Nuclear-Test-Ban Treaty) relevant radioxenon isotopes are unavoidably detected by the IMS along with possible treaty violations. Another civil source of radioxenon emissions which contributes to the global background is radiopharmaceutical production companies. To better understand the source terms of these background emissions, a joint project between HC, ANSTO, PNNL and CRL was formed to install real-time detection systems to support 135Xe, 133Xe, 131mXe and 133mXe measurements at the ANSTO and CRL 99Mo production facility stacks as well as the CANDU (CANada Deuterium Uranium) primary coolant monitoring system at CRL. At each site, high resolution gamma spectra were collected every 15 minutes using a HPGe detector to continuously monitor a bypass feed from the stack or CANDU primary coolant system as it passed through a sampling cell. HC also conducted atmospheric monitoring for radioxenon at approximately 200 km distant from CRL. A program was written to transfer each spectrum into a text file format suitable for the automatic gamma-spectra analysis platform and then email the file to a server. Once the email was received by the server, it was automatically analysed with the gamma-spectrum software UniSampo/Shaman to perform radionuclide identification and activity calculation for a large number of gamma-spectra in a short period of time (less than 10 seconds per spectrum). The results of nuclide activity together with other spectrum parameters were saved into the Linssi database. This database contains a large amount of radionuclide information which is a valuable resource for the analysis of radionuclide distribution within the noble gas fission product emissions. The results could be useful to identify the specific mechanisms of the activity release. The isotopic signatures of the various radioxenon species can be determined as a function of release time. Comparison of 133mXe and 133Xe activity

  8. GWAMA: software for genome-wide association meta-analysis

    Directory of Open Access Journals (Sweden)

    Mägi Reedik

    2010-05-01

    Full Text Available Abstract Background Despite the recent success of genome-wide association studies in identifying novel loci contributing effects to complex human traits, such as type 2 diabetes and obesity, much of the genetic component of variation in these phenotypes remains unexplained. One way to improving power to detect further novel loci is through meta-analysis of studies from the same population, increasing the sample size over any individual study. Although statistical software analysis packages incorporate routines for meta-analysis, they are ill equipped to meet the challenges of the scale and complexity of data generated in genome-wide association studies. Results We have developed flexible, open-source software for the meta-analysis of genome-wide association studies. The software incorporates a variety of error trapping facilities, and provides a range of meta-analysis summary statistics. The software is distributed with scripts that allow simple formatting of files containing the results of each association study and generate graphical summaries of genome-wide meta-analysis results. Conclusions The GWAMA (Genome-Wide Association Meta-Analysis software has been developed to perform meta-analysis of summary statistics generated from genome-wide association studies of dichotomous phenotypes or quantitative traits. Software with source files, documentation and example data files are freely available online at http://www.well.ox.ac.uk/GWAMA.

  9. Long-term preservation of analysis software environment

    International Nuclear Information System (INIS)

    Toppe Larsen, Dag; Blomer, Jakob; Buncic, Predrag; Charalampidis, Ioannis; Haratyunyan, Artem

    2012-01-01

    Long-term preservation of scientific data represents a challenge to experiments, especially regarding the analysis software. Preserving data is not enough; the full software and hardware environment is needed. Virtual machines (VMs) make it possible to preserve hardware “in software”. A complete infrastructure package has been developed for easy deployment and management of VMs, based on CERN virtual machine (CernVM). Further, a HTTP-based file system, CernVM file system (CVMFS), is used for the distribution of the software. It is possible to process data with any given software version, and a matching, regenerated VM version. A point-and-click web user interface is being developed for setting up the complete processing chain, including VM and software versions, number and type of processing nodes, and the particular type of analysis and data. This paradigm also allows for distributed cloud-computing on private and public clouds, for both legacy and contemporary experiments.

  10. Performance evaluation of spectral deconvolution analysis tool (SDAT) software used for nuclear explosion radionuclide measurements

    International Nuclear Information System (INIS)

    Foltz Biegalski, K.M.; Biegalski, S.R.; Haas, D.A.

    2008-01-01

    The Spectral Deconvolution Analysis Tool (SDAT) software was developed to improve counting statistics and detection limits for nuclear explosion radionuclide measurements. SDAT utilizes spectral deconvolution spectroscopy techniques and can analyze both β-γ coincidence spectra for radioxenon isotopes and high-resolution HPGe spectra from aerosol monitors. Spectral deconvolution spectroscopy is an analysis method that utilizes the entire signal deposited in a gamma-ray detector rather than the small portion of the signal that is present in one gamma-ray peak. This method shows promise to improve detection limits over classical gamma-ray spectroscopy analytical techniques; however, this hypothesis has not been tested. To address this issue, we performed three tests to compare the detection ability and variance of SDAT results to those of commercial off- the-shelf (COTS) software which utilizes a standard peak search algorithm. (author)

  11. Romanian wines characterization with CF-IRMS (Continuous Flow Isotope Ratio Mass Spectrometry) isotopic analysis

    International Nuclear Information System (INIS)

    Costinel, Diana; Ionete, Roxana Elena; Vremera, Raluca; Stanciu, Vasile

    2007-01-01

    Wine growing has been known for centuries long in Romania. The country has been favored by its geographical position in south-eastern Europe, by its proximity to the Black Sea, as well as by the specificity of the local soil and climate. Alongside France, Italy, Spain, Germany, countries in this area like Romania could also be called 'a vine homeland' in Europe. High quality wines produced in this region were object of trade ever since ancient times. Under current EU research projects, it is necessary to develop new methods of evidencing wine adulteration and safety. The use of mass spectrometry (MS) to determine the ratios of stable isotopes in bio-molecules now provides the means to prove the botanical and geographical origin of a wide variety of foodstuffs - and therefore, to authenticate and eliminate fraud. Isotope analysis has been officially adopted by the EU as a means of controlling adulteration of wine. Adulteration of wine can happen in many ways, e.g. addition of non-grape ethanol, addition of non-grape sugar, water or other unauthorized substances, undeclared mixing of wines from different wards, geographical areas or countries, mislabelling of variety and age. The present paper emphasize the isotopic analysis for D/H, 18 O/ 16 O, 13 C/ 12 C from wines, using a new generation Isotope Ratio MS, Finnigan Delta V Plus, coupling with a three flexible continuous flow preparation device (GasBench II, TC Elemental Analyser and GC-C/TC). Therefore authentication of wines is an important problem to which isotopic analysis has made a significant contribution. (authors)

  12. High-Throughput Method for Strontium Isotope Analysis by Multi-Collector-Inductively Coupled Plasma-Mass Spectrometer

    Energy Technology Data Exchange (ETDEWEB)

    Wall, Andrew J. [National Energy Technology Lab. (NETL), Pittsburgh, PA, (United States); Capo, Rosemary C. [Univ. of Pittsburgh, PA (United States); Stewart, Brian W. [Univ. of Pittsburgh, PA (United States); Phan, Thai T. [Univ. of Pittsburgh, PA (United States); Jain, Jinesh C. [National Energy Technology Lab. (NETL), Pittsburgh, PA, (United States); Hakala, Alexandra [National Energy Technology Lab. (NETL), Pittsburgh, PA, (United States); Guthrie, George D. [National Energy Technology Lab. (NETL), Pittsburgh, PA, (United States)

    2016-09-22

    This technical report presents the details of the Sr column configuration and the high-throughput Sr separation protocol. Data showing the performance of the method as well as the best practices for optimizing Sr isotope analysis by MC-ICP-MS is presented. Lastly, this report offers tools for data handling and data reduction of Sr isotope results from the Thermo Scientific Neptune software to assist in data quality assurance, which help avoid issues of data glut associated with high sample throughput rapid analysis.

  13. High-Throughput Method for Strontium Isotope Analysis by Multi-Collector-Inductively Coupled Plasma-Mass Spectrometer

    Energy Technology Data Exchange (ETDEWEB)

    Hakala, Jacqueline Alexandra [National Energy Technology Lab. (NETL), Morgantown, WV (United States)

    2016-11-22

    This technical report presents the details of the Sr column configuration and the high-throughput Sr separation protocol. Data showing the performance of the method as well as the best practices for optimizing Sr isotope analysis by MC-ICP-MS is presented. Lastly, this report offers tools for data handling and data reduction of Sr isotope results from the Thermo Scientific Neptune software to assist in data quality assurance, which help avoid issues of data glut associated with high sample throughput rapid analysis.

  14. Development of design and analysis software for advanced nuclear system

    International Nuclear Information System (INIS)

    Wu Yican; Hu Liqin; Long Pengcheng; Luo Yuetong; Li Yazhou; Zeng Qin; Lu Lei; Zhang Junjun; Zou Jun; Xu Dezheng; Bai Yunqing; Zhou Tao; Chen Hongli; Peng Lei; Song Yong; Huang Qunying

    2010-01-01

    A series of professional codes, which are necessary software tools and data libraries for advanced nuclear system design and analysis, were developed by the FDS Team, including the codes of automatic modeling, physics and engineering calculation, virtual simulation and visualization, system engineering and safety analysis and the related database management etc. The development of these software series was proposed as an exercise of development of nuclear informatics. This paper introduced the main functions and key techniques of the software series, as well as some tests and practical applications. (authors)

  15. SIMS analysis of isotopic impurities in ion implants

    International Nuclear Information System (INIS)

    Sykes, D.E.; Blunt, R.T.

    1986-01-01

    The n-type dopant species Si and Se used for ion implantation in GaAs are multi-isotopic with the most abundant isotope not chosen because of potential interferences with residual gases. SIMS analysis of a range of 29 Si implants produced by several designs of ion implanter all showed significant 28 Si impurity with a different depth distribution from that of the deliberately implanted 29 Si isotope. This effect was observed to varying degrees with all fifteen implanters examined and in every 29 Si implant analysed to date 29 Si + , 29 Si ++ and 30 Si implants all show the same effect. In the case of Se implantation, poor mass resolution results in the implantation of all isotopes with the same implant distribution (i.e. energy), whilst implants carried out with good mass resolution show the implantation of all isotopes with the characteristic lower depth distribution of the impurity isotopes as found in the Si implants. This effect has also been observed in p-type implants into GaAs (Mg) and for Ga implanted in Si. A tentative explanation of the effect is proposed. (author)

  16. Potential of isotope analysis (C, Cl) to identify dechlorination mechanisms

    Science.gov (United States)

    Cretnik, Stefan; Thoreson, Kristen; Bernstein, Anat; Ebert, Karin; Buchner, Daniel; Laskov, Christine; Haderlein, Stefan; Shouakar-Stash, Orfan; Kliegman, Sarah; McNeill, Kristopher; Elsner, Martin

    2013-04-01

    Chloroethenes are commonly used in industrial applications, and detected as carcinogenic contaminants in the environment. Their dehalogenation is of environmental importance in remediation processes. However, a detailed understanding frequently accounted problem is the accumulation of toxic degradation products such as cis-dichloroethylene (cis-DCE) at contaminated sites. Several studies have addressed the reductive dehalogenation reactions using biotic and abiotic model systems, but a crucial question in this context has remained open: Do environmental transformations occur by the same mechanism as in their corresponding in vitro model systems? The presented study shows the potential to close this research gap using the latest developments in compound specific chlorine isotope analysis, which make it possible to routinely measure chlorine isotope fractionation of chloroethenes in environmental samples and complex reaction mixtures.1,2 In particular, such chlorine isotope analysis enables the measurement of isotope fractionation for two elements (i.e., C and Cl) in chloroethenes. When isotope values of both elements are plotted against each other, different slopes reflect different underlying mechanisms and are remarkably insensitive towards masking. Our results suggest that different microbial strains (G. lovleyi strain SZ, D. hafniense Y51) and the isolated cofactor cobalamin employ similar mechanisms of reductive dechlorination of TCE. In contrast, evidence for a different mechanism was obtained with cobaloxime cautioning its use as a model for biodegradation. The study shows the potential of the dual isotope approach as a tool to directly compare transformation mechanisms of environmental scenarios, biotic transformations, and their putative chemical lab scale systems. Furthermore, it serves as an essential reference when using the dual isotope approach to assess the fate of chlorinated compounds in the environment.

  17. Change impact analysis for software product lines

    Directory of Open Access Journals (Sweden)

    Jihen Maâzoun

    2016-10-01

    Full Text Available A software product line (SPL represents a family of products in a given application domain. Each SPL is constructed to provide for the derivation of new products by covering a wide range of features in its domain. Nevertheless, over time, some domain features may become obsolete with the apparition of new features while others may become refined. Accordingly, the SPL must be maintained to account for the domain evolution. Such evolution requires a means for managing the impact of changes on the SPL models, including the feature model and design. This paper presents an automated method that analyzes feature model evolution, traces their impact on the SPL design, and offers a set of recommendations to ensure the consistency of both models. The proposed method defines a set of new metrics adapted to SPL evolution to identify the effort needed to maintain the SPL models consistently and with a quality as good as the original models. The method and its tool are illustrated through an example of an SPL in the Text Editing domain. In addition, they are experimentally evaluated in terms of both the quality of the maintained SPL models and the precision of the impact change management.

  18. Dispersion analysis of biotoxins using HPAC software

    International Nuclear Information System (INIS)

    Wu, A.; Nurthen, N.; Horstman, A.; Watson, R.; Phillips, M.

    2009-01-01

    Biotoxins are emerging threat agents produced by living organisms: bacteria, plants, or animals. Biotoxins are generally classified as cyanotoxins, hemotoxins, necrotoxins, neurotoxins, and cytotoxins. The application of classical biotoxins as weapons of terror has been realized because of extreme potency and lethality; ease of production, transport, and misuse; and the need for prolonged intensive care among affected persons. Recently, emerging biotoxins, such as ricin and T2 micotoxin have been clandestinely used by either terrorist groups or military combat operations. It is thus highly desirable to have a modeling system to simulate dispersions of biotoxins in a terrorist attack scenario in order to provide prompt technical support and casualty estimation to the first responders and military rescuers. The Hazard Prediction and Assessment Capability (HPAC) automated software system provides the means to accurately predict the effects of hazardous material released into the atmosphere and its impact on civilian and military populations. The system uses integrated source terms, high-resolution weather forecasts and atmospheric transport and dispersion analyses to model hazard areas produced by military or terrorist incidents and industrial accidents. We have successfully incorporated physical, chemical, epidemiological and biological characteristics of a variety of biotoxins into the HPAC system and have conducted numerous analyses for our emergency responders. The health effects caused by these hazards are closely reflected in HPAC output results.(author)

  19. OST: analysis tool for real time software by simulation of material and software environments

    International Nuclear Information System (INIS)

    Boulc'h; Le Meur; Lapassat; Salichon; Segalard

    1988-07-01

    The utilization of microprocessors systems in a nuclear installation control oblige a great operation safety in the installation operation and in the environment protection. For the safety analysis of these installations the Institute of Protection and Nuclear Safety (IPSN) will dispose tools which permit to make controls during all the life of the software. The simulation and test tool (OST) which have been created is completely made by softwares. It is used on VAX calculators and can be easily transportable on other calculators [fr

  20. Advanced concepts for gamma-ray isotopic analysis and instrumentation

    International Nuclear Information System (INIS)

    Buckley, W.M.; Carlson, J.B.

    1994-07-01

    The Safeguards Technology Program at the Lawrence Livermore National Laboratory is developing actinide isotopic analysis technologies in response to needs that address issues of flexibility of analysis, robustness of analysis, ease-of-use, automation and portability. Recent developments such as the Intelligent Actinide Analysis System (IAAS), begin to address these issues. We are continuing to develop enhancements on this and other instruments that improve ease-of-use, automation and portability. Requests to analyze samples with unusual isotopics, contamination, or containers have made us aware of the need for more flexible and robust analysis. We have modified the MGA program to extend its plutonium isotopic analysis capability to samples with greater 241 Am content or U isotopics. We are looking at methods for dealing with tantalum or lead contamination and contamination with high-energy gamma emitters, such as 233 U. We are looking at ways to allow the program to use additional information about the sample to further extend the domain of analyzable samples. These unusual analyses will come from the domain of samples that need to be measured because of complex reconfiguration or environmental cleanup

  1. Advances in isotopic analysis for food authenticity testing

    DEFF Research Database (Denmark)

    Laursen, Kristian Holst; Bontempo, L.; Camin, Federica

    2016-01-01

    Abstract Stable isotope analysis has been used for food authenticity testing for more than 30 years and is today being utilized on a routine basis for a wide variety of food commodities. During the past decade, major analytical method developments have been made and the fundamental understanding...... authenticity testing is currently developing even further. In this chapter, we aim to provide an overview of the latest developments in stable isotope analysis for food authenticity testing. As several review articles and book chapters have recently addressed this topic, we will primarily focus on relevant...... literature from the past 5 years. We will focus on well-established methods for food authenticity testing using stable isotopes but will also include recent methodological developments, new applications, and current and future challenges....

  2. Power Analysis Software for Educational Researchers

    Science.gov (United States)

    Peng, Chao-Ying Joanne; Long, Haiying; Abaci, Serdar

    2012-01-01

    Given the importance of statistical power analysis in quantitative research and the repeated emphasis on it by American Educational Research Association/American Psychological Association journals, the authors examined the reporting practice of power analysis by the quantitative studies published in 12 education/psychology journals between 2005…

  3. JEM-X science analysis software

    DEFF Research Database (Denmark)

    Westergaard, Niels Jørgen Stenfeldt; Kretschmar, P.; Oxborrow, Carol Anne

    2003-01-01

    The science analysis of the data from JEM-X on INTEGRAL is performed through a number of levels including corrections, good time selection, imaging and source finding, spectrum and light-curve extraction. These levels consist of individual executables and the running of the complete analysis...

  4. Multicomponent isotopic separation and recirculation analysis

    International Nuclear Information System (INIS)

    Misra, B.; Maroni, V.A.

    1976-01-01

    A digital computer program for design of multicomponent distillation columns has been developed based on an exact method of solution of the governing equations. Although this computer program was developed for enrichment of the spent fuels from presently conceived tokamak-type fusion power reactors by cryogenic distillation, the program can be used for the design of any multicomponent distillation column, provided, of course, the necessary thermodynamic and phase equilibrium data are available. To prove the versatility of the computer program, parametric investigations to study the effect of design and operating variables on the composition of the product streams was carried out for the case of separating hydrogen isotopes. The computer program is very efficient; hence, a number of parametric investigations can be carried out with limited resources. The program does, however, require a fairly large computer storage space

  5. Analysis of Variance: What Is Your Statistical Software Actually Doing?

    Science.gov (United States)

    Li, Jian; Lomax, Richard G.

    2011-01-01

    Users assume statistical software packages produce accurate results. In this article, the authors systematically examined Statistical Package for the Social Sciences (SPSS) and Statistical Analysis System (SAS) for 3 analysis of variance (ANOVA) designs, mixed-effects ANOVA, fixed-effects analysis of covariance (ANCOVA), and nested ANOVA. For each…

  6. Results of Am isotopic ratio analysis in irradiated MOX fuels

    Energy Technology Data Exchange (ETDEWEB)

    Koyama, Shin-ichi; Osaka, Masahiko; Mitsugashira, Toshiaki; Konno, Koichi [Power Reactor and Nuclear Fuel Development Corp., Oarai, Ibaraki (Japan). Oarai Engineering Center; Kajitani, Mikio

    1997-04-01

    For analysis of a small quantity of americium, it is necessary to separate from curium which has similar chemical property. As a chemical separation method for americium and curium, the oxidation of americium with pentavalent bismuth and subsequent co-precipitation of trivalent curium with BIP O{sub 4} were applied to analyze americium in irradiated MOX fuels which contained about 30wt% plutonium and 0.9wt% {sup 241}Am before irradiation and were irradiated up to 26.2GWd/t in the experimental fast reactor Joyo. The purpose of this study is to measure isotopic ratio of americium and to evaluate the change of isotopic ratio with irradiation. Following results are obtained in this study. (1) The isotopic ratio of americium ({sup 241}Am, {sup 242m}Am and {sup 243}Am) can be analyzed in the MOX fuels by isolating americium. The isotopic ratio of {sup 242m}Am and {sup 243}Am increases up to 0.62at% and 0.82at% at maximum burnup, respectively, (2) The results of isotopic analysis indicates that the contents of {sup 241}Am decreases, whereas {sup 242m}Am, {sup 243}Am increase linearly with increasing burnup. (author)

  7. Hg stable isotope analysis by the double-spike method.

    Science.gov (United States)

    Mead, Chris; Johnson, Thomas M

    2010-06-01

    Recent publications suggest great potential for analysis of Hg stable isotope abundances to elucidate sources and/or chemical processes that control the environmental impact of mercury. We have developed a new MC-ICP-MS method for analysis of mercury isotope ratios using the double-spike approach, in which a solution containing enriched (196)Hg and (204)Hg is mixed with samples and provides a means to correct for instrumental mass bias and most isotopic fractionation that may occur during sample preparation and introduction into the instrument. Large amounts of isotopic fractionation induced by sample preparation and introduction into the instrument (e.g., by batch reactors) are corrected for. This may greatly enhance various Hg pre-concentration methods by correcting for minor fractionation that may occur during preparation and removing the need to demonstrate 100% recovery. Current precision, when ratios are normalized to the daily average, is 0.06 per thousand, 0.06 per thousand, 0.05 per thousand, and 0.05 per thousand (2sigma) for (202)Hg/(198)Hg, (201)Hg/(198)Hg, (200)Hg/(198)Hg, and (199)Hg/(198)Hg, respectively. This is slightly better than previously published methods. Additionally, this precision was attained despite the presence of large amounts of other Hg isotopes (e.g., 5.0% atom percent (198)Hg) in the spike solution; substantially better precision could be achieved if purer (196)Hg were used.

  8. Adapted wavelet analysis from theory to software

    CERN Document Server

    Wickerhauser, Mladen Victor

    1994-01-01

    This detail-oriented text is intended for engineers and applied mathematicians who must write computer programs to perform wavelet and related analysis on real data. It contains an overview of mathematical prerequisites and proceeds to describe hands-on programming techniques to implement special programs for signal analysis and other applications. From the table of contents: - Mathematical Preliminaries - Programming Techniques - The Discrete Fourier Transform - Local Trigonometric Transforms - Quadrature Filters - The Discrete Wavelet Transform - Wavelet Packets - The Best Basis Algorithm - Multidimensional Library Trees - Time-Frequency Analysis - Some Applications - Solutions to Some of the Exercises - List of Symbols - Quadrature Filter Coefficients

  9. Development of Cell Analysis Software for Cultivated Corneal Endothelial Cells.

    Science.gov (United States)

    Okumura, Naoki; Ishida, Naoya; Kakutani, Kazuya; Hongo, Akane; Hiwa, Satoru; Hiroyasu, Tomoyuki; Koizumi, Noriko

    2017-11-01

    To develop analysis software for cultured human corneal endothelial cells (HCECs). Software was designed to recognize cell borders and to provide parameters such as cell density, coefficient of variation, and polygonality of cultured HCECs based on phase contrast images. Cultured HCECs with high or low cell density were incubated with Ca-free and Mg-free phosphate-buffered saline for 10 minutes to reveal the cell borders and were then analyzed with software (n = 50). Phase contrast images showed that cell borders were not distinctly outlined, but these borders became more distinctly outlined after phosphate-buffered saline treatment and were recognized by cell analysis software. The cell density value provided by software was similar to that obtained using manual cell counting by an experienced researcher. Morphometric parameters, such as the coefficient of variation and polygonality, were also produced by software, and these values were significantly correlated with cell density (Pearson correlation coefficients -0.62 and 0.63, respectively). The software described here provides morphometric information from phase contrast images, and it enables subjective and noninvasive quality assessment for tissue engineering therapy of the corneal endothelium.

  10. Krypton isotope analysis using near-resonant stimulated Raman spectroscopy

    International Nuclear Information System (INIS)

    Whitehead, C.A.; Cannon, B.D.; Wacker, J.F.

    1994-12-01

    A method for measuring low relative abundances of 85 Kr in one liter or less samples of air has been under development here at Pacific Northwest Laboratory. The goal of the Krypton Isotope Laser Analysis (KILA) method is to measure ratios of 10 -10 or less of 85 Kr to more abundant stable krypton. Mass spectrometry and beta counting are the main competing technologies used in rare-gas trace analysis and are limited in application by such factors as sample size, counting times, and selectivity. The use of high-resolution lasers to probe hyperfine levels to determine isotopic abundance has received much attention recently. In this study, we report our progress on identifying and implementing techniques for trace 85 Kr analysis on small gas samples in a static cell as well as limitations on sensitivity and selectivity for the technique. High-resolution pulsed and cw lasers are employed in a laser-induced fluorescence technique that preserves the original sample. This technique, is based on resonant isotopic depletion spectroscopy (RIDS) in which one isotope is optically depleted while preserving the population of a less abundant isotope. The KILA method consists of three steps. In the first step, the 1s 5 metastable level of krypton is populated via radiative cascade following two-photon excitation of the 2p 6 energy level. Next, using RBDS, the stable krypton isotopes are optically depleted to the ground state through the 1s 4 level with the bulk of the 85 Kr population being preserved. Finally, the remaining metastable population is probed to determine 85 Kr concentration. The experimental requirements for each of these steps are outlined below

  11. Automated Freedom from Interference Analysis for Automotive Software

    OpenAIRE

    Leitner-Fischer , Florian; Leue , Stefan; Liu , Sirui

    2016-01-01

    International audience; Freedom from Interference for automotive software systems developed according to the ISO 26262 standard means that a fault in a less safety critical software component will not lead to a fault in a more safety critical component. It is an important concern in the realm of functional safety for automotive systems. We present an automated method for the analysis of concurrency-related interferences based on the QuantUM approach and tool that we have previously developed....

  12. Comparison of gas chromatography/isotope ratio mass spectrometry and liquid chromatography/isotope ratio mass spectrometry for carbon stable-isotope analysis of carbohydrates

    NARCIS (Netherlands)

    Moerdijk-Poortvliet, Tanja C. W.; Schierbeek, Henk; Houtekamer, Marco; van Engeland, Tom; Derrien, Delphine; Stal, Lucas J.; Boschker, Henricus T. S.

    2015-01-01

    We compared gas chromatography/isotope ratio mass spectrometry (GC/IRMS) and liquid chromatography/isotope ratio mass spectrometry (LC/IRMS) for the measurement of δ(13)C values in carbohydrates. Contrary to GC/IRMS, no derivatisation is needed for LC/IRMS analysis of carbohydrates. Hence, although

  13. Comparison of gas chromatography/isotope ratio mass spectrometry and liquid chromatography/isotope ratio mass spectrometry for carbon stable-isotope analysis of carbohydrates

    NARCIS (Netherlands)

    Moerdijk-Poortvliet, T.C.W.; Schierbeek, H.; Houtekamer, M.; van Engeland, T.; Derrien, D.; Stal, L.J.; Boschker, H.T.S.

    2015-01-01

    We compared gas chromatography/isotope ratio mass spectrometry (GC/IRMS) and liquid chromatography/isotope ratio mass spectrometry (LC/IRMS) for the measurement of d13C values in carbohydrates. Contrary to GC/IRMS, no derivatisation is needed for LC/IRMS analysis of carbohydrates. Hence, although

  14. Comparison of gas chromatography/isotope ratio mass spectrometry and liquid chromatography/isotope ratio mass spectrometry for carbon stable-isotope analysis of carbohydrates

    NARCIS (Netherlands)

    Moerdijk-Poortvliet, T.C.W.; Schierbeek, H.; Houtekamer, M.; van Engeland, T.; Derrien, D.; Stal, L.J.; Boschker, H.T.S.

    2015-01-01

    Rationale: We compared gas chromatography/isotope ratio mass spectrometry (GC/IRMS) and liquid chromatography/isotope ratio mass spectrometry (LC/IRMS) for the measurement of δ13C values in carbohydrates. Contrary to GC/IRMS, no derivatisation is needed for LC/IRMS analysis of carbohydrates. Hence,

  15. ISODEP, A Fuel Depletion Analysis Code for Predicting Isotopic ...

    African Journals Online (AJOL)

    The trend of results was found to be consistent with those obtained by analytical and other numerical methods. Discovery and Innovation Vol. 13 no. 3/4 December (2001) pp. 184-195. KEY WORDS: depletion analysis, code, research reactor, simultaneous equations, decay of nuclides, radionuclitides, isotope. Résumé

  16. Hydrology of Bishop Creek, California: An Isotopic Analysis

    Science.gov (United States)

    Michael L. Space; John W. Hess; Stanley D. Smith

    1989-01-01

    Five power generation plants along an eleven kilometer stretch divert Bishop Creek water for hydro-electric power. Stream diversion may be adversely affecting the riparian vegetation. Stable isotopic analysis is employed to determine surface water/ground-water interactions along the creek. surface water originates primarily from three headwater lakes. Discharge into...

  17. Software safety analysis techniques for developing safety critical software in the digital protection system of the LMR

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jang Soo; Cheon, Se Woo; Kim, Chang Hoi; Sim, Yun Sub

    2001-02-01

    This report has described the software safety analysis techniques and the engineering guidelines for developing safety critical software to identify the state of the art in this field and to give the software safety engineer a trail map between the code and standards layer and the design methodology and documents layer. We have surveyed the management aspects of software safety activities during the software lifecycle in order to improve the safety. After identifying the conventional safety analysis techniques for systems, we have surveyed in details the software safety analysis techniques, software FMEA(Failure Mode and Effects Analysis), software HAZOP(Hazard and Operability Analysis), and software FTA(Fault Tree Analysis). We have also surveyed the state of the art in the software reliability assessment techniques. The most important results from the reliability techniques are not the specific probability numbers generated, but the insights into the risk importance of software features. To defend against potential common-mode failures, high quality, defense-in-depth, and diversity are considered to be key elements in digital I and C system design. To minimize the possibility of CMFs and thus increase the plant reliability, we have provided D-in-D and D analysis guidelines.

  18. Software safety analysis techniques for developing safety critical software in the digital protection system of the LMR

    International Nuclear Information System (INIS)

    Lee, Jang Soo; Cheon, Se Woo; Kim, Chang Hoi; Sim, Yun Sub

    2001-02-01

    This report has described the software safety analysis techniques and the engineering guidelines for developing safety critical software to identify the state of the art in this field and to give the software safety engineer a trail map between the code and standards layer and the design methodology and documents layer. We have surveyed the management aspects of software safety activities during the software lifecycle in order to improve the safety. After identifying the conventional safety analysis techniques for systems, we have surveyed in details the software safety analysis techniques, software FMEA(Failure Mode and Effects Analysis), software HAZOP(Hazard and Operability Analysis), and software FTA(Fault Tree Analysis). We have also surveyed the state of the art in the software reliability assessment techniques. The most important results from the reliability techniques are not the specific probability numbers generated, but the insights into the risk importance of software features. To defend against potential common-mode failures, high quality, defense-in-depth, and diversity are considered to be key elements in digital I and C system design. To minimize the possibility of CMFs and thus increase the plant reliability, we have provided D-in-D and D analysis guidelines

  19. Discrimination of ginseng cultivation regions using light stable isotope analysis.

    Science.gov (United States)

    Kim, Kiwook; Song, Joo-Hyun; Heo, Sang-Cheol; Lee, Jin-Hee; Jung, In-Woo; Min, Ji-Sook

    2015-10-01

    Korean ginseng is considered to be a precious health food in Asia. Today, thieves frequently compromise ginseng farms by pervasive theft. Thus, studies regarding the characteristics of ginseng according to growth region are required in order to deter ginseng thieves and prevent theft. In this study, 6 regions were selected on the basis of Korea regional criteria (si, gun, gu), and two ginseng-farms were randomly selected from each of the 6 regions. Then 4-6 samples of ginseng were acquired from each ginseng farm. The stable isotopic compositions of H, O, C, and N of the collected ginseng samples were analyzed. As a result, differences in the hydrogen isotope ratios could be used to distinguish regional differences, and differences in the nitrogen isotope ratios yielded characteristic information regarding the farms from which the samples were obtained. Thus, stable isotope values could be used to differentiate samples according to regional differences. Therefore, stable isotope analysis serves as a powerful tool to discriminate the regional origin of Korean ginseng samples from across Korea. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  20. Development of output user interface software to support analysis

    Energy Technology Data Exchange (ETDEWEB)

    Wahanani, Nursinta Adi, E-mail: sintaadi@batan.go.id; Natsir, Khairina, E-mail: sintaadi@batan.go.id; Hartini, Entin, E-mail: sintaadi@batan.go.id [Center for Development of Nuclear Informatics - National Nuclear Energy Agency, PUSPIPTEK, Serpong, Tangerang, Banten (Indonesia)

    2014-09-30

    Data processing software packages such as VSOP and MCNPX are softwares that has been scientifically proven and complete. The result of VSOP and MCNPX are huge and complex text files. In the analyze process, user need additional processing like Microsoft Excel to show informative result. This research develop an user interface software for output of VSOP and MCNPX. VSOP program output is used to support neutronic analysis and MCNPX program output is used to support burn-up analysis. Software development using iterative development methods which allow for revision and addition of features according to user needs. Processing time with this software 500 times faster than with conventional methods using Microsoft Excel. PYTHON is used as a programming language, because Python is available for all major operating systems: Windows, Linux/Unix, OS/2, Mac, Amiga, among others. Values that support neutronic analysis are k-eff, burn-up and mass Pu{sup 239} and Pu{sup 241}. Burn-up analysis used the mass inventory values of actinide (Thorium, Plutonium, Neptunium and Uranium). Values are visualized in graphical shape to support analysis.

  1. Development of output user interface software to support analysis

    International Nuclear Information System (INIS)

    Wahanani, Nursinta Adi; Natsir, Khairina; Hartini, Entin

    2014-01-01

    Data processing software packages such as VSOP and MCNPX are softwares that has been scientifically proven and complete. The result of VSOP and MCNPX are huge and complex text files. In the analyze process, user need additional processing like Microsoft Excel to show informative result. This research develop an user interface software for output of VSOP and MCNPX. VSOP program output is used to support neutronic analysis and MCNPX program output is used to support burn-up analysis. Software development using iterative development methods which allow for revision and addition of features according to user needs. Processing time with this software 500 times faster than with conventional methods using Microsoft Excel. PYTHON is used as a programming language, because Python is available for all major operating systems: Windows, Linux/Unix, OS/2, Mac, Amiga, among others. Values that support neutronic analysis are k-eff, burn-up and mass Pu 239 and Pu 241 . Burn-up analysis used the mass inventory values of actinide (Thorium, Plutonium, Neptunium and Uranium). Values are visualized in graphical shape to support analysis

  2. LANDSAFE: LANDING SITE RISK ANALYSIS SOFTWARE FRAMEWORK

    OpenAIRE

    Schmidt, Ralph; Bostelmann, Jonas; Cornet, Yves; Heipke, Christian; Philippe, Christian; Poncelet, Nadia; de Rosa, Diego; Vandeloise, Yannick

    2012-01-01

    The European Space Agency (ESA) is planning a Lunar Lander mission in the 2018 timeframe that will demonstrate precise soft landing at the polar regions of the Moon. To ensure a safe and successful landing a careful risk analysis has to be carried out. This is comprised of identifying favorable target areas and evaluating the surface conditions in these areas. Features like craters, boulders, steep slopes, rough surfaces and shadow areas have to be identified in order to assess the risk assoc...

  3. Development of evaluation method for software safety analysis techniques

    International Nuclear Information System (INIS)

    Huang, H.; Tu, W.; Shih, C.; Chen, C.; Yang, W.; Yih, S.; Kuo, C.; Chen, M.

    2006-01-01

    Full text: Full text: Following the massive adoption of digital Instrumentation and Control (I and C) system for nuclear power plant (NPP), various Software Safety Analysis (SSA) techniques are used to evaluate the NPP safety for adopting appropriate digital I and C system, and then to reduce risk to acceptable level. However, each technique has its specific advantage and disadvantage. If the two or more techniques can be complementarily incorporated, the SSA combination would be more acceptable. As a result, if proper evaluation criteria are available, the analyst can then choose appropriate technique combination to perform analysis on the basis of resources. This research evaluated the applicable software safety analysis techniques nowadays, such as, Preliminary Hazard Analysis (PHA), Failure Modes and Effects Analysis (FMEA), Fault Tree Analysis (FTA), Markov chain modeling, Dynamic Flowgraph Methodology (DFM), and simulation-based model analysis; and then determined indexes in view of their characteristics, which include dynamic capability, completeness, achievability, detail, signal/ noise ratio, complexity, and implementation cost. These indexes may help the decision makers and the software safety analysts to choose the best SSA combination arrange their own software safety plan. By this proposed method, the analysts can evaluate various SSA combinations for specific purpose. According to the case study results, the traditional PHA + FMEA + FTA (with failure rate) + Markov chain modeling (without transfer rate) combination is not competitive due to the dilemma for obtaining acceptable software failure rates. However, the systematic architecture of FTA and Markov chain modeling is still valuable for realizing the software fault structure. The system centric techniques, such as DFM and Simulation-based model analysis, show the advantage on dynamic capability, achievability, detail, signal/noise ratio. However, their disadvantage are the completeness complexity

  4. Software development for the simulation and design of the cryogenic distillation cascade used for hydrogen isotope separation

    Energy Technology Data Exchange (ETDEWEB)

    Draghia, Mirela Mihaela, E-mail: mirela.draghia@istech-ro.com; Pasca, Gheorghe; Porcariu, Florina

    2016-11-01

    Highlights: • Software for designing and simulation of a cryogenic distillation cascade. • The simulation provides the distribution of all the molecular species involved along each cryogenic distillation column and also the temperature profile along the columns. • Useful information that are relevant for ITER Isotope Separation System. - Abstract: The hydrogen isotope separation system (ISS) based on cryogenic distillation is one of the key systems of the fuel cycle of a fusion reactor. Similar with ITER ISS in a Water Detritiation Facility for a CANDU reactor, one of the main systems is cryogenic distillation. The developments on the CANDU water detritiation systems have shown that a cascade of four cryogenic distillation columns is required in order to achieve the required decontamination factor of the heavy water and a tritium enrichment up to 99.9%. This paper aims to present the results of the design and simulation activities in support to the development of the Cernavoda Tritium Removal Facility (CTRF). Beside the main features of software developed “in house”, an introduction to the main relevant issues of a CANDU tritium removal facility for the ITER ISS is provided as well. Based on the input data (e.g. the flow rates, the composition of the gas supplied into the cryogenic distillation cascade, pressure drop along the column, liquid inventory) the simulation provides the distribution of all the molecular species involved along each cryogenic distillation column and also the temperature profile along the columns. The approach for the static and dynamic simulation of a cryogenic distillation process is based on theoretical plates model and the calculations are performed incrementally plate by plate.

  5. Software development for the simulation and design of the cryogenic distillation cascade used for hydrogen isotope separation

    International Nuclear Information System (INIS)

    Draghia, Mirela Mihaela; Pasca, Gheorghe; Porcariu, Florina

    2016-01-01

    Highlights: • Software for designing and simulation of a cryogenic distillation cascade. • The simulation provides the distribution of all the molecular species involved along each cryogenic distillation column and also the temperature profile along the columns. • Useful information that are relevant for ITER Isotope Separation System. - Abstract: The hydrogen isotope separation system (ISS) based on cryogenic distillation is one of the key systems of the fuel cycle of a fusion reactor. Similar with ITER ISS in a Water Detritiation Facility for a CANDU reactor, one of the main systems is cryogenic distillation. The developments on the CANDU water detritiation systems have shown that a cascade of four cryogenic distillation columns is required in order to achieve the required decontamination factor of the heavy water and a tritium enrichment up to 99.9%. This paper aims to present the results of the design and simulation activities in support to the development of the Cernavoda Tritium Removal Facility (CTRF). Beside the main features of software developed “in house”, an introduction to the main relevant issues of a CANDU tritium removal facility for the ITER ISS is provided as well. Based on the input data (e.g. the flow rates, the composition of the gas supplied into the cryogenic distillation cascade, pressure drop along the column, liquid inventory) the simulation provides the distribution of all the molecular species involved along each cryogenic distillation column and also the temperature profile along the columns. The approach for the static and dynamic simulation of a cryogenic distillation process is based on theoretical plates model and the calculations are performed incrementally plate by plate.

  6. Ion Mobility Mass Spectrometry Direct Isotope Abundance Analysis

    International Nuclear Information System (INIS)

    Manard, Manuel J.; Weeks, Stephan; Kyle, Kevin

    2010-01-01

    The nuclear forensics community is currently engaged in the analysis of illicit nuclear or radioactive material for the purposes of non-proliferations and attribution. One technique commonly employed for gathering nuclear forensics information is isotope analysis. At present, the state-of-the-art methodology for obtaining isotopic distributions is thermal ionization mass spectrometry (TIMS). Although TIMS is highly accurate at determining isotope distributions, the technique requires an elementally pure sample to perform the measurement. The required radiochemical separations give rise to sample preparation times that can be in excess of one to two weeks. Clearly, the nuclear forensics community is in need of instrumentation and methods that can expedite their decision making process in the event of a radiological release or nuclear detonation. Accordingly, we are developing instrumentation that couples a high resolution IM drift cell to the front end of a MS. The IM cell provides a means of separating ions based upon their collision cross-section and mass-to-charge ratio (m/z). Two analytes with the same m/z, but with different collision cross-sections (shapes) would exit the cell at different times, essentially enabling the cell to function in a similar manner to a gas chromatography (GC) column. Thus, molecular and atomic isobaric interferences can be effectively removed from the ion beam. The mobility selected chemical species could then be introduced to a MS for high-resolution mass analysis to generate isotopic distributions of the target analytes. The outcome would be an IM/MS system capable of accurately measuring isotopic distributions while concurrently eliminating isobaric interferences and laboratory radiochemical sample preparation. The overall objective of this project is developing instrumentation and methods to produce near real-time isotope distributions with a modular mass spectrometric system that performs the required gas-phase chemistry and

  7. How to do Meta-Analysis using HLM software

    OpenAIRE

    Petscher, Yaacov

    2013-01-01

    This is a step-by-step presentation of how to run a meta-analysis using HLM software. Because it's a variance known model, it is not run through the GUI, but batch mode. These slides show how to prepare the data and run the analysis.

  8. A relational approach to support software architecture analysis

    NARCIS (Netherlands)

    Feijs, L.M.G.; Krikhaar, R.L.; van Ommering, R.C.

    1998-01-01

    This paper reports on our experience with a relational approach to support the analysis of existing software architectures. The analysis options provide for visualization and view calculation. The approach has been applied for reverse engineering. It is also possible to check concrete designs

  9. Development of data acquisition and analysis software for multichannel detectors

    International Nuclear Information System (INIS)

    Chung, Y.

    1988-06-01

    This report describes the development of data acquisition and analysis software for Apple Macintosh computers, capable of controlling two multichannel detectors. With the help of outstanding graphics capabilities, easy-to-use user interface, and several other built-in convenience features, this application has enhanced the productivity and the efficiency of data analysis. 2 refs., 6 figs

  10. Radio-science performance analysis software

    Science.gov (United States)

    Morabito, D. D.; Asmar, S. W.

    1995-02-01

    The Radio Science Systems Group (RSSG) provides various support functions for several flight project radio-science teams. Among these support functions are uplink and sequence planning, real-time operations monitoring and support, data validation, archiving and distribution functions, and data processing and analysis. This article describes the support functions that encompass radio-science data performance analysis. The primary tool used by the RSSG to fulfill this support function is the STBLTY program set. STBLTY is used to reconstruct observable frequencies and calculate model frequencies, frequency residuals, frequency stability in terms of Allan deviation, reconstructed phase, frequency and phase power spectral density, and frequency drift rates. In the case of one-way data, using an ultrastable oscillator (USO) as a frequency reference, the program set computes the spacecraft transmitted frequency and maintains a database containing the in-flight history of the USO measurements. The program set also produces graphical displays. Some examples and discussions on operating the program set on Galileo and Ulysses data will be presented.

  11. Confirmatory Factor Analysis Alternative: Free, Accessible CBID Software.

    Science.gov (United States)

    Bott, Marjorie; Karanevich, Alex G; Garrard, Lili; Price, Larry R; Mudaranthakam, Dinesh Pal; Gajewski, Byron

    2018-02-01

    New software that performs Classical and Bayesian Instrument Development (CBID) is reported that seamlessly integrates expert (content validity) and participant data (construct validity) to produce entire reliability estimates with smaller sample requirements. The free CBID software can be accessed through a website and used by clinical investigators in new instrument development. Demonstrations are presented of the three approaches using the CBID software: (a) traditional confirmatory factor analysis (CFA), (b) Bayesian CFA using flat uninformative prior, and (c) Bayesian CFA using content expert data (informative prior). Outcomes of usability testing demonstrate the need to make the user-friendly, free CBID software available to interdisciplinary researchers. CBID has the potential to be a new and expeditious method for instrument development, adding to our current measurement toolbox. This allows for the development of new instruments for measuring determinants of health in smaller diverse populations or populations of rare diseases.

  12. A 'Toolbox' Equivalent Process for Safety Analysis Software

    International Nuclear Information System (INIS)

    O'Kula, K.R.; Eng, Tony

    2004-01-01

    Defense Nuclear Facilities Safety Board (DNFSB) Recommendation 2002-1 (Quality Assurance for Safety-Related Software) identified a number of quality assurance issues on the use of software in Department of Energy (DOE) facilities for analyzing hazards, and designing and operating controls that prevent or mitigate potential accidents. The development and maintenance of a collection, or 'toolbox', of multiple-site use, standard solution, Software Quality Assurance (SQA)-compliant safety software is one of the major improvements identified in the associated DOE Implementation Plan (IP). The DOE safety analysis toolbox will contain a set of appropriately quality-assured, configuration-controlled, safety analysis codes, recognized for DOE-broad, safety basis applications. Currently, six widely applied safety analysis computer codes have been designated for toolbox consideration. While the toolbox concept considerably reduces SQA burdens among DOE users of these codes, many users of unique, single-purpose, or single-site software may still have sufficient technical justification to continue use of their computer code of choice, but are thwarted by the multiple-site condition on toolbox candidate software. The process discussed here provides a roadmap for an equivalency argument, i.e., establishing satisfactory SQA credentials for single-site software that can be deemed ''toolbox-equivalent''. The process is based on the model established to meet IP Commitment 4.2.1.2: Establish SQA criteria for the safety analysis ''toolbox'' codes. Implementing criteria that establish the set of prescriptive SQA requirements are based on implementation plan/procedures from the Savannah River Site, also incorporating aspects of those from the Waste Isolation Pilot Plant (SNL component) and the Yucca Mountain Project. The major requirements are met with evidence of a software quality assurance plan, software requirements and design documentation, user's instructions, test report, a

  13. IFDOTMETER: A New Software Application for Automated Immunofluorescence Analysis.

    Science.gov (United States)

    Rodríguez-Arribas, Mario; Pizarro-Estrella, Elisa; Gómez-Sánchez, Rubén; Yakhine-Diop, S M S; Gragera-Hidalgo, Antonio; Cristo, Alejandro; Bravo-San Pedro, Jose M; González-Polo, Rosa A; Fuentes, José M

    2016-04-01

    Most laboratories interested in autophagy use different imaging software for managing and analyzing heterogeneous parameters in immunofluorescence experiments (e.g., LC3-puncta quantification and determination of the number and size of lysosomes). One solution would be software that works on a user's laptop or workstation that can access all image settings and provide quick and easy-to-use analysis of data. Thus, we have designed and implemented an application called IFDOTMETER, which can run on all major operating systems because it has been programmed using JAVA (Sun Microsystems). Briefly, IFDOTMETER software has been created to quantify a variety of biological hallmarks, including mitochondrial morphology and nuclear condensation. The program interface is intuitive and user-friendly, making it useful for users not familiar with computer handling. By setting previously defined parameters, the software can automatically analyze a large number of images without the supervision of the researcher. Once analysis is complete, the results are stored in a spreadsheet. Using software for high-throughput cell image analysis offers researchers the possibility of performing comprehensive and precise analysis of a high number of images in an automated manner, making this routine task easier. © 2015 Society for Laboratory Automation and Screening.

  14. FORECAST: Regulatory effects cost analysis software annual

    International Nuclear Information System (INIS)

    Lopez, B.; Sciacca, F.W.

    1991-11-01

    Over the past several years the NRC has developed a generic cost methodology for the quantification of cost/economic impacts associated with a wide range of new or revised regulatory requirements. This methodology has been developed to aid the NRC in preparing Regulatory Impact Analyses (RIAs). These generic costing methods can be useful in quantifying impacts both to industry and to the NRC. The FORECAST program was developed to facilitate the use of the generic costing methodology. This PC program integrates the major cost considerations that may be required because of a regulatory change. FORECAST automates much of the calculations typically needed in an RIA and thus reduces the time and labor required to perform these analysis. More importantly, its integrated and consistent treatment of the different cost elements should help assure comprehensiveness, uniformity, and accuracy in the preparation of needed cost estimates

  15. Equipment Obsolescence Analysis and Management Software

    Energy Technology Data Exchange (ETDEWEB)

    Redmond, J.; Carret, L.; Shaon, S.; Schultz, C.

    2015-07-01

    The procurement engineering resources at Nuclear Power Plants (NPPs) are experiencing increasing backlog for procurement items primarily due to the inability to order the original replacement parts. The level of effort and time required to prepare procurement packages is increasing since the number of obsolete parts are increasing exponentially. Procurement packages for obsolete components and parts are much more complex and take more time to prepare because of the need to perform equivalency evaluations, testing requirements and test acceptance criteria development, commercial grade dedication or equipment qualification, and increasing efforts to verify that no fraudulent or counterfeit parts are procured. This problem will be further compounded when NPPs pursue license renewal and approval for plant-life extension. Advanced planning and advanced knowledge of equipment obsolescence is required to allow for sufficient time to properly procure replacement parts for obsolete items. The uncertain supply chain capability due to obsolescence is a real problem and can cause a risk to reliable plant operations due to the potential for a lack of available spare parts and replacement components to support outages and unplanned component failures. Advanced notification of obsolescence is increasingly more important to ensure that adequate time and planning is scheduled to procure the proper replacement parts. A thorough analysis of Original Equipment Manufacturer (OEM) availability and inventory as well as an analysis of failure rates and usage rates is required to predict critical part needs to allow for early identification of obsolescence issues so that a planned and controlled strategy to qualify replacement equipment can be implemented. (Author)

  16. The software analysis project for the Office of Human Resources

    Science.gov (United States)

    Tureman, Robert L., Jr.

    1994-01-01

    There were two major sections of the project for the Office of Human Resources (OHR). The first section was to conduct a planning study to analyze software use with the goal of recommending software purchases and determining whether the need exists for a file server. The second section was analysis and distribution planning for retirement planning computer program entitled VISION provided by NASA Headquarters. The software planning study was developed to help OHR analyze the current administrative desktop computing environment and make decisions regarding software acquisition and implementation. There were three major areas addressed by the study: current environment new software requirements, and strategies regarding the implementation of a server in the Office. To gather data on current environment, employees were surveyed and an inventory of computers were produced. The surveys were compiled and analyzed by the ASEE fellow with interpretation help by OHR staff. New software requirements represented a compilation and analysis of the surveyed requests of OHR personnel. Finally, the information on the use of a server represents research done by the ASEE fellow and analysis of survey data to determine software requirements for a server. This included selection of a methodology to estimate the number of copies of each software program required given current use and estimated growth. The report presents the results of the computing survey, a description of the current computing environment, recommenations for changes in the computing environment, current software needs, management advantages of using a server, and management considerations in the implementation of a server. In addition, detailed specifications were presented for the hardware and software recommendations to offer a complete picture to OHR management. The retirement planning computer program available to NASA employees will aid in long-range retirement planning. The intended audience is the NASA civil

  17. Isotope analysis reveals foraging area dichotomy for atlantic leatherback turtles.

    Directory of Open Access Journals (Sweden)

    Stéphane Caut

    Full Text Available BACKGROUND: The leatherback turtle (Dermochelys coriacea has undergone a dramatic decline over the last 25 years, and this is believed to be primarily the result of mortality associated with fisheries bycatch followed by egg and nesting female harvest. Atlantic leatherback turtles undertake long migrations across ocean basins from subtropical and tropical nesting beaches to productive frontal areas. Migration between two nesting seasons can last 2 or 3 years, a time period termed the remigration interval (RI. Recent satellite transmitter data revealed that Atlantic leatherbacks follow two major dispersion patterns after nesting season, through the North Gulf Stream area or more eastward across the North Equatorial Current. However, information on the whole RI is lacking, precluding the accurate identification of feeding areas where conservation measures may need to be applied. METHODOLOGY/PRINCIPAL FINDINGS: Using stable isotopes as dietary tracers we determined the characteristics of feeding grounds of leatherback females nesting in French Guiana. During migration, 3-year RI females differed from 2-year RI females in their isotope values, implying differences in their choice of feeding habitats (offshore vs. more coastal and foraging latitude (North Atlantic vs. West African coasts, respectively. Egg-yolk and blood isotope values are correlated in nesting females, indicating that egg analysis is a useful tool for assessing isotope values in these turtles, including adults when not available. CONCLUSIONS/SIGNIFICANCE: Our results complement previous data on turtle movements during the first year following the nesting season, integrating the diet consumed during the year before nesting. We suggest that the French Guiana leatherback population segregates into two distinct isotopic groupings, and highlight the urgent need to determine the feeding habitats of the turtle in the Atlantic in order to protect this species from incidental take by

  18. Development of interactive software for fuel management analysis

    International Nuclear Information System (INIS)

    Graves, H.W. Jr.

    1986-01-01

    Electronic computation plays a central part in engineering analysis of all types. Utilization of microcomputers for calculations that were formerly carried out on large mainframe computers presents a unique opportunity to develop software that not only takes advantage of the lower cost of using these machines, but also increases the efficiency of the engineers performing these calculations. This paper reviews the use of electronic computers in engineering analysis, discusses the potential for microcomputer utilization in this area, and describes a series of steps to be followed in software development that can yield significant gains in engineering design efficiency

  19. A software package for biomedical image processing and analysis

    International Nuclear Information System (INIS)

    Goncalves, J.G.M.; Mealha, O.

    1988-01-01

    The decreasing cost of computing power and the introduction of low cost imaging boards justifies the increasing number of applications of digital image processing techniques in the area of biomedicine. There is however a large software gap to be fulfilled, between the application and the equipment. The requirements to bridge this gap are twofold: good knowledge of the hardware provided and its interface to the host computer, and expertise in digital image processing and analysis techniques. A software package incorporating these two requirements was developed using the C programming language, in order to create a user friendly image processing programming environment. The software package can be considered in two different ways: as a data structure adapted to image processing and analysis, which acts as the backbone and the standard of communication for all the software; and as a set of routines implementing the basic algorithms used in image processing and analysis. Hardware dependency is restricted to a single module upon which all hardware calls are based. The data structure that was built has four main features: hierchical, open, object oriented, and object dependent dimensions. Considering the vast amount of memory needed by imaging applications and the memory available in small imaging systems, an effective image memory management scheme was implemented. This software package is being used for more than one and a half years by users with different applications. It proved to be an excellent tool for helping people to get adapted into the system, and for standardizing and exchanging software, yet preserving flexibility allowing for users' specific implementations. The philosophy of the software package is discussed and the data structure that was built is described in detail

  20. Applications of the BEam Cross section Analysis Software (BECAS)

    DEFF Research Database (Denmark)

    Blasques, José Pedro Albergaria Amaral; Bitsche, Robert; Fedorov, Vladimir

    2013-01-01

    A newly developed framework is presented for structural design and analysis of long slender beam-like structures, e.g., wind turbine blades. The framework is based on the BEam Cross section Analysis Software – BECAS – a finite element based cross section analysis tool. BECAS is used for the gener......A newly developed framework is presented for structural design and analysis of long slender beam-like structures, e.g., wind turbine blades. The framework is based on the BEam Cross section Analysis Software – BECAS – a finite element based cross section analysis tool. BECAS is used...... for the generation of beam finite element models which correctly account for effects stemming from material anisotropy and inhomogeneity in cross sections of arbitrary geometry. These type of modelling approach allows for an accurate yet computationally inexpensive representation of a general class of three...

  1. Development of Software for Measurement and Analysis of Solar Radiation

    International Nuclear Information System (INIS)

    Mohamad Idris Taib; Abul Adli Anuar; Noor Ezati Shuib

    2015-01-01

    This software was under development using LabVIEW to be using with StellarNet spectrometers system with USB communication to computer. LabVIEW have capabilities in hardware interfacing, graphical user interfacing and mathematical calculation including array manipulation and processing. This software read data from StellarNet spectrometer in real-time and then processed for analysis. Several measurement of solar radiation and analysis have been done. Solar radiation involved mainly infra-red, visible light and ultra-violet. With solar radiation spectrum data, information of weather and suitability of plant can be gathered and analyzed. Furthermore, optimization of utilization and safety precaution of solar radiation can be planned. Using this software, more research and development in utilization and safety of solar radiation can be explored. (author)

  2. Evaluation of peak-fitting software for gamma spectrum analysis

    International Nuclear Information System (INIS)

    Zahn, Guilherme S.; Genezini, Frederico A.; Moralles, Mauricio

    2009-01-01

    In all applications of gamma-ray spectroscopy, one of the most important and delicate parts of the data analysis is the fitting of the gamma-ray spectra, where information as the number of counts, the position of the centroid and the width, for instance, are associated with each peak of each spectrum. There's a huge choice of computer programs that perform this type of analysis, and the most commonly used in routine work are the ones that automatically locate and fit the peaks; this fit can be made in several different ways - the most common ways are to fit a Gaussian function to each peak or simply to integrate the area under the peak, but some software go far beyond and include several small corrections to the simple Gaussian peak function, in order to compensate for secondary effects. In this work several gamma-ray spectroscopy software are compared in the task of finding and fitting the gamma-ray peaks in spectra taken with standard sources of 137 Cs, 60 Co, 133 Ba and 152 Eu. The results show that all of the automatic software can be properly used in the task of finding and fitting peaks, with the exception of GammaVision; also, it was possible to verify that the automatic peak-fitting software did perform as well as - and sometimes even better than - a manual peak-fitting software. (author)

  3. Isotope analysis (δ13C of pulpy whole apple juice

    Directory of Open Access Journals (Sweden)

    Ricardo Figueira

    2011-09-01

    Full Text Available The objectives of this study were to develop the method of isotope analysis to quantify the carbon of C3 photosynthetic cycle in pulpy whole apple juice and to measure the legal limits based on Brazilian legislation in order to identify the beverages that do not conform to the Ministry of Agriculture, Livestock and Food Supply (MAPA. This beverage was produced in a laboratory according to the Brazilian law. Pulpy juices adulterated by the addition of sugarcane were also produced. The isotope analyses measured the relative isotope enrichment of the juices, their pulpy fractions (internal standard and purified sugar. From those results, the quantity of C3 source was estimated by means of the isotope dilution equation. To determine the existence of adulteration in commercial juices, it was necessary to create a legal limit according to the Brazilian law. Three brands of commercial juices were analyzed. One was classified as adulterated. The legal limit enabled to clearly identify the juice that was not in conformity with the Brazilian law. The methodology developed proved efficient for quantifying the carbon of C3 origin in commercial pulpy apple juices.

  4. UTOOLS: microcomputer software for spatial analysis and landscape visualization.

    Science.gov (United States)

    Alan A. Ager; Robert J. McGaughey

    1997-01-01

    UTOOLS is a collection of programs designed to integrate various spatial data in a way that allows versatile spatial analysis and visualization. The programs were designed for watershed-scale assessments in which a wide array of resource data must be integrated, analyzed, and interpreted. UTOOLS software combines raster, attribute, and vector data into "spatial...

  5. Using Business Analysis Software in a Business Intelligence Course

    Science.gov (United States)

    Elizondo, Juan; Parzinger, Monica J.; Welch, Orion J.

    2011-01-01

    This paper presents an example of a project used in an undergraduate business intelligence class which integrates concepts from statistics, marketing, and information systems disciplines. SAS Enterprise Miner software is used as the foundation for predictive analysis and data mining. The course culminates with a competition and the project is used…

  6. Application of software technology to automatic test data analysis

    Science.gov (United States)

    Stagner, J. R.

    1991-01-01

    The verification process for a major software subsystem was partially automated as part of a feasibility demonstration. The methods employed are generally useful and applicable to other types of subsystems. The effort resulted in substantial savings in test engineer analysis time and offers a method for inclusion of automatic verification as a part of regression testing.

  7. WinDAM C earthen embankment internal erosion analysis software

    Science.gov (United States)

    Two primary causes of dam failure are overtopping and internal erosion. For the purpose of evaluating dam safety for existing earthen embankment dams and proposed earthen embankment dams, Windows Dam Analysis Modules C (WinDAM C) software will simulate either internal erosion or erosion resulting f...

  8. ANALYSIS OF CONTEMPORARY SOFTWARE BEING USED FOR FORWARDING SERVICES

    Directory of Open Access Journals (Sweden)

    Naumov, V.

    2013-01-01

    Full Text Available The role of information technologies in the forwarding services has been specified. The typical structure of the logistic sites providing the search of requests of freight owners and carriers has been described. The analysis of the software for transportation companies was conducted. The perspective directions of improvement of forwarding services process have been revealed.

  9. A Knowledge-based Environment for Software Process Performance Analysis

    Directory of Open Access Journals (Sweden)

    Natália Chaves Lessa Schots

    2015-08-01

    Full Text Available Background: Process performance analysis is a key step for implementing continuous improvement in software organizations. However, the knowledge to execute such analysis is not trivial and the person responsible to executing it must be provided with appropriate support. Aim: This paper presents a knowledge-based environment, named SPEAKER, proposed for supporting software organizations during the execution of process performance analysis. SPEAKER comprises a body of knowledge and a set of activities and tasks for software process performance analysis along with supporting tools to executing these activities and tasks. Method: We conducted an informal literature reviews and a systematic mapping study, which provided basic requirements for the proposed environment. We implemented the SPEAKER environment integrating supporting tools for the execution of activities and tasks of performance analysis and the knowledge necessary to execute them, in order to meet the variability presented by the characteristics of these activities. Results: In this paper, we describe each SPEAKER module and the individual evaluations of these modules, and also present an example of use comprising how the environment can guide the user through a specific performance analysis activity. Conclusion: Although we only conducted individual evaluations of SPEAKER’s modules, the example of use indicates the feasibility of the proposed environment. Therefore, the environment as a whole will be further evaluated to verify if it attains its goal of assisting in the execution of process performance analysis by non-specialist people.

  10. Quantitative mass-spectrometric analysis of hydrogen helium isotope mixtures

    International Nuclear Information System (INIS)

    Langer, U.

    1998-12-01

    This work deals with the mass-spectrometric method for the quantitative analysis of hydrogen-helium-isotope mixtures, with special attention to fusion plasma diagnostics. The aim was to use the low-resolution mass spectrometry, a standard measuring method which is well established in science and industry. This task is solved by means of the vector mass spectrometry, where a mass spectrum is repeatedly measured, but with stepwise variation of the parameter settings of a quadruple mass spectrometer. In this way, interfering mass spectra can be decomposed and, moreover, it is possible to analyze underestimated mass spectra of complex hydrogen-helium-isotope mixtures. In this work experimental investigations are presented which show that there are different parameters which are suitable for the UMS-method. With an optimal choice of the parameter settings hydrogen-helium-isotope mixtures can be analyzed with an accuracy of 1-3 %. In practice, a low sensitivity for small helium concentration has to be noted. To cope with this task, a method for selective hydrogen pressure reduction has been developed. Experimental investigations and calculations show that small helium amounts (about 1 %) in a hydrogen atmosphere can be analyzed with an accuracy of 3 - 10 %. Finally, this work deals with the effects of the measuring and calibration error on the resulting error in spectrum decomposition. This aspect has been investigated both in general mass-spectrometric gas analysis and in the analysis of hydrogen-helium-mixtures by means of the vector mass spectrometry. (author)

  11. Synchronized analysis of testbeam data with the Judith software

    CERN Document Server

    McGoldrick, Garrin; Gorišek, Andrej

    2014-01-01

    The Judith software performs pixel detector analysis tasks utilizing two different data streams such as those produced by the reference and tested devices typically found in a testbeam. This software addresses and fixes problems arising from the desynchronization of the two simultaneously triggered data streams by detecting missed triggers in either of the streams. The software can perform all tasks required to generate particle tracks using multiple detector planes: it can align the planes, cluster hits and generate tracks from these clusters. This information can then be used to measure the properties of a particle detector with very fine spatial resolution. It was tested at DESY in the Kartel telescope, a silicon tracking detector, with ATLAS Diamond Beam Monitor modules as a device under test.

  12. Carbon isotopic analysis of atmospheric methane by isotope-ratio-monitoring gas chromatography-mass spectrometry

    Science.gov (United States)

    Merritt, Dawn A.; Hayes, J. M.; Des Marais, David J.

    1995-01-01

    Less than 15 min are required for the determination of delta C(sub PDB)-13 with a precision of 0.2 ppt(1 sigma, single measurement) in 5-mL samples of air containing CH4 at natural levels (1.7 ppm). An analytical system including a sample-introduction unit incorporating a preparative gas chromatograph (GC) column for separation of CH4 from N2, O2, and Ar is described. The 15-min procedure includes time for operation of that system, high-resolution chromatographic separation of the CH4, on-line combustion and purification of the products, and isotopic calibration. Analyses of standards demonstrate that systematic errors are absent and that there is no dependence of observed values of delta on sample size. For samples containing 100 ppm or more CH4, preconcentration is not required and the analysis time is less than 5 min. The system utilizes a commercially available, high-sensitivity isotope-ratio mass spectrometer. For optimal conditions of smaple handling and combustion, performance of the system is within a factor of 2 of the shot-noise limit. The potential exists therefore for analysis of samples as small as 15 pmol CH4 with a standard deviation of less than 1 ppt.

  13. One-Click Data Analysis Software for Science Operations

    Science.gov (United States)

    Navarro, Vicente

    2015-12-01

    One of the important activities of ESA Science Operations Centre is to provide Data Analysis Software (DAS) to enable users and scientists to process data further to higher levels. During operations and post-operations, Data Analysis Software (DAS) is fully maintained and updated for new OS and library releases. Nonetheless, once a Mission goes into the "legacy" phase, there are very limited funds and long-term preservation becomes more and more difficult. Building on Virtual Machine (VM), Cloud computing and Software as a Service (SaaS) technologies, this project has aimed at providing long-term preservation of Data Analysis Software for the following missions: - PIA for ISO (1995) - SAS for XMM-Newton (1999) - Hipe for Herschel (2009) - EXIA for EXOSAT (1983) Following goals have guided the architecture: - Support for all operations, post-operations and archive/legacy phases. - Support for local (user's computer) and cloud environments (ESAC-Cloud, Amazon - AWS). - Support for expert users, requiring full capabilities. - Provision of a simple web-based interface. This talk describes the architecture, challenges, results and lessons learnt gathered in this project.

  14. Application of econometric and ecology analysis methods in physics software

    Science.gov (United States)

    Han, Min Cheol; Hoff, Gabriela; Kim, Chan Hyeong; Kim, Sung Hun; Grazia Pia, Maria; Ronchieri, Elisabetta; Saracco, Paolo

    2017-10-01

    Some data analysis methods typically used in econometric studies and in ecology have been evaluated and applied in physics software environments. They concern the evolution of observables through objective identification of change points and trends, and measurements of inequality, diversity and evenness across a data set. Within each analysis area, various statistical tests and measures have been examined. This conference paper summarizes a brief overview of some of these methods.

  15. Comparison of two three-dimensional cephalometric analysis computer software.

    Science.gov (United States)

    Sawchuk, Dena; Alhadlaq, Adel; Alkhadra, Thamer; Carlyle, Terry D; Kusnoto, Budi; El-Bialy, Tarek

    2014-10-01

    Three-dimensional cephalometric analyses are getting more attraction in orthodontics. The aim of this study was to compare two softwares to evaluate three-dimensional cephalometric analyses of orthodontic treatment outcomes. Twenty cone beam computed tomography images were obtained using i-CAT(®) imaging system from patient's records as part of their regular orthodontic records. The images were analyzed using InVivoDental5.0 (Anatomage Inc.) and 3DCeph™ (University of Illinois at Chicago, Chicago, IL, USA) software. Before and after orthodontic treatments data were analyzed using t-test. Reliability test using interclass correlation coefficient was stronger for InVivoDental5.0 (0.83-0.98) compared with 3DCeph™ (0.51-0.90). Paired t-test comparison of the two softwares shows no statistical significant difference in the measurements made in the two softwares. InVivoDental5.0 measurements are more reproducible and user friendly when compared to 3DCeph™. No statistical difference between the two softwares in linear or angular measurements. 3DCeph™ is more time-consuming in performing three-dimensional analysis compared with InVivoDental5.0.

  16. Cavity Ring-down Spectroscopy for Carbon Isotope Analysis with 2 μm Diode Laser

    International Nuclear Information System (INIS)

    Hiromoto, K.; Tomita, H.; Watanabe, K.; Kawarabayashi, J.; Iguchi, T.

    2009-01-01

    We have made a prototype based on CRDS with 2 μm diode laser for carbon isotope analysis of CO 2 in air. The carbon isotope ratio was obtained to be (1.085±0.012)x10 -2 which shows good agreement with the isotope ratio measured by the magnetic sector-type mass spectrometer within uncertainty. Hence, we demonstrated the carbon isotope analysis based on CRDS with 2 μm tunable diode laser.

  17. Decision Engines for Software Analysis Using Satisfiability Modulo Theories Solvers

    Science.gov (United States)

    Bjorner, Nikolaj

    2010-01-01

    The area of software analysis, testing and verification is now undergoing a revolution thanks to the use of automated and scalable support for logical methods. A well-recognized premise is that at the core of software analysis engines is invariably a component using logical formulas for describing states and transformations between system states. The process of using this information for discovering and checking program properties (including such important properties as safety and security) amounts to automatic theorem proving. In particular, theorem provers that directly support common software constructs offer a compelling basis. Such provers are commonly called satisfiability modulo theories (SMT) solvers. Z3 is a state-of-the-art SMT solver. It is developed at Microsoft Research. It can be used to check the satisfiability of logical formulas over one or more theories such as arithmetic, bit-vectors, lists, records and arrays. The talk describes some of the technology behind modern SMT solvers, including the solver Z3. Z3 is currently mainly targeted at solving problems that arise in software analysis and verification. It has been applied to various contexts, such as systems for dynamic symbolic simulation (Pex, SAGE, Vigilante), for program verification and extended static checking (Spec#/Boggie, VCC, HAVOC), for software model checking (Yogi, SLAM), model-based design (FORMULA), security protocol code (F7), program run-time analysis and invariant generation (VS3). We will describe how it integrates support for a variety of theories that arise naturally in the context of the applications. There are several new promising avenues and the talk will touch on some of these and the challenges related to SMT solvers. Proceedings

  18. Hanford Isotope Project strategic business analysis Cesium-137 (Cs-137)

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-10-01

    The purpose of this business analysis is to address the beneficial reuse of Cesium 137 (Cs-137) in order to utilize a valuable national asset and possibly save millions of tax dollars. Food irradiation is the front runner application along with other uses. This business analysis supports the objectives of the Department of Energy National Isotope Strategy distributed in August 1994 which describes the DOE plans for the production and distribution of isotope products and services. As part of the Department`s mission as stated in that document. ``The Department of Energy will also continue to produce and distribute other radioisotopes and enriched stable isotopes for medical diagnostics and therapeutics, industrial, agricultural, and other useful applications on a businesslike basis. This is consistent with the goals and objectives of the National Performance Review. The Department will endeavor to look at opportunities for private sector to co-fund or invest in new ventures. Also, the Department will seek to divest from ventures that can more profitably or reliably be operated by the private sector.``

  19. Hanford Isotope Project strategic business analysis Cesium-137 (Cs-137)

    International Nuclear Information System (INIS)

    1995-10-01

    The purpose of this business analysis is to address the beneficial reuse of Cesium 137 (Cs-137) in order to utilize a valuable national asset and possibly save millions of tax dollars. Food irradiation is the front runner application along with other uses. This business analysis supports the objectives of the Department of Energy National Isotope Strategy distributed in August 1994 which describes the DOE plans for the production and distribution of isotope products and services. As part of the Department's mission as stated in that document. ''The Department of Energy will also continue to produce and distribute other radioisotopes and enriched stable isotopes for medical diagnostics and therapeutics, industrial, agricultural, and other useful applications on a businesslike basis. This is consistent with the goals and objectives of the National Performance Review. The Department will endeavor to look at opportunities for private sector to co-fund or invest in new ventures. Also, the Department will seek to divest from ventures that can more profitably or reliably be operated by the private sector.''

  20. Isotopic analysis of uranium hexafluoride highly enriched in U-235

    International Nuclear Information System (INIS)

    Chaussy, L.; Boyer, R.

    1968-01-01

    Isotopic analysis of uranium in the form of the hexafluoride by mass-spectrometry gives gross results which are not very accurate. Using a linear interpolation method applied to two standards it is possible to correct for this inaccuracy as long as the isotopic concentrations are less than about 10 per cent in U-235. Above this level, the interpolations formula overestimates the results, especially if the enrichment of the analyzed samples is higher than 1.3 with respect to the standards. A formula is proposed for correcting the interpolation equation and for the extending its field of application to high values of the enrichment (≅2) and of the concentration. It is shown that by using this correction the results obtained have an accuracy which depends practically only on that of the standards, taking into account the dispersion in the measurements. (authors) [fr

  1. Knickpoint finder: A software tool that improves neotectonic analysis

    Science.gov (United States)

    Queiroz, G. L.; Salamuni, E.; Nascimento, E. R.

    2015-03-01

    This work presents a new software tool for morphometric analysis of drainage networks based on the methods of Hack (1973) and Etchebehere et al. (2004). This tool is applicable to studies of morphotectonics and neotectonics. The software used a digital elevation model (DEM) to identify the relief breakpoints along drainage profiles (knickpoints). The program was coded in Python for use on the ArcGIS platform and is called Knickpoint Finder. A study area was selected to test and evaluate the software's ability to analyze and identify neotectonic morphostructures based on the morphology of the terrain. For an assessment of its validity, we chose an area of the James River basin, which covers most of the Piedmont area of Virginia (USA), which is an area of constant intraplate seismicity and non-orogenic active tectonics and exhibits a relatively homogeneous geodesic surface currently being altered by the seismogenic features of the region. After using the tool in the chosen area, we found that the knickpoint locations are associated with the geologic structures, epicenters of recent earthquakes, and drainages with rectilinear anomalies. The regional analysis demanded the use of a spatial representation of the data after processing using Knickpoint Finder. The results were satisfactory in terms of the correlation of dense areas of knickpoints with active lineaments and the rapidity of the identification of deformed areas. Therefore, this software tool may be considered useful in neotectonic analyses of large areas and may be applied to any area where there is DEM coverage.

  2. Stable isotope analysis of Dacryoconarid carbonate microfossils: a new tool for Devonian oxygen and carbon isotope stratigraphy.

    Science.gov (United States)

    Frappier, Amy Benoit; Lindemann, Richard H; Frappier, Brian R

    2015-04-30

    Dacryoconarids are extinct marine zooplankton known from abundant, globally distributed calcite microfossils in the Devonian, but their shell stable isotope composition has not been previously explored. Devonian stable isotope stratigraphy is currently limited to less common invertebrates or bulk rock analyses of uncertain provenance. As with Cenozoic planktonic foraminifera, isotopic analysis of dacryoconarid shells could facilitate higher-resolution, geographically widespread stable isotope records of paleoenvironmental change, including marine hypoxia events, climate changes, and biocrises. We explored the use of Dacryoconarid isotope stratigraphy as a viable method in interpreting paleoenvironments. We applied an established method for determining stable isotope ratios (δ(13) C, δ(18) O values) of small carbonate microfossils to very well-preserved dacryoconarid shells. We analyzed individual calcite shells representing five common genera using a Kiel carbonate device coupled to a MAT 253 isotope ratio mass spectrometer. Calcite shell δ(13) C and δ(18) O values were compared by taxonomic group, rock unit, and locality. Single dacryoconarid calcite shells are suitable for stable isotope analysis using a Kiel-IRMS setup. The dacryoconarid shell δ(13) C values (-4.7 to 2.3‰) and δ(18) O values (-10.3 to -4.8‰) were consistent across taxa, independent of shell size or part, but varied systematically through time. Lower fossil δ(18) O values were associated with warmer water temperature and more variable δ(13) C values were associated with major bioevents. Dacryoconarid δ(13) C and δ(18) O values differed from bulk rock carbonate values. Dacryoconarid individual microfossil δ(13) C and δ(18) O values are highly sensitive to paleoenvironmental changes, thus providing a promising avenue for stable isotope chemostratigraphy to better resolve regional to global paleoceanographic changes throughout the upper Silurian to the upper Devonian. Our results

  3. STARS software tool for analysis of reliability and safety

    International Nuclear Information System (INIS)

    Poucet, A.; Guagnini, E.

    1989-01-01

    This paper reports on the STARS (Software Tool for the Analysis of Reliability and Safety) project aims at developing an integrated set of Computer Aided Reliability Analysis tools for the various tasks involved in systems safety and reliability analysis including hazard identification, qualitative analysis, logic model construction and evaluation. The expert system technology offers the most promising perspective for developing a Computer Aided Reliability Analysis tool. Combined with graphics and analysis capabilities, it can provide a natural engineering oriented environment for computer assisted reliability and safety modelling and analysis. For hazard identification and fault tree construction, a frame/rule based expert system is used, in which the deductive (goal driven) reasoning and the heuristic, applied during manual fault tree construction, is modelled. Expert system can explain their reasoning so that the analyst can become aware of the why and the how results are being obtained. Hence, the learning aspect involved in manual reliability and safety analysis can be maintained and improved

  4. Pressurizer pump reliability analysis high flux isotope reactor

    International Nuclear Information System (INIS)

    Merryman, L.; Christie, B.

    1993-01-01

    During a prolonged outage from November 1986 to May 1990, numerous changes were made at the High Flux Isotope Reactor (HFIR). Some of these changes involved the pressurizer pumps. An analysis was performed to calculate the impact of these changes on the pressurizer system availability. The analysis showed that the availability of the pressurizer system dropped from essentially 100% to approximately 96%. The primary reason for the decrease in availability comes because off-site power grid disturbances sometimes result in a reactor trip with the present pressurizer pump configuration. Changes are being made to the present pressurizer pump configuration to regain some of the lost availability

  5. Optimization and Validation of the Developed Uranium Isotopic Analysis Code

    Energy Technology Data Exchange (ETDEWEB)

    Kim, J. H.; Kang, M. Y.; Kim, Jinhyeong; Choi, H. D. [Seoul National Univ., Seoul (Korea, Republic of)

    2014-10-15

    γ-ray spectroscopy is a representative non-destructive assay for nuclear material, and less time-consuming and less expensive than the destructive analysis method. The destructive technique is more precise than NDA technique, however, there is some correction algorithm which can improve the performance of γ-spectroscopy. For this reason, an analysis code for uranium isotopic analysis is developed by Applied Nuclear Physics Group in Seoul National University. Overlapped γ- and x-ray peaks in the 89-101 keV X{sub α}-region are fitted with Gaussian and Lorentzian distribution peak functions, tail and background functions. In this study, optimizations for the full-energy peak efficiency calibration and fitting parameters of peak tail and background are performed, and validated with 24 hour acquisition of CRM uranium samples. The optimization of peak tail and background parameters are performed with the validation by using CRM uranium samples. The analysis performance is improved in HEU samples, but more optimization of fitting parameters is required in LEU sample analysis. In the future, the optimization research about the fitting parameters with various type of uranium samples will be performed. {sup 234}U isotopic analysis algorithms and correction algorithms (coincidence effect, self-attenuation effect) will be developed.

  6. STAMPS: development and verification of swallowing kinematic analysis software.

    Science.gov (United States)

    Lee, Woo Hyung; Chun, Changmook; Seo, Han Gil; Lee, Seung Hak; Oh, Byung-Mo

    2017-10-17

    Swallowing impairment is a common complication in various geriatric and neurodegenerative diseases. Swallowing kinematic analysis is essential to quantitatively evaluate the swallowing motion of the oropharyngeal structures. This study aims to develop a novel swallowing kinematic analysis software, called spatio-temporal analyzer for motion and physiologic study (STAMPS), and verify its validity and reliability. STAMPS was developed in MATLAB, which is one of the most popular platforms for biomedical analysis. This software was constructed to acquire, process, and analyze the data of swallowing motion. The target of swallowing structures includes bony structures (hyoid bone, mandible, maxilla, and cervical vertebral bodies), cartilages (epiglottis and arytenoid), soft tissues (larynx and upper esophageal sphincter), and food bolus. Numerous functions are available for the spatiotemporal parameters of the swallowing structures. Testing for validity and reliability was performed in 10 dysphagia patients with diverse etiologies and using the instrumental swallowing model which was designed to mimic the motion of the hyoid bone and the epiglottis. The intra- and inter-rater reliability tests showed excellent agreement for displacement and moderate to excellent agreement for velocity. The Pearson correlation coefficients between the measured and instrumental reference values were nearly 1.00 (P software is expected to be useful for researchers who are interested in the swallowing motion analysis.

  7. Effectiveness of an Automatic Tracking Software in Underwater Motion Analysis

    Directory of Open Access Journals (Sweden)

    Fabrício A. Magalhaes

    2013-12-01

    Full Text Available Tracking of markers placed on anatomical landmarks is a common practice in sports science to perform the kinematic analysis that interests both athletes and coaches. Although different software programs have been developed to automatically track markers and/or features, none of them was specifically designed to analyze underwater motion. Hence, this study aimed to evaluate the effectiveness of a software developed for automatic tracking of underwater movements (DVP, based on the Kanade-Lucas-Tomasi feature tracker. Twenty-one video recordings of different aquatic exercises (n = 2940 markers’ positions were manually tracked to determine the markers’ center coordinates. Then, the videos were automatically tracked using DVP and a commercially available software (COM. Since tracking techniques may produce false targets, an operator was instructed to stop the automatic procedure and to correct the position of the cursor when the distance between the calculated marker’s coordinate and the reference one was higher than 4 pixels. The proportion of manual interventions required by the software was used as a measure of the degree of automation. Overall, manual interventions were 10.4% lower for DVP (7.4% than for COM (17.8%. Moreover, when examining the different exercise modes separately, the percentage of manual interventions was 5.6% to 29.3% lower for DVP than for COM. Similar results were observed when analyzing the type of marker rather than the type of exercise, with 9.9% less manual interventions for DVP than for COM. In conclusion, based on these results, the developed automatic tracking software presented can be used as a valid and useful tool for underwater motion analysis.

  8. Model Based Analysis and Test Generation for Flight Software

    Science.gov (United States)

    Pasareanu, Corina S.; Schumann, Johann M.; Mehlitz, Peter C.; Lowry, Mike R.; Karsai, Gabor; Nine, Harmon; Neema, Sandeep

    2009-01-01

    We describe a framework for model-based analysis and test case generation in the context of a heterogeneous model-based development paradigm that uses and combines Math- Works and UML 2.0 models and the associated code generation tools. This paradigm poses novel challenges to analysis and test case generation that, to the best of our knowledge, have not been addressed before. The framework is based on a common intermediate representation for different modeling formalisms and leverages and extends model checking and symbolic execution tools for model analysis and test case generation, respectively. We discuss the application of our framework to software models for a NASA flight mission.

  9. Hardware and software constructs for a vibration analysis network

    International Nuclear Information System (INIS)

    Cook, S.A.; Crowe, R.D.; Toffer, H.

    1985-01-01

    Vibration level monitoring and analysis has been initiated at N Reactor, the dual purpose reactor operated at Hanford, Washington by UNC Nuclear Industries (UNC) for the Department of Energy (DOE). The machinery to be monitored was located in several buildings scattered over the plant site, necessitating an approach using satellite stations to collect, monitor and temporarily store data. The satellite stations are, in turn, linked to a centralized processing computer for further analysis. The advantages of a networked data analysis system are discussed in this paper along with the hardware and software required to implement such a system

  10. Calcium Isotope Analysis with "Peak Cut" Method on Column Chemistry

    Science.gov (United States)

    Zhu, H.; Zhang, Z.; Liu, F.; Li, X.

    2017-12-01

    To eliminate isobaric interferences from elemental and molecular isobars (e.g., 40K+, 48Ti+, 88Sr2+, 24Mg16O+, 27Al16O+) on Ca isotopes during mass determination, samples should be purified through ion-exchange column chemistry before analysis. However, large Ca isotopic fractionation has been observed during column chemistry (Russell and Papanastassiou, 1978; Zhu et al., 2016). Therefore, full recovery during column chemistry is greatly needed, otherwise uncertainties would be caused by poor recovery (Zhu et al., 2016). Generally, matrix effects could be enhanced by full recovery, as other elements might overlap with Ca cut during column chemistry. Matrix effects and full recovery are difficult to balance and both need to be considered for high-precision analysis of stable Ca isotopes. Here, we investigate the influence of poor recovery on δ44/40Ca using TIMS with the double spike technique. The δ44/40Ca values of IAPSO seawater, ML3B-G and BHVO-2 in different Ca subcats (e.g., 0-20, 20-40, 40-60, 60-80, 80-100%) with 20% Ca recovery on column chemistry display limited variation after correction by the 42Ca-43Ca double spike technique with the exponential law. Notably, δ44/40Ca of each Ca subcut is quite consistent with δ44/40Ca of Ca cut with full recovery within error. Our results indicate that the 42Ca-43Ca double spike technique can simultaneously correct both of the Ca isotopic fractionation that occurred during column chemistry and thermal ionization mass spectrometry (TIMS) determination properly, because both of the isotopic fractionation occurred during analysis follow the exponential law well. Therefore, we propose the "peak cut" method on Ca column chemistry for samples with complex matrix effects. Briefly, for samples with low Ca contents, we can add the double spike before column chemistry, and only collect the middle of the Ca eluate and abandon the both sides of Ca eluate that might overlap with other elements (e.g., K, Sr). This method would

  11. Calibration Analysis Software for the ATLAS Pixel Detector

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00372086; The ATLAS collaboration

    2016-01-01

    The calibration of the ATLAS Pixel detector at LHC fulfils two main purposes: to tune the front-end configuration parameters for establishing the best operational settings and to measure the tuning performance through a subset of scans. An analysis framework has been set up in order to take actions on the detector given the outcome of a calibration scan (e.g. to create a mask for disabling noisy pixels). The software framework to control all aspects of the Pixel detector scans and analyses is called Calibration Console. The introduction of a new layer, equipped with new Front End-I4 Chips, required an update the Console architecture. It now handles scans and scans analyses applied together to chips with different characteristics. An overview of the newly developed Calibration Analysis Software will be presented, together with some preliminary result.

  12. Using Statistical Analysis Software to Advance Nitro Plasticizer Wettability

    Energy Technology Data Exchange (ETDEWEB)

    Shear, Trevor Allan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-08-29

    Statistical analysis in science is an extremely powerful tool that is often underutilized. Additionally, it is frequently the case that data is misinterpreted or not used to its fullest extent. Utilizing the advanced software JMP®, many aspects of experimental design and data analysis can be evaluated and improved. This overview will detail the features of JMP® and how they were used to advance a project, resulting in time and cost savings, as well as the collection of scientifically sound data. The project analyzed in this report addresses the inability of a nitro plasticizer to coat a gold coated quartz crystal sensor used in a quartz crystal microbalance. Through the use of the JMP® software, the wettability of the nitro plasticizer was increased by over 200% using an atmospheric plasma pen, ensuring good sample preparation and reliable results.

  13. Spectrum analysis on quality requirements consideration in software design documents.

    Science.gov (United States)

    Kaiya, Haruhiko; Umemura, Masahiro; Ogata, Shinpei; Kaijiri, Kenji

    2013-12-01

    Software quality requirements defined in the requirements analysis stage should be implemented in the final products, such as source codes and system deployment. To guarantee this meta-requirement, quality requirements should be considered in the intermediate stages, such as the design stage or the architectural definition stage. We propose a novel method for checking whether quality requirements are considered in the design stage. In this method, a technique called "spectrum analysis for quality requirements" is applied not only to requirements specifications but also to design documents. The technique enables us to derive the spectrum of a document, and quality requirements considerations in the document are numerically represented in the spectrum. We can thus objectively identify whether the considerations of quality requirements in a requirements document are adapted to its design document. To validate the method, we applied it to commercial software systems with the help of a supporting tool, and we confirmed that the method worked well.

  14. Characterization of phenols biodegradation by compound specific stable isotope analysis

    Science.gov (United States)

    Wei, Xi; Gilevska, Tetyana; Wenzig, Felix; Hans, Richnow; Vogt, Carsten

    2015-04-01

    -cresol degradation and 2.2±0.3‰ for m-cresol degradation, respectively. The carbon isotope fractionation patterns of phenol degradation differed more profoundly. Oxygen-dependent monooxygenation of phenol by A.calcoaceticus as the initial reaction yielded ƐC values of -1.5±0.02‰. In contrast, the anaerobic degradation initiated by ATP-dependent carboxylation performed by Thauera aromatia DSM 6984, produced no detectable fractionation (ƐC 0±0.1‰). D. cetonica showed a slight inverse carbon isotope fractionation (ƐC 0.4±0.1‰). In conclusion, a validated method for compound specific stable isotope analysis was developed for phenolic compounds, and the first data set of carbon enrichment factors upon the biodegradation of phenol and cresols with different activation mechanisms has been obtained in the present study. Carbon isotope fractionation analysis is a potentially powerful tool to monitor phenolic compounds degradation in the environment.

  15. Evaluation of Kilauea Eruptions By Using Stable Isotope Analysis

    Science.gov (United States)

    Rahimi, K. E.; Bursik, M. I.

    2016-12-01

    Kilauea, on the island of Hawaii, is a large volcanic edifice with numerous named vents scattered across its surface. Halema`uma`u crater sits with Kilauea caldera, above the magma reservoir, which is the main source of lava feeding most vents on Kilauea volcano. Halema`uma`u crater produces basaltic explosive activity ranging from weak emission to sub-Plinian. Changes in the eruption style are thought to be due to the interplay between external water and magma (phreatomagmatic/ phreatic), or to segregation of gas from magma (magmatic) at shallow depths. Since there are three different eruption mechanisms (phreatomagmatic, phreatic, and magmatic), each eruption has its own isotope ratios. The aim of this study is to evaluate the eruption mechanism by using stable isotope analysis. Studying isotope ratios of D/H and δ18O within fluid inclusion and volcanic glass will provide an evidence of what driven the eruption. The results would be determined the source of water that drove an eruption by correlating the values with water sources (groundwater, rainwater, and magmatic water) since each water source has a diagnostic value of D/H and δ18O. These results will provide the roles of volatiles in eruptions. The broader application of this research is that these methods could help volcanologists forecasting and predicting the current volcanic activity by mentoring change in volatiles concentration within deposits.

  16. Software for a measuring facility for activation analysis

    International Nuclear Information System (INIS)

    De Keyser, A.; De Roost, E.

    1985-01-01

    A software package has been developed for an APPLE P.C. The programs are intended to control an automated measuring station for photon activation analysis at GELINA, the linear accelerator of C.B.N.M. at Geel (Belgium). They allow to set-up a measuring scheme, to execute it under computer control, to accumulate and store 2 K-spectra using a built-in ADC and to output the results as listings, plots or evaluated reports

  17. Phenomenology and Qualitative Data Analysis Software (QDAS): A Careful Reconciliation

    OpenAIRE

    Brian Kelleher Sohn

    2017-01-01

    An oft-cited phenomenological methodologist, Max VAN MANEN (2014), claims that qualitative data analysis software (QDAS) is not an appropriate tool for phenomenological research. Yet phenomenologists rarely describe how phenomenology is to be done: pencil, paper, computer? DAVIDSON and DI GREGORIO (2011) urge QDAS contrarians such as VAN MANEN to get over their methodological loyalties and join the digital world, claiming that all qualitative researchers, whatever their methodology, perform p...

  18. SIMA: Python software for analysis of dynamic fluorescence imaging data

    Directory of Open Access Journals (Sweden)

    Patrick eKaifosh

    2014-09-01

    Full Text Available Fluorescence imaging is a powerful method for monitoring dynamic signals in the nervous system. However, analysis of dynamic fluorescence imaging data remains burdensome, in part due to the shortage of available software tools. To address this need, we have developed SIMA, an open source Python package that facilitates common analysis tasks related to fluorescence imaging. Functionality of this package includes correction of motion artifacts occurring during in vivo imaging with laser-scanning microscopy, segmentation of imaged fields into regions of interest (ROIs, and extraction of signals from the segmented ROIs. We have also developed a graphical user interface (GUI for manual editing of the automatically segmented ROIs and automated registration of ROIs across multiple imaging datasets. This software has been designed with flexibility in mind to allow for future extension with different analysis methods and potential integration with other packages. Software, documentation, and source code for the SIMA package and ROI Buddy GUI are freely available at http://www.losonczylab.org/sima/.

  19. Nuclear analysis software. Pt. 1: Spectrum transfer and reformatting (SPEDAC)

    International Nuclear Information System (INIS)

    1991-01-01

    GANAAS (Gamma, Activity, and Neutron Activation Analysis System) is one in the family of software packages developed under the auspices of the International Atomic Energy Agency. Primarily, the package was intended to support the IAEA Technical Assistance and Cooperation projects in developing countries. However, it is open domain software that can be copied and used by anybody, except for commercial purposes. All the nuclear analysis software provided by the IAEA has the same design philosophy and similar structure. The intention was to provide the user with maximum flexibility, at the same time with a simple and logical organization that requires minimum digging through the manuals. GANAAS is a modular system. It consists of several programmes that can be installed on the hard disk as the are needed. Obviously, some parts of they system are required in all cases. Those are installed at the beginning, without consulting the operator. GANAAS offers the opportunity to expand and improve the system. The gamma spectrum evaluation programmes using different fitting algorithms can be added to GANAAS, under the condition that the format of their input and output files corresponds to the rules of GANAAS. The same applies to the quantitative analysis parts of the programme

  20. Freud: a software suite for high-throughput simulation analysis

    Science.gov (United States)

    Harper, Eric; Spellings, Matthew; Anderson, Joshua; Glotzer, Sharon

    Computer simulation is an indispensable tool for the study of a wide variety of systems. As simulations scale to fill petascale and exascale supercomputing clusters, so too does the size of the data produced, as well as the difficulty in analyzing these data. We present Freud, an analysis software suite for efficient analysis of simulation data. Freud makes no assumptions about the system being analyzed, allowing for general analysis methods to be applied to nearly any type of simulation. Freud includes standard analysis methods such as the radial distribution function, as well as new methods including the potential of mean force and torque and local crystal environment analysis. Freud combines a Python interface with fast, parallel C + + analysis routines to run efficiently on laptops, workstations, and supercomputing clusters. Data analysis on clusters reduces data transfer requirements, a prohibitive cost for petascale computing. Used in conjunction with simulation software, Freud allows for smart simulations that adapt to the current state of the system, enabling the study of phenomena such as nucleation and growth, intelligent investigation of phases and phase transitions, and determination of effective pair potentials.

  1. Analysis and separation of boron isotopes; Analyse et separation des isotopes du bore

    Energy Technology Data Exchange (ETDEWEB)

    Perie, M [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires

    1966-11-01

    The nuclear applications of boron-10 justify the study of a method of measurement of its isotopic abundance as well as of very small traces of boron in different materials. A systematic study of thermionic emission of BO{sub 2}Na{sub 2}{sup +} has been carried out. In the presence of a slight excess of alkalis, the thermionic emission is considerably reduced. On the other hand, the addition of a mixture of sodium hydroxide-glycerol (or mannitol) to borax permits to obtain an intense and stable beam. These results have permitted to establish an operative method for the analysis of traces of boron by isotopic dilution. In other respects, the needs of boron-10 in nuclear industry Justify the study of procedures of separation of isotopes of boron. A considerable isotopic effect has been exhibited in the chemical exchange reaction between methyl borate and borate salt in solution. In the case of exchange between methyl borate and sodium borate, the elementary separation factor {alpha} is: {alpha}=(({sup 11}B/{sup 10}B)vap.)/(({sup 11}B/{sup 10}B)liq.)=1.03{sub 3}. The high value of this elementary effect has been multiplied in a distillation column in which the problem of regeneration of the reactive has been resolved. An alternative procedure replacing the alkali borate by a borate of volatile base, for example diethylamine, has also been studied ({alpha}=1,02{sub 5} in medium hydro-methanolic with 2,2 per cent water). (author) [French] Les applications nucleaires du bore 10 justifient l'etude d'une methode de mesure de son abondance isotopique dans divers materiaux ainsi que le dosage de tres faibles traces de bore. Une etude systematique de l'emission thermoionique de BO{sub 2} Na{sub 2}{sup +} a ete effectuee. En presence d'un leger exces d'alcalins, l'emission thermoionique est considerablement reduite. Par contre l'addition au borax d'un melange soude-glycerol (ou mannitol) permet d'obtenir un faisceau stable et intense. Ces resultats ont permis d'etablir un mode

  2. IMMAN: free software for information theory-based chemometric analysis.

    Science.gov (United States)

    Urias, Ricardo W Pino; Barigye, Stephen J; Marrero-Ponce, Yovani; García-Jacas, César R; Valdes-Martiní, José R; Perez-Gimenez, Facundo

    2015-05-01

    The features and theoretical background of a new and free computational program for chemometric analysis denominated IMMAN (acronym for Information theory-based CheMoMetrics ANalysis) are presented. This is multi-platform software developed in the Java programming language, designed with a remarkably user-friendly graphical interface for the computation of a collection of information-theoretic functions adapted for rank-based unsupervised and supervised feature selection tasks. A total of 20 feature selection parameters are presented, with the unsupervised and supervised frameworks represented by 10 approaches in each case. Several information-theoretic parameters traditionally used as molecular descriptors (MDs) are adapted for use as unsupervised rank-based feature selection methods. On the other hand, a generalization scheme for the previously defined differential Shannon's entropy is discussed, as well as the introduction of Jeffreys information measure for supervised feature selection. Moreover, well-known information-theoretic feature selection parameters, such as information gain, gain ratio, and symmetrical uncertainty are incorporated to the IMMAN software ( http://mobiosd-hub.com/imman-soft/ ), following an equal-interval discretization approach. IMMAN offers data pre-processing functionalities, such as missing values processing, dataset partitioning, and browsing. Moreover, single parameter or ensemble (multi-criteria) ranking options are provided. Consequently, this software is suitable for tasks like dimensionality reduction, feature ranking, as well as comparative diversity analysis of data matrices. Simple examples of applications performed with this program are presented. A comparative study between IMMAN and WEKA feature selection tools using the Arcene dataset was performed, demonstrating similar behavior. In addition, it is revealed that the use of IMMAN unsupervised feature selection methods improves the performance of both IMMAN and WEKA

  3. An ion beam analysis software based on ImageJ

    International Nuclear Information System (INIS)

    Udalagama, C.; Chen, X.; Bettiol, A.A.; Watt, F.

    2013-01-01

    The suit of techniques (RBS, STIM, ERDS, PIXE, IL, IF,…) available in ion beam analysis yields a variety of rich information. Typically, after the initial challenge of acquiring data we are then faced with the task of having to extract relevant information or to present the data in a format with the greatest impact. This process sometimes requires developing new software tools. When faced with such situations the usual practice at the Centre for Ion Beam Applications (CIBA) in Singapore has been to use our computational expertise to develop ad hoc software tools as and when we need them. It then became apparent that the whole ion beam community can benefit from such tools; specifically from a common software toolset that can be developed and maintained by everyone with freedom to use and allowance to modify. In addition to the benefits of readymade tools and sharing the onus of development, this also opens up the possibility for collaborators to access and analyse ion beam data without having to depend on an ion beam lab. This has the virtue of making the ion beam techniques more accessible to a broader scientific community. We have identified ImageJ as an appropriate software base to develop such a common toolset. In addition to being in the public domain and been setup for collaborative tool development, ImageJ is accompanied by hundreds of modules (plugins) that allow great breadth in analysis. The present work is the first step towards integrating ion beam analysis into ImageJ. Some of the features of the current version of the ImageJ ‘ion beam’ plugin are: (1) reading list mode or event-by-event files, (2) energy gates/sorts, (3) sort stacks, (4) colour function, (5) real time map updating, (6) real time colour updating and (7) median and average map creation

  4. An ion beam analysis software based on ImageJ

    Energy Technology Data Exchange (ETDEWEB)

    Udalagama, C., E-mail: chammika@nus.edu.sg [Centre for Ion Beam Applications (CIBA), Department of Physics, National University of Singapore, 2 Science Drive 3, Singapore 117 542 (Singapore); Chen, X.; Bettiol, A.A.; Watt, F. [Centre for Ion Beam Applications (CIBA), Department of Physics, National University of Singapore, 2 Science Drive 3, Singapore 117 542 (Singapore)

    2013-07-01

    The suit of techniques (RBS, STIM, ERDS, PIXE, IL, IF,…) available in ion beam analysis yields a variety of rich information. Typically, after the initial challenge of acquiring data we are then faced with the task of having to extract relevant information or to present the data in a format with the greatest impact. This process sometimes requires developing new software tools. When faced with such situations the usual practice at the Centre for Ion Beam Applications (CIBA) in Singapore has been to use our computational expertise to develop ad hoc software tools as and when we need them. It then became apparent that the whole ion beam community can benefit from such tools; specifically from a common software toolset that can be developed and maintained by everyone with freedom to use and allowance to modify. In addition to the benefits of readymade tools and sharing the onus of development, this also opens up the possibility for collaborators to access and analyse ion beam data without having to depend on an ion beam lab. This has the virtue of making the ion beam techniques more accessible to a broader scientific community. We have identified ImageJ as an appropriate software base to develop such a common toolset. In addition to being in the public domain and been setup for collaborative tool development, ImageJ is accompanied by hundreds of modules (plugins) that allow great breadth in analysis. The present work is the first step towards integrating ion beam analysis into ImageJ. Some of the features of the current version of the ImageJ ‘ion beam’ plugin are: (1) reading list mode or event-by-event files, (2) energy gates/sorts, (3) sort stacks, (4) colour function, (5) real time map updating, (6) real time colour updating and (7) median and average map creation.

  5. Analysis of signal acquisition in GPS receiver software

    Directory of Open Access Journals (Sweden)

    Vlada S. Sokolović

    2011-01-01

    Full Text Available This paper presents a critical analysis of the flow signal processing carried out in GPS receiver software, which served as a basis for a critical comparison of different signal processing architectures within the GPS receiver. It is possible to achieve Increased flexibility and reduction of GPS device commercial costs, including those of mobile devices, by using radio technology software (SDR, Software Defined Radio. The SDR application can be realized when certain hardware components in a GPS receiver are replaced. Signal processing in the SDR is implemented using a programmable DSP (Digital Signal Processing or FPGA (Field Programmable Gate Array circuit, which allows a simple change of digital signal processing algorithms and a simple change of the receiver parameters. The starting point of the research is the signal generated on the satellite the structure of which is shown in the paper. Based on the GPS signal structure, a receiver is realized with a task to extract an appropriate signal from the spectrum and detect it. Based on collected navigation data, the receiver calculates the position of the end user. The signal coming from the satellite may be at the carrier frequencies of L1 and L2. Since the SPS is used in the civil service, all the tests shown in the work were performed on the L1 signal. The signal coming to the receiver is generated in the spread spectrum technology and is situated below the level of noise. Such signals often interfere with signals from the environment which presents a difficulty for a receiver to perform proper detection and signal processing. Therefore, signal processing technology is continually being improved, aiming at more accurate and faster signal processing. All tests were carried out on a signal acquired from the satellite using the SE4110 input circuit used for filtering, amplification and signal selection. The samples of the received signal were forwarded to a computer for data post processing, i. e

  6. Open source software and crowdsourcing for energy analysis

    International Nuclear Information System (INIS)

    Bazilian, Morgan; Rice, Andrew; Rotich, Juliana; Howells, Mark; DeCarolis, Joseph; Macmillan, Stuart; Brooks, Cameron; Bauer, Florian; Liebreich, Michael

    2012-01-01

    Informed energy decision making requires effective software, high-quality input data, and a suitably trained user community. Developing these resources can be expensive and time consuming. Even when data and tools are intended for public re-use they often come with technical, legal, economic and social barriers that make them difficult to adopt, adapt and combine for use in new contexts. We focus on the promise of open, publically accessible software and data as well as crowdsourcing techniques to develop robust energy analysis tools that can deliver crucial, policy-relevant insight, particularly in developing countries, where planning resources are highly constrained—and the need to adapt these resources and methods to the local context is high. We survey existing research, which argues that these techniques can produce high-quality results, and also explore the potential role that linked, open data can play in both supporting the modelling process and in enhancing public engagement with energy issues. - Highlights: ► We focus on the promise of open, publicly accessible software and data. ► These emerging techniques can produce high-quality results for energy analysis. ► Developing economies require new techniques for energy planning.

  7. PuMA: the Porous Microstructure Analysis software

    Science.gov (United States)

    Ferguson, Joseph C.; Panerai, Francesco; Borner, Arnaud; Mansour, Nagi N.

    2018-01-01

    The Porous Microstructure Analysis (PuMA) software has been developed in order to compute effective material properties and perform material response simulations on digitized microstructures of porous media. PuMA is able to import digital three-dimensional images obtained from X-ray microtomography or to generate artificial microstructures. PuMA also provides a module for interactive 3D visualizations. Version 2.1 includes modules to compute porosity, volume fractions, and surface area. Two finite difference Laplace solvers have been implemented to compute the continuum tortuosity factor, effective thermal conductivity, and effective electrical conductivity. A random method has been developed to compute tortuosity factors from the continuum to rarefied regimes. Representative elementary volume analysis can be performed on each property. The software also includes a time-dependent, particle-based model for the oxidation of fibrous materials. PuMA was developed for Linux operating systems and is available as a NASA software under a US & Foreign release.

  8. Redox substoichiometry in isotope dilution analysis Pt. 4

    International Nuclear Information System (INIS)

    Kambara, T.; Yoshioka, H.; Ugai, Y.

    1980-01-01

    The oxidation reaction of antimony(III) with potassium dichromate has been investigated radiometrically. The quantitative oxidation of antimony(III) was found to be not disturbed even in large amounts of tin(IV). On the basis of these results the redox substoichiometric isotope dilution analysis using potassium dichromate as the oxidizing agent was proposed for the determination of antimony in metallic tin. An antimony content of 1.22+-0.05 μg in metallic tin (10 mg) was determined without separation of the matrix element. (author)

  9. Redox substoichiometry in isotope dilution analysis Pt. 2

    International Nuclear Information System (INIS)

    Kambara, T.; Suzuki, J.; Yoshioka, H.; Nakajima, N.

    1978-01-01

    Isotope dilution analysis using the redox substoichiometric principle has been applied to the determination of antimony content in metallic zinc. As the substoichiometric reaction, the oxidation of trivalent to pentavalent antimony with potassium permanganate was used, followed by separation of these species by the BHPA extraction of trivalent antimony. Determination of antimony contents less than 0.5 μg was found to be possible with good accuracy, without separation of zinc ions. The antimony content in a metallic zinc sample was determined to be 19.7+-0.8 ppm, in good agreement with the results obtained by the other analytical methods. (author)

  10. Applications of stable isotope analysis to atmospheric trace gas budgets

    Directory of Open Access Journals (Sweden)

    Brenninkmeijer C. A.M.

    2009-02-01

    Full Text Available Stable isotope analysis has become established as a useful method for tracing the budgets of atmospheric trace gases and even atmospheric oxygen. Several new developments are briefly discussed in a systematic way to give a practical guide to the scope of recent work. Emphasis is on applications and not on instrumental developments. Processes and reactions are less considered than applications to resolve trace gas budgets. Several new developments are promising and applications hitherto not considered to be possible may allow new uses.

  11. Uses of software in digital image analysis: a forensic report

    Science.gov (United States)

    Sharma, Mukesh; Jha, Shailendra

    2010-02-01

    Forensic image analysis is required an expertise to interpret the content of an image or the image itself in legal matters. Major sub-disciplines of forensic image analysis with law enforcement applications include photo-grammetry, photographic comparison, content analysis and image authentication. It has wide applications in forensic science range from documenting crime scenes to enhancing faint or indistinct patterns such as partial fingerprints. The process of forensic image analysis can involve several different tasks, regardless of the type of image analysis performed. Through this paper authors have tried to explain these tasks, which are described in to three categories: Image Compression, Image Enhancement & Restoration and Measurement Extraction. With the help of examples like signature comparison, counterfeit currency comparison and foot-wear sole impression using the software Canvas and Corel Draw.

  12. A software platform for the analysis of dermatology images

    Science.gov (United States)

    Vlassi, Maria; Mavraganis, Vlasios; Asvestas, Panteleimon

    2017-11-01

    The purpose of this paper is to present a software platform developed in Python programming environment that can be used for the processing and analysis of dermatology images. The platform provides the capability for reading a file that contains a dermatology image. The platform supports image formats such as Windows bitmaps, JPEG, JPEG2000, portable network graphics, TIFF. Furthermore, it provides suitable tools for selecting, either manually or automatically, a region of interest (ROI) on the image. The automated selection of a ROI includes filtering for smoothing the image and thresholding. The proposed software platform has a friendly and clear graphical user interface and could be a useful second-opinion tool to a dermatologist. Furthermore, it could be used to classify images including from other anatomical parts such as breast or lung, after proper re-training of the classification algorithms.

  13. New analysis software for Viking Lander meteorological data

    Directory of Open Access Journals (Sweden)

    O. Kemppinen

    2013-02-01

    Full Text Available We have developed a set of tools that enable us to process Viking Lander meteorological data beyond what has been previously publicly available. Besides providing data for new periods of time, the existing data periods have been augmented by enhancing the data resolution significantly. This was accomplished by first transferring the original Prime computer version of the data analysis software to a standard Linux platform, and then by modifying the software to be able to process the data despite irregularities in the original raw data and reverse engineering various parameter files. In addition to this, the processing pipeline has been streamlined, making processing the data faster and easier. As a case example of new data, freshly processed Viking Lander 1 and 2 temperature records are described and briefly analyzed in ways that have not been previously possible due to the lack of data.

  14. OVERVIEW OF THE SAPHIRE PROBABILISTIC RISK ANALYSIS SOFTWARE

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Curtis L.; Wood, Ted; Knudsen, James; Ma, Zhegang

    2016-10-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) is a software application developed for performing a complete probabilistic risk assessment (PRA) using a personal computer (PC) running the Microsoft Windows operating system. SAPHIRE Version 8 is funded by the U.S. Nuclear Regulatory Commission (NRC) and developed by the Idaho National Laboratory (INL). INL's primary role in this project is that of software developer and tester. However, INL also plays an important role in technology transfer by interfacing and supporting SAPHIRE users, who constitute a wide range of PRA practitioners from the NRC, national laboratories, the private sector, and foreign countries. In this paper, we provide an overview of the current technical capabilities found in SAPHIRE Version 8, including the user interface and enhanced solving algorithms.

  15. 76 FR 60939 - Metal Fatigue Analysis Performed by Computer Software

    Science.gov (United States)

    2011-09-30

    ... Software AGENCY: Nuclear Regulatory Commission. ACTION: Regulatory issue summary; request for comment... computer software package, WESTEMS TM , to demonstrate compliance with Section III, ``Rules for... Software Addressees All holders of, and applicants for, a power reactor operating license or construction...

  16. A software architectural framework specification for neutron activation analysis

    International Nuclear Information System (INIS)

    Preston, J.A.; Grant, C.N.

    2013-01-01

    Neutron Activation Analysis (NAA) is a sensitive multi-element nuclear analytical technique that has been routinely applied by research reactor (RR) facilities to environmental, nutritional, health related, geological and geochemical studies. As RR facilities face calls to increase their research output and impact, with existing or reducing budgets, automation of NAA offers a possible solution. However, automation has many challenges, not the least of which is a lack of system architecture standards to establish acceptable mechanisms for the various hardware/software and software/software interactions among data acquisition systems, specialised hardware such as sample changers, sample loaders, and data processing modules. This lack of standardization often results in automation hardware and software being incompatible with existing system components, in a facility looking to automate its NAA operations. This limits the availability of automation to a few RR facilities with adequate budgets or in-house engineering resources. What is needed is a modern open system architecture for NAA, that provides the required set of functionalities. This paper describes such an 'architectural framework' (OpenNAA), and portions of a reference implementation. As an example of the benefits, calculations indicate that applying this architecture to the compilation and QA steps associated with the analysis of 35 elements in 140 samples, with 14 SRM's, can reduce the time required by over 80 %. The adoption of open standards in the nuclear industry has been very successful over the years in promoting interchangeability and maximising the lifetime and output of nuclear measurement systems. OpenNAA will provide similar benefits within the NAA application space, safeguarding user investments in their current system, while providing a solid path for development into the future. (author)

  17. BIM Software Capability and Interoperability Analysis : An analytical approach toward structural usage of BIM software (S-BIM)

    OpenAIRE

    A. Taher, Ali

    2016-01-01

    This study focused on the structuralanalysis of BIM models. Different commercial software (Autodesk products and Rhinoceros)are presented through modelling and analysis of different structures with varying complexity,section properties, geometry, and material. Beside the commercial software, differentarchitectural and different tools for structural analysis are evaluated (dynamo, grasshopper,add-on tool, direct link, indirect link via IFC). BIM and Structural BIM (S-BIM)

  18. UPVapor: Cofrentes nuclear power plant production results analysis software

    International Nuclear Information System (INIS)

    Curiel, M.; Palomo, M. J.; Baraza, A.; Vaquer, J.

    2010-10-01

    UPVapor software version 02 has been developed for the Cofrentes nuclear power plant Data Analysis Department (Spain). It is an analysis graphical environment in which users have available all the plant variables registered in the process computer system (SIEC). In order to perform this, UPVapor software has many advanced graphic tools for work simplicity, as well as a friendly environment easy to use and with many configuration possibilities. Plant variables are classified in the same way that they are in SIEC computer and these values are taken from it through the network of Iberdrola. UPVapor can generate two different types of graphics: evolution graphs and X Y graphs. The first ones analyse the evolution up to twenty plant variables in a user's defined time period and according to historic plant files. Many tools are available: cursors, graphic configuration, mobile means, non valid data visualization ... Moreover, a particular analysis configuration can be saved, as a pre selection, giving the possibility of charging pre selection directly and developing quick monitoring of a group of preselected plant variables. In X Y graphs, it is possible to analyse a variable value against another variable in a defined time. As an option, users can filter previous data depending on a variable certain range, with the possibility of programming up to five filters. As well as the other graph, X Y graph has many configurations, saving and printing options. With UPVapor software, data analysts can save a valuable time during daily work and, as it is of easy utilization, it permits to other users to perform their own analysis without ask the analysts to develop. Besides, it can be used from any work centre with access to network framework. (Author)

  19. UPVapor: Cofrentes nuclear power plant production results analysis software

    Energy Technology Data Exchange (ETDEWEB)

    Curiel, M. [Logistica y Acondicionamientos Industriales SAU, Sorolla Center, local 10, Av. de las Cortes Valencianas No. 58, 46015 Valencia (Spain); Palomo, M. J. [ISIRYM, Universidad Politecnica de Valencia, Camino de Vera s/n, Valencia (Spain); Baraza, A. [Iberdrola Generacion S. A., Central Nuclear Cofrentes, Carretera Almansa Requena s/n, 04662 Cofrentes, Valencia (Spain); Vaquer, J., E-mail: m.curiel@lainsa.co [TITANIA Servicios Tecnologicos SL, Sorolla Center, local 10, Av. de las Cortes Valencianas No. 58, 46015 Valencia (Spain)

    2010-10-15

    UPVapor software version 02 has been developed for the Cofrentes nuclear power plant Data Analysis Department (Spain). It is an analysis graphical environment in which users have available all the plant variables registered in the process computer system (SIEC). In order to perform this, UPVapor software has many advanced graphic tools for work simplicity, as well as a friendly environment easy to use and with many configuration possibilities. Plant variables are classified in the same way that they are in SIEC computer and these values are taken from it through the network of Iberdrola. UPVapor can generate two different types of graphics: evolution graphs and X Y graphs. The first ones analyse the evolution up to twenty plant variables in a user's defined time period and according to historic plant files. Many tools are available: cursors, graphic configuration, mobile means, non valid data visualization ... Moreover, a particular analysis configuration can be saved, as a pre selection, giving the possibility of charging pre selection directly and developing quick monitoring of a group of preselected plant variables. In X Y graphs, it is possible to analyse a variable value against another variable in a defined time. As an option, users can filter previous data depending on a variable certain range, with the possibility of programming up to five filters. As well as the other graph, X Y graph has many configurations, saving and printing options. With UPVapor software, data analysts can save a valuable time during daily work and, as it is of easy utilization, it permits to other users to perform their own analysis without ask the analysts to develop. Besides, it can be used from any work centre with access to network framework. (Author)

  20. International Atomic Energy Agency intercomparison of ion beam analysis software

    Energy Technology Data Exchange (ETDEWEB)

    Barradas, N.P. [Instituto Tecnologico e Nuclear, Estrada Nacional No. 10, Apartado 21, 2686-953 Sacavem (Portugal); Centro de Fisica Nuclear da Universidade de Lisboa, Avenida do Professor Gama Pinto 2, 1649-003 Lisboa (Portugal)], E-mail: nunoni@itn.pt; Arstila, K. [K.U. Leuven, Instituut voor Kern-en Stralingsfysica, Celestijnenlaan 200D, B-3001 Leuven (Belgium); Battistig, G. [MFA Research Institute for Technical Physics and Materials Science, P.O. Box 49, H-1525 Budapest (Hungary); Bianconi, M. [CNR-IMM-Sezione di Bologna, Via P. Gobetti, 101, I-40129 Bologna (Italy); Dytlewski, N. [International Atomic Energy Agency, Wagramer Strasse 5, P.O. Box 100, A-1400 Vienna (Austria); Jeynes, C. [Surrey Ion Beam Centre, University of Surrey, Guildford, Surrey GU2 7XH (United Kingdom); Kotai, E. [KFKI Research Institute for Particle and Nuclear Physics, P.O. Box 49, H-1525 Budapest (Hungary); Lulli, G. [CNR-IMM-Sezione di Bologna, Via P. Gobetti, 101, I-40129 Bologna (Italy); Mayer, M. [Max-Planck-Institut fuer Plasmaphysik, EURATOM Association, Boltzmannstrasse 2, D-85748 Garching (Germany); Rauhala, E. [Accelerator Laboratory, Department of Physics, University of Helsinki, P.O. Box 43, FIN-00014 Helsinki (Finland); Szilagyi, E. [KFKI Research Institute for Particle and Nuclear Physics, P.O. Box 49, H-1525 Budapest (Hungary); Thompson, M. [Department of MS and E/Bard Hall 328, Cornell University, Ithaca, NY 14853 (United States)

    2007-09-15

    Ion beam analysis (IBA) includes a group of techniques for the determination of elemental concentration depth profiles of thin film materials. Often the final results rely on simulations, fits and calculations, made by dedicated codes written for specific techniques. Here we evaluate numerical codes dedicated to the analysis of Rutherford backscattering spectrometry, non-Rutherford elastic backscattering spectrometry, elastic recoil detection analysis and non-resonant nuclear reaction analysis data. Several software packages have been presented and made available to the community. New codes regularly appear, and old codes continue to be used and occasionally updated and expanded. However, those codes have to date not been validated, or even compared to each other. Consequently, IBA practitioners use codes whose validity, correctness and accuracy have never been validated beyond the authors' efforts. In this work, we present the results of an IBA software intercomparison exercise, where seven different packages participated. These were DEPTH, GISA, DataFurnace (NDF), RBX, RUMP, SIMNRA (all analytical codes) and MCERD (a Monte Carlo code). In a first step, a series of simulations were defined, testing different capabilities of the codes, for fixed conditions. In a second step, a set of real experimental data were analysed. The main conclusion is that the codes perform well within the limits of their design, and that the largest differences in the results obtained are due to differences in the fundamental databases used (stopping power and scattering cross section). In particular, spectra can be calculated including Rutherford cross sections with screening, energy resolution convolutions including energy straggling, and pileup effects, with agreement between the codes available at the 0.1% level. This same agreement is also available for the non-RBS techniques. This agreement is not limited to calculation of spectra from particular structures with predetermined

  1. Sample preparation techniques of biological material for isotope analysis

    International Nuclear Information System (INIS)

    Axmann, H.; Sebastianelli, A.; Arrillaga, J.L.

    1990-01-01

    Sample preparation is an essential step in all isotope-aided experiments but often it is not given enough attention. The methods of sample preparation are very important to obtain reliable and precise analytical data and for further interpretation of results. The size of a sample required for chemical analysis is usually very small (10mg-1500mg). On the other hand the amount of harvested plant material from plots in a field experiment is often bulky (several kilograms) and the entire sample is too large for processing. In addition, while approaching maturity many crops show not only differences in physical consistency but also a non-uniformity in 15 N content among plant parts, requiring a plant fractionation or separation into parts (vegetative and reproductive) e.g. shoots and spikes, in case of small grain cereals, shoots and pods in case of grain legumes and tops and roots or beets (including crown) in case of sugar beet, etc. In any case the ultimate goal of these procedures is to obtain representative subsample harvested from greenhouse or field experiments for chemical analysis. Before harvesting an isotopic-aided experiment the method of sampling has to be selected. It should be based on the type of information required in relation to the objectives of the research and the availability of resources (staff, sample preparation equipment, analytical facilities, chemicals and supplies, etc.). 10 refs, 3 figs, 3 tabs

  2. Research and Development of Statistical Analysis Software System of Maize Seedling Experiment

    OpenAIRE

    Hui Cao

    2014-01-01

    In this study, software engineer measures were used to develop a set of software system for maize seedling experiments statistics and analysis works. During development works, B/S structure software design method was used and a set of statistics indicators for maize seedling evaluation were established. The experiments results indicated that this set of software system could finish quality statistics and analysis for maize seedling very well. The development of this software system explored a...

  3. A Parallel Software Pipeline for DMET Microarray Genotyping Data Analysis

    Directory of Open Access Journals (Sweden)

    Giuseppe Agapito

    2018-06-01

    Full Text Available Personalized medicine is an aspect of the P4 medicine (predictive, preventive, personalized and participatory based precisely on the customization of all medical characters of each subject. In personalized medicine, the development of medical treatments and drugs is tailored to the individual characteristics and needs of each subject, according to the study of diseases at different scales from genotype to phenotype scale. To make concrete the goal of personalized medicine, it is necessary to employ high-throughput methodologies such as Next Generation Sequencing (NGS, Genome-Wide Association Studies (GWAS, Mass Spectrometry or Microarrays, that are able to investigate a single disease from a broader perspective. A side effect of high-throughput methodologies is the massive amount of data produced for each single experiment, that poses several challenges (e.g., high execution time and required memory to bioinformatic software. Thus a main requirement of modern bioinformatic softwares, is the use of good software engineering methods and efficient programming techniques, able to face those challenges, that include the use of parallel programming and efficient and compact data structures. This paper presents the design and the experimentation of a comprehensive software pipeline, named microPipe, for the preprocessing, annotation and analysis of microarray-based Single Nucleotide Polymorphism (SNP genotyping data. A use case in pharmacogenomics is presented. The main advantages of using microPipe are: the reduction of errors that may happen when trying to make data compatible among different tools; the possibility to analyze in parallel huge datasets; the easy annotation and integration of data. microPipe is available under Creative Commons license, and is freely downloadable for academic and not-for-profit institutions.

  4. Comparison of two software versions for assessment of body-composition analysis by DXA

    DEFF Research Database (Denmark)

    Vozarova, B; Wang, J; Weyer, C

    2001-01-01

    To compare two software versions provided by Lunar CO: for assessment of body composition analysis by DXA.......To compare two software versions provided by Lunar CO: for assessment of body composition analysis by DXA....

  5. Existing and emerging technologies for measuring stable isotope labelled retinol in biological samples: isotope dilution analysis of body retinol stores.

    Science.gov (United States)

    Preston, Tom

    2014-01-01

    This paper discusses some of the recent improvements in instrumentation used for stable isotope tracer measurements in the context of measuring retinol stores, in vivo. Tracer costs, together with concerns that larger tracer doses may perturb the parameter under study, demand that ever more sensitive mass spectrometric techniques are developed. GCMS is the most widely used technique. It has high sensitivity in terms of sample amount and uses high resolution GC, yet its ability to detect low isotope ratios is limited by background noise. LCMSMS may become more accessible for tracer studies. Its ability to measure low level stable isotope tracers may prove superior to GCMS, but it is isotope ratio MS (IRMS) that has been designed specifically for low level stable isotope analysis through accurate analysis of tracer:tracee ratios (the tracee being the unlabelled species). Compound-specific isotope analysis, where GC is interfaced to IRMS, is gaining popularity. Here, individual 13C-labelled compounds are separated by GC, combusted to CO2 and transferred on-line for ratiometric analysis by IRMS at the ppm level. However, commercially-available 13C-labelled retinol tracers are 2 - 4 times more expensive than deuterated tracers. For 2H-labelled compounds, GC-pyrolysis-IRMS has now become more generally available as an operating mode on the same IRMS instrument. Here, individual compounds are separated by GC and pyrolysed to H2 at high temperature for analysis by IRMS. It is predicted that GC-pyrolysis-IRMS will facilitate low level tracer procedures to measure body retinol stores, as has been accomplished in the case of fatty acids and amino acids. Sample size requirements for GC-P-IRMS may exceed those of GCMS, but this paper discusses sample preparation procedures and predicts improvements, particularly in the efficiency of sample introduction.

  6. Graph based communication analysis for hardware/software codesign

    DEFF Research Database (Denmark)

    Knudsen, Peter Voigt; Madsen, Jan

    1999-01-01

    In this paper we present a coarse grain CDFG (Control/Data Flow Graph) model suitable for hardware/software partitioning of single processes and demonstrate how it is necessary to perform various transformations on the graph structure before partitioning in order to achieve a structure that allows...... for accurate estimation of communication overhead between nodes mapped to different processors. In particular, we demonstrate how various transformations of control structures can lead to a more accurate communication analysis and more efficient implementations. The purpose of the transformations is to obtain...

  7. Development of RCM analysis software for Korean nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Young Ho; Choi, Kwang Hee; Jeong, Hyeong Jong [Korea Electric Power Research Institute, Taejon (Korea, Republic of)

    1999-12-31

    A software called KEPCO RCM workstation (KRCM) has been developed to optimize the maintenance strategies of Korean nuclear power plants. The program modules of the KRCM were designed in a manner that combines EPRI methodologies and KEPRI analysis technique. The KRCM is being applied to the three pilot system, chemical and volume control system, main steam system, and compressed air system of Yonggwang Units 1 and 2. In addition, the KRCM can be utilized as a tool to meet a part of the requirements of maintenance rule (MR) imposed by U.S. NRC. 3 refs., 4 figs. (Author)

  8. Development of RCM analysis software for Korean nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Young Ho; Choi, Kwang Hee; Jeong, Hyeong Jong [Korea Electric Power Research Institute, Taejon (Korea, Republic of)

    1998-12-31

    A software called KEPCO RCM workstation (KRCM) has been developed to optimize the maintenance strategies of Korean nuclear power plants. The program modules of the KRCM were designed in a manner that combines EPRI methodologies and KEPRI analysis technique. The KRCM is being applied to the three pilot system, chemical and volume control system, main steam system, and compressed air system of Yonggwang Units 1 and 2. In addition, the KRCM can be utilized as a tool to meet a part of the requirements of maintenance rule (MR) imposed by U.S. NRC. 3 refs., 4 figs. (Author)

  9. Gamma camera image processing and graphical analysis mutual software system

    International Nuclear Information System (INIS)

    Wang Zhiqian; Chen Yongming; Ding Ailian; Ling Zhiye; Jin Yongjie

    1992-01-01

    GCCS gamma camera image processing and graphical analysis system is a special mutual software system. It is mainly used to analyse various patient data acquired from gamma camera. This system is used on IBM PC, PC/XT or PC/AT. It consists of several parts: system management, data management, device management, program package and user programs. The system provides two kinds of user interfaces: command menu and command characters. It is easy to change and enlarge this system because it is best modularized. The user programs include almost all the clinical protocols used now

  10. Discriminant Analysis of the Effects of Software Cost Drivers on ...

    African Journals Online (AJOL)

    The paper work investigates the effect of software cost drivers on project schedule estimation of software development projects in Nigeria. Specifically, the paper determines the extent to which software cost variables affect our software project time schedule in our environment. Such studies are lacking in the recent ...

  11. Tool Support for Parametric Analysis of Large Software Simulation Systems

    Science.gov (United States)

    Schumann, Johann; Gundy-Burlet, Karen; Pasareanu, Corina; Menzies, Tim; Barrett, Tony

    2008-01-01

    The analysis of large and complex parameterized software systems, e.g., systems simulation in aerospace, is very complicated and time-consuming due to the large parameter space, and the complex, highly coupled nonlinear nature of the different system components. Thus, such systems are generally validated only in regions local to anticipated operating points rather than through characterization of the entire feasible operational envelope of the system. We have addressed the factors deterring such an analysis with a tool to support envelope assessment: we utilize a combination of advanced Monte Carlo generation with n-factor combinatorial parameter variations to limit the number of cases, but still explore important interactions in the parameter space in a systematic fashion. Additional test-cases, automatically generated from models (e.g., UML, Simulink, Stateflow) improve the coverage. The distributed test runs of the software system produce vast amounts of data, making manual analysis impossible. Our tool automatically analyzes the generated data through a combination of unsupervised Bayesian clustering techniques (AutoBayes) and supervised learning of critical parameter ranges using the treatment learner TAR3. The tool has been developed around the Trick simulation environment, which is widely used within NASA. We will present this tool with a GN&C (Guidance, Navigation and Control) simulation of a small satellite system.

  12. Development of a software for INAA analysis automation

    International Nuclear Information System (INIS)

    Zahn, Guilherme S.; Genezini, Frederico A.; Figueiredo, Ana Maria G.; Ticianelli, Regina B.

    2013-01-01

    In this work, a software to automate the post-counting tasks in comparative INAA has been developed that aims to become more flexible than the available options, integrating itself with some of the routines currently in use in the IPEN Activation Analysis Laboratory and allowing the user to choose between a fully-automatic analysis or an Excel-oriented one. The software makes use of the Genie 2000 data importing and analysis routines and stores each 'energy-counts-uncertainty' table as a separate ASCII file that can be used later on if required by the analyst. Moreover, it generates an Excel-compatible CSV (comma separated values) file with only the relevant results from the analyses for each sample or comparator, as well as the results of the concentration calculations and the results obtained with four different statistical tools (unweighted average, weighted average, normalized residuals and Rajeval technique), allowing the analyst to double-check the results. Finally, a 'summary' CSV file is also produced, with the final concentration results obtained for each element in each sample. (author)

  13. Models for composing software : an analysis of software composition and objects

    NARCIS (Netherlands)

    Bergmans, Lodewijk

    1999-01-01

    In this report, we investigate component-based software construction with a focus on composition. In particular we try to analyze the requirements and issues for components and software composition. As a means to understand this research area, we introduce a canonical model for representing

  14. PROTEINCHALLENGE: Crowd sourcing in proteomics analysis and software development

    DEFF Research Database (Denmark)

    Martin, Sarah F.; Falkenberg, Heiner; Dyrlund, Thomas Franck

    2013-01-01

    , including arguments for community-wide open source software development and “big data” compatible solutions for the future. For the meantime, we have laid out ten top tips for data processing. With these at hand, a first large-scale proteomics analysis hopefully becomes less daunting to navigate.......However there is clearly a real need for robust tools, standard operating procedures and general acceptance of best practises. Thus we submit to the proteomics community a call for a community-wide open set of proteomics analysis challenges—PROTEINCHALLENGE—that directly target and compare data analysis workflows......In large-scale proteomics studies there is a temptation, after months of experimental work, to plug resulting data into a convenient—if poorly implemented—set of tools, which may neither do the data justice nor help answer the scientific question. In this paper we have captured key concerns...

  15. Automatic measurement system for light element isotope analysis

    International Nuclear Information System (INIS)

    Satake, Hiroshi; Ikegami, Kouichi.

    1990-01-01

    The automatic measurement system for the light element isotope analysis was developed by installing the specially designed inlet system which was controlled by a computer. The microcomputer system contains specific interface boards for the inlet system and the mass spectrometer, Micromass 602 E. All the components of the inlet and the computer system installed are easily available in Japan. Ten samples can be automatically measured as a maximum of. About 160 minutes are required for 10 measurements of δ 18 O values of CO 2 . Thus four samples can be measured per an hour using this system, while usually three samples for an hour using the manual operation. The automatized analysis system clearly has an advantage over the conventional method. This paper describes the details of this automated system, such as apparatuses used, the control procedure and the correction for reliable measurement. (author)

  16. Hazard Analysis of Software Requirements Specification for Process Module of FPGA-based Controllers in NPP

    Energy Technology Data Exchange (ETDEWEB)

    Jung; Sejin; Kim, Eui-Sub; Yoo, Junbeom [Konkuk University, Seoul (Korea, Republic of); Keum, Jong Yong; Lee, Jang-Soo [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2016-10-15

    Software in PLC, FPGA which are used to develop I and C system also should be analyzed to hazards and risks before used. NUREG/CR-6430 proposes the method for performing software hazard analysis. It suggests analysis technique for software affected hazards and it reveals that software hazard analysis should be performed with the aspects of software life cycle such as requirements analysis, design, detailed design, implements. It also provides the guide phrases for applying software hazard analysis. HAZOP (Hazard and operability analysis) is one of the analysis technique which is introduced in NUREG/CR-6430 and it is useful technique to use guide phrases. HAZOP is sometimes used to analyze the safety of software. Analysis method of NUREG/CR-6430 had been used in Korea nuclear power plant software for PLC development. Appropriate guide phrases and analysis process are selected to apply efficiently and NUREG/CR-6430 provides applicable methods for software hazard analysis is identified in these researches. We perform software hazard analysis of FPGA software requirements specification with two approaches which are NUREG/CR-6430 and HAZOP with using general GW. We also perform the comparative analysis with them. NUREG/CR-6430 approach has several pros and cons comparing with the HAZOP with general guide words and approach. It is enough applicable to analyze the software requirements specification of FPGA.

  17. Don't Blame the Software: Using Qualitative Data Analysis Software Successfully in Doctoral Research

    Directory of Open Access Journals (Sweden)

    Michelle Salmona

    2016-07-01

    Full Text Available In this article, we explore the learning experiences of doctoral candidates as they use qualitative data analysis software (QDAS. Of particular interest is the process of adopting technology during the development of research methodology. Using an action research approach, data was gathered over five years from advanced doctoral research candidates and supervisors. The technology acceptance model (TAM was then applied as a theoretical analytic lens for better understanding how students interact with new technology. Findings relate to two significant barriers which doctoral students confront: 1. aligning perceptions of ease of use and usefulness is essential in overcoming resistance to technological change; 2. transparency into the research process through technology promotes insights into methodological challenges. Transitioning through both barriers requires a competent foundation in qualitative research. The study acknowledges the importance of higher degree research, curriculum reform and doctoral supervision in post-graduate research training together with their interconnected relationships in support of high-quality inquiry. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs1603117

  18. eXtended CASA Line Analysis Software Suite (XCLASS)

    Science.gov (United States)

    Möller, T.; Endres, C.; Schilke, P.

    2017-02-01

    The eXtended CASA Line Analysis Software Suite (XCLASS) is a toolbox for the Common Astronomy Software Applications package (CASA) containing new functions for modeling interferometric and single dish data. Among the tools is the myXCLASS program which calculates synthetic spectra by solving the radiative transfer equation for an isothermal object in one dimension, whereas the finite source size and dust attenuation are considered as well. Molecular data required by the myXCLASS program are taken from an embedded SQLite3 database containing entries from the Cologne Database for Molecular Spectroscopy (CDMS) and JPL using the Virtual Atomic and Molecular Data Center (VAMDC) portal. Additionally, the toolbox provides an interface for the model optimizer package Modeling and Analysis Generic Interface for eXternal numerical codes (MAGIX), which helps to find the best description of observational data using myXCLASS (or another external model program), that is, finding the parameter set that most closely reproduces the data. http://www.astro.uni-koeln.de/projects/schilke/myXCLASSInterface A copy of the code is available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (http://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/598/A7

  19. System and software safety analysis for the ERA control computer

    International Nuclear Information System (INIS)

    Beerthuizen, P.G.; Kruidhof, W.

    2001-01-01

    The European Robotic Arm (ERA) is a seven degrees of freedom relocatable anthropomorphic robotic manipulator system, to be used in manned space operation on the International Space Station, supporting the assembly and external servicing of the Russian segment. The safety design concept and implementation of the ERA is described, in particular with respect to the central computer's software design. A top-down analysis and specification process is used to down flow the safety aspects of the ERA system towards the subsystems, which are produced by a consortium of companies in many countries. The user requirements documents and the critical function list are the key documents in this process. Bottom-up analysis (FMECA) and test, on both subsystem and system level, are the basis for safety verification. A number of examples show the use of the approach and methods used

  20. Compositional Solution Space Quantification for Probabilistic Software Analysis

    Science.gov (United States)

    Borges, Mateus; Pasareanu, Corina S.; Filieri, Antonio; d'Amorim, Marcelo; Visser, Willem

    2014-01-01

    Probabilistic software analysis aims at quantifying how likely a target event is to occur during program execution. Current approaches rely on symbolic execution to identify the conditions to reach the target event and try to quantify the fraction of the input domain satisfying these conditions. Precise quantification is usually limited to linear constraints, while only approximate solutions can be provided in general through statistical approaches. However, statistical approaches may fail to converge to an acceptable accuracy within a reasonable time. We present a compositional statistical approach for the efficient quantification of solution spaces for arbitrarily complex constraints over bounded floating-point domains. The approach leverages interval constraint propagation to improve the accuracy of the estimation by focusing the sampling on the regions of the input domain containing the sought solutions. Preliminary experiments show significant improvement on previous approaches both in results accuracy and analysis time.

  1. HistFitter software framework for statistical data analysis

    CERN Document Server

    Baak, M.; Côte, D.; Koutsman, A.; Lorenz, J.; Short, D.

    2015-01-01

    We present a software framework for statistical data analysis, called HistFitter, that has been used extensively by the ATLAS Collaboration to analyze big datasets originating from proton-proton collisions at the Large Hadron Collider at CERN. Since 2012 HistFitter has been the standard statistical tool in searches for supersymmetric particles performed by ATLAS. HistFitter is a programmable and flexible framework to build, book-keep, fit, interpret and present results of data models of nearly arbitrary complexity. Starting from an object-oriented configuration, defined by users, the framework builds probability density functions that are automatically fitted to data and interpreted with statistical tests. A key innovation of HistFitter is its design, which is rooted in core analysis strategies of particle physics. The concepts of control, signal and validation regions are woven into its very fabric. These are progressively treated with statistically rigorous built-in methods. Being capable of working with mu...

  2. Gamma-ray spectral analysis software designed for extreme ease of use or unattended operation

    International Nuclear Information System (INIS)

    Buckley, W.M.; Carlson, J.B.; Romine, W.A.

    1993-07-01

    We are developing isotopic analysis software in the Safeguards Technology Program that advances usability in two complimentary directions. The first direction is towards Graphical User Interfaces (GUIs) for very easy. to use applications. The second is toward a minimal user interface, but with additional features for unattended or fully automatic applications. We are developing a GUI-based spectral viewing engine that is currently running in the MS-Windows environment. We intend to use this core application to provide the common user interface for our data analysis, and subsequently data acquisition and instrument control applications. We are also investigating sets of cases where the MGA methodology produces reduced accuracy results, incorrect errors, or incorrect results. We try to determine the root cause for the problem and extend the methodology or replace portions of the Methodology so that MGA will function over a wider domain of analysis without requiring intervention and analysis by a spectroscopist. This effort is necessary for applications where such intervention is inconvenient or impractical

  3. An Analysis of Related Software Cycles Among Organizations, People and the Software Industry

    National Research Council Canada - National Science Library

    Moore, Robert; Adams, Brady

    2008-01-01

    .... This thesis intends to explore the moderating factors of these three distinct and disjointed cycles and propose courses of action towards mitigating various issues and problems inherent in the software upgrade process...

  4. Using CASE Software to Teach Undergraduates Systems Analysis and Design.

    Science.gov (United States)

    Wilcox, Russell E.

    1988-01-01

    Describes the design and delivery of a college course for information system students utilizing a Computer-Aided Software Engineering program. Discusses class assignments, cooperative learning, student attitudes, and the advantages of using this software in the course. (CW)

  5. Redox substoichiometric isotope dilution analysis of metallic arsenic for antimony

    International Nuclear Information System (INIS)

    Kambara, Tomihisa; Yoshioka, Hiroe; Suzuki, Junsuke; Shibata, Yasue.

    1979-01-01

    In 1 M HCl solution Sb(III) reacts with N-benzoyl-N-phenylhydroxylamine (BPHA) to form a complex extractable into chloroform while the extraction of Sb(V) is negligible. The redox substoichiometric isotope dilution analysis based on this reaction was applied to the determination of antimony in metallic arsenic. After the dissolution of metallic arsenic, Sb(V) was separated from As(V) by a tribenzylamine extraction from 8 M HCl solution and the extracted Sb(V) was stripped into 0.5 M NaOH solution. Thereafter, all the Sb(V) were completely reduced to Sb(III) by bubbling SO 2 gas through 3 M HCl solution. As the substoichiometric reaction, the oxidation of Sb(III) to Sb(V) by a substoichiometric amount of potassium dichromate was used, followed by separation of these species by the BPHA extraction of Sb(III). The substoichiometric oxidation of Sb(III) was found to be quantitative over HCl concentration range from 0.8 to 1.2 M. The amount of antimony was determined by isotope dilution analysis using the method of carrier amount variation. By the present method the determination of as small as 0.36 μg antimony was accomplished with a good accuracy (relative error; 5.6%) and also the method was successfully applied to the determination of antimony in arsenic samples containing known amounts of Sb(III) and in metallic arsenic. The present method gives reliable results with the good accuracy and precision. (author)

  6. HTSSIP: An R package for analysis of high throughput sequencing data from nucleic acid stable isotope probing (SIP experiments.

    Directory of Open Access Journals (Sweden)

    Nicholas D Youngblut

    Full Text Available Combining high throughput sequencing with stable isotope probing (HTS-SIP is a powerful method for mapping in situ metabolic processes to thousands of microbial taxa. However, accurately mapping metabolic processes to taxa is complex and challenging. Multiple HTS-SIP data analysis methods have been developed, including high-resolution stable isotope probing (HR-SIP, multi-window high-resolution stable isotope probing (MW-HR-SIP, quantitative stable isotope probing (qSIP, and ΔBD. Currently, there is no publicly available software designed specifically for analyzing HTS-SIP data. To address this shortfall, we have developed the HTSSIP R package, an open-source, cross-platform toolset for conducting HTS-SIP analyses in a straightforward and easily reproducible manner. The HTSSIP package, along with full documentation and examples, is available from CRAN at https://cran.r-project.org/web/packages/HTSSIP/index.html and Github at https://github.com/buckleylab/HTSSIP.

  7. Software for 3D diagnostic image reconstruction and analysis

    International Nuclear Information System (INIS)

    Taton, G.; Rokita, E.; Sierzega, M.; Klek, S.; Kulig, J.; Urbanik, A.

    2005-01-01

    Recent advances in computer technologies have opened new frontiers in medical diagnostics. Interesting possibilities are the use of three-dimensional (3D) imaging and the combination of images from different modalities. Software prepared in our laboratories devoted to 3D image reconstruction and analysis from computed tomography and ultrasonography is presented. In developing our software it was assumed that it should be applicable in standard medical practice, i.e. it should work effectively with a PC. An additional feature is the possibility of combining 3D images from different modalities. The reconstruction and data processing can be conducted using a standard PC, so low investment costs result in the introduction of advanced and useful diagnostic possibilities. The program was tested on a PC using DICOM data from computed tomography and TIFF files obtained from a 3D ultrasound system. The results of the anthropomorphic phantom and patient data were taken into consideration. A new approach was used to achieve spatial correlation of two independently obtained 3D images. The method relies on the use of four pairs of markers within the regions under consideration. The user selects the markers manually and the computer calculates the transformations necessary for coupling the images. The main software feature is the possibility of 3D image reconstruction from a series of two-dimensional (2D) images. The reconstructed 3D image can be: (1) viewed with the most popular methods of 3D image viewing, (2) filtered and processed to improve image quality, (3) analyzed quantitatively (geometrical measurements), and (4) coupled with another, independently acquired 3D image. The reconstructed and processed 3D image can be stored at every stage of image processing. The overall software performance was good considering the relatively low costs of the hardware used and the huge data sets processed. The program can be freely used and tested (source code and program available at

  8. ATLAS tile calorimeter cesium calibration control and analysis software

    International Nuclear Information System (INIS)

    Solovyanov, O; Solodkov, A; Starchenko, E; Karyukhin, A; Isaev, A; Shalanda, N

    2008-01-01

    An online control system to calibrate and monitor ATLAS Barrel hadronic calorimeter (TileCal) with a movable radioactive source, driven by liquid flow, is described. To read out and control the system an online software has been developed, using ATLAS TDAQ components like DVS (Diagnostic and Verification System) to verify the hardware before running, IS (Information Server) for data and status exchange between networked computers, and other components like DDC (DCS to DAQ Connection), to connect to PVSS-based slow control systems of Tile Calorimeter, high voltage and low voltage. A system of scripting facilities, based on Python language, is used to handle all the calibration and monitoring processes from hardware perspective to final data storage, including various abnormal situations. A QT based graphical user interface to display the status of the calibration system during the cesium source scan is described. The software for analysis of the detector response, using online data, is discussed. Performance of the system and first experience from the ATLAS pit are presented

  9. Phenomenology and Qualitative Data Analysis Software (QDAS: A Careful Reconciliation

    Directory of Open Access Journals (Sweden)

    Brian Kelleher Sohn

    2017-01-01

    Full Text Available An oft-cited phenomenological methodologist, Max VAN MANEN (2014, claims that qualitative data analysis software (QDAS is not an appropriate tool for phenomenological research. Yet phenomenologists rarely describe how phenomenology is to be done: pencil, paper, computer? DAVIDSON and DI GREGORIO (2011 urge QDAS contrarians such as VAN MANEN to get over their methodological loyalties and join the digital world, claiming that all qualitative researchers, whatever their methodology, perform processes aided by QDAS: disaggregation and recontextualization of texts. Other phenomenologists exemplify DAVIDSON and DI GREGORIO's observation that arguments against QDAS often identify problems more closely related to the researchers than QDAS. But the concerns about technology of McLUHAN (2003 [1964], HEIDEGGER (2008 [1977], and FLUSSER (2013 cannot be ignored. In this conceptual article I answer the questions of phenomenologists and the call of QDAS methodologists to describe how I used QDAS to carry out a phenomenological study in order to guide others who choose to reconcile the use of software to assist their research. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs1701142

  10. ATLAS tile calorimeter cesium calibration control and analysis software

    Energy Technology Data Exchange (ETDEWEB)

    Solovyanov, O; Solodkov, A; Starchenko, E; Karyukhin, A; Isaev, A; Shalanda, N [Institute for High Energy Physics, Protvino 142281 (Russian Federation)], E-mail: Oleg.Solovyanov@ihep.ru

    2008-07-01

    An online control system to calibrate and monitor ATLAS Barrel hadronic calorimeter (TileCal) with a movable radioactive source, driven by liquid flow, is described. To read out and control the system an online software has been developed, using ATLAS TDAQ components like DVS (Diagnostic and Verification System) to verify the hardware before running, IS (Information Server) for data and status exchange between networked computers, and other components like DDC (DCS to DAQ Connection), to connect to PVSS-based slow control systems of Tile Calorimeter, high voltage and low voltage. A system of scripting facilities, based on Python language, is used to handle all the calibration and monitoring processes from hardware perspective to final data storage, including various abnormal situations. A QT based graphical user interface to display the status of the calibration system during the cesium source scan is described. The software for analysis of the detector response, using online data, is discussed. Performance of the system and first experience from the ATLAS pit are presented.

  11. Experimental software for modeling and interpreting educational data analysis processes

    Directory of Open Access Journals (Sweden)

    Natalya V. Zorina

    2017-12-01

    Full Text Available Problems, tasks and processes of educational data mining are considered in this article. The objective is to create a fundamentally new information system of the University using the results educational data analysis. One of the functions of such a system is knowledge extraction from accumulated in the operation process data. The creation of the national system of this type is an iterative and time-consuming process requiring the preliminary studies and incremental prototyping modules. The novelty of such systems is that there is a lack of those using this methodology of the development, for this purpose a number of experiments was carried out in order to collect data, choose appropriate methods for the study and to interpret them. As a result of the experiment, the authors were available sources available for analysis in the information environment of the home university. The data were taken from the semester performance, obtained from the information system of the training department of the Institute of IT MTU MIREA, the data obtained as a result of the independent work of students and data, using specially designed Google-forms. To automate the collection of information and analysis of educational data, an experimental software package was created. As a methodology for developing the experimental software complex, a decision was made using the methodologies of rational-empirical complexes (REX and single-experimentation program technologies (TPEI. The details of the program implementation of the complex are described in detail, conclusions are given about the availability of the data sources used, and conclusions are drawn about the prospects for further development.

  12. Visual data mining and analysis of software repositories

    NARCIS (Netherlands)

    Voinea, S.L.; Telea, A.C.

    2007-01-01

    In this article we describe an ongoing effort to integrate information visualization techniques into the process of configuration management for software systems. Our focus is to help software engineers manage the evolution of large and complex software systems by offering them effective and

  13. Metabolic Flux Analysis in Isotope Labeling Experiments Using the Adjoint Approach.

    Science.gov (United States)

    Mottelet, Stephane; Gaullier, Gil; Sadaka, Georges

    2017-01-01

    Comprehension of metabolic pathways is considerably enhanced by metabolic flux analysis (MFA-ILE) in isotope labeling experiments. The balance equations are given by hundreds of algebraic (stationary MFA) or ordinary differential equations (nonstationary MFA), and reducing the number of operations is therefore a crucial part of reducing the computation cost. The main bottleneck for deterministic algorithms is the computation of derivatives, particularly for nonstationary MFA. In this article, we explain how the overall identification process may be speeded up by using the adjoint approach to compute the gradient of the residual sum of squares. The proposed approach shows significant improvements in terms of complexity and computation time when it is compared with the usual (direct) approach. Numerical results are obtained for the central metabolic pathways of Escherichia coli and are validated against reference software in the stationary case. The methods and algorithms described in this paper are included in the sysmetab software package distributed under an Open Source license at http://forge.scilab.org/index.php/p/sysmetab/.

  14. Multivariate Stable Isotope Analysis to Determine Linkages between Benzocaine Seizures

    Science.gov (United States)

    Kemp, H. F.; Meier-Augenstein, W.; Collins, M.; Salouros, H.; Cunningham, A.; Harrison, M.

    2012-04-01

    In July 2010, a woman was jailed for nine years in the UK after the prosecution successfully argued that attempting to import a cutting agent was proof of involvement in a conspiracy to supply Cocaine. That landmark ruling provided law enforcement agencies with much greater scope to tackle those involved in this aspect of the drug trade, specifically targeting those importing the likes of benzocaine or lidocaine. Huge quantities of these compounds are imported into the UK and between May and August 2010, four shipments of Benzocaine amounting to more then 4 tons had been seized as part of Operation Kitley, a joint initiative between the UK Border Agency and the Serious Organised Crime Agency (SOCA). By diluting cocaine, traffickers can make it go a lot further for very little cost, leading to huge profits. In recent years, dealers have moved away from inert substances, like sugar and baby milk powder, in favour of active pharmaceutical ingredients (APIs), including anaesthetics like Benzocaine and Lidocaine. Both these mimic the numbing effect of cocaine, and resemble it closely in colour, texture and some chemical behaviours, making it easier to conceal the fact that the drug has been diluted. API cutting agents have helped traffickers to maintain steady supplies in the face of successful interdiction and even expand the market in the UK, particularly to young people aged from their mid teens to early twenties. From importation to street-level, the purity of the drug can be reduced up to a factor of 80 and street level cocaine can have a cocaine content as low as 1%. In view of the increasing use of Benzocaine as cutting agent for Cocaine, a study was carried out to investigate if 2H, 13C, 15N and 18O stable isotope signatures could be used in conjunction with multivariate chemometric data analysis to determine potential linkage between benzocaine exhibits seized from different locations or individuals to assist with investigation and prosecution of drug

  15. Automated software analysis of nuclear core discharge data

    International Nuclear Information System (INIS)

    Larson, T.W.; Halbig, J.K.; Howell, J.A.; Eccleston, G.W.; Klosterbuer, S.F.

    1993-03-01

    Monitoring the fueling process of an on-load nuclear reactor is a full-time job for nuclear safeguarding agencies. Nuclear core discharge monitors (CDMS) can provide continuous, unattended recording of the reactor's fueling activity for later, qualitative review by a safeguards inspector. A quantitative analysis of this collected data could prove to be a great asset to inspectors because more information can be extracted from the data and the analysis time can be reduced considerably. This paper presents a prototype for an automated software analysis system capable of identifying when fuel bundle pushes occurred and monitoring the power level of the reactor. Neural network models were developed for calculating the region on the reactor face from which the fuel was discharged and predicting the burnup. These models were created and tested using actual data collected from a CDM system at an on-load reactor facility. Collectively, these automated quantitative analysis programs could help safeguarding agencies to gain a better perspective on the complete picture of the fueling activity of an on-load nuclear reactor. This type of system can provide a cost-effective solution for automated monitoring of on-load reactors significantly reducing time and effort

  16. Integrated Software Environment for Pressurized Thermal Shock Analysis

    Directory of Open Access Journals (Sweden)

    Dino Araneo

    2011-01-01

    Full Text Available The present paper describes the main features and an application to a real Nuclear Power Plant (NPP of an Integrated Software Environment (in the following referred to as “platform” developed at University of Pisa (UNIPI to perform Pressurized Thermal Shock (PTS analysis. The platform is written in Java for the portability and it implements all the steps foreseen in the methodology developed at UNIPI for the deterministic analysis of PTS scenarios. The methodology starts with the thermal hydraulic analysis of the NPP with a system code (such as Relap5-3D and Cathare2, during a selected transient scenario. The results so obtained are then processed to provide boundary conditions for the next step, that is, a CFD calculation. Once the system pressure and the RPV wall temperature are known, the stresses inside the RPV wall can be calculated by mean a Finite Element (FE code. The last step of the methodology is the Fracture Mechanics (FM analysis, using weight functions, aimed at evaluating the stress intensity factor (KI at crack tip to be compared with the critical stress intensity factor KIc. The platform automates all these steps foreseen in the methodology once the user specifies a number of boundary conditions at the beginning of the simulation.

  17. Cost Analysis of Poor Quality Using a Software Simulation

    Directory of Open Access Journals (Sweden)

    Jana Fabianová

    2017-02-01

    Full Text Available The issues of quality, cost of poor quality and factors affecting quality are crucial to maintaining a competitiveness regarding to business activities. Use of software applications and computer simulation enables more effective quality management. Simulation tools offer incorporating the variability of more variables in experiments and evaluating their common impact on the final output. The article presents a case study focused on the possibility of using computer simulation Monte Carlo in the field of quality management. Two approaches for determining the cost of poor quality are introduced here. One from retrospective scope of view, where the cost of poor quality and production process are calculated based on historical data. The second approach uses the probabilistic characteristics of the input variables by means of simulation, and reflects as a perspective view of the costs of poor quality. Simulation output in the form of a tornado and sensitivity charts complement the risk analysis.

  18. Knowledge-based requirements analysis for automating software development

    Science.gov (United States)

    Markosian, Lawrence Z.

    1988-01-01

    We present a new software development paradigm that automates the derivation of implementations from requirements. In this paradigm, informally-stated requirements are expressed in a domain-specific requirements specification language. This language is machine-understable and requirements expressed in it are captured in a knowledge base. Once the requirements are captured, more detailed specifications and eventually implementations are derived by the system using transformational synthesis. A key characteristic of the process is that the required human intervention is in the form of providing problem- and domain-specific engineering knowledge, not in writing detailed implementations. We describe a prototype system that applies the paradigm in the realm of communication engineering: the prototype automatically generates implementations of buffers following analysis of the requirements on each buffer.

  19. Nonlinear analysis of reinforced concrete structures using software package abaqus

    Directory of Open Access Journals (Sweden)

    Marković Nemanja

    2014-01-01

    Full Text Available Reinforced concrete (AB is characterized by huge inhomogeneity resulting from the material characteristics of the concrete, then, quasi-brittle behavior during failure. These and other phenomena require the introduction of material nonlinearity in the modeling of reinforced concrete structures. This paper presents the modeling reinforced concrete in the software package ABAQUS. A brief theoretical overview is presented of methods such as: Concrete Damage Plasticity (CDP, Smeared Concrete Cracking (CSC, Cap Plasticity (CP and Drucker-Prager model (DPM. We performed a nonlinear analysis of two-storey reinforced concrete frame by applying CDP method for modeling material nonlinearity of concrete. We have analyzed damage zones, crack propagation and loading-deflection ratio.

  20. The Database and Data Analysis Software of Radiation Monitoring System

    International Nuclear Information System (INIS)

    Wang Weizhen; Li Jianmin; Wang Xiaobing; Hua Zhengdong; Xu Xunjiang

    2009-01-01

    Shanghai Synchrotron Radiation Facility (SSRF for short) is a third-generation light source building in China, including a 150MeV injector, 3.5GeV booster, 3.5GeV storage ring and an amount of beam line stations. The data is fetched by the monitoring computer from collecting modules in the front end, and saved in the MySQL database in the managing computer. The data analysis software is coded with Python, a script language, to inquire, summarize and plot the data of a certain monitoring channel during a certain period and export to an external file. In addition, the warning event can be inquired separately. The website for historical and real-time data inquiry and plotting is coded with PHP. (authors)

  1. Image analysis software for following progression of peripheral neuropathy

    Science.gov (United States)

    Epplin-Zapf, Thomas; Miller, Clayton; Larkin, Sean; Hermesmeyer, Eduardo; Macy, Jenny; Pellegrini, Marco; Luccarelli, Saverio; Staurenghi, Giovanni; Holmes, Timothy

    2009-02-01

    A relationship has been reported by several research groups [1 - 4] between the density and shapes of nerve fibers in the cornea and the existence and severity of peripheral neuropathy. Peripheral neuropathy is a complication of several prevalent diseases or conditions, which include diabetes, HIV, prolonged alcohol overconsumption and aging. A common clinical technique for confirming the condition is intramuscular electromyography (EMG), which is invasive, so a noninvasive technique like the one proposed here carries important potential advantages for the physician and patient. A software program that automatically detects the nerve fibers, counts them and measures their shapes is being developed and tested. Tests were carried out with a database of subjects with levels of severity of diabetic neuropathy as determined by EMG testing. Results from this testing, that include a linear regression analysis are shown.

  2. PROTEINCHALLENGE: Crowd sourcing in proteomics analysis and software development

    DEFF Research Database (Denmark)

    Martin, Sarah F.; Falkenberg, Heiner; Dyrlund, Thomas Franck

    2013-01-01

    , including arguments for community-wide open source software development and “big data” compatible solutions for the future. For the meantime, we have laid out ten top tips for data processing. With these at hand, a first large-scale proteomics analysis hopefully becomes less daunting to navigate......, with the aim of setting a community-driven gold standard for data handling, reporting and sharing. This article is part of a Special Issue entitled: New Horizons and Applications for Proteomics [EuPA 2012].......In large-scale proteomics studies there is a temptation, after months of experimental work, to plug resulting data into a convenient—if poorly implemented—set of tools, which may neither do the data justice nor help answer the scientific question. In this paper we have captured key concerns...

  3. Development of Spectrometer Software for Electromagnetic Radiation Measurement and Analysis

    International Nuclear Information System (INIS)

    Mohd Idris Taib; Noor Ezati Shuib; Wan Saffiey Wan Abdullah

    2013-01-01

    This software was under development using LabVIEW to be using with StellarNet Spectrometer system. StellarNet Spectrometer was supplied with SpectraWiz operating software that can measure spectral data for real-time spectroscopy. This LabVIEW software was used to access real-time data from SpectraWiz dynamic link library as hardware interfacing. This software will acquire amplitude of every electromagnetic wavelength at periodic time. In addition to hardware interfacing, the user interface capabilities of software include plotting of spectral data in various mode including scope, absorbance, transmission and irradiance mode. This software surely can be used for research and development in application, utilization and safety of electromagnetic radiation, especially solar, laser and ultra violet. Of-line capabilities of this software are almost unlimited due to availability of mathematical and signal processing function in the LabVIEW add on library. (author)

  4. Stable isotope analysis of precipitation samples obtained via crowdsourcing reveals the spatiotemporal evolution of Superstorm Sandy.

    Directory of Open Access Journals (Sweden)

    Stephen P Good

    Full Text Available Extra-tropical cyclones, such as 2012 Superstorm Sandy, pose a significant climatic threat to the northeastern United Sates, yet prediction of hydrologic and thermodynamic processes within such systems is complicated by their interaction with mid-latitude water patterns as they move poleward. Fortunately, the evolution of these systems is also recorded in the stable isotope ratios of storm-associated precipitation and water vapor, and isotopic analysis provides constraints on difficult-to-observe cyclone dynamics. During Superstorm Sandy, a unique crowdsourced approach enabled 685 precipitation samples to be obtained for oxygen and hydrogen isotopic analysis, constituting the largest isotopic sampling of a synoptic-scale system to date. Isotopically, these waters span an enormous range of values (> 21‰ for δ(18O, > 160‰ for δ(2H and exhibit strong spatiotemporal structure. Low isotope ratios occurred predominantly in the west and south quadrants of the storm, indicating robust isotopic distillation that tracked the intensity of the storm's warm core. Elevated values of deuterium-excess (> 25‰ were found primarily in the New England region after Sandy made landfall. Isotope mass balance calculations and Lagrangian back-trajectory analysis suggest that these samples reflect the moistening of dry continental air entrained from a mid-latitude trough. These results demonstrate the power of rapid-response isotope monitoring to elucidate the structure and dynamics of water cycling within synoptic-scale systems and improve our understanding of storm evolution, hydroclimatological impacts, and paleo-storm proxies.

  5. Stable isotope analysis of precipitation samples obtained via crowdsourcing reveals the spatiotemporal evolution of Superstorm Sandy.

    Science.gov (United States)

    Good, Stephen P; Mallia, Derek V; Lin, John C; Bowen, Gabriel J

    2014-01-01

    Extra-tropical cyclones, such as 2012 Superstorm Sandy, pose a significant climatic threat to the northeastern United Sates, yet prediction of hydrologic and thermodynamic processes within such systems is complicated by their interaction with mid-latitude water patterns as they move poleward. Fortunately, the evolution of these systems is also recorded in the stable isotope ratios of storm-associated precipitation and water vapor, and isotopic analysis provides constraints on difficult-to-observe cyclone dynamics. During Superstorm Sandy, a unique crowdsourced approach enabled 685 precipitation samples to be obtained for oxygen and hydrogen isotopic analysis, constituting the largest isotopic sampling of a synoptic-scale system to date. Isotopically, these waters span an enormous range of values (> 21‰ for δ(18)O, > 160‰ for δ(2)H) and exhibit strong spatiotemporal structure. Low isotope ratios occurred predominantly in the west and south quadrants of the storm, indicating robust isotopic distillation that tracked the intensity of the storm's warm core. Elevated values of deuterium-excess (> 25‰) were found primarily in the New England region after Sandy made landfall. Isotope mass balance calculations and Lagrangian back-trajectory analysis suggest that these samples reflect the moistening of dry continental air entrained from a mid-latitude trough. These results demonstrate the power of rapid-response isotope monitoring to elucidate the structure and dynamics of water cycling within synoptic-scale systems and improve our understanding of storm evolution, hydroclimatological impacts, and paleo-storm proxies.

  6. Fault tree synthesis for software design analysis of PLC based safety-critical systems

    International Nuclear Information System (INIS)

    Koo, S. R.; Cho, C. H.; Seong, P. H.

    2006-01-01

    As a software verification and validation should be performed for the development of PLC based safety-critical systems, a software safety analysis is also considered in line with entire software life cycle. In this paper, we propose a technique of software safety analysis in the design phase. Among various software hazard analysis techniques, fault tree analysis is most widely used for the safety analysis of nuclear power plant systems. Fault tree analysis also has the most intuitive notation and makes both qualitative and quantitative analyses possible. To analyze the design phase more effectively, we propose a technique of fault tree synthesis, along with a universal fault tree template for the architecture modules of nuclear software. Consequently, we can analyze the safety of software on the basis of fault tree synthesis. (authors)

  7. Hanford isotope project strategic business analysis yttrium-90 (Y-90)

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-10-01

    The purpose of this analysis is to address the short-term direction for the Hanford yttrium-90 (Y-90) project. Hanford is the sole DOE producer of Y-90, and is the largest repository for its source in this country. The production of Y-90 is part of the DOE Isotope Production and Distribution (IP and D) mission. The Y-90 is ``milked`` from strontium-90 (Sr-90), a byproduct of the previous Hanford missions. The use of Sr-90 to produce Y-90 could help reduce the amount of waste material processed and the related costs incurred by the clean-up mission, while providing medical and economic benefits. The cost of producing Y-90 is being subsidized by DOE-IP and D due to its use for research, and resultant low production level. It is possible that the sales of Y-90 could produce full cost recovery within two to three years, at two curies per week. Preliminary projections place the demand at between 20,000 and 50,000 curies per year within the next ten years, assuming FDA approval of one or more of the current therapies now in clinical trials. This level of production would incentivize private firms to commercialize the operation, and allow the government to recover some of its sunk costs. There are a number of potential barriers to the success of the Y-90 project, outside the control of the Hanford Site. The key issues include: efficacy, Food and Drug Administration (FDA) approval and medical community acceptance. There are at least three other sources for Y-90 available to the US users, but they appear to have limited resources to produce the isotope. Several companies have communicated interest in entering into agreements with Hanford for the processing and distribution of Y-90, including some of the major pharmaceutical firms in this country.

  8. Hanford isotope project strategic business analysis yttrium-90 (Y-90)

    International Nuclear Information System (INIS)

    1995-10-01

    The purpose of this analysis is to address the short-term direction for the Hanford yttrium-90 (Y-90) project. Hanford is the sole DOE producer of Y-90, and is the largest repository for its source in this country. The production of Y-90 is part of the DOE Isotope Production and Distribution (IP and D) mission. The Y-90 is ''milked'' from strontium-90 (Sr-90), a byproduct of the previous Hanford missions. The use of Sr-90 to produce Y-90 could help reduce the amount of waste material processed and the related costs incurred by the clean-up mission, while providing medical and economic benefits. The cost of producing Y-90 is being subsidized by DOE-IP and D due to its use for research, and resultant low production level. It is possible that the sales of Y-90 could produce full cost recovery within two to three years, at two curies per week. Preliminary projections place the demand at between 20,000 and 50,000 curies per year within the next ten years, assuming FDA approval of one or more of the current therapies now in clinical trials. This level of production would incentivize private firms to commercialize the operation, and allow the government to recover some of its sunk costs. There are a number of potential barriers to the success of the Y-90 project, outside the control of the Hanford Site. The key issues include: efficacy, Food and Drug Administration (FDA) approval and medical community acceptance. There are at least three other sources for Y-90 available to the US users, but they appear to have limited resources to produce the isotope. Several companies have communicated interest in entering into agreements with Hanford for the processing and distribution of Y-90, including some of the major pharmaceutical firms in this country

  9. Analysis of Software Development Methodologies to Build Safety Software Applications for the SATEX-II: A Mexican Experimental Satellite

    Science.gov (United States)

    Aguilar Cisneros, Jorge; Vargas Martinez, Hector; Pedroza Melendez, Alejandro; Alonso Arevalo, Miguel

    2013-09-01

    Mexico is a country where the experience to build software for satellite applications is beginning. This is a delicate situation because in the near future we will need to develop software for the SATEX-II (Mexican Experimental Satellite). SATEX- II is a SOMECyTA's project (the Mexican Society of Aerospace Science and Technology). We have experienced applying software development methodologies, like TSP (Team Software Process) and SCRUM in other areas. Then, we analyzed these methodologies and we concluded: these can be applied to develop software for the SATEX-II, also, we supported these methodologies with SSP-05-0 Standard in particular with ESA PSS-05-11. Our analysis was focusing on main characteristics of each methodology and how these methodologies could be used with the ESA PSS 05-0 Standards. Our outcomes, in general, may be used by teams who need to build small satellites, but, in particular, these are going to be used when we will build the on board software applications for the SATEX-II.

  10. Biometrics from the carbon isotope ratio analysis of amino acids in human hair.

    Science.gov (United States)

    Jackson, Glen P; An, Yan; Konstantynova, Kateryna I; Rashaid, Ayat H B

    2015-01-01

    This study compares and contrasts the ability to classify individuals into different grouping factors through either bulk isotope ratio analysis or amino-acid-specific isotope ratio analysis of human hair. Using LC-IRMS, we measured the isotope ratios of 14 amino acids in hair proteins independently, and leucine/isoleucine as a co-eluting pair, to provide 15 variables for classification. Multivariate analysis confirmed that the essential amino acids and non-essential amino acids were mostly independent variables in the classification rules, thereby enabling the separation of dietary factors of isotope intake from intrinsic or phenotypic factors of isotope fractionation. Multivariate analysis revealed at least two potential sources of non-dietary factors influencing the carbon isotope ratio values of the amino acids in human hair: body mass index (BMI) and age. These results provide evidence that compound-specific isotope ratio analysis has the potential to go beyond region-of-origin or geospatial movements of individuals-obtainable through bulk isotope measurements-to the provision of physical and characteristic traits about the individuals, such as age and BMI. Further development and refinement, for example to genetic, metabolic, disease and hormonal factors could ultimately be of great assistance in forensic and clinical casework. Copyright © 2014 Forensic Science Society. Published by Elsevier Ireland Ltd. All rights reserved.

  11. Software hazard analysis for nuclear digital protection system by Colored Petri Net

    International Nuclear Information System (INIS)

    Bai, Tao; Chen, Wei-Hua; Liu, Zhen; Gao, Feng

    2017-01-01

    Highlights: •A dynamic hazard analysis method is proposed for the safety-critical software. •The mechanism relies on Colored Petri Net. •Complex interactions between software and hardware are captured properly. •Common failure mode in software are identified effectively. -- Abstract: The software safety of a nuclear digital protection system is critical for the safety of nuclear power plants as any software defect may result in severe damage. In order to ensure the safety and reliability of safety-critical digital system products and their applications, software hazard analysis is required to be performed during the lifecycle of software development. The dynamic software hazard modeling and analysis method based on Colored Petri Net is proposed and applied to the safety-critical control software of the nuclear digital protection system in this paper. The analysis results show that the proposed method can explain the complex interactions between software and hardware and identify the potential common cause failure in software properly and effectively. Moreover, the method can find the dominant software induced hazard to safety control actions, which aids in increasing software quality.

  12. Economic Consequence Analysis of Disasters: The ECAT Software Tool

    Energy Technology Data Exchange (ETDEWEB)

    Rose, Adam; Prager, Fynn; Chen, Zhenhua; Chatterjee, Samrat; Wei, Dan; Heatwole, Nathaniel; Warren, Eric

    2017-04-15

    This study develops a methodology for rapidly obtaining approximate estimates of the economic consequences from numerous natural, man-made and technological threats. This software tool is intended for use by various decision makers and analysts to obtain estimates rapidly. It is programmed in Excel and Visual Basic for Applications (VBA) to facilitate its use. This tool is called E-CAT (Economic Consequence Analysis Tool) and accounts for the cumulative direct and indirect impacts (including resilience and behavioral factors that significantly affect base estimates) on the U.S. economy. E-CAT is intended to be a major step toward advancing the current state of economic consequence analysis (ECA) and also contributing to and developing interest in further research into complex but rapid turnaround approaches. The essence of the methodology involves running numerous simulations in a computable general equilibrium (CGE) model for each threat, yielding synthetic data for the estimation of a single regression equation based on the identification of key explanatory variables (threat characteristics and background conditions). This transforms the results of a complex model, which is beyond the reach of most users, into a "reduced form" model that is readily comprehensible. Functionality has been built into E-CAT so that its users can switch various consequence categories on and off in order to create customized profiles of economic consequences of numerous risk events. E-CAT incorporates uncertainty on both the input and output side in the course of the analysis.

  13. Optical spectroscopy versus mass spectrometry: The race for fieldable isotopic analysis

    International Nuclear Information System (INIS)

    Barshick, C.M.; Young, J.P.; Shaw, R.W.

    1995-01-01

    Several techniques have been developed to provide on-site isotopic analyses, including decay-counting and mass spectrometry, as well as methods that rely on the accessibility of optical transitions for isotopic selectivity (e.g., laser-induced fluorescence and optogalvanic spectroscopy). The authors have been investigating both mass spectrometry and optogalvanic spectroscopy for several years. Although others have considered these techniques for isotopic analysis, the authors have focussed on the use of a dc glow discharge for atomization and ionization, and a demountable discharge cell for rapid sample exchange. The authors' goal is a fieldable instrument that provides useful uranium isotope ratio information

  14. Optical spectroscopy of laser-produced plasmas for standoff isotopic analysis

    Energy Technology Data Exchange (ETDEWEB)

    Harilal, Sivanandan S.; Brumfield, Brian E.; LaHaye, Nicole L.; Hartig, Kyle C.; Phillips, Mark C.

    2018-04-20

    This review article covers the present status of isotope detection through emission, absorption, and fluorescence spectroscopy of atoms and molecules in a laser-produced plasma formed from a solid sample. A description of the physics behind isotope shifts in atoms and molecules is presented, followed by the physics behind solid sampling of laser ablation plumes, optical methods for isotope measurements, the suitable physical conditions of laser-produced plasma plumes for isotopic analysis, and the current status. Finally, concluding remarks will be made on the existing gaps between previous works in the literature and suggestions for future work.

  15. High burn-up plutonium isotopic compositions recommended for use in shielding analysis

    International Nuclear Information System (INIS)

    Zimmerman, M.G.

    1977-06-01

    Isotopic compositions for plutonium generated and recycled in LWR's were estimated for use in shielding calculations. The values were obtained by averaging isotopic values from many sources in the literature. These isotopic values should provide the basis for a reasonable prediction of exposure rates from the range of LWR fuel expected in the future. The isotopic compositions given are meant to be used for shielding calculations, and the values are not necessarily applicable to other forms of analysis, such as inventory assessment or criticality safety. 11 tables, 2 figs

  16. Software use cases to elicit the software requirements analysis within the ASTRI project

    Science.gov (United States)

    Conforti, Vito; Antolini, Elisa; Bonnoli, Giacomo; Bruno, Pietro; Bulgarelli, Andrea; Capalbi, Milvia; Fioretti, Valentina; Fugazza, Dino; Gardiol, Daniele; Grillo, Alessandro; Leto, Giuseppe; Lombardi, Saverio; Lucarelli, Fabrizio; Maccarone, Maria Concetta; Malaguti, Giuseppe; Pareschi, Giovanni; Russo, Federico; Sangiorgi, Pierluca; Schwarz, Joseph; Scuderi, Salvatore; Tanci, Claudio; Tosti, Gino; Trifoglio, Massimo; Vercellone, Stefano; Zanmar Sanchez, Ricardo

    2016-07-01

    The Italian National Institute for Astrophysics (INAF) is leading the Astrofisica con Specchi a Tecnologia Replicante Italiana (ASTRI) project whose main purpose is the realization of small size telescopes (SST) for the Cherenkov Telescope Array (CTA). The first goal of the ASTRI project has been the development and operation of an innovative end-to-end telescope prototype using a dual-mirror optical configuration (SST-2M) equipped with a camera based on silicon photo-multipliers and very fast read-out electronics. The ASTRI SST-2M prototype has been installed in Italy at the INAF "M.G. Fracastoro" Astronomical Station located at Serra La Nave, on Mount Etna, Sicily. This prototype will be used to test several mechanical, optical, control hardware and software solutions which will be used in the ASTRI mini-array, comprising nine telescopes proposed to be placed at the CTA southern site. The ASTRI mini-array is a collaborative and international effort led by INAF and carried out by Italy, Brazil and South-Africa. We present here the use cases, through UML (Unified Modeling Language) diagrams and text details, that describe the functional requirements of the software that will manage the ASTRI SST-2M prototype, and the lessons learned thanks to these activities. We intend to adopt the same approach for the Mini Array Software System that will manage the ASTRI miniarray operations. Use cases are of importance for the whole software life cycle; in particular they provide valuable support to the validation and verification activities. Following the iterative development approach, which breaks down the software development into smaller chunks, we have analysed the requirements, developed, and then tested the code in repeated cycles. The use case technique allowed us to formalize the problem through user stories that describe how the user procedurally interacts with the software system. Through the use cases we improved the communication among team members, fostered

  17. New developments on COSI6, the simulation software for fuel cycle analysis

    Energy Technology Data Exchange (ETDEWEB)

    Meyer, Maryan; Boucher, Lionel [CEA, DEN, DER, SPRC, LECy, Centre de Cadarache, Batiment 230, Saint-Paul-lez-Durance, 13108 (France)

    2009-06-15

    New developments on COSI6, the simulation software for fuel cycle analysis 'COSI', is a code simulating a pool of nuclear electricity generating plants with its associated fuel cycle facilities. This code has been designed to study various short, medium and long term options for the introduction of various types of nuclear reactors and for the use of associated nuclear materials. COSI calculates the mass and the isotopic composition of all the materials, in each part of the nuclear park, at any time. Following the complete renewal of the code in 2006, new developments have been implemented into improve physical models, user convenience and to enlarge the scope of utilisation. COSI can now be coupled with CESAR 5 (JEFF 2.2: 200 FP available), allowing to perform waste packages calculations: high level waste (glasses), intermediate level waste (compacted waste), liquid and gaseous waste coming from processing, reactor high level waste. Those packages are managed in new kinds of facilities: finished warehouse, interim storage, waste disposal. The calculable features for waste packages are: mass (initial heavy metal), isotopic content, packages mass, packages volume, number of packages, activity, radiotoxicity, decay heat, necessary area. Developments on COSI are still ongoing. The reactivity equivalence function (based on Baker formula) is now available: - for thorium cycle: compositions for [Th+Pu] or [Th+U] can be calculated taking into account all nuclides impacts. - for ADS: compositions for [MA+Pu] can be calculated taking into account all nuclides impacts. The physical model is also developed: nuclear data (branching ratios, fission energies, fission yields) will be different between thermal reactors and fast reactors. The first step towards proliferation resistance methodology implementation is the evaluation of physical data needed for proliferation evaluation: heating rate from Pu in material (Watt/kg), weight fraction of even isotopes

  18. Software requirements definition Shipping Cask Analysis System (SCANS)

    International Nuclear Information System (INIS)

    Johnson, G.L.; Serbin, R.

    1985-01-01

    The US Nuclear Regulatory Commission (NRC) staff reviews the technical adequacy of applications for certification of designs of shipping casks for spent nuclear fuel. In order to confirm an acceptable design, the NRC staff may perform independent calculations. The current NRC procedure for confirming cask design analyses is laborious and tedious. Most of the work is currently done by hand or through the use of a remote computer network. The time required to certify a cask can be long. The review process may vary somewhat with the engineer doing the reviewing. Similarly, the documentation on the results of the review can also vary with the reviewer. To increase the efficiency of this certification process, LLNL was requested to design and write an integrated set of user-oriented, interactive computer programs for a personal microcomputer. The system is known as the NRC Shipping Cask Analysis System (SCANS). The computer codes and the software system supporting these codes are being developed and maintained for the NRC by LLNL. The objective of this system is generally to lessen the time and effort needed to review an application. Additionally, an objective of the system is to assure standardized methods and documentation of the confirmatory analyses used in the review of these cask designs. A software system should be designed based on NRC-defined requirements contained in a requirements document. The requirements document is a statement of a project's wants and needs as the users and implementers jointly understand them. The requirements document states the desired end products (i.e. WHAT's) of the project, not HOW the project provides them. This document describes the wants and needs for the SCANS system. 1 fig., 3 tabs

  19. Study of gamma ray analysis software's. Application to activation analysis of geological samples

    International Nuclear Information System (INIS)

    Silva, Luiz Roberto Nogueira da

    1998-01-01

    A comparative evaluation of the gamma-ray analysis software VISPECT, in relation to two commercial gamma-ray analysis software packages, OMNIGAM (EG and G Ortec) and SAMPO 90 (Canberra) was performed. For this evaluation, artificial gamma ray spectra were created, presenting peaks of different intensities and located at four different regions of the spectrum. Multiplet peaks with equal and different intensities, but with different channel separations, were also created. The results obtained showed a good performance of VISPECT in detecting and analysing single and multiplet peaks of different intensities in the gamma-ray spectrum. Neutron activation analysis of the geological reference material GS-N (IWG-GIT) and of the granite G-94, used in a Proficiency Testing Trial of Analytical Geochemistry Laboratories, was also performed , in order to evaluate the VISEPCT software in the analysis of real samples. The results obtained by using VISPECT were as good or better than the ones obtained using the other programs. (author)

  20. Lead isotopic compositions of environmental certified reference materials for an inter-laboratory comparison of lead isotope analysis

    International Nuclear Information System (INIS)

    Aung, Nyein Nyein; Uryu, Tsutomu; Yoshinaga, Jun

    2004-01-01

    Lead isotope ratios, viz. 207 Pb/ 206 Pb and 208 Pb/ 206 Pb, of the commercially available certified reference materials (CRMs) issued in Japan are presented with an objective to provide a data set, which will be useful for the quality assurance of analytical procedures, instrumental performance and method validation of the laboratories involved in environmental lead isotope ratio analysis. The analytical method used in the present study was inductively coupled plasma quadrupole mass spectrometry (ICPQMS) presented by acid digestion and with/without chemical separation of lead from the matrix. The precision of the measurements in terms of the relative standard deviation (RSD) of triplicated analyses was 0.19% and 0.14%, for 207 Pb/ 206 Pb and 208 Pb/ 206 Pb, respectively. The trueness of lead isotope ratio measurements of the present study was tested with a few CRMs, which have been analyzed by other analytical methods and reported in various literature. The lead isotopic ratios of 18 environmental matrix CRMs (including 6 CRMs analyzed for our method validation) are presented and the distribution of their ratios is briefly discussed. (author)

  1. Potential application of gas chromatography to the analysis of hydrogen isotopes

    International Nuclear Information System (INIS)

    Warner, D.K.; Sprague, R.E.; Bohl, D.R.

    1976-01-01

    Gas chromatography is used at Mound Laboratory for the analysis of hydrogen isotopic impurities in gas mixtures. This instrumentation was used to study the applicability of the gas chromatography technique to the determination of the major components of hydrogen isotopic gas mixtures. The results of this study, including chromatograms and precision data, are presented

  2. Mobility and diet in Neolithic, Bronze Age and Iron Age Germany : evidence from multiple isotope analysis

    NARCIS (Netherlands)

    Oelze, Viktoria Martha

    2012-01-01

    Prehistoric human diet can be reconstructed by the analysis of carbon (C), nitrogen (N) and sulphur (S) stable isotopes in bone, whereas ancient mobility and provenance can be studied using the isotopes of strontium (Sr) and oxygen (O) in tooth enamel, and of sulphur in bone. Although thirty years

  3. The Software Therapist: Usability Problem Diagnosis Through Latent Semantic Analysis

    National Research Council Canada - National Science Library

    Sparks, Randall; Hartson, Rex

    2006-01-01

    The work we report on here addresses the problem of low return on investment in software usability engineering and offers support for usability practitioners in identifying, understanding, documenting...

  4. HistFitter software framework for statistical data analysis

    Energy Technology Data Exchange (ETDEWEB)

    Baak, M. [CERN, Geneva (Switzerland); Besjes, G.J. [Radboud University Nijmegen, Nijmegen (Netherlands); Nikhef, Amsterdam (Netherlands); Cote, D. [University of Texas, Arlington (United States); Koutsman, A. [TRIUMF, Vancouver (Canada); Lorenz, J. [Ludwig-Maximilians-Universitaet Muenchen, Munich (Germany); Excellence Cluster Universe, Garching (Germany); Short, D. [University of Oxford, Oxford (United Kingdom)

    2015-04-15

    We present a software framework for statistical data analysis, called HistFitter, that has been used extensively by the ATLAS Collaboration to analyze big datasets originating from proton-proton collisions at the Large Hadron Collider at CERN. Since 2012 HistFitter has been the standard statistical tool in searches for supersymmetric particles performed by ATLAS. HistFitter is a programmable and flexible framework to build, book-keep, fit, interpret and present results of data models of nearly arbitrary complexity. Starting from an object-oriented configuration, defined by users, the framework builds probability density functions that are automatically fit to data and interpreted with statistical tests. Internally HistFitter uses the statistics packages RooStats and HistFactory. A key innovation of HistFitter is its design, which is rooted in analysis strategies of particle physics. The concepts of control, signal and validation regions are woven into its fabric. These are progressively treated with statistically rigorous built-in methods. Being capable of working with multiple models at once that describe the data, HistFitter introduces an additional level of abstraction that allows for easy bookkeeping, manipulation and testing of large collections of signal hypotheses. Finally, HistFitter provides a collection of tools to present results with publication quality style through a simple command-line interface. (orig.)

  5. Development and applications of Kramers-Kronig PEELS analysis software

    International Nuclear Information System (INIS)

    Fan, X. D.; Peng, J.L.; Bursill, L.A.

    1997-01-01

    A Kramers-Kronig analysis program is developed as a custom function for the GATAN parallel electron energy loss spectroscopy (PEELS) software package EL/P. When used with a JEOL 4000EX high-resolution transmission electron microscope this program allows to measure the dielectric functions of materials with an energy resolution of approx 1.4eV. The imaginary part of the dielectric function is particularly useful, since it allows the magnitude of the band gap to be determined for relatively wide-gap materials. More importantly, changes in the gap may be monitored at high spatial resolution, when used in conjunction with the HRTEM images. The principles of the method are described and applications are presented for Type-1a gem quality diamond, before and after neutron irradiation. The former shows a band gap of about 5.8 eV, as expected, whereas for the latter the gap appears to be effectively collapsed. The core-loss spectra confirm that Type-1a diamond has pure sp 3 tetrahedral bonding, whereas the neutron irradiated diamond has mixed sp 2 /sp 3 bonding. Analysis of the low-loss spectra for the neutron-irradiated specimen yielded density 1.6 g/cm 3 , approximately half that of diamond. 10 refs., 2 figs

  6. HistFitter software framework for statistical data analysis

    International Nuclear Information System (INIS)

    Baak, M.; Besjes, G.J.; Cote, D.; Koutsman, A.; Lorenz, J.; Short, D.

    2015-01-01

    We present a software framework for statistical data analysis, called HistFitter, that has been used extensively by the ATLAS Collaboration to analyze big datasets originating from proton-proton collisions at the Large Hadron Collider at CERN. Since 2012 HistFitter has been the standard statistical tool in searches for supersymmetric particles performed by ATLAS. HistFitter is a programmable and flexible framework to build, book-keep, fit, interpret and present results of data models of nearly arbitrary complexity. Starting from an object-oriented configuration, defined by users, the framework builds probability density functions that are automatically fit to data and interpreted with statistical tests. Internally HistFitter uses the statistics packages RooStats and HistFactory. A key innovation of HistFitter is its design, which is rooted in analysis strategies of particle physics. The concepts of control, signal and validation regions are woven into its fabric. These are progressively treated with statistically rigorous built-in methods. Being capable of working with multiple models at once that describe the data, HistFitter introduces an additional level of abstraction that allows for easy bookkeeping, manipulation and testing of large collections of signal hypotheses. Finally, HistFitter provides a collection of tools to present results with publication quality style through a simple command-line interface. (orig.)

  7. An analysis software of tritium distribution in food and environmental water in China

    International Nuclear Information System (INIS)

    Li Wenhong; Xu Cuihua; Ren Tianshan; Deng Guilong

    2006-01-01

    Objective: The purpose of developing this analysis-software of tritium distribution in food and environmental water is to collect tritium monitoring data, to analyze the data, both automatically, statistically and graphically, and to study and share the data. Methods: Based on the data obtained before, analysis-software is wrote by using VC++. NET as tool software. The software first transfers data from EXCEL into a database. It has additive function of data-append, so operators can embody new monitoring data easily. Results: After turning the monitoring data saved as EXCEL file by original researchers into a database, people can easily access them. The software provides a tool of distributing-analysis of tritium. Conclusion: This software is a first attempt of data-analysis about tritium level in food and environmental water in China. Data achieving, searching and analyzing become easily and directly with the software. (authors)

  8. Software selection based on analysis and forecasting methods, practised in 1C

    Science.gov (United States)

    Vazhdaev, A. N.; Chernysheva, T. Y.; Lisacheva, E. I.

    2015-09-01

    The research focuses on the problem of a “1C: Enterprise 8” platform inboard mechanisms for data analysis and forecasting. It is important to evaluate and select proper software to develop effective strategies for customer relationship management in terms of sales, as well as implementation and further maintenance of software. Research data allows creating new forecast models to schedule further software distribution.

  9. Spacelab user implementation assessment study. (Software requirements analysis). Volume 2: Technical report

    Science.gov (United States)

    1976-01-01

    The engineering analyses and evaluation studies conducted for the Software Requirements Analysis are discussed. Included are the development of the study data base, synthesis of implementation approaches for software required by both mandatory onboard computer services and command/control functions, and identification and implementation of software for ground processing activities.

  10. IsoDesign: a software for optimizing the design of 13C-metabolic flux analysis experiments.

    Science.gov (United States)

    Millard, Pierre; Sokol, Serguei; Letisse, Fabien; Portais, Jean-Charles

    2014-01-01

    The growing demand for (13) C-metabolic flux analysis ((13) C-MFA) in the field of metabolic engineering and systems biology is driving the need to rationalize expensive and time-consuming (13) C-labeling experiments. Experimental design is a key step in improving both the number of fluxes that can be calculated from a set of isotopic data and the precision of flux values. We present IsoDesign, a software that enables these parameters to be maximized by optimizing the isotopic composition of the label input. It can be applied to (13) C-MFA investigations using a broad panel of analytical tools (MS, MS/MS, (1) H NMR, (13) C NMR, etc.) individually or in combination. It includes a visualization module to intuitively select the optimal label input depending on the biological question to be addressed. Applications of IsoDesign are described, with an example of the entire (13) C-MFA workflow from the experimental design to the flux map including important practical considerations. IsoDesign makes the experimental design of (13) C-MFA experiments more accessible to a wider biological community. IsoDesign is distributed under an open source license at http://metasys.insa-toulouse.fr/software/isodes/ © 2013 Wiley Periodicals, Inc.

  11. Empirical analysis of change metrics for software fault prediction

    NARCIS (Netherlands)

    Choudhary, Garvit Rajesh; Kumar, Sandeep; Kumar, Kuldeep; Mishra, Alok; Catal, Cagatay

    2018-01-01

    A quality assurance activity, known as software fault prediction, can reduce development costs and improve software quality. The objective of this study is to investigate change metrics in conjunction with code metrics to improve the performance of fault prediction models. Experimental studies are

  12. An Analysis of Open Source Security Software Products Downloads

    Science.gov (United States)

    Barta, Brian J.

    2014-01-01

    Despite the continued demand for open source security software, a gap in the identification of success factors related to the success of open source security software persists. There are no studies that accurately assess the extent of this persistent gap, particularly with respect to the strength of the relationships of open source software…

  13. Multi-criteria decision analysis methods and software

    CERN Document Server

    Ishizaka, Alessio

    2013-01-01

    This book presents an introduction to MCDA followed by more detailed chapters about each of the leading methods used in this field. Comparison of methods and software is also featured to enable readers to choose the most appropriate method needed in their research. Worked examples as well as the software featured in the book are available on an accompanying website.

  14. A pattern framework for software quality assessment and tradeoff analysis

    NARCIS (Netherlands)

    Folmer, Eelke; Boscht, Jan

    The earliest design decisions often have a significant impact on software quality and are the most costly to revoke. One of the challenges in architecture design is to reduce the frequency of retrofit problems in software designs; not being able to improve the quality of a system cost effectively, a

  15. Dissolution of barite for the analysis of strontium isotopes and other chemical and isotopic variations using aqueous sodium carbonate

    Science.gov (United States)

    Breit, G.N.; Simmons, E.C.; Goldhaber, M.B.

    1985-01-01

    A simple procedure for preparing barite samples for chemical and isotopic analysis is described. Sulfate ion, in barite, in the presence of high concentrations of aqueous sodium carbonate, is replaced by carbonate. This replacement forms insoluble carbonates with the cations commonly in barite: Ba, Sr, Ca and Pb. Sulfate is released into the solution by the carbonate replacement and is separated by filtration. The aqueous sulfate can then be reprecipitated for analysis of the sulfur and oxygen isotopes. The cations in the carbonate phase can be dissolved by acidifying the solid residue. Sr can be separated from the solution for Sr isotope analysis by ion-exchange chromatography. The sodium carbonate used contains amounts of Sr which will affect almost all barite 87Sr 86Sr ratios by less than 0.00001 at 1.95?? of the mean. The procedure is preferred over other techniques used for preparing barite samples for the determination of 87Sr 86Sr ratios because it is simple, rapid and enables simultaneous determination of many compositional parameters on the same material. ?? 1985.

  16. Analysis method for beta-gamma coincidence spectra from radio-xenon isotopes

    International Nuclear Information System (INIS)

    Yang Wenjing; Yin Jingpeng; Huang Xiongliang; Cheng Zhiwei; Shen Maoquan; Zhang Yang

    2012-01-01

    Radio-xenon isotopes monitoring is one important method for the verification of CTBT, what includes the measurement methods of HPGe γ spectrometer and β-γ coincidence. The article describes the analytic flowchart and method of three-dimensional beta-gamma coincidence spectra from β-γ systems, and analyses in detail the principles and methods of the regions of interest of coincidence spectra and subtracting the interference, finally gives the formula of radioactivity of Xenon isotopes and minimum detectable concentrations. Studying on the principles of three-dimensional beta-gamma coincidence spectra, which can supply the foundation for designing the software of β-γ coincidence systems. (authors)

  17. Software para análise quantitativa da deglutição Swallowing quantitative analysis software

    Directory of Open Access Journals (Sweden)

    André Augusto Spadotto

    2008-02-01

    Full Text Available OBJETIVO: Apresentar um software que permita uma análise detalhada da dinâmica da deglutição. MATERIAIS E MÉTODOS: Participaram deste estudo dez indivíduos após acidente vascular encefálico, sendo seis do gênero masculino, com idade média de 57,6 anos. Foi realizada videofluoroscopia da deglutição e as imagens foram digitalizadas em microcomputador, com posterior análise do tempo do trânsito faríngeo da deglutição, por meio de um cronômetro e do software. RESULTADOS: O tempo médio do trânsito faríngeo da deglutição apresentou-se diferente quando comparados os métodos utilizados (cronômetro e software. CONCLUSÃO: Este software é um instrumento de análise dos parâmetros tempo e velocidade da deglutição, propiciando melhor compreensão da dinâmica da deglutição, com reflexos tanto na abordagem clínica dos pacientes com disfagia como para fins de pesquisa científica.OBJECTIVE: The present paper is aimed at introducing a software to allow a detailed analysis of the swallowing dynamics. MATERIALS AND METHODS: The sample included ten (six male and four female stroke patients, with mean age of 57.6 years. Swallowing videofluoroscopy was performed and images were digitized for posterior analysis of the pharyngeal transit time with the aid of a chronometer and the software. RESULTS: Differences were observed in the average pharyngeal swallowing transit time as a result of measurements with chronometer and software. CONCLUSION: This software is a useful tool for the analysis of parameters such as swallowing time and speed, allowing a better understanding of the swallowing dynamics, both in the clinical approach of patients with oropharyngeal dysphagia and for scientific research purposes.

  18. Disposal criticality analysis methodology's principal isotope burnup credit

    International Nuclear Information System (INIS)

    Doering, T.W.; Thomas, D.A.

    2001-01-01

    This paper presents the burnup credit aspects of the United States Department of Energy Yucca Mountain Project's methodology for performing criticality analyses for commercial light-water-reactor fuel. The disposal burnup credit methodology uses a 'principal isotope' model, which takes credit for the reduced reactivity associated with the build-up of the primary principal actinides and fission products in irradiated fuel. Burnup credit is important to the disposal criticality analysis methodology and to the design of commercial fuel waste packages. The burnup credit methodology developed for disposal of irradiated commercial nuclear fuel can also be applied to storage and transportation of irradiated commercial nuclear fuel. For all applications a series of loading curves are developed using a best estimate methodology and depending on the application, an additional administrative safety margin may be applied. The burnup credit methodology better represents the 'true' reactivity of the irradiated fuel configuration, and hence the real safety margin, than do evaluations using the 'fresh fuel' assumption. (author)

  19. Paradigms in isotope dilution mass spectrometry for elemental speciation analysis

    International Nuclear Information System (INIS)

    Meija, Juris; Mester, Zoltan

    2008-01-01

    Isotope dilution mass spectrometry currently stands out as the method providing results with unchallenged precision and accuracy in elemental speciation. However, recent history of isotope dilution mass spectrometry has shown that the extent to which this primary ratio measurement method can deliver accurate results is still subject of active research. In this review, we will summarize the fundamental prerequisites behind isotope dilution mass spectrometry and discuss their practical limits of validity and effects on the accuracy of the obtained results. This review is not to be viewed as a critique of isotope dilution; rather its purpose is to highlight the lesser studied aspects that will ensure and elevate current supremacy of the results obtained from this method

  20. RNAstructure: software for RNA secondary structure prediction and analysis.

    Science.gov (United States)

    Reuter, Jessica S; Mathews, David H

    2010-03-15

    To understand an RNA sequence's mechanism of action, the structure must be known. Furthermore, target RNA structure is an important consideration in the design of small interfering RNAs and antisense DNA oligonucleotides. RNA secondary structure prediction, using thermodynamics, can be used to develop hypotheses about the structure of an RNA sequence. RNAstructure is a software package for RNA secondary structure prediction and analysis. It uses thermodynamics and utilizes the most recent set of nearest neighbor parameters from the Turner group. It includes methods for secondary structure prediction (using several algorithms), prediction of base pair probabilities, bimolecular structure prediction, and prediction of a structure common to two sequences. This contribution describes new extensions to the package, including a library of C++ classes for incorporation into other programs, a user-friendly graphical user interface written in JAVA, and new Unix-style text interfaces. The original graphical user interface for Microsoft Windows is still maintained. The extensions to RNAstructure serve to make RNA secondary structure prediction user-friendly. The package is available for download from the Mathews lab homepage at http://rna.urmc.rochester.edu/RNAstructure.html.

  1. Analysis on Influential Functions in the Weighted Software Network

    Directory of Open Access Journals (Sweden)

    Haitao He

    2018-01-01

    Full Text Available Identifying influential nodes is important for software in terms of understanding the design patterns and controlling the development and the maintenance process. However, there are no efficient methods to discover them so far. Based on the invoking dependency relationships between the nodes, this paper proposes a novel approach to define the node importance for mining the influential software nodes. First, according to the multiple execution information, we construct a weighted software network (WSN to denote the software execution dependency structure. Second, considering the invoking times and outdegree about software nodes, we improve the method PageRank and put forward the targeted algorithm FunctionRank to evaluate the node importance (NI in weighted software network. It has higher influence when the node has lager value of NI. Finally, comparing the NI of nodes, we can obtain the most influential nodes in the software network. In addition, the experimental results show that the proposed approach has good performance in identifying the influential nodes.

  2. Experimental analysis of specification language impact on NPP software diversity

    International Nuclear Information System (INIS)

    Yoo, Chang Sik; Seong, Poong Hyun

    1998-01-01

    When redundancy and diversity is applied in NPP digital computer system, diversification of system software may be a critical point for the entire system dependability. As the means of enhancing software diversity, specification language diversity is suggested in this study. We set up a simple hypothesis for the specification language impact on common errors, and an experiment based on NPP protection system application was performed. Experiment result showed that this hypothesis could be justified and specification language diversity is effective in overcoming software common mode failure problem

  3. Learn by Yourself: The Self-Learning Tools for Qualitative Analysis Software Packages

    Science.gov (United States)

    Freitas, Fábio; Ribeiro, Jaime; Brandão, Catarina; Reis, Luís Paulo; de Souza, Francislê Neri; Costa, António Pedro

    2017-01-01

    Computer Assisted Qualitative Data Analysis Software (CAQDAS) are tools that help researchers to develop qualitative research projects. These software packages help the users with tasks such as transcription analysis, coding and text interpretation, writing and annotation, content search and analysis, recursive abstraction, grounded theory…

  4. Isotope dilution analysis for urinary fentanyl and its main metabolite, norfentanyl, in patients by isotopic fractionation using capillary gas chromatography

    Energy Technology Data Exchange (ETDEWEB)

    Sera, Shoji; Goromaru, Tsuyoshi [Fukuyama Univ., Hiroshima (Japan). Faculty of Pharmacy and Pharmaceutical Sciences; Sameshima, Teruko; Kawasaki, Koichi; Oda, Toshiyuki

    1998-07-01

    Isotope dilution analysis was applied to determine urinary excretion of fentanyl (FT) and its main metabolite, norfentanyl (Nor-FT), by isotopic fractionation using a capillary gas chromatograph equipped with a surface ionization detector (SID). Urinary FT was determined quantitatively in the range of 0.4-40 ng/ml using deuterium labeled FT (FT-{sup 2}H{sub 19}), as an internal standard. We also performed isotope dilution analysis of Nor-FT in urine. N-Alkylation was necessary to sensitively detect Nor-FT with SID. Methyl derivative was selected from 3 kinds of N-alkyl derivatives to increase sensitivity and peak resolution, and to prevent interference with urinary compound. Nor-FT concentration was quantitatively determined in the range of 10-400 ng/ml using deuterium labeled Nor-FT (Nor-FT-{sup 2}H{sub 10}). No endogenous compounds or concomitant drugs interfered with the detection of FT and Nor-FT in the urine of patients. The present method will be useful for pharmacokinetic studies and the evaluation of drug interactions in FT metabolism. (author)

  5. Isotope dilution analysis for urinary fentanyl and its main metabolite, norfentanyl, in patients by isotopic fractionation using capillary gas chromatography

    International Nuclear Information System (INIS)

    Sera, Shoji; Goromaru, Tsuyoshi; Sameshima, Teruko; Kawasaki, Koichi; Oda, Toshiyuki

    1998-01-01

    Isotope dilution analysis was applied to determine urinary excretion of fentanyl (FT) and its main metabolite, norfentanyl (Nor-FT), by isotopic fractionation using a capillary gas chromatograph equipped with a surface ionization detector (SID). Urinary FT was determined quantitatively in the range of 0.4-40 ng/ml using deuterium labeled FT (FT- 2 H 19 ), as an internal standard. We also performed isotope dilution analysis of Nor-FT in urine. N-Alkylation was necessary to sensitively detect Nor-FT with SID. Methyl derivative was selected from 3 kinds of N-alkyl derivatives to increase sensitivity and peak resolution, and to prevent interference with urinary compound. Nor-FT concentration was quantitatively determined in the range of 10-400 ng/ml using deuterium labeled Nor-FT (Nor-FT- 2 H 10 ). No endogenous compounds or concomitant drugs interfered with the detection of FT and Nor-FT in the urine of patients. The present method will be useful for pharmacokinetic studies and the evaluation of drug interactions in FT metabolism. (author)

  6. Development of the software dead time methodology for the 4πβ-γ software coincidence system analysis program

    International Nuclear Information System (INIS)

    Toledo, Fabio de; Brancaccio, Franco; Dias, Mauro da Silva

    2009-01-01

    The Laboratorio de Metrologia Nuclear - LMN, Nuclear Metrology Laboratory -, at IPEN-CNEN/SP, Sao Paulo, Brazil, developed a new Software Coincidence System (SCS) for 4πβ-γ radioisotope standardization. SCS is composed by the data acquisition hardware, for the coincidence data recording, and the coincidence data analysis program that performs the radioactive activity calculation for the target sample. Due to hardware intrinsic signal sampling characteristics, multiple undesired data recording occurs from a single saturated pulse. Also pulse pileup leads to bad data recording. As the beta counting rates are much greater than the gamma ones, due to the high 4π geometry beta detecting efficiencies, the beta counting significantly increases because of multiple pulse recordings, resulting in a respective increasing in the calculated activity value. In order to minimize such bad recordings effect, a software dead time value was introduced in the coincidence analysis program, under development at LMN, discarding multiple recordings, due to pulse pileup or saturation. This work presents the methodology developed to determine the optimal software dead time data value, for better accuracy results attaining, and discusses the results, pointing to software improvement possibilities. (author)

  7. Software safety analysis on the model specified by NuSCR and SMV input language at requirements phase of software development life cycle using SMV

    International Nuclear Information System (INIS)

    Koh, Kwang Yong; Seong, Poong Hyun

    2005-01-01

    Safety-critical software process is composed of development process, verification and validation (V and V) process and safety analysis process. Safety analysis process has been often treated as an additional process and not found in a conventional software process. But software safety analysis (SSA) is required if software is applied to a safety system, and the SSA shall be performed independently for the safety software through software development life cycle (SDLC). Of all the phases in software development, requirements engineering is generally considered to play the most critical role in determining the overall software quality. NASA data demonstrate that nearly 75% of failures found in operational software were caused by errors in the requirements. The verification process in requirements phase checks the correctness of software requirements specification, and the safety analysis process analyzes the safety-related properties in detail. In this paper, the method for safety analysis at requirements phase of software development life cycle using symbolic model verifier (SMV) is proposed. Hazard is discovered by hazard analysis and in other to use SMV for the safety analysis, the safety-related properties are expressed by computation tree logic (CTL)

  8. BEANS - a software package for distributed Big Data analysis

    Science.gov (United States)

    Hypki, Arkadiusz

    2018-03-01

    BEANS software is a web based, easy to install and maintain, new tool to store and analyse in a distributed way a massive amount of data. It provides a clear interface for querying, filtering, aggregating, and plotting data from an arbitrary number of datasets. Its main purpose is to simplify the process of storing, examining and finding new relations in huge datasets. The software is an answer to a growing need of the astronomical community to have a versatile tool to store, analyse and compare the complex astrophysical numerical simulations with observations (e.g. simulations of the Galaxy or star clusters with the Gaia archive). However, this software was built in a general form and it is ready to use in any other research field. It can be used as a building block for other open source software too.

  9. User-friendly software for SANS data reduction and analysis

    International Nuclear Information System (INIS)

    Biemann, P.; Haese-Seiller, M.; Staron, P.

    1999-01-01

    At the Geesthacht Neutron Facility (GeNF) a new software is being developed for the reduction of two-dimensional small-angle neutron scattering (SANS) data. The main motivation for this work was to created software for users of our SANS facilities that is easy to use. Another motivation was to provide users with software they can also use at their home institute. Therefore, the software is implemented on a personal computer running WINDOWS. The program reads raw data from an area detector in binary or ascii format and produces ascii files containing the scattering curve. The cross section can be averaged over the whole area of the detector or over users defined sectors only. Scripts can be created for processing large numbers of files. (author)

  10. Multi-channel software defined radio experimental evaluation and analysis

    CSIR Research Space (South Africa)

    Van der Merwe, JR

    2014-09-01

    Full Text Available Multi-channel software-defined radios (SDRs) can be utilised as inexpensive prototyping platforms for transceiver arrays. The application for multi-channel prototyping is discussed and measured results of coherent channels for both receiver...

  11. On the interference of Kr during carbon isotope analysis of methane using continuous-flow combustion–isotope ratio mass spectrometry

    NARCIS (Netherlands)

    Schmitt, J.; Seth, B.; Bock, M; van der Veen, C.; Möller, L.; Sapart, C.J.; Prokopiou, M.; Sowers, T.; Röckmann, T.; Fischer, H

    2013-01-01

    Stable carbon isotope analysis of methane ( 13C of CH4) on atmospheric samples is one key method to constrain the current and past atmospheric CH4 budget. A frequently applied measurement technique is gas chromatography (GC) isotope ratio mass spectrometry (IRMS) coupled to a

  12. First stable isotope analysis of Asiatic wild ass tail hair from the Mongolian Gobi.

    Science.gov (United States)

    Horacek, Micha; Sturm, Martina Burnik; Kaczensky, Petra

    Stable isotope analysis has become a powerful tool to study feeding ecology, water use or movement pattern in contemporary, historic and ancient species. Certain hair and teeth grow continuously, and when sampled longitudinally can provide temporally explicit information on dietary regime and movement pattern. In an initial trial, we analysed a tail sample of an Asiatic wild ass ( Equus hemionus ) from the Mongolian Gobi. We found seasonal variations in H, C and N isotope patterns, likely being the result of temporal variations in available feeds, water supply and possibly physiological status. Thus stable isotope analysis shows promise to study the comparative ecology of the three autochthonous equid species in the Mongolian Gobi.

  13. An online database for plant image analysis software tools

    OpenAIRE

    Lobet, Guillaume; Draye, Xavier; Périlleux, Claire

    2013-01-01

    Background: Recent years have seen an increase in methods for plant phenotyping using image analyses. These methods require new software solutions for data extraction and treatment. These solutions are instrumental in supporting various research pipelines, ranging from the localisation of cellular compounds to the quantification of tree canopies. However, due to the variety of existing tools and the lack of central repository, it is challenging for researchers to identify the software that is...

  14. Prototype Software for Automated Structural Analysis of Systems

    DEFF Research Database (Denmark)

    Jørgensen, A.; Izadi-Zamanabadi, Roozbeh; Kristensen, M.

    2004-01-01

    In this paper we present a prototype software tool that is developed to analyse the structural model of automated systems in order to identify redundant information that is hence utilized for Fault detection and Isolation (FDI) purposes. The dedicated algorithms in this software tool use a tri......-partite graph that represents the structural model of the system. A component-based approach has been used to address issues such as system complexity and recon¯gurability possibilities....

  15. Prototype Software for Automated Structural Analysis of Systems

    DEFF Research Database (Denmark)

    Jørgensen, A.; Izadi-Zamanabadi, Roozbeh; Kristensen, M.

    2004-01-01

    In this paper we present a prototype software tool that is developed to analyse the structural model of automated systems in order to identify redundant information that is hence utilized for Fault detection and Isolation (FDI) purposes. The dedicated algorithms in this software tool use a tri......-partite graph that represents the structural model of the system. A component-based approach has been used to address issues such as system complexity and reconfigurability possibilities....

  16. Network-based analysis of software change propagation.

    Science.gov (United States)

    Wang, Rongcun; Huang, Rubing; Qu, Binbin

    2014-01-01

    The object-oriented software systems frequently evolve to meet new change requirements. Understanding the characteristics of changes aids testers and system designers to improve the quality of softwares. Identifying important modules becomes a key issue in the process of evolution. In this context, a novel network-based approach is proposed to comprehensively investigate change distributions and the correlation between centrality measures and the scope of change propagation. First, software dependency networks are constructed at class level. And then, the number of times of cochanges among classes is minded from software repositories. According to the dependency relationships and the number of times of cochanges among classes, the scope of change propagation is calculated. Using Spearman rank correlation analyzes the correlation between centrality measures and the scope of change propagation. Three case studies on java open source software projects Findbugs, Hibernate, and Spring are conducted to research the characteristics of change propagation. Experimental results show that (i) change distribution is very uneven; (ii) PageRank, Degree, and CIRank are significantly correlated to the scope of change propagation. Particularly, CIRank shows higher correlation coefficient, which suggests it can be a more useful indicator for measuring the scope of change propagation of classes in object-oriented software system.

  17. Growth history of cultured pearl oysters based on stable oxygen isotope analysis

    Science.gov (United States)

    Nakashima, R.; Furuta, N.; Suzuki, A.; Kawahata, H.; Shikazono, N.

    2007-12-01

    We investigated the oxygen isotopic ratio in shells of the pearl oyster Pinctada martensii cultivated in embayments in Mie Prefecture, central Japan, to evaluate the biomineralization of shell structures of the species and its pearls in response to environmental change. Microsamples for oxygen isotope analysis were collected from the surfaces of shells (outer, middle, and inner shell layers) and pearls. Water temperature variations were estimated from the oxygen isotope values of the carbonate. Oxygen isotope profiles of the prismatic calcite of the outer shell layer reflected seasonal variations of water temperature, whereas those of nacreous aragonites of the middle and inner shell layers and pearls recorded temperatures from April to November, June to September, and July to September, respectively. Lower temperatures in autumn and winter might slow the growth of nacreous aragonites. The oxygen isotope values are controlled by both variations of water temperature and shell structures; the prismatic calcite of this species is useful for reconstructing seasonal changes of calcification temperature.

  18. Development of isotope dilution gamma-ray spectrometry for plutonium analysis

    Energy Technology Data Exchange (ETDEWEB)

    Li, T.K.; Parker, J.L. (Los Alamos National Lab., NM (United States)); Kuno, Y.; Sato, S.; Kurosawa, A.; Akiyama, T. (Power Reactor and Nuclear Fuel Development Corp., Tokai, Ibaraki (Japan))

    1991-01-01

    We are studying the feasibility of determining the plutonium concentration and isotopic distribution of highly radioactive, spent-fuel dissolver solutions by employing high-resolution gamma-ray spectrometry. The study involves gamma-ray plutonium isotopic analysis for both dissolver and spiked dissolver solution samples, after plutonium is eluted through an ion-exchange column and absorbed in a small resin bead bag. The spike is well characterized, dry plutonium containing {approximately}98% of {sup 239}Pu. By using measured isotopic information, the concentration of elemental plutonium in the dissolver solution can be determined. Both the plutonium concentration and the isotopic composition of the dissolver solution obtained from this study agree well with values obtained by traditional isotope dilution mass spectrometry (IDMS). Because it is rapid, easy to operate and maintain, and costs less, this new technique could be an alternative method to IDMS for input accountability and verification measurements in reprocessing plants. 7 refs., 4 figs., 4 tabs.

  19. High precision analysis of trace lithium isotope by thermal ionization mass spectrometry

    International Nuclear Information System (INIS)

    Tang Lei; Liu Xuemei; Long Kaiming; Liu Zhao; Yang Tianli

    2010-01-01

    High precision analysis method of ng lithium by thermal ionization mass spectrometry is developed. By double-filament measurement,phosphine acid ion enhancer and sample pre-baking technique,the precision of trace lithium analysis is improved. For 100 ng lithium isotope standard sample, relative standard deviation is better than 0.086%; for 10 ng lithium isotope standard sample, relative standard deviation is better than 0.90%. (authors)

  20. High Resolution Gamma Ray Analysis of Medical Isotopes

    Science.gov (United States)

    Chillery, Thomas

    2015-10-01

    Compton-suppressed high-purity Germanium detectors at the University of Massachusetts Lowell have been used to study medical radioisotopes produced at Brookhaven Linac Isotope Producer (BLIP), in particular isotopes such as Pt-191 used for cancer therapy in patients. The ability to precisely analyze the concentrations of such radio-isotopes is essential for both production facilities such as Brookhaven and consumer hospitals across the U.S. Without accurate knowledge of the quantities and strengths of these isotopes, it is possible for doctors to administer incorrect dosages to patients, thus leading to undesired results. Samples have been produced at Brookhaven and shipped to UML, and the advanced electronics and data acquisition capabilities at UML have been used to extract peak areas in the gamma decay spectra. Levels of Pt isotopes in diluted samples have been quantified, and reaction cross-sections deduced from the irradiation parameters. These provide both cross checks with published work, as well as a rigorous quantitative framework with high quality state-of-the-art detection apparatus in use in the experimental nuclear physics community.

  1. High precision isotopic ratio analysis of volatile metal chelates

    International Nuclear Information System (INIS)

    Hachey, D.L.; Blais, J.C.; Klein, P.D.

    1980-01-01

    High precision isotope ratio measurements have been made for a series of volatile alkaline earth and transition metal chelates using conventional GC/MS instrumentation. Electron ionization was used for alkaline earth chelates, whereas isobutane chemical ionization was used for transition metal studies. Natural isotopic abundances were determined for a series of Mg, Ca, Cr, Fe, Ni, Cu, Cd, and Zn chelates. Absolute accuracy ranged between 0.01 and 1.19 at. %. Absolute precision ranged between +-0.01-0.27 at. % (RSD +- 0.07-10.26%) for elements that contained as many as eight natural isotopes. Calibration curves were prepared using natural abundance metals and their enriched 50 Cr, 60 Ni, and 65 Cu isotopes covering the range 0.1-1010.7 at. % excess. A separate multiple isotope calibration curve was similarly prepared using enriched 60 Ni (0.02-2.15 at. % excess) and 62 Ni (0.23-18.5 at. % excess). The samples were analyzed by GC/CI/MS. Human plasma, containing enriched 26 Mg and 44 Ca, was analyzed by EI/MS. 1 figure, 5 tables

  2. Inequalities in Open Source Software Development: Analysis of Contributor's Commits in Apache Software Foundation Projects.

    Science.gov (United States)

    Chełkowski, Tadeusz; Gloor, Peter; Jemielniak, Dariusz

    2016-01-01

    While researchers are becoming increasingly interested in studying OSS phenomenon, there is still a small number of studies analyzing larger samples of projects investigating the structure of activities among OSS developers. The significant amount of information that has been gathered in the publicly available open-source software repositories and mailing-list archives offers an opportunity to analyze projects structures and participant involvement. In this article, using on commits data from 263 Apache projects repositories (nearly all), we show that although OSS development is often described as collaborative, but it in fact predominantly relies on radically solitary input and individual, non-collaborative contributions. We also show, in the first published study of this magnitude, that the engagement of contributors is based on a power-law distribution.

  3. Accident Damage Analysis Module (ADAM) – Technical Guidance, Software tool for Consequence Analysis calculations

    OpenAIRE

    FABBRI LUCIANO; BINDA MASSIMO; BRUINEN DE BRUIN YURI

    2017-01-01

    This report provides a technical description of the modelling and assumptions of the Accident Damage Analysis Module (ADAM) software application, which has been recently developed by the Joint Research Centre (JRC) of the European Commission (EC) to assess physical effects of an industrial accident resulting from an unintended release of a dangerous substance

  4. Evaluation of Distribution Analysis Software for DER Applications

    Energy Technology Data Exchange (ETDEWEB)

    Staunton, RH

    2003-01-23

    unstoppable. In response, energy providers will be forced to both fully acknowledge the trend and plan for accommodating DER [3]. With bureaucratic barriers [4], lack of time/resources, tariffs, etc. still seen in certain regions of the country, changes still need to be made. Given continued technical advances in DER, the time is fast approaching when the industry, nation-wide, must not only accept DER freely but also provide or review in-depth technical assessments of how DER should be integrated into and managed throughout the distribution system. Characterization studies are needed to fully understand how both the utility system and DER devices themselves will respond to all reasonable events (e.g., grid disturbances, faults, rapid growth, diverse and multiple DER systems, large reactive loads). Some of this work has already begun as it relates to operation and control of DER [5] and microturbine performance characterization [6,7]. One of the most urgently needed tools that can provide these types of analyses is a distribution network analysis program in combination with models for various DER. Together, they can be used for (1) analyzing DER placement in distribution networks and (2) helping to ensure that adequate transmission reliability is maintained. Surveys of the market show products that represent a partial match to these needs; specifically, software that has been developed to plan electrical distribution systems and analyze reliability (in a near total absence of DER). The first part of this study (Sections 2 and 3 of the report) looks at a number of these software programs and provides both summary descriptions and comparisons. The second part of this study (Section 4 of the report) considers the suitability of these analysis tools for DER studies. It considers steady state modeling and assessment work performed by ORNL using one commercially available tool on feeder data provided by a southern utility. Appendix A provides a technical report on the results of

  5. UNiquant, a Program for Quantitative Proteomics Analysis Using Stable Isotope Labeling

    Energy Technology Data Exchange (ETDEWEB)

    Huang, Xin; Tolmachev, Aleksey V.; Shen, Yulei; Liu, Miao; Huang, Lin; Zhang, Zhixin; Anderson, Gordon A.; Smith, Richard D.; Chan, Wing C.; Hinrichs, Steven; Fu, Kai; Ding, Shi-Jian

    2011-03-04

    We present UNiquant, a new software program for analyzing stable isotope labeling (SIL) based quantitative proteomics data. UNiquant surpassed the performance of two other platforms, MaxQuant and Mascot Distiller, using complex proteome mixtures having either known or unknown heavy/light ratios. UNiquant is compatible with a broad spectrum of search engines and SIL methods, providing outstanding peptide pair identification and accurate measurement of the relative peptide/protein abundance.

  6. Greek marbles: determination of provenance by isotopic analysis.

    Science.gov (United States)

    Craig, H; Craig, V

    1972-04-28

    A study has been made of carbon-13 and oxygen-18 variations in Greek marbles from the ancient quarry localities of Naxos, Paros, Mount Hymettus, and Mount Pentelikon. Parian, Hymettian, and Pentelic marbles can be clearly distinguished by the isotopic relationships; Naxian marbles fall into two groups characterized by different oxygen-18/oxygen-16 ratios. Ten archeological samples were also analyzed; the isotopic data indicate that the "Theseion" is made of Pentelic marble and a block in the Treasury of Siphnos at Delphi is probably Parian marble.

  7. Software and package applicating for network meta-analysis: A usage-based comparative study.

    Science.gov (United States)

    Xu, Chang; Niu, Yuming; Wu, Junyi; Gu, Huiyun; Zhang, Chao

    2017-12-21

    To compare and analyze the characteristics and functions of software applications for network meta-analysis (NMA). PubMed, EMbase, The Cochrane Library, the official websites of Bayesian inference Using Gibbs Sampling (BUGS), Stata and R, and Google were searched to collect the software and packages for performing NMA; software and packages published up to March 2016 were included. After collecting the software, packages, and their user guides, we used the software and packages to calculate a typical example. All characteristics, functions, and computed results were compared and analyzed. Ten types of software were included, including programming and non-programming software. They were developed mainly based on Bayesian or frequentist theory. Most types of software have the characteristics of easy operation, easy mastery, exact calculation, or excellent graphing. However, there was no single software that performed accurate calculations with superior graphing; this could only be achieved through the combination of two or more types of software. This study suggests that the user should choose the appropriate software according to personal programming basis, operational habits, and financial ability. Then, the choice of the combination of BUGS and R (or Stata) software to perform the NMA is considered. © 2017 Chinese Cochrane Center, West China Hospital of Sichuan University and John Wiley & Sons Australia, Ltd.

  8. NuFTA: A CASE Tool for Automatic Software Fault Tree Analysis

    International Nuclear Information System (INIS)

    Yun, Sang Hyun; Lee, Dong Ah; Yoo, Jun Beom

    2010-01-01

    Software fault tree analysis (SFTA) is widely used for analyzing software requiring high-reliability. In SFTA, experts predict failures of system through HA-ZOP (Hazard and Operability study) or FMEA (Failure Mode and Effects Analysis) and draw software fault trees about the failures. Quality and cost of the software fault tree, therefore, depend on knowledge and experience of the experts. This paper proposes a CASE tool NuFTA in order to assist experts of safety analysis. The NuFTA automatically generate software fault trees from NuSCR formal requirements specification. NuSCR is a formal specification language used for specifying software requirements of KNICS RPS (Reactor Protection System) in Korea. We used the SFTA templates proposed by in order to generate SFTA automatically. The NuFTA also generates logical formulae summarizing the failure's cause, and we have a plan to use the formulae usefully through formal verification techniques

  9. U and Pb isotope analysis of uraninite and galena by ion microprobe

    Energy Technology Data Exchange (ETDEWEB)

    Evins, L.Z.; Sunde, T.; Schoeberg, H. [Swedish Museum of Natural History, Stockholm (Sweden). Laboratory for Isotope Geology; Fayek, M. [Univ. of Tennessee, Knoxville, TN (United States). Dept. of Geological Sciences

    2001-10-01

    Accurate isotopic analysis of minerals by ion microprobe, or SIMS (Secondary Ion Mass Spectrometry) usually requires a standard to correct for instrumental mass bias effects that occur during analysis. We have calibrated two uraninite crystals and one galena crystal to be used as ion probe standards. As part of this study we describe the analytical procedures and problems encountered while trying to establish fractionation factors for U and Pb isotopes measured in galena and uraninite. Only the intra-element isotopic mass fractionation is considered and not the interelement fractionation. Galena and uraninite were analysed with TIMS (Thermal Ionisation Mass Spectrometry) prior to SIMS. One uraninite crystal (P88) comes from Sweden and is ca 900 Ma old, the other from Maine, USA (LAMNH-30222) and is ca 350 Ma old. The galena sample comes from the Paleoproterozoic ore district Bergslagen in Sweden. SIMS analyses were performed at two different laboratories: the NORDSM facility in Stockholm, which has a high resolution Cameca IMS 1270 ion microprobe, and the Oak Ridge National Laboratory (ORNL) in Tennessee, which has a Cameca IMS 4f ion microprobe. The results show that during the analysis of galena, Pb isotopes fractionate in favour of the lighter isotope by as much as 0.5%/amu. A Pb isotope fractionation factor for uraninite was more difficult to calculate, probably due to the formation of hydride interferences encountered during analysis with the Cameca IMS 1270 ion microprobe. However, drying the sample in vacuum prior to analysis, and using high-energy filtering and a cold trap during analysis can minimise these hydride interferences. A large fractionation of U isotopes of ca 1.4%/amu in favour of the lighter isotope was calculated for uraninite.

  10. U and Pb isotope analysis of uraninite and galena by ion microprobe

    International Nuclear Information System (INIS)

    Evins, L.Z.; Sunde, T.; Schoeberg, H.; Fayek, M.

    2001-10-01

    Accurate isotopic analysis of minerals by ion microprobe, or SIMS (Secondary Ion Mass Spectrometry) usually requires a standard to correct for instrumental mass bias effects that occur during analysis. We have calibrated two uraninite crystals and one galena crystal to be used as ion probe standards. As part of this study we describe the analytical procedures and problems encountered while trying to establish fractionation factors for U and Pb isotopes measured in galena and uraninite. Only the intra-element isotopic mass fractionation is considered and not the interelement fractionation. Galena and uraninite were analysed with TIMS (Thermal Ionisation Mass Spectrometry) prior to SIMS. One uraninite crystal (P88) comes from Sweden and is ca 900 Ma old, the other from Maine, USA (LAMNH-30222) and is ca 350 Ma old. The galena sample comes from the Paleoproterozoic ore district Bergslagen in Sweden. SIMS analyses were performed at two different laboratories: the NORDSM facility in Stockholm, which has a high resolution Cameca IMS 1270 ion microprobe, and the Oak Ridge National Laboratory (ORNL) in Tennessee, which has a Cameca IMS 4f ion microprobe. The results show that during the analysis of galena, Pb isotopes fractionate in favour of the lighter isotope by as much as 0.5%/amu. A Pb isotope fractionation factor for uraninite was more difficult to calculate, probably due to the formation of hydride interferences encountered during analysis with the Cameca IMS 1270 ion microprobe. However, drying the sample in vacuum prior to analysis, and using high-energy filtering and a cold trap during analysis can minimise these hydride interferences. A large fractionation of U isotopes of ca 1.4%/amu in favour of the lighter isotope was calculated for uraninite

  11. Analyzing the State of Static Analysis : A Large-Scale Evaluation in Open Source Software

    NARCIS (Netherlands)

    Beller, M.; Bholanath, R.; McIntosh, S.; Zaidman, A.E.

    2016-01-01

    The use of automatic static analysis has been a software engineering best practice for decades. However, we still do not know a lot about its use in real-world software projects: How prevalent is the use of Automated Static Analysis Tools (ASATs) such as FindBugs and JSHint? How do developers use

  12. A Method for Software Requirement Volatility Analysis Using QFD

    Directory of Open Access Journals (Sweden)

    Yunarso Anang

    2016-10-01

    Full Text Available Changes of software requirements are inevitable during the development life cycle. Rather than avoiding the circumstance, it is easier to just accept it and find a way to anticipate those changes. This paper proposes a method to analyze the volatility of requirement by using the Quality Function Deployment (QFD method and the introduced degree of volatility. Customer requirements are deployed to software functions and subsequently to architectural design elements. And then, after determining the potential for changes of the design elements, the degree of volatility of the software requirements is calculated. In this paper the method is described using a flow diagram and illustrated using a simple example, and is evaluated using a case study.

  13. Distortion Analysis Toolkit—A Software Tool for Easy Analysis of Nonlinear Audio Systems

    Directory of Open Access Journals (Sweden)

    Jyri Pakarinen

    2010-01-01

    Full Text Available Several audio effects devices deliberately add nonlinear distortion to the processed signal in order to create a desired sound. When creating virtual analog models of nonlinearly distorting devices, it would be very useful to carefully analyze the type of distortion, so that the model could be made as realistic as possible. While traditional system analysis tools such as the frequency response give detailed information on the operation of linear and time-invariant systems, they are less useful for analyzing nonlinear devices. Furthermore, although there do exist separate algorithms for nonlinear distortion analysis, there is currently no unified, easy-to-use tool for rapid analysis of distorting audio systems. This paper offers a remedy by introducing a new software tool for easy analysis of distorting effects. A comparison between a well-known guitar tube amplifier and two commercial software simulations is presented as a case study. This freely available software is written in Matlab language, but the analysis tool can also run as a standalone program, so the user does not need to have Matlab installed in order to perform the analysis.

  14. Progress in Addressing DNFSB Recommendation 2002-1 Issues: Improving Accident Analysis Software Applications

    International Nuclear Information System (INIS)

    VINCENT, ANDREW

    2005-01-01

    Defense Nuclear Facilities Safety Board (DNFSB) Recommendation 2002-1 (''Quality Assurance for Safety-Related Software'') identified a number of quality assurance issues on the use of software in Department of Energy (DOE) facilities for analyzing hazards, and designing and operating controls to prevent or mitigate potential accidents. Over the last year, DOE has begun several processes and programs as part of the Implementation Plan commitments, and in particular, has made significant progress in addressing several sets of issues particularly important in the application of software for performing hazard and accident analysis. The work discussed here demonstrates that through these actions, Software Quality Assurance (SQA) guidance and software tools are available that can be used to improve resulting safety analysis. Specifically, five of the primary actions corresponding to the commitments made in the Implementation Plan to Recommendation 2002-1 are identified and discussed in this paper. Included are the web-based DOE SQA Knowledge Portal and the Central Registry, guidance and gap analysis reports, electronic bulletin board and discussion forum, and a DOE safety software guide. These SQA products can benefit DOE safety contractors in the development of hazard and accident analysis by precluding inappropriate software applications and utilizing best practices when incorporating software results to safety basis documentation. The improvement actions discussed here mark a beginning to establishing stronger, standard-compliant programs, practices, and processes in SQA among safety software users, managers, and reviewers throughout the DOE Complex. Additional effort is needed, however, particularly in: (1) processes to add new software applications to the DOE Safety Software Toolbox; (2) improving the effectiveness of software issue communication; and (3) promoting a safety software quality assurance culture

  15. An effective technique for the software requirements analysis of NPP safety-critical systems, based on software inspection, requirements traceability, and formal specification

    International Nuclear Information System (INIS)

    Koo, Seo Ryong; Seong, Poong Hyun; Yoo, Junbeom; Cha, Sung Deok; Yoo, Yeong Jae

    2005-01-01

    A thorough requirements analysis is indispensable for developing and implementing safety-critical software systems such as nuclear power plant (NPP) software systems because a single error in the requirements can generate serious software faults. However, it is very difficult to completely analyze system requirements. In this paper, an effective technique for the software requirements analysis is suggested. For requirements verification and validation (V and V) tasks, our technique uses software inspection, requirement traceability, and formal specification with structural decomposition. Software inspection and requirements traceability analysis are widely considered the most effective software V and V methods. Although formal methods are also considered an effective V and V activity, they are difficult to use properly in the nuclear fields as well as in other fields because of their mathematical nature. In this work, we propose an integrated environment (IE) approach for requirements, which is an integrated approach that enables easy inspection by combining requirement traceability and effective use of a formal method. The paper also introduces computer-aided tools for supporting IE approach for requirements. Called the nuclear software inspection support and requirements traceability (NuSISRT), the tool incorporates software inspection, requirement traceability, and formal specification capabilities. We designed the NuSISRT to partially automate software inspection and analysis of requirement traceability. In addition, for the formal specification and analysis, we used the formal requirements specification and analysis tool for nuclear engineering (NuSRS)

  16. iMS2Flux – a high–throughput processing tool for stable isotope labeled mass spectrometric data used for metabolic flux analysis

    Directory of Open Access Journals (Sweden)

    Poskar C Hart

    2012-11-01

    Full Text Available Abstract Background Metabolic flux analysis has become an established method in systems biology and functional genomics. The most common approach for determining intracellular metabolic fluxes is to utilize mass spectrometry in combination with stable isotope labeling experiments. However, before the mass spectrometric data can be used it has to be corrected for biases caused by naturally occurring stable isotopes, by the analytical technique(s employed, or by the biological sample itself. Finally the MS data and the labeling information it contains have to be assembled into a data format usable by flux analysis software (of which several dedicated packages exist. Currently the processing of mass spectrometric data is time-consuming and error-prone requiring peak by peak cut-and-paste analysis and manual curation. In order to facilitate high-throughput metabolic flux analysis, the automation of multiple steps in the analytical workflow is necessary. Results Here we describe iMS2Flux, software developed to automate, standardize and connect the data flow between mass spectrometric measurements and flux analysis programs. This tool streamlines the transfer of data from extraction via correction tools to 13C-Flux software by processing MS data from stable isotope labeling experiments. It allows the correction of large and heterogeneous MS datasets for the presence of naturally occurring stable isotopes, initial biomass and several mass spectrometry effects. Before and after data correction, several checks can be performed to ensure accurate data. The corrected data may be returned in a variety of formats including those used by metabolic flux analysis software such as 13CFLUX, OpenFLUX and 13CFLUX2. Conclusion iMS2Flux is a versatile, easy to use tool for the automated processing of mass spectrometric data containing isotope labeling information. It represents the core framework for a standardized workflow and data processing. Due to its flexibility

  17. Direct U isotope analysis in μm-sized particles by LA-MC-ICPMS

    International Nuclear Information System (INIS)

    Kappel, S.; Boulyga, S.F.; Prohaska, T.

    2009-01-01

    Full text: The knowledge of the isotopic composition of individual μm-sized hot particles is of great interest especially for strengthened nuclear safeguards in order to identify undeclared nuclear activities. We present the potential of a 'Nu Plasma HR' MC-ICPMS coupled to a New Wave 'UP 193' laser ablation (LA) system for the direct analysis of U isotope abundance ratios in individual μm-sized particles. The ability to determine 234 U/ 238 U and 235 U/ 238 U isotope ratios was successfully demonstrated in the NUSIMEP-6 interlaboratory comparison, which was organized by the IRMM (Geel, Belgium). (author)

  18. Isotope ratio analysis by a combination of element analyzer and mass spectrometer

    International Nuclear Information System (INIS)

    Pichlmayer, F.

    1987-06-01

    The use of stable isotope ratios of carbon, nitrogen and sulfur as analytical tool in many fields of research is of growing interest. A method has therefore been developed, consisting in essential of coupling an Elemental Analyzer with an Isotope Mass Spectrometer, which enables the gas preparation of carbon dioxide, nitrogen and sulfur dioxide from any solid or liquid sample in a fast and easy way. Results of carbon isotope measurements in food analysis are presented, whereat it is possible to check origin and treatment of sugar, oils, fats, mineral waters, spirituous liquors etc. and to detect adulterations as well. Also applications in the field of environmental research are given. (Author)

  19. Use of azeotropic distillation for isotopic analysis of deuterium in soil water and saturate saline solution

    International Nuclear Information System (INIS)

    Santos, Antonio Vieira dos.

    1995-05-01

    The azeotropic distillation technique was adapted to extract soil water and saturate saline solution, which is similar to the sea water for the Isotopic Determination of Deuterium (D). A soil test was used to determine the precision and the nature of the methodology to extract soil water for stable isotopic analysis, using the azeotropic distillation and comparing with traditional methodology of heating under vacuum. This methodology has been very useful for several kinds of soil or saturate saline solution. The apparatus does not have a memory effect, and the chemical reagents do not affect the isotopic composition of soil water. (author). 43 refs., 10 figs., 12 tabs

  20. Analysis of Stable Isotope Contents of Surface and Underground ...

    African Journals Online (AJOL)

    Sam Eshun

    (2H/1H) ratios relative to a standard called Standard Mean Ocean Water (SMOW) ..... Hence, higher forest cover has greater influence on heavy isotope ... Accra Plains is influenced by the Atlantic Ocean where the relative humidity is higher ...

  1. Generator Coordinate Method Analysis of Xe and Ba Isotopes

    Science.gov (United States)

    Higashiyama, Koji; Yoshinaga, Naotaka; Teruya, Eri

    Nuclear structure of Xe and Ba isotopes is studied in terms of the quantum-number projected generator coordinate method (GCM). The GCM reproduces well the energy levels of high-spin states as well as low-lying states. The structure of the low-lying states is analyzed through the GCM wave functions.

  2. Using recurrence plot analysis for software execution interpretation and fault detection

    Science.gov (United States)

    Mosdorf, M.

    2015-09-01

    This paper shows a method targeted at software execution interpretation and fault detection using recurrence plot analysis. In in the proposed approach recurrence plot analysis is applied to software execution trace that contains executed assembly instructions. Results of this analysis are subject to further processing with PCA (Principal Component Analysis) method that simplifies number coefficients used for software execution classification. This method was used for the analysis of five algorithms: Bubble Sort, Quick Sort, Median Filter, FIR, SHA-1. Results show that some of the collected traces could be easily assigned to particular algorithms (logs from Bubble Sort and FIR algorithms) while others are more difficult to distinguish.

  3. Compound-Specific Isotope Analysis of Diesel Fuels in a Forensic Investigation

    Science.gov (United States)

    Muhammad, Syahidah; Frew, Russell; Hayman, Alan

    2015-02-01

    Compound-specific isotope analysis (CSIA) offers great potential as a tool to provide chemical evidence in a forensic investigation. Many attempts to trace environmental oil spills were successful where isotopic values were particularly distinct. However, difficulties arise when a large data set is analyzed and the isotopic differences between samples are subtle. In the present study, discrimination of diesel oils involved in a diesel theft case was carried out to infer the relatedness of the samples to potential source samples. This discriminatory analysis used a suite of hydrocarbon diagnostic indices, alkanes, to generate carbon and hydrogen isotopic data of the compositions of the compounds which were then processed using multivariate statistical analyses to infer the relatedness of the data set. The results from this analysis were put into context by comparing the data with the δ13C and δ2H of alkanes in commercial diesel samples obtained from various locations in the South Island of New Zealand. Based on the isotopic character of the alkanes, it is suggested that diesel fuels involved in the diesel theft case were distinguishable. This manuscript shows that CSIA when used in tandem with multivariate statistical analysis provide a defensible means to differentiate and source-apportion qualitatively similar oils at the molecular level. This approach was able to overcome confounding challenges posed by the near single-point source of origin i.e. the very subtle differences in isotopic values between the samples.

  4. Compound-Specific Isotope Analysis of Diesel Fuels in a Forensic Investigation

    Directory of Open Access Journals (Sweden)

    Syahidah Akmal Muhammad

    2015-02-01

    Full Text Available Compound-specific isotope analysis (CSIA offers great potential as a tool to provide chemical evidence in a forensic investigation. Many attempts to trace environmental oil spills were successful where isotopic values were particularly distinct. However, difficulties arise when a large data set is analyzed and the isotopic differences between samples are subtle. In the present study, discrimination of diesel oils involved in a diesel theft case was carried out to infer the relatedness of the samples to potential source samples. This discriminatory analysis used a suite of hydrocarbon diagnostic indices, alkanes, to generate carbon and hydrogen isotopic data of the compositions of the compounds which were then processed using multivariate statistical analyses to infer the relatedness of the data set. The results from this analysis were put into context by comparing the data with the δ13C and δ2H of alkanes in commercial diesel samples obtained from various locations in the South Island of New Zealand. Based on the isotopic character of the alkanes, it is suggested that diesel fuels involved in the diesel theft case were distinguishable. This manuscript shows that CSIA when used in tandem with multivariate statistical analysis provide a defensible means to differentiate and source-apportion qualitatively similar oils at the molecular level. This approach was able to overcome confounding challenges posed by the near single-point source of origin i.e. the very subtle differences in isotopic values between the samples.

  5. Compound-specific isotope analysis of diesel fuels in a forensic investigation.

    Science.gov (United States)

    Muhammad, Syahidah A; Frew, Russell D; Hayman, Alan R

    2015-01-01

    Compound-specific isotope analysis (CSIA) offers great potential as a tool to provide chemical evidence in a forensic investigation. Many attempts to trace environmental oil spills were successful where isotopic values were particularly distinct. However, difficulties arise when a large data set is analyzed and the isotopic differences between samples are subtle. In the present study, discrimination of diesel oils involved in a diesel theft case was carried out to infer the relatedness of the samples to potential source samples. This discriminatory analysis used a suite of hydrocarbon diagnostic indices, alkanes, to generate carbon and hydrogen isotopic data of the compositions of the compounds which were then processed using multivariate statistical analyses to infer the relatedness of the data set. The results from this analysis were put into context by comparing the data with the δ(13)C and δ(2)H of alkanes in commercial diesel samples obtained from various locations in the South Island of New Zealand. Based on the isotopic character of the alkanes, it is suggested that diesel fuels involved in the diesel theft case were distinguishable. This manuscript shows that CSIA when used in tandem with multivariate statistical analysis provide a defensible means to differentiate and source-apportion qualitatively similar oils at the molecular level. This approach was able to overcome confounding challenges posed by the near single-point source of origin, i.e., the very subtle differences in isotopic values between the samples.

  6. GROMOS++Software for the Analysis of Biomolecular Simulation Trajectories

    NARCIS (Netherlands)

    Eichenberger, A.P.; Allison, J.R.; Dolenc, J.; Geerke, D.P.; Horta, B.A.C.; Meier, K; Oostenbrink, B.C.; Schmid, N.; Steiner, D; Wang, D.; van Gunsteren, W.F.

    2011-01-01

    GROMOS++ is a set of C++ programs for pre- and postprocessing of molecular dynamics simulation trajectories and as such is part of the GROningen MOlecular Simulation software for (bio)molecular simulation. It contains more than 70 programs that can be used to prepare data for the production of

  7. Algebraic software analysis and embedded simulation of a driving robot

    NARCIS (Netherlands)

    Merkx, L.L.F.; Duringhof, H.M.; Cuijpers, P.J.L.

    2007-01-01

    At TNO Automotive the Generic Driving Actuator (GDA) is developed. The GDA is a device capable of driving a vehicle fully automatically using the same interface as a human driver does. In this paper, the design of the GDA is discussed. The software and hardware of the GDA and its effect on vehicle

  8. Program spectra analysis in embedded software : A case study

    NARCIS (Netherlands)

    Abreu, R.; Zoeteweij, P.; Van Gemund, A.J.C.

    2006-01-01

    Because of constraints imposed by the market, embedded software in consumer electronics is almost inevitably shipped with faults and the goal is just to reduce the inherent unreliability to an acceptable level before a product has to be released. Automatic fault diagnosis is a valuable tool to

  9. Image analysis software versus direct anthropometry for breast measurements.

    Science.gov (United States)

    Quieregatto, Paulo Rogério; Hochman, Bernardo; Furtado, Fabianne; Machado, Aline Fernanda Perez; Sabino Neto, Miguel; Ferreira, Lydia Masako

    2014-10-01

    To compare breast measurements performed using the software packages ImageTool(r), AutoCAD(r) and Adobe Photoshop(r) with direct anthropometric measurements. Points were marked on the breasts and arms of 40 volunteer women aged between 18 and 60 years. When connecting the points, seven linear segments and one angular measurement on each half of the body, and one medial segment common to both body halves were defined. The volunteers were photographed in a standardized manner. Photogrammetric measurements were performed by three independent observers using the three software packages and compared to direct anthropometric measurements made with calipers and a protractor. Measurements obtained with AutoCAD(r) were the most reproducible and those made with ImageTool(r) were the most similar to direct anthropometry, while measurements with Adobe Photoshop(r) showed the largest differences. Except for angular measurements, significant differences were found between measurements of line segments made using the three software packages and those obtained by direct anthropometry. AutoCAD(r) provided the highest precision and intermediate accuracy; ImageTool(r) had the highest accuracy and lowest precision; and Adobe Photoshop(r) showed intermediate precision and the worst accuracy among the three software packages.

  10. Software Graphical User Interface For Analysis Of Images

    Science.gov (United States)

    Leonard, Desiree M.; Nolf, Scott R.; Avis, Elizabeth L.; Stacy, Kathryn

    1992-01-01

    CAMTOOL software provides graphical interface between Sun Microsystems workstation and Eikonix Model 1412 digitizing camera system. Camera scans and digitizes images, halftones, reflectives, transmissives, rigid or flexible flat material, or three-dimensional objects. Users digitize images and select from three destinations: work-station display screen, magnetic-tape drive, or hard disk. Written in C.

  11. Control and analysis software for a laser scanning microdensitometer

    Indian Academy of Sciences (India)

    A PC-based control software and data acquisition system is devel- oped for an existing ... Description of the system. Figure 1 shows a schematic diagram of the microdensitometer and the data acquisition system. ... ming language with very strong library functions and it also supports direct input/output programming. 5.

  12. Prospects for Evidence -Based Software Assurance: Models and Analysis

    Science.gov (United States)

    2015-09-01

    Languages and Systems (TOPLAS) 28, 1 (2006), 175–205. [27] Higgins , K. Spear-phishing attacks out of China targeted source code, intellectual property...Spinuzzi. Building more usable APIs. IEEE Software, 15(3):78–86, 1998. [23] Chris Parnin and Christoph Treude. Measuring api documentation on the web. In

  13. Software engineering article types: An analysis of the literature

    NARCIS (Netherlands)

    Montesi, M.; Lago, P.

    2008-01-01

    The software engineering (SE) community has recently recognized that the field lacks well-established research paradigms and clear guidance on how to write good research reports. With no comprehensive guide to the different article types in the field, article writing and reviewing heavily depends on

  14. Software package for analysis of completely randomized block design

    African Journals Online (AJOL)

    This study is to design and develop statistical software (package), OYSP1.0 which conveniently accommodates and analyzes large mass of data emanating from experimental designs, in particular, completely Randomized Block design. Visual Basic programming is used in the design. The statistical package OYSP 1.0 ...

  15. Applications of stable isotope analysis in foodstuffs surveillance and environmental research

    International Nuclear Information System (INIS)

    Pichlmayer, F.; Blochberger, F.

    1991-12-01

    The instrumental coupling of Elemental Analysis and Mass Spectrometry, constituting a convenient tool for isotope ratio measurements of the bioelements in solid or liquid samples is now well established. Advantages of this technique compared with the so far usual wet chemistry sample preparation are: speed of analysis, easy operation and minor sample consumption. The performance of the system is described and some applications are given. Detection of foodstuffs adulterations is mainly based on the natural carbon isotope differences between C 3 - and C 4 -plants. In the field of environmental research the existing small isotopic variations of carbon, nitrogen and sulfur in nature, which depend on substance origin and history, are used as intrinsic signature of the considered sample. Examples of source appointment or exclusion by help of this natural isotopic tracer method are dealt with. (authors)

  16. Risk Analysis and Decision-Making Software Package (1997 Version) User Manual

    Energy Technology Data Exchange (ETDEWEB)

    Chung, F.T.H.

    1999-02-11

    This manual provides instructions for using the U.S. Department of Energy's (DOE) risk analysis and decision making software (1997 version) developed at BDM Petroleum Technologies by BDM-Oklahoma, Inc. for DOE, under contract No. DE-AC22-94PC91OO8. This software provides petroleum producers with a simple, handy tool for exploration and production risk analysis and decision-making. It collects useful risk analysis tools in one package so that users do not have to use several programs separately. The software is simple to use, but still provides many functions. The 1997 version of the software package includes the following tools: (1) Investment risk (Gambler's ruin) analysis; (2) Monte Carlo simulation; (3) Best fit for distribution functions; (4) Sample and rank correlation; (5) Enhanced oil recovery method screening; and (6) artificial neural network. This software package is subject to change. Suggestions and comments from users are welcome and will be considered for future modifications and enhancements of the software. Please check the opening screen of the software for the current contact information. In the future, more tools will be added to this software package. This manual includes instructions on how to use the software but does not attempt to fully explain the theory and algorithms used to create it.

  17. Practicality for Software Hazard Analysis for Nuclear Safety I and C System

    International Nuclear Information System (INIS)

    Kim, Yong-Ho; Moon, Kwon-Ki; Chang, Young-Woo; Jeong, Soo-Hyun

    2016-01-01

    We are using the concept of system safety in engineering. It is difficult to make any system perfectly safe and probably a complete system may not easily be achieved. The standard definition of a system from MIL-STD- 882E is: “The organization of hardware, software, material, facilities, personnel, data, and services needed to perform a designated function within a stated environment with specified results.” From the perspective of the system safety engineer and the hazard analysis process, software is considered as a subsystem. Regarding hazard analysis, to date, methods for identifying software failures and determining their effects is still a research problem. Since the success of software development is based on rigorous test of hardware and software, it is necessary to check the balance between software test and hardware test, and in terms of efficiency. Lessons learned and experience from similar systems are important for the work of hazard analysis. No major hazard has been issued for the software developed and verified in Korean NPPs. In addition to hazard analysis, software development, and verification and validation were thoroughly performed. It is reasonable that the test implementation including the development of the test case, stress and abnormal conditions, error recovery situations, and high risk hazardous situations play a key role in detecting and preventing software faults

  18. Practicality for Software Hazard Analysis for Nuclear Safety I and C System

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Yong-Ho; Moon, Kwon-Ki; Chang, Young-Woo; Jeong, Soo-Hyun [KEPCO Engineering and Construction Co., Deajeon (Korea, Republic of)

    2016-10-15

    We are using the concept of system safety in engineering. It is difficult to make any system perfectly safe and probably a complete system may not easily be achieved. The standard definition of a system from MIL-STD- 882E is: “The organization of hardware, software, material, facilities, personnel, data, and services needed to perform a designated function within a stated environment with specified results.” From the perspective of the system safety engineer and the hazard analysis process, software is considered as a subsystem. Regarding hazard analysis, to date, methods for identifying software failures and determining their effects is still a research problem. Since the success of software development is based on rigorous test of hardware and software, it is necessary to check the balance between software test and hardware test, and in terms of efficiency. Lessons learned and experience from similar systems are important for the work of hazard analysis. No major hazard has been issued for the software developed and verified in Korean NPPs. In addition to hazard analysis, software development, and verification and validation were thoroughly performed. It is reasonable that the test implementation including the development of the test case, stress and abnormal conditions, error recovery situations, and high risk hazardous situations play a key role in detecting and preventing software faults.

  19. A tool to include gamma analysis software into a quality assurance program.

    Science.gov (United States)

    Agnew, Christina E; McGarry, Conor K

    2016-03-01

    To provide a tool to enable gamma analysis software algorithms to be included in a quality assurance (QA) program. Four image sets were created comprising two geometric images to independently test the distance to agreement (DTA) and dose difference (DD) elements of the gamma algorithm, a clinical step and shoot IMRT field and a clinical VMAT arc. The images were analysed using global and local gamma analysis with 2 in-house and 8 commercially available software encompassing 15 software versions. The effect of image resolution on gamma pass rates was also investigated. All but one software accurately calculated the gamma passing rate for the geometric images. Variation in global gamma passing rates of 1% at 3%/3mm and over 2% at 1%/1mm was measured between software and software versions with analysis of appropriately sampled images. This study provides a suite of test images and the gamma pass rates achieved for a selection of commercially available software. This image suite will enable validation of gamma analysis software within a QA program and provide a frame of reference by which to compare results reported in the literature from various manufacturers and software versions. Copyright © 2015. Published by Elsevier Ireland Ltd.

  20. Comparison of gas chromatography/isotope ratio mass spectrometry and liquid chromatography/isotope ratio mass spectrometry for carbon stable-isotope analysis of carbohydrates.

    Science.gov (United States)

    Moerdijk-Poortvliet, Tanja C W; Schierbeek, Henk; Houtekamer, Marco; van Engeland, Tom; Derrien, Delphine; Stal, Lucas J; Boschker, Henricus T S

    2015-07-15

    We compared gas chromatography/isotope ratio mass spectrometry (GC/IRMS) and liquid chromatography/isotope ratio mass spectrometry (LC/IRMS) for the measurement of δ(13)C values in carbohydrates. Contrary to GC/IRMS, no derivatisation is needed for LC/IRMS analysis of carbohydrates. Hence, although LC/IRMS is expected to be more accurate and precise, no direct comparison has been reported. GC/IRMS with the aldonitrile penta-acetate (ANPA) derivatisation method was compared with LC/IRMS without derivatisation. A large number of glucose standards and a variety of natural samples were analysed for five neutral carbohydrates at natural abundance as well as at (13)C-enriched levels. Gas chromatography/chemical ionisation mass spectrometry (GC/CIMS) was applied to check for incomplete derivatisation of the carbohydrate, which would impair the accuracy of the GC/IRMS method. The LC/IRMS technique provided excellent precision (±0.08‰ and ±3.1‰ at natural abundance and enrichment levels, respectively) for the glucose standards and this technique proved to be superior to GC/IRMS (±0.62‰ and ±19.8‰ at natural abundance and enrichment levels, respectively). For GC/IRMS measurements the derivatisation correction and the conversion of carbohydrates into CO2 had a considerable effect on the measured δ(13)C values. However, we did not find any significant differences in the accuracy of the two techniques over the full range of natural δ(13)C abundances and (13)C-labelled glucose. The difference in the performance of GC/IRMS and LC/IRMS diminished when the δ(13)C values were measured in natural samples, because the chromatographic performance and background correction became critical factors, particularly for LC/IRMS. The derivatisation of carbohydrates for the GC/IRMS method was complete. Although both LC/IRMS and GC/IRMS are reliable techniques for compound-specific stable carbon isotope analysis of carbohydrates (provided that derivatisation is complete and the

  1. Design and validation of Segment - freely available software for cardiovascular image analysis

    International Nuclear Information System (INIS)

    Heiberg, Einar; Sjögren, Jane; Ugander, Martin; Carlsson, Marcus; Engblom, Henrik; Arheden, Håkan

    2010-01-01

    Commercially available software for cardiovascular image analysis often has limited functionality and frequently lacks the careful validation that is required for clinical studies. We have already implemented a cardiovascular image analysis software package and released it as freeware for the research community. However, it was distributed as a stand-alone application and other researchers could not extend it by writing their own custom image analysis algorithms. We believe that the work required to make a clinically applicable prototype can be reduced by making the software extensible, so that researchers can develop their own modules or improvements. Such an initiative might then serve as a bridge between image analysis research and cardiovascular research. The aim of this article is therefore to present the design and validation of a cardiovascular image analysis software package (Segment) and to announce its release in a source code format. Segment can be used for image analysis in magnetic resonance imaging (MRI), computed tomography (CT), single photon emission computed tomography (SPECT) and positron emission tomography (PET). Some of its main features include loading of DICOM images from all major scanner vendors, simultaneous display of multiple image stacks and plane intersections, automated segmentation of the left ventricle, quantification of MRI flow, tools for manual and general object segmentation, quantitative regional wall motion analysis, myocardial viability analysis and image fusion tools. Here we present an overview of the validation results and validation procedures for the functionality of the software. We describe a technique to ensure continued accuracy and validity of the software by implementing and using a test script that tests the functionality of the software and validates the output. The software has been made freely available for research purposes in a source code format on the project home page (http://segment.heiberg.se). Segment

  2. Portable microcomputer for the analysis of plutonium gamma-ray spectra. Volume II. Software description and listings

    International Nuclear Information System (INIS)

    Ruhter, W.D.

    1984-05-01

    A portable microcomputer has been developed and programmed for the International Atomic Energy Agency (IAEA) to perform in-field analysis of plutonium gamma-ray spectra. The unit includes a 16-bit LSI-11/2 microprocessor, 32-K words of memory, a 20-character display for user prompting, a numeric keyboard for user responses, and a 20-character thermal printer for hard-copy output of results. The unit weights 11 kg and has dimensions of 33.5 x 30.5 x 23.0 cm. This compactness allows the unit to be stored under an airline seat. Only the positions of the 148-keV 241 Pu and 208-keV 237 U peaks are required for spectral analysis that gives plutonium isotopic ratios and weight percent abundances. Volume I of this report provides a detailed description of the data analysis methodology, operation instructions, hardware, and maintenance and troubleshooting. Volume II describes the software and provides software listings

  3. An isotopic analysis process with optical emission spectrometry on a laser-produced plasma

    International Nuclear Information System (INIS)

    Mauchien, P.; Pietsch, W.; Petit, A.; Briand, A.

    1994-01-01

    The sample that is to be analyzed is irradiated with a laser beam to produce a plasma at the sample surface; the spectrum of the light emitted by the plasma is analyzed and the isotope composition of the sample is derived from the spectrometry. The process is preferentially applied to uranium and plutonium; it is rapid, simpler and cheaper than previous methods, and may be applied to 'in-situ' isotopic analysis in nuclear industry. 2 figs

  4. Individual economical value of plutonium isotopes and analysis of the reprocessing of irradiated fuel

    International Nuclear Information System (INIS)

    Gomes, I.C.; Rubini, L.A.; Barroso, D.E.G.

    1983-01-01

    An economical analysis of plutonium recycle in a PWR reactor, without any modification, is done, supposing an open market for the plutonium. The individual value of the plutonium isotopes is determined solving a system with four equations, which the unknow factors are the Pu-239, Pu-240, pu-241 and Pu-242 values. The equations are obtained equalizing the cost of plutonium fuel cycle of four different isotope mixture to the cost of the uranium fuel cycle. (E.G.) [pt

  5. User-driven integrated software lives: ``Paleomag'' paleomagnetics analysis on the Macintosh

    Science.gov (United States)

    Jones, Craig H.

    2002-12-01

    "PaleoMag," a paleomagnetics analysis package originally developed for the Macintosh operating system in 1988, allows examination of demagnetization of individual samples and analysis of directional data from collections of samples. Prior to recent reinvigorated development of the software for both Macintosh and Windows, it was widely used despite not running properly on machines and operating systems sold after 1995. This somewhat surprising situation demonstrates that there is a continued need for integrated analysis software within the earth sciences, in addition to well-developed scripting and batch-mode software. One distinct advantage of software like PaleoMag is in the ability to combine quality control with analysis within a unique graphical environment. Because such demands are frequent within the earth sciences, means of nurturing the development of similar software should be found.

  6. Forensic analysis of explosives using isotope ratio mass spectrometry (IRMS)--discrimination of ammonium nitrate sources.

    Science.gov (United States)

    Benson, Sarah J; Lennard, Christopher J; Maynard, Philip; Hill, David M; Andrew, Anita S; Roux, Claude

    2009-06-01

    An evaluation was undertaken to determine if isotope ratio mass spectrometry (IRMS) could assist in the investigation of complex forensic cases by providing a level of discrimination not achievable utilising traditional forensic techniques. The focus of the research was on ammonium nitrate (AN), a common oxidiser used in improvised explosive mixtures. The potential value of IRMS to attribute Australian AN samples to the manufacturing source was demonstrated through the development of a preliminary AN classification scheme based on nitrogen isotopes. Although the discrimination utilising nitrogen isotopes alone was limited and only relevant to samples from the three Australian manufacturers during the evaluated time period, the classification scheme has potential as an investigative aid. Combining oxygen and hydrogen stable isotope values permitted the differentiation of AN prills from three different Australian manufacturers. Samples from five different overseas sources could be differentiated utilising a combination of the nitrogen, oxygen and hydrogen isotope values. Limited differentiation between Australian and overseas prills was achieved for the samples analysed. The comparison of nitrogen isotope values from intact AN prill samples with those from post-blast AN prill residues highlighted that the nitrogen isotopic composition of the prills was not maintained post-blast; hence, limiting the technique to analysis of un-reacted explosive material.

  7. Utility of stable isotope analysis in studying foraging ecology of herbivores: Examples from moose and caribou

    Science.gov (United States)

    Ben-David, Merav; Shochat, Einav; Adams, Layne G.

    2001-01-01

    Recently, researchers emphasized that patterns of stable isotope ratios observed at the individual level are a result of an interaction between ecological, physiological, and biochemical processes. Isotopic models for herbivores provide additional complications because those mammals consume foods that have high variability in nitrogen concentrations. In addition, distribution of amino acids in plants may differ greatly from that required by a herbivore. At northern latitudes, where the growing season of vegetation is short, isotope ratios in herbivore tissues are expected to differ between seasons. Summer ratios likely reflect diet composition, whereas winter ratios would reflect diet and nutrient recycling by the animals. We tested this hypothesis using data collected from blood samples of caribou (Rangifer tarandus) and moose (Alces alces) in Denali National Park and Preserve, Alaska, USA. Stable isotope ratios of moose and caribou were significantly different from each other in late summer-autumn and winter. Also, late summer-autumn and winter ratios differed significantly between seasons in both species. Nonetheless, we were unable to evaluate whether differences in seasonal isotopic ratios were a result of diet selection or a response to nutrient recycling. We believe that additional studies on plant isotopic ratios as related to ecological factors in conjunction with investigations of diet selection by the herbivores will enhance our understanding of those interactions. Also, controlled studies investigating the relation between diet and physiological responses in herbivores will increase the utility of isotopic analysis in studying foraging ecology of herbivores.

  8. In Situ Carbon Isotope Analysis by Laser Ablation MC-ICP-MS.

    Science.gov (United States)

    Chen, Wei; Lu, Jue; Jiang, Shao-Yong; Zhao, Kui-Dong; Duan, Deng-Fei

    2017-12-19

    Carbon isotopes have been widely used in tracing a wide variety of geological and environmental processes. The carbon isotope composition of bulk rocks and minerals was conventionally analyzed by isotope ratio mass spectrometry (IRMS), and, more recently, secondary ionization mass spectrometry (SIMS) has been widely used to determine carbon isotope composition of carbon-bearing solid materials with good spatial resolution. Here, we present a new method that couples a RESOlution S155 193 nm laser ablation system with a Nu Plasma II MC-ICP-MS, with the aim of measuring carbon isotopes in situ in carbonate minerals (i.e., calcite and aragonite). Under routine operating conditions for δ 13 C analysis, instrumental bias generally drifts by 0.8‰-2.0‰ in a typical analytical session of 2-3 h. Using a magmatic calcite as the standard, the carbon isotopic composition was determined for a suite of calcite samples with δ 13 C values in the range of -6.94‰ to 1.48‰. The obtained δ 13 C data are comparable to IRMS values. The combined standard uncertainty for magmatic calcite is ICP-MS can serve as an appropriate method to analyze carbon isotopes of carbonate minerals in situ.

  9. Decision making model design for antivirus software selection using Factor Analysis and Analytical Hierarchy Process

    OpenAIRE

    Nurhayati Ai; Gautama Aditya; Naseer Muchammad

    2018-01-01

    Virus spread increase significantly through the internet in 2017. One of the protection method is using antivirus software. The wide variety of antivirus software in the market tends to creating confusion among consumer. Selecting the right antivirus according to their needs has become difficult. This is the reason we conduct our research. We formulate a decision making model for antivirus software consumer. The model is constructed by using factor analysis and AHP method. First we spread que...

  10. Food certification based on isotopic analysis, according to the European standards

    International Nuclear Information System (INIS)

    Costinel, Diana; Ionete, Roxana Elena; Vremera, Raluca; Stanciu, Vasile; Iordache, Andreea

    2007-01-01

    Full text: Under current EU research projects, several public research institutions, universities and private companies are collaborating to develop new methods of evidencing food adulteration and consequently assessing food safety. The use of mass spectrometry (MS) to determine the ratio of stable isotopes in bio-molecules now provides the means to prove the natural origin of a wide variety of foodstuffs - and therefore, to identify the fraud and consequently to reject the improper products or certify the food quality. Isotope analysis has been officially adopted by the EU as a means of controlling adulteration of some food stuffs. A network of research organizations developed the use of isotopic analysis to support training and technology transfer to encourage uptake of the technique. There were also developed proficiency-testing schemes to ensure the correct use of isotopic techniques in national testing laboratories. In addition, ensuring the food quality and safety is a requirement, which must be fulfilled for the integration in EU. The present paper emphasizes the isotopic analysis for D/H, 18 O/ 16 O, 13 C/ 12 C from food (honey, juice, wines) using a new generation Isotope Ratio MS, Finnigan Delta V Plus, coupled to a three flexible continuous flow preparation device (GasBench II, TC Elemental Analyser and GC-C/TC). (authors)

  11. Development of a new method for hydrogen isotope analysis of trace hydrocarbons in natural gas samples

    Directory of Open Access Journals (Sweden)

    Xibin Wang

    2016-12-01

    Full Text Available A new method had been developed for the analysis of hydrogen isotopic composition of trace hydrocarbons in natural gas samples by using solid phase microextraction (SPME combined with gas chromatography-isotope ratio mass spectrometry (GC/IRMS. In this study, the SPME technique had been initially introduced to achieve the enrichment of trace content of hydrocarbons with low abundance and coupled to GC/IRMS for hydrogen isotopic analysis. The main parameters, including the equilibration time, extraction temperature, and the fiber type, were systematically optimized. The results not only demonstrated that high extraction yield was true but also shows that the hydrogen isotopic fractionation was not observed during the extraction process, when the SPME device fitted with polydimethylsiloxane/divinylbenzene/carbon molecular sieve (PDMS/DVB/CAR fiber. The applications of SPME-GC/IRMS method were evaluated by using natural gas samples collected from different sedimentary basins; the standard deviation (SD was better than 4‰ for reproducible measurements; and also, the hydrogen isotope values from C1 to C9 can be obtained with satisfying repeatability. The SPME-GC/IRMS method fitted with PDMS/DVB/CAR fiber is well suited for the preconcentration of trace hydrocarbons, and provides a reliable hydrogen isotopic analysis for trace hydrocarbons in natural gas samples.

  12. Usability study of clinical exome analysis software: top lessons learned and recommendations.

    Science.gov (United States)

    Shyr, Casper; Kushniruk, Andre; Wasserman, Wyeth W

    2014-10-01

    New DNA sequencing technologies have revolutionized the search for genetic disruptions. Targeted sequencing of all protein coding regions of the genome, called exome analysis, is actively used in research-oriented genetics clinics, with the transition to exomes as a standard procedure underway. This transition is challenging; identification of potentially causal mutation(s) amongst ∼10(6) variants requires specialized computation in combination with expert assessment. This study analyzes the usability of user interfaces for clinical exome analysis software. There are two study objectives: (1) To ascertain the key features of successful user interfaces for clinical exome analysis software based on the perspective of expert clinical geneticists, (2) To assess user-system interactions in order to reveal strengths and weaknesses of existing software, inform future design, and accelerate the clinical uptake of exome analysis. Surveys, interviews, and cognitive task analysis were performed for the assessment of two next-generation exome sequence analysis software packages. The subjects included ten clinical geneticists who interacted with the software packages using the "think aloud" method. Subjects' interactions with the software were recorded in their clinical office within an urban research and teaching hospital. All major user interface events (from the user interactions with the packages) were time-stamped and annotated with coding categories to identify usability issues in order to characterize desired features and deficiencies in the user experience. We detected 193 usability issues, the majority of which concern interface layout and navigation, and the resolution of reports. Our study highlights gaps in specific software features typical within exome analysis. The clinicians perform best when the flow of the system is structured into well-defined yet customizable layers for incorporation within the clinical workflow. The results highlight opportunities to

  13. AMIDE: A Free Software Tool for Multimodality Medical Image Analysis

    Directory of Open Access Journals (Sweden)

    Andreas Markus Loening

    2003-07-01

    Full Text Available Amide's a Medical Image Data Examiner (AMIDE has been developed as a user-friendly, open-source software tool for displaying and analyzing multimodality volumetric medical images. Central to the package's abilities to simultaneously display multiple data sets (e.g., PET, CT, MRI and regions of interest is the on-demand data reslicing implemented within the program. Data sets can be freely shifted, rotated, viewed, and analyzed with the program automatically handling interpolation as needed from the original data. Validation has been performed by comparing the output of AMIDE with that of several existing software packages. AMIDE runs on UNIX, Macintosh OS X, and Microsoft Windows platforms, and it is freely available with source code under the terms of the GNU General Public License.

  14. Shell-model-based deformation analysis of light cadmium isotopes

    Science.gov (United States)

    Schmidt, T.; Heyde, K. L. G.; Blazhev, A.; Jolie, J.

    2017-07-01

    Large-scale shell-model calculations for the even-even cadmium isotopes 98Cd-108Cd have been performed with the antoine code in the π (2 p1 /2;1 g9 /2) ν (2 d5 /2;3 s1 /2;2 d3 /2;1 g7 /2;1 h11 /2) model space without further truncation. Known experimental energy levels and B (E 2 ) values could be well reproduced. Taking these calculations as a starting ground we analyze the deformation parameters predicted for the Cd isotopes as a function of neutron number N and spin J using the methods of model independent invariants introduced by Kumar [Phys. Rev. Lett. 28, 249 (1972), 10.1103/PhysRevLett.28.249] and Cline [Annu. Rev. Nucl. Part. Sci. 36, 683 (1986), 10.1146/annurev.ns.36.120186.003343].

  15. Analysis and application of heavy isotopes in the environment

    Science.gov (United States)

    Steier, Peter; Dellinger, Franz; Forstner, Oliver; Golser, Robin; Knie, Klaus; Kutschera, Walter; Priller, Alfred; Quinto, Francesca; Srncik, Michaela; Terrasi, Filippo; Vockenhuber, Christof; Wallner, Anton; Wallner, Gabriele; Wild, Eva Maria

    2010-04-01

    A growing number of AMS laboratories are pursuing applications of actinides. We discuss the basic requirements of the AMS technique of heavy (i.e., above ˜150 amu) isotopes, present the setup at the Vienna Environmental Research Accelerator (VERA) which is especially well suited for the isotope 236U, and give a comparison with other AMS facilities. Special emphasis will be put on elaborating the effective detection limits for environmental samples with respect to other mass spectrometric methods. At VERA, we have carried out measurements for radiation protection and environmental monitoring ( 236U, 239,240,241,242,244Pu), astrophysics ( 182Hf, 236U, 244Pu, 247Cm), nuclear physics, and a search for long-lived super-heavy elements ( Z > 100). We are pursuing the environmental distribution of 236U, as a basis for geological applications of natural 236U.

  16. Analysis and application of heavy isotopes in the environment

    International Nuclear Information System (INIS)

    Steier, Peter; Dellinger, Franz; Forstner, Oliver; Golser, Robin; Knie, Klaus; Kutschera, Walter; Priller, Alfred; Quinto, Francesca; Srncik, Michaela; Terrasi, Filippo; Vockenhuber, Christof; Wallner, Anton; Wallner, Gabriele; Wild, Eva Maria

    2010-01-01

    A growing number of AMS laboratories are pursuing applications of actinides. We discuss the basic requirements of the AMS technique of heavy (i.e., above ∼150 amu) isotopes, present the setup at the Vienna Environmental Research Accelerator (VERA) which is especially well suited for the isotope 236 U, and give a comparison with other AMS facilities. Special emphasis will be put on elaborating the effective detection limits for environmental samples with respect to other mass spectrometric methods. At VERA, we have carried out measurements for radiation protection and environmental monitoring ( 236 U, 239,240,241,242,244 Pu), astrophysics ( 182 Hf, 236 U, 244 Pu, 247 Cm), nuclear physics, and a search for long-lived super-heavy elements (Z > 100). We are pursuing the environmental distribution of 236 U, as a basis for geological applications of natural 236 U.

  17. Nonlinear analysis of reinforced concrete structures using software package abaqus

    OpenAIRE

    Marković Nemanja; Stojić Dragoslav; Cvetković Radovan

    2014-01-01

    Reinforced concrete (AB) is characterized by huge inhomogeneity resulting from the material characteristics of the concrete, then, quasi-brittle behavior during failure. These and other phenomena require the introduction of material nonlinearity in the modeling of reinforced concrete structures. This paper presents the modeling reinforced concrete in the software package ABAQUS. A brief theoretical overview is presented of methods such as: Concrete Damage Plasticity (CDP), Smeared Concrete Cr...

  18. Comparison of two three-dimensional cephalometric analysis computer software

    OpenAIRE

    Sawchuk, Dena; Alhadlaq, Adel; Alkhadra, Thamer; Carlyle, Terry D; Kusnoto, Budi; El-Bialy, Tarek

    2014-01-01

    Background: Three-dimensional cephalometric analyses are getting more attraction in orthodontics. The aim of this study was to compare two softwares to evaluate three-dimensional cephalometric analyses of orthodontic treatment outcomes. Materials and Methods: Twenty cone beam computed tomography images were obtained using i-CAT® imaging system from patient's records as part of their regular orthodontic records. The images were analyzed using InVivoDental5.0 (Anatomage Inc.) and 3DCeph™ (Unive...

  19. Security Analysis of a Software Defined Wide Area Network Solution

    OpenAIRE

    Rajendran, Ashok

    2016-01-01

    Enterprise wide area network (WAN) is a private network that connects the computers and other devices across an organisation's branch locations and the data centers. It forms the backbone of enterprise communication. Currently, multiprotocol label switching (MPLS) is commonly used to provide this service. As a recent alternative to MPLS, software-dened wide area networking (SD-WAN) solutions are being introduced as an IP based cloud-networking service for enterprises. SD-WAN virtualizes the n...

  20. Software for muscle fibre type classification and analysis

    Czech Academy of Sciences Publication Activity Database

    Karen, Petr; Števanec, M.; Smerdu, V.; Cvetko, E.; Kubínová, Lucie; Eržen, I.

    2009-01-01

    Roč. 53, č. 2 (2009), s. 87-95 ISSN 1121-760X R&D Projects: GA MŠk(CZ) LC06063; GA MŠk(CZ) MEB090910 Institutional research plan: CEZ:AV0Z50110509 Keywords : muscle fiber types * myosin heavy chain isoforms * image processing Subject RIV: JC - Computer Hardware ; Software Impact factor: 0.886, year: 2009

  1. The analysis of software system in SOPHY SPECT

    International Nuclear Information System (INIS)

    Xu Chikang

    1993-01-01

    The FORTH software system of the Single Photon Emission Computed Tomography (SPECT) made by French SOPHA MEDICAL Corp. are analysed. On the basis of brief introduction to the construction principle and programming methods of FORTH language the whole structure and lay-out of the Sophy system are described. With the help of some figures the modular structure, the allocation of the hard disk and internal storage, as well as the running procedure of the system are introduced in details

  2. Isotopic analysis of Bothrops atrox in Amazonian forest

    Science.gov (United States)

    Martinez, M. G.; Silva, A. M.; Chalkidis, H.; de Oliveira Júnior, R. C.; Camargo, P. B.

    2012-12-01

    The poisoning of snakes is considered a public health problem, especially in populations from rural areas of tropical and subtropical countries. In Brazil, the 26,000 snakebites, 90% are of the genus Bothrops, and Bothrops atrox species predominant in the Amazon region including all the Brazilian Amazon. Research shows that using stable isotopes, we can verify the isotopic composition of tissues of animals that depend mainly on food, water ingested and inhaled gases. For this study, samples taken from Bothrops atrox (B. atrox), in forest using pitfall traps and fall ("Pitt-fall traps with drift fence"). The analyzes were performed by mass spectrometry, where the analytical error is 0.3‰ for carbon and 0.5‰ to nitrogen. The results of the forest animals are significantly different from results of animal vivarium. The average values of the tissues and venoms of snakes of the forest for carbon-13 and nitrogen-15 are: δ13C = -24.68‰ and δ15N = 14.22‰ and mean values of tissue and poisons snakes vivarium (Instituto Butantan) to carbon-13 and nitrogen-15 are δ13C = -20.47‰ and δ15N = 8.36‰, with a significantly different due to different sources of food animals. Based on all results isotopic δ13C and δ15N, we can suggest that changes as the power of the serpent, (nature and captivity), changes occur in relation to diet and environment as the means of the isotopic data are quite distinct, showing that these changes can also cause metabolic changes in the body of the animal itself and the different periods of turnover of each tissue analyzed.

  3. Computer automated mass spectrometer for isotope analysis on gas samples

    International Nuclear Information System (INIS)

    Pamula, A.; Kaucsar, M.; Fatu, C.; Ursu, D.; Vonica, D.; Bendea, D.; Muntean, F.

    1998-01-01

    A low resolution, high precision instrument was designed and realized in the mass spectrometry laboratory of the Institute of Isotopic and Molecular Technology, Cluj-Napoca. The paper presents the vacuum system, the sample inlet system, the ion source, the magnetic analyzer and the ion collector. The instrument is almost completely automated. There are described the analog-to-digital conversion circuits, the local control microcomputer, the automation systems and the performance checking. (authors)

  4. Assessing connectivity of estuarine fishes based on stable isotope ratio analysis

    Science.gov (United States)

    Herzka, Sharon Z.

    2005-07-01

    Assessing connectivity is fundamental to understanding the population dynamics of fishes. I propose that isotopic analyses can greatly contribute to studies of connectivity in estuarine fishes due to the high diversity of isotopic signatures found among estuarine habitats and the fact that variations in isotopic composition at the base of a food web are reflected in the tissues of consumers. Isotopic analysis can be used for identifying nursery habitats and estimating their contribution to adult populations. If movement to a new habitat is accompanied by a shift to foods of distinct isotopic composition, recent immigrants and residents can be distinguished based on their isotopic ratios. Movement patterns thus can be reconstructed based on information obtained from individuals. A key consideration is the rate of isotopic turnover, which determines the length of time that an immigrant to a given habitat will be distinguishable from a longtime resident. A literature survey indicated that few studies have measured turnover rates in fishes and that these have focused on larvae and juveniles. These studies reveal that biomass gain is the primary process driving turnover rates, while metabolic turnover is either minimal or undetectable. Using a simple dilution model and biomass-specific growth rates, I estimated that young fishes with fast growth rates will reflect the isotopic composition of a new diet within days or weeks. Older or slower-growing individuals may take years or never fully equilibrate. Future studies should evaluate the factors that influence turnover rates in fishes during various stages of the life cycle and in different tissues, as well as explore the potential for combining stable isotope and otolith microstructure analyses to examine the relationship between demographic parameters, movement and connectivity.

  5. The conflict between cheetahs and humans on Namibian farmland elucidated by stable isotope diet analysis.

    Directory of Open Access Journals (Sweden)

    Christian C Voigt

    Full Text Available Large areas of Namibia are covered by farmland, which is also used by game and predator species. Because it can cause conflicts with farmers when predators, such as cheetahs (Acinonyx jubatus, hunt livestock, we assessed whether livestock constitutes a significant part of the cheetah diet by analysing the stable isotope composition of blood and tissue samples of cheetahs and their potential prey species. According to isotopic similarities, we defined three isotopic categories of potential prey: members of a C4 food web with high δ15N values (gemsbok, cattle, springhare and guinea fowl and those with low δ15N values (hartebeest, warthog, and members of a C3 food web, namely browsers (eland, kudu, springbok, steenbok and scrub hare. We quantified the trophic discrimination of heavy isotopes in cheetah muscle in 9 captive individuals and measured an enrichment for 15N (3.2‰ but not for 13C in relation to food. We captured 53 free-ranging cheetahs of which 23 were members of groups. Cheetahs of the same group were isotopically distinct from members of other groups, indicating that group members shared their prey. Solitary males (n = 21 and males in a bachelor groups (n = 11 fed mostly on hartebeest and warthogs, followed by browsers in case of solitary males, and by grazers with high δ15N values in case of bachelor groups. Female cheetahs (n = 9 predominantly fed on browsers and used also hartebeest and warthogs. Mixing models suggested that the isotopic prey category that included cattle was only important, if at all, for males living in bachelor groups. Stable isotope analysis of fur, muscle, red blood cells and blood plasma in 9 free-ranging cheetahs identified most individuals as isotopic specialists, focussing on isotopically distinct prey categories as their food.

  6. The Conflict between Cheetahs and Humans on Namibian Farmland Elucidated by Stable Isotope Diet Analysis

    Science.gov (United States)

    Voigt, Christian C.; Thalwitzer, Susanne; Melzheimer, Jörg; Blanc, Anne-Sophie; Jago, Mark; Wachter, Bettina

    2014-01-01

    Large areas of Namibia are covered by farmland, which is also used by game and predator species. Because it can cause conflicts with farmers when predators, such as cheetahs (Acinonyx jubatus), hunt livestock, we assessed whether livestock constitutes a significant part of the cheetah diet by analysing the stable isotope composition of blood and tissue samples of cheetahs and their potential prey species. According to isotopic similarities, we defined three isotopic categories of potential prey: members of a C4 food web with high δ15N values (gemsbok, cattle, springhare and guinea fowl) and those with low δ15N values (hartebeest, warthog), and members of a C3 food web, namely browsers (eland, kudu, springbok, steenbok and scrub hare). We quantified the trophic discrimination of heavy isotopes in cheetah muscle in 9 captive individuals and measured an enrichment for 15N (3.2‰) but not for 13C in relation to food. We captured 53 free-ranging cheetahs of which 23 were members of groups. Cheetahs of the same group were isotopically distinct from members of other groups, indicating that group members shared their prey. Solitary males (n = 21) and males in a bachelor groups (n = 11) fed mostly on hartebeest and warthogs, followed by browsers in case of solitary males, and by grazers with high δ15N values in case of bachelor groups. Female cheetahs (n = 9) predominantly fed on browsers and used also hartebeest and warthogs. Mixing models suggested that the isotopic prey category that included cattle was only important, if at all, for males living in bachelor groups. Stable isotope analysis of fur, muscle, red blood cells and blood plasma in 9 free-ranging cheetahs identified most individuals as isotopic specialists, focussing on isotopically distinct prey categories as their food. PMID:25162403

  7. Storm runoff analysis using environmental isotopes and major ions

    International Nuclear Information System (INIS)

    Fritz, P.; Cherry, J.A.; Sklash, M.; Weyer, K.U.

    1976-01-01

    At a given locality the oxygen-18 content of rainwater varies from storm to storm but within broad seasonal trends. Very frequently, especially during heavy summer storms, the stable isotope composition of rainwater differs from that of the groundwater in the area. This isotopic difference can be used to differentiate between 'prestorm' and 'rain' components in storm runoff. This approach to the use of natural 18 O was applied in four hydrogeologically very different basins in Canada. Their surface areas range from less than 2km 2 to more than 700km 2 . Before, during and after the storm events samples of stream water, groundwater and rain were analysed for 18 O and in some cases for deuterium, major ions and electrical conductance. The 18 O hydrograph separations show that groundwater was a major component of the runoff in each of the basins, and usually exceeded 50% of the total water discharged. Even at peak stream flow most of discharge was subsurface water. The identification of geographic sources rather than time sources appears possible if isotope techniques are used in conjunction with chemical analyses, hydrological data - such as flow measurements - and visual observations. (author)

  8. Software design specification and analysis(NuFDS) approach for the safety critical software based on porgrammable logic controller(PLC)

    International Nuclear Information System (INIS)

    Koo, Seo Ryong; Seong, Poong Hyun; Jung, Jin Yong; Choi, Seong Soo

    2004-01-01

    This paper introduces the software design specification and analysis technique for the safety-critical system based on Programmable Logic Controller (PLC). During software development phases, the design phase should perform an important role to connect between requirements phase and implementation phase as a process of translating problem requirements into software structures. In this work, the Nuclear FBD-style Design Specification and analysis (NuFDS) approach was proposed. The NuFDS approach for nuclear Instrumentation and Control (I and C) software are suggested in a straight forward manner. It consists of four major specifications as follows; Database, Software Architecture, System Behavior, and PLC Hardware Configuration. Additionally, correctness, completeness, consistency, and traceability check techniques are also suggested for the formal design analysis in NuFDS approach. In addition, for the tool supporting, we are developing NuSDS tool based on the NuFDS approach which is a tool, especially for the software design specification in nuclear fields

  9. 13C metabolic flux analysis: optimal design of isotopic labeling experiments.

    Science.gov (United States)

    Antoniewicz, Maciek R

    2013-12-01

    Measuring fluxes by 13C metabolic flux analysis (13C-MFA) has become a key activity in chemical and pharmaceutical biotechnology. Optimal design of isotopic labeling experiments is of central importance to 13C-MFA as it determines the precision with which fluxes can be estimated. Traditional methods for selecting isotopic tracers and labeling measurements did not fully utilize the power of 13C-MFA. Recently, new approaches were developed for optimal design of isotopic labeling experiments based on parallel labeling experiments and algorithms for rational selection of tracers. In addition, advanced isotopic labeling measurements were developed based on tandem mass spectrometry. Combined, these approaches can dramatically improve the quality of 13C-MFA results with important applications in metabolic engineering and biotechnology. Copyright © 2013 Elsevier Ltd. All rights reserved.

  10. Tracking transformation processes of organic micropollutants in aquatic environments using multi-element isotope fractionation analysis

    International Nuclear Information System (INIS)

    Hofstetter, Thomas B.; Bolotin, Jakov; Skarpeli-Liati, Marita; Wijker, Reto; Kurt, Zohre; Nishino, Shirley F.; Spain, Jim C.

    2011-01-01

    The quantitative description of enzymatic or abiotic transformations of man-made organic micropollutants in rivers, lakes, and groundwaters is one of the major challenges associated with the risk assessment of water resource contamination. Compound-specific isotope analysis enables one to identify (bio)degradation pathways based on changes in the contaminants' stable isotope ratios even if multiple reactive and non-reactive processes cause concentrations to decrease. Here, we investigated how the magnitude and variability of isotope fractionation in some priority pollutants is determined by the kinetics and mechanisms of important enzymatic and abiotic redox reactions. For nitroaromatic compounds and substituted anilines, we illustrate that competing transformation pathways can be assessed via trends of N and C isotope signatures.

  11. Diode laser based resonance ionization mass spectrometry for spectroscopy and trace analysis of uranium isotopes

    International Nuclear Information System (INIS)

    Hakimi, Amin

    2013-01-01

    In this doctoral thesis, the upgrade and optimization of a diode laser system for high-resolution resonance ionization mass spectrometry is described. A frequency-control system, based on a double-interferometric approach, allowing for absolute stabilization down to 1 MHz as well as frequency detunings of several GHz within a second for up to three lasers in parallel was optimized. This laser system was used for spectroscopic studies on uranium isotopes, yielding precise and unambiguous level energies, total angular momenta, hyperfine constants and isotope shifts. Furthermore, an efficient excitation scheme which can be operated with commercial diode lasers was developed. The performance of the complete laser mass spectrometer was optimized and characterized for the ultra-trace analysis of the uranium isotope 236 U, which serves as a neutron flux dosimeter and tracer for radioactive anthropogenic contaminations in the environment. Using synthetic samples, an isotope selectivity of ( 236 U)/( 238 U) = 4.5(1.5) . 10 -9 was demonstrated.

  12. Optical spectroscopy of laser-produced plasmas for standoff isotopic analysis

    Science.gov (United States)

    Harilal, S. S.; Brumfield, B. E.; LaHaye, N. L.; Hartig, K. C.; Phillips, M. C.

    2018-06-01

    Rapid, in-field, and non-contact isotopic analysis of solid materials is extremely important to a large number of applications, such as nuclear nonproliferation monitoring and forensics, geochemistry, archaeology, and biochemistry. Presently, isotopic measurements for these and many other fields are performed in laboratory settings. Rapid, in-field, and non-contact isotopic analysis of solid material is possible with optical spectroscopy tools when combined with laser ablation. Laser ablation generates a transient vapor of any solid material when a powerful laser interacts with a sample of interest. Analysis of atoms, ions, and molecules in a laser-produced plasma using optical spectroscopy tools can provide isotopic information with the advantages of real-time analysis, standoff capability, and no sample preparation requirement. Both emission and absorption spectroscopy methods can be used for isotopic analysis of solid materials. However, applying optical spectroscopy to the measurement of isotope ratios from solid materials presents numerous challenges. Isotope shifts arise primarily due to variation in nuclear charge distribution caused by different numbers of neutrons, but the small proportional nuclear mass differences between nuclei of various isotopes lead to correspondingly small differences in optical transition wavelengths. Along with this, various line broadening mechanisms in laser-produced plasmas and instrumental broadening generated by the detection system are technical challenges frequently encountered with emission-based optical diagnostics. These challenges can be overcome by measuring the isotope shifts associated with the vibronic emission bands from molecules or by using the techniques of laser-based absorption/fluorescence spectroscopy to marginalize the effect of instrumental broadening. Absorption and fluorescence spectroscopy probe the ground state atoms existing in the plasma when it is cooler, which inherently provides narrower

  13. Comparative Performance Analysis of Machine Learning Techniques for Software Bug Detection

    OpenAIRE

    Saiqa Aleem; Luiz Fernando Capretz; Faheem Ahmed

    2015-01-01

    Machine learning techniques can be used to analyse data from different perspectives and enable developers to retrieve useful information. Machine learning techniques are proven to be useful in terms of software bug prediction. In this paper, a comparative performance analysis of different machine learning techniques is explored f or software bug prediction on public available data sets. Results showed most of the mac ...

  14. Choosing your weapons : on sentiment analysis tools for software engineering research

    NARCIS (Netherlands)

    Jongeling, R.M.; Datta, S.; Serebrenik, A.; Koschke, R.; Krinke, J.; Robillard, M.

    2015-01-01

    Recent years have seen an increasing attention to social aspects of software engineering, including studies of emotions and sentiments experienced and expressed by the software developers. Most of these studies reuse existing sentiment analysis tools such as SentiStrength and NLTK. However, these

  15. FieldTrip: Open source software for advanced analysis of MEG, EEG, and invasive electrophysiological data

    NARCIS (Netherlands)

    Oostenveld, R.; Fries, P.; Maris, E.G.G.; Schoffelen, J.M.

    2011-01-01

    This paper describes FieldTrip, an open source software package that we developed for the analysis of MEG, EEG, and other electrophysiological data. The software is implemented as a MATLAB toolbox and includes a complete set of consistent and user-friendly high-level functions that allow

  16. Simulation and Analysis of Isotope Separation System for Fusion Fuel Recovery System

    Science.gov (United States)

    Senevirathna, Bathiya; Gentile, Charles

    2011-10-01

    This paper presents results of a simulation of the Fuel Recovery System (FRS) for the Laser Inertial Fusion Engine (LIFE) reactor. The LIFE reaction will produce exhaust gases that will need to be recycled in the FRS along with xenon, the chamber's intervention gas. Solids and liquids will first be removed and then vapor traps are used to remove large gas molecules such as lead. The gas will be reacted with lithium at high temperatures to extract the hydrogen isotopes, protium, deuterium, and tritium in hydride form. The hydrogen isotopes will be recovered using a lithium blanket processing system already in place and this product will be sent to the Isotope Separation System (ISS). The ISS will be modeled in software to analyze its effectiveness. Aspen HYSYS was chosen for this purpose for its widespread use industrial gas processing systems. Reactants and corresponding chemical reactions had to be initialized in the software. The ISS primarily consists of four cryogenic distillation columns and these were modeled in HYSYS based on design requirements. Fractional compositions of the distillate and liquid products were analyzed and used to optimize the overall system.

  17. Analysis and design of software ecosystem architectures – Towards the 4S telemedicine ecosystem

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak; Hansen, Klaus Marius; Kyng, Morten

    2014-01-01

    performed a descriptive, revelatory case study of the Danish telemedicine ecosystem and for ii), we experimentally designed, implemented, and evaluated the architecture of 4S. Results We contribute in three areas. First, we define the software ecosystem architecture concept that captures organization......, and application stove-pipes that inhibit the adoption of telemedical solutions. To which extent can a software ecosystem approach to telemedicine alleviate this? Objective In this article, we define the concept of software ecosystem architecture as the structure(s) of a software ecosystem comprising elements...... experience in creating and evolving the 4S telemedicine ecosystem. Conclusion The concept of software ecosystem architecture can be used analytically and constructively in respectively the analysis and design of software ecosystems....

  18. The software safety analysis based on SFTA for reactor power regulating system in nuclear power plant

    International Nuclear Information System (INIS)

    Liu Zhaohui; Yang Xiaohua; Liao Longtao; Wu Zhiqiang

    2015-01-01

    The digitalized Instrumentation and Control (I and C) system of Nuclear power plants can provide many advantages. However, digital control systems induce new failure modes that differ from those of analog control systems. While the cost effectiveness and flexibility of software is widely recognized, it is very difficult to achieve and prove high levels of dependability and safety assurance for the functions performed by process control software, due to the very flexibility and potential complexity of the software itself. Software safety analysis (SSA) was one way to improve the software safety by identify the system hazards caused by software failure. This paper describes the application of a software fault tree analysis (SFTA) at the software design phase. At first, we evaluate all the software modules of the reactor power regulating system in nuclear power plant and identify various hazards. The SFTA was applied to some critical modules selected from the previous step. At last, we get some new hazards that had not been identified in the prior processes of the document evaluation which were helpful for our design. (author)

  19. Suitability of selected free-gas and dissolved-gas sampling containers for carbon isotopic analysis.

    Science.gov (United States)

    Eby, P; Gibson, J J; Yi, Y

    2015-07-15

    Storage trials were conducted for 2 to 3 months using a hydrocarbon and carbon dioxide gas mixture with known carbon isotopic composition to simulate typical hold times for gas samples prior to isotopic analysis. A range of containers (both pierced and unpierced) was periodically sampled to test for δ(13)C isotopic fractionation. Seventeen containers were tested for free-gas storage (20°C, 1 atm pressure) and 7 containers were tested for dissolved-gas storage, the latter prepared by bubbling free gas through tap water until saturated (20°C, 1 atm) and then preserved to avoid biological activity by acidifying to pH 2 with phosphoric acid and stored in the dark at 5°C. Samples were extracted using valves or by piercing septa, and then introduced into an isotope ratio mass spectrometer for compound-specific δ(13)C measurements. For free gas, stainless steel canisters and crimp-top glass serum bottles with butyl septa were most effective at preventing isotopic fractionation (pierced and unpierced), whereas silicone and PTFE-butyl septa allowed significant isotopic fractionation. FlexFoil and Tedlar bags were found to be effective only for storage of up to 1 month. For dissolved gas, crimp-top glass serum bottles with butyl septa were again effective, whereas silicone and PTFE-butyl were not. FlexFoil bags were reliable for up to 2 months. Our results suggest a range of preferred containers as well as several that did not perform very well for isotopic analysis. Overall, the results help establish better QA/QC procedures to avoid isotopic fractionation when storing environmental gas samples. Recommended containers for air transportation include steel canisters and glass serum bottles with butyl septa (pierced and unpierced). Copyright © 2015 John Wiley & Sons, Ltd.

  20. Linking cases of illegal shootings of the endangered California condor using stable lead isotope analysis

    Energy Technology Data Exchange (ETDEWEB)

    Finkelstein, Myra E., E-mail: myraf@ucsc.edu [Microbiology and Environmental Toxicology Department, University of California, Santa Cruz, CA 95064 (United States); Kuspa, Zeka E. [Microbiology and Environmental Toxicology Department, University of California, Santa Cruz, CA 95064 (United States); Welch, Alacia [National Park Service, Pinnacles National Park, 5000 Highway 146, Paicines, CA 95043 (United States); Eng, Curtis; Clark, Michael [Los Angeles Zoo and Botanical Gardens, 5333 Zoo Drive, Los Angeles, CA 90027 (United States); Burnett, Joseph [Ventana Wildlife Society, 19045 Portola Dr. Ste. F-1, Salinas, CA 93908 (United States); Smith, Donald R. [Microbiology and Environmental Toxicology Department, University of California, Santa Cruz, CA 95064 (United States)

    2014-10-15

    Lead poisoning is preventing the recovery of the critically endangered California condor (Gymnogyps californianus) and lead isotope analyses have demonstrated that ingestion of spent lead ammunition is the principal source of lead poisoning in condors. Over an 8 month period in 2009, three lead-poisoned condors were independently presented with birdshot embedded in their tissues, evidencing they had been shot. No information connecting these illegal shooting events existed and the timing of the shooting(s) was unknown. Using lead concentration and stable lead isotope analyses of feathers, blood, and recovered birdshot, we observed that: i) lead isotope ratios of embedded shot from all three birds were measurably indistinguishable from each other, suggesting a common source; ii) lead exposure histories re-constructed from feather analysis suggested that the shooting(s) occurred within the same timeframe; and iii) two of the three condors were lead poisoned from a lead source isotopically indistinguishable from the embedded birdshot, implicating ingestion of this type of birdshot as the source of poisoning. One of the condors was subsequently lead poisoned the following year from ingestion of a lead buckshot (blood lead 556 µg/dL), illustrating that ingested shot possess a substantially greater lead poisoning risk compared to embedded shot retained in tissue (blood lead ∼20 µg/dL). To our knowledge, this is the first study to use lead isotopes as a tool to retrospectively link wildlife shooting events. - Highlights: • We conducted a case-based analysis of illegal shootings of California condors. • Blood and feather Pb isotopes were used to reconstruct the illegal shooting events. • Embedded birdshot from the three condors had the same Pb isotope ratios. • Feather and blood Pb isotopes indicated that the condors were shot in a common event. • Ingested shot causes substantially greater lead exposure compared to embedded shot.

  1. Verification of LRFD Bridge Design and Analysis Software for INDOT

    OpenAIRE

    Varma, Amit H.; Seo, Jungil

    2009-01-01

    NCHRP Process 12-50 was implemented to evaluate and verify composite steel I-girder bridge design software used commonly in Indiana. A test-bed of twenty one bridges was developed with the guidance from an Indiana Department of Transportation appointed research advisory panel (RAP). The test-bed included five simple-span and sixteen multi-span bridge superstructures. More than 80 parameters were required to define a bridge and they include bridge span, girder spacing, number of beams, section...

  2. Analysis of carbon stable isotope to determine the origin and migration of gaseous hydrocarbon in the Brazilian sedimentary basins

    International Nuclear Information System (INIS)

    Takaki, T.; Rodrigues, R.

    1986-01-01

    The carbon isotopic composition of natural gases to determine the origin and gaseous hydrocarbon migration of Brazilian sedimentar basins is analysed. The carbon isotopic ratio of methane from natural gases depends on the process of gas formation and stage of organic matter maturation. In the geochemical surface exploration the biogenic gases are differentiated from thermogenic gases, because the last one is isotopically heavier. As the isotopic composition of methane has not changed during migration, the migrated gases from deeper and more mature source rocks are identified by its relative 13 C enrichment. The methane was separated from chromatography and and the isotopic analysis was done with mass spectrometer. (M.C.K.) [pt

  3. Application of Gaia Analysis Software AGIS to Nano-JASMINE

    Science.gov (United States)

    Yamada, Y.; Lammers, U.; Gouda, N.

    2011-07-01

    The core data reduction for the Nano-JASMINE mission is planned to be done with Gaia's Astrometric Global Iterative Solution (AGIS). Nano-JASMINE is an ultra small (35 kg) satellite for astrometry observations in Japan and Gaia is ESA's large (over 1000 kg) next-generation astrometry mission. The accuracy of Nano-JASMINE is about 3 mas, comparable to the Hipparcos mission, Gaia's predecessor some 20 years ago. It is challenging that such a small satellite can perform real scientific observations. The collaboration for sharing software started in 2007. In addition to similar design and operating principles of the two missions, this is possible thanks to the encapsulation of all Gaia-specific aspects of AGIS in a Parameter Database. Nano-JASMINE will be the test bench for the Gaia AGIS software. We present this idea in detail and the necessary practical steps to make AGIS work with Nano-JASMINE data. We also show the key mission parameters, goals, and status of the data reduction for the Nano-JASMINE.

  4. SEDA: A software package for the Statistical Earthquake Data Analysis

    Science.gov (United States)

    Lombardi, A. M.

    2017-03-01

    In this paper, the first version of the software SEDA (SEDAv1.0), designed to help seismologists statistically analyze earthquake data, is presented. The package consists of a user-friendly Matlab-based interface, which allows the user to easily interact with the application, and a computational core of Fortran codes, to guarantee the maximum speed. The primary factor driving the development of SEDA is to guarantee the research reproducibility, which is a growing movement among scientists and highly recommended by the most important scientific journals. SEDAv1.0 is mainly devoted to produce accurate and fast outputs. Less care has been taken for the graphic appeal, which will be improved in the future. The main part of SEDAv1.0 is devoted to the ETAS modeling. SEDAv1.0 contains a set of consistent tools on ETAS, allowing the estimation of parameters, the testing of model on data, the simulation of catalogs, the identification of sequences and forecasts calculation. The peculiarities of routines inside SEDAv1.0 are discussed in this paper. More specific details on the software are presented in the manual accompanying the program package.

  5. Diet of spotted bats (Euderma maculatum) in Arizona as indicated by fecal analysis and stable isotopes

    Science.gov (United States)

    We assessed diet of spotted bats (Euderma maculatum (J.A. Allen, 1891)) by visual analysis of bat feces and stable carbon (δ13C) and nitrogen (δ15N) isotope analysis of bat feces, wing, hair, and insect prey. We collected 33 fecal samples from spotted bats and trapped 3755 insect...

  6. Stable Isotope Quantitative N-Glycan Analysis by Liquid Separation Techniques and Mass Spectrometry.

    Science.gov (United States)

    Mittermayr, Stefan; Albrecht, Simone; Váradi, Csaba; Millán-Martín, Silvia; Bones, Jonathan

    2017-01-01

    Liquid phase separation analysis and subsequent quantitation remains a challenging task for protein-derived oligosaccharides due to their inherent structural complexity and diversity. Incomplete resolution or co-detection of multiple glycan species complicates peak area-based quantitation and associated statistical analysis when optical detection methods are used. The approach outlined herein describes the utilization of stable isotope variants of commonly used fluorescent tags that allow for mass-based glycan identification and relative quantitation following separation by liquid chromatography (LC) or capillary electrophoresis (CE). Comparability assessment of glycoprotein-derived oligosaccharides is performed by derivatization with commercially available isotope variants of 2-aminobenzoic acid or aniline and analysis by LC- and CE-mass spectrometry. Quantitative information is attained from the extracted ion chromatogram/electropherogram ratios generated from the light and heavy isotope clusters.

  7. Application of software quality assurance methods in validation and maintenance of reactor analysis computer codes

    International Nuclear Information System (INIS)

    Reznik, L.

    1994-01-01

    Various computer codes employed at Israel Electricity Company for preliminary reactor design analysis and fuel cycle scoping calculations have been often subject to program source modifications. Although most changes were due to computer or operating system compatibility problems, a number of significant modifications were due to model improvement and enhancements of algorithm efficiency and accuracy. With growing acceptance of software quality assurance requirements and methods, a program of implementing extensive testing of modified software has been adopted within the regular maintenance activities. In this work survey has been performed of various software quality assurance methods of software testing which belong mainly to the two major categories of implementation ('white box') and specification-based ('black box') testing. The results of this survey exhibits a clear preference of specification-based testing. In particular the equivalence class partitioning method and the boundary value method have been selected as especially suitable functional methods for testing reactor analysis codes.A separate study of software quality assurance methods and techniques has been performed in this work objective to establish appropriate pre-test software specification methods. Two methods of software analysis and specification have been selected as the most suitable for this purpose: The method of data flow diagrams has been shown to be particularly valuable for performing the functional/procedural software specification while the entities - relationship diagrams has been approved to be efficient for specifying software data/information domain. Feasibility of these two methods has been analyzed in particular for software uncertainty analysis and overall code accuracy estimation. (author). 14 refs

  8. Direct uranium isotope ratio analysis of single micrometer-sized glass particles

    International Nuclear Information System (INIS)

    Kappel, Stefanie; Boulyga, Sergei F.; Prohaska, Thomas

    2012-01-01

    We present the application of nanosecond laser ablation (LA) coupled to a ‘Nu Plasma HR’ multi collector inductively coupled plasma mass spectrometer (MC-ICP-MS) for the direct analysis of U isotope ratios in single, 10–20 μm-sized, U-doped glass particles. Method development included studies with respect to (1) external correction of the measured U isotope ratios in glass particles, (2) the applied laser ablation carrier gas (i.e. Ar versus He) and (3) the accurate determination of lower abundant 236 U/ 238 U isotope ratios (i.e. 10 −5 ). In addition, a data processing procedure was developed for evaluation of transient signals, which is of potential use for routine application of the developed method. We demonstrate that the developed method is reliable and well suited for determining U isotope ratios of individual particles. Analyses of twenty-eight S1 glass particles, measured under optimized conditions, yielded average biases of less than 0.6% from the certified values for 234 U/ 238 U and 235 U/ 238 U ratios. Experimental results obtained for 236 U/ 238 U isotope ratios deviated by less than −2.5% from the certified values. Expanded relative total combined standard uncertainties U c (k = 2) of 2.6%, 1.4% and 5.8% were calculated for 234 U/ 238 U, 235 U/ 238 U and 236 U/ 238 U, respectively. - Highlights: ► LA-MC-ICP-MS was fully validated for the direct analysis of individual particles. ► Traceability was established by using an IRMM glass particle reference material. ► Measured U isotope ratios were in agreement with the certified range. ► A comprehensive total combined uncertainty evaluation was performed. ► The analysis of 236 U/ 238 U isotope ratios was improved by using a deceleration filter.

  9. Growth versus metabolic tissue replacement in mouse tissues determined by stable carbon and nitrogen isotope analysis

    Science.gov (United States)

    Macavoy, S. E.; Jamil, T.; Macko, S. A.; Arneson, L. S.

    2003-12-01

    Stable isotope analysis is becoming an extensively used tool in animal ecology. The isotopes most commonly used for analysis in terrestrial systems are those of carbon and nitrogen, due to differential carbon fractionation in C3 and C4 plants, and the approximately 3‰ enrichment in 15N per trophic level. Although isotope signatures in animal tissues presumably reflect the local food web, analysis is often complicated by differential nutrient routing and fractionation by tissues, and by the possibility that large organisms are not in isotopic equilibrium with the foods available in their immediate environment. Additionally, the rate at which organisms incorporate the isotope signature of a food through both growth and metabolic tissue replacement is largely unknown. In this study we have assessed the rate of carbon and nitrogen isotopic turnover in liver, muscle and blood in mice following a diet change. By determining growth rates, we were able to determine the proportion of tissue turnover caused by growth versus that caused by metabolic tissue replacement. Growth was found to account for approximately 10% of observed tissue turnover in sexually mature mice (Mus musculus). Blood carbon was found to have the shortest half-life (16.9 days), followed by muscle (24.7 days). Liver carbon turnover was not as well described by the exponential decay equations as other tissues. However, substantial liver carbon turnover was observed by the 28th day after diet switch. Surprisingly, these tissues primarily reflect the carbon signature of the protein, rather than carbohydrate, source in their diet. The nitrogen signature in all tissues was enriched by 3 - 5‰ over their dietary protein source, depending on tissue type, and the isotopic turnover rates were comparable to those observed in carbon.

  10. Microbial degradation of alpha-cypermethrin in soil by compound-specific stable isotope analysis

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Zemin [MOE Key Laboratory of Environmental Remediation and Ecosystem Health, College of Environmental and Resource Sciences, Zhejiang University, Hangzhou 310058 (China); Shen, Xiaoli [MOE Key Laboratory of Environmental Remediation and Ecosystem Health, College of Environmental and Resource Sciences, Zhejiang University, Hangzhou 310058 (China); Department of Environmental Engineering, Quzhou University, Quzhou 324000 (China); Zhang, Xi-Chang [Laboratory for Teaching in Environmental and Resource Sciences, Zhejiang University, Hangzhou 310058 (China); Liu, Weiping [MOE Key Laboratory of Environmental Remediation and Ecosystem Health, College of Environmental and Resource Sciences, Zhejiang University, Hangzhou 310058 (China); Yang, Fangxing, E-mail: fxyang@zju.edu.cn [MOE Key Laboratory of Environmental Remediation and Ecosystem Health, College of Environmental and Resource Sciences, Zhejiang University, Hangzhou 310058 (China); Department of Effect-Directed Analysis, Helmholtz Center for Environmental Research – UFZ, Leipzig 04318 (Germany)

    2015-09-15

    Highlights: • Alpha-cypermethrin (α-CP) can be degraded by microorganisms in soil. • Biodegradation of α-CP resulted in carbon isotope fractionation. • A relationship was found between carbon isotope ratios and concentrations of α-CP. • An enrichment factor ϵ of α-CP was determined as −1.87‰. • CSIA is applicable to assess biodegradation of α-CP. - Abstract: To assess microbial degradation of alpha-cypermethrin in soil, attenuation of alpha-cypermethrin was investigated by compound-specific stable isotope analysis. The variations of the residual concentrations and stable carbon isotope ratios of alpha-cypermethrin were detected in unsterilized and sterilized soils spiked with alpha-cypermethrin. After an 80 days’ incubation, the concentrations of alpha-cypermethrin decreased to 0.47 and 3.41 mg/kg in the unsterilized soils spiked with 2 and 10 mg/kg, while those decreased to 1.43 and 6.61 mg/kg in the sterilized soils. Meanwhile, the carbon isotope ratios shifted to −29.14 ± 0.22‰ and −29.86 ± 0.33‰ in the unsterilized soils spiked with 2 and 10 mg/kg, respectively. The results revealed that microbial degradation contributed to the attenuation of alpha-cypermethrin and induced the carbon isotope fractionation. In order to quantitatively assess microbial degradation, a relationship between carbon isotope ratios and residual concentrations of alpha-cypermethrin was established according to Rayleigh equation. An enrichment factor, ϵ = −1.87‰ was obtained, which can be employed to assess microbial degradation of alpha-cypermethrin. The significant carbon isotope fractionation during microbial degradation suggests that CSIA is a proper approach to qualitatively detect and quantitatively assess the biodegradation during attenuation process of alpha-cypermethrin in the field.

  11. Software for tomographic analysis: application in ceramic filters

    International Nuclear Information System (INIS)

    Figuerola, W.B.; Assis, J.T.; Oliveira, L.F.; Lopes, R.T.

    2001-01-01

    New methods for acquiring data have been developed with the technological advances. With this, it has been possible to obtain more precise data and, consequently produce results with greater reliability. Among the variety of acquisition methods available, those that have volume description, as CT (Computerized Tomography) and NMR (Nuclear Magnetic Resonance) stand out. The models of volumetric data (group of data that describe a solid object from a three dimensional space) are being greatly used in diversity of areas as a way of inspection, modeling and simulation of objects in a three - dimensional space. Applications of this model are already found in Mechanic Engineering, Geosciences, Medicine and other areas. In the area of engineering it is sometimes necessary to use industrial CT as the only non-invasive form of inspection the interior of pieces without destroying them. The 3D micro focus X-ray tomography is one technique of non destructive testing used in the most different areas of science and technology, given its capacity to generate clean images (practically free of the unhappiness effect) and high resolution reconstructions. The unsharpness effect minimization and space resolution improvement are consequences of the focal spot size reduction in the X-ray micro focus tube to dimensions smaller than 50 mm. The ceramic filters are used in a wide area in the metallurgic industry, particularly in the cast aluminum where they are used to clean the waste coming through the liquid aluminum. The ceramic filters used in this work are manufactured by FUSICO (German company) and they are constructed from foams. They are manufactured at three models: 10, 20 and 30 ppi (porous per inch). In this paper we present the development of software to analyze and characterize ceramic filters, which can be divided in four stages. This software was developed in C++ language, using objects oriented programming. It is also capable of being executed in multiple platforms (Windows

  12. Software project profitability analysis using temporal probabilistic reasoning; an empirical study with the CASSE framework

    CSIR Research Space (South Africa)

    Balikuddembe, JK

    2009-04-01

    Full Text Available Undertaking adequate risk management by understanding project requirements and ensuring that viable estimates are made on software projects require extensive application and sophisticated techniques of analysis and interpretation. Informative...

  13. Development of the Free-space Optical Communications Analysis Software (FOCAS)

    Science.gov (United States)

    Jeganathan, M.; Mecherle, G.; Lesh, J.

    1998-01-01

    The Free-space Optical Communications Analysis Software (FOCAS) was developed at the Jet Propulsion Laboratory (JPL) to provide mission planners, systems engineers and communications engineers with an easy to use tool to analyze optical communications link.

  14. Development of tools for safety analysis of control software in advanced reactors

    Energy Technology Data Exchange (ETDEWEB)

    Guarro, S.; Yau, M.; Motamed, M. [Advanced Systems Concepts Associates, El Segundo, CA (United States)

    1996-04-01

    Software based control systems have gained a pervasive presence in a wide variety of applications, including nuclear power plant control and protection systems which are within the oversight and licensing responsibility of the US Nuclear Regulatory Commission. While the cost effectiveness and flexibility of software based plant process control is widely recognized, it is very difficult to achieve and prove high levels of demonstrated dependability and safety assurance for the functions performed by process control software, due to the very flexibility and potential complexity of the software itself. The development of tools to model, analyze and test software design and implementations in the context of the system that the software is designed to control can greatly assist the task of providing higher levels of assurance than those obtainable by software testing alone. This report presents and discusses the development of the Dynamic Flowgraph Methodology (DFM) and its application in the dependability and assurance analysis of software-based control systems. The features of the methodology and full-scale examples of application to both generic process and nuclear power plant control systems are presented and discussed in detail. The features of a workstation software tool developed to assist users in the application of DFM are also described.

  15. Development of tools for safety analysis of control software in advanced reactors

    International Nuclear Information System (INIS)

    Guarro, S.; Yau, M.; Motamed, M.

    1996-04-01

    Software based control systems have gained a pervasive presence in a wide variety of applications, including nuclear power plant control and protection systems which are within the oversight and licensing responsibility of the US Nuclear Regulatory Commission. While the cost effectiveness and flexibility of software based plant process control is widely recognized, it is very difficult to achieve and prove high levels of demonstrated dependability and safety assurance for the functions performed by process control software, due to the very flexibility and potential complexity of the software itself. The development of tools to model, analyze and test software design and implementations in the context of the system that the software is designed to control can greatly assist the task of providing higher levels of assurance than those obtainable by software testing alone. This report presents and discusses the development of the Dynamic Flowgraph Methodology (DFM) and its application in the dependability and assurance analysis of software-based control systems. The features of the methodology and full-scale examples of application to both generic process and nuclear power plant control systems are presented and discussed in detail. The features of a workstation software tool developed to assist users in the application of DFM are also described

  16. Recent developments in application of stable isotope analysis on agro-product authenticity and traceability.

    Science.gov (United States)

    Zhao, Yan; Zhang, Bin; Chen, Gang; Chen, Ailiang; Yang, Shuming; Ye, Zhihua

    2014-02-15

    With the globalisation of agro-product markets and convenient transportation of food across countries and continents, the potential for distribution of mis-labelled products increases accordingly, highlighting the need for measures to identify the origin of food. High quality food with identified geographic origin is a concern not only for consumers, but also for agriculture farmers, retailers and administrative authorities. Currently, stable isotope ratio analysis in combination with other chemical methods gradually becomes a promising approach for agro-product authenticity and traceability. In the last five years, a growing number of research papers have been published on tracing agro-products by stable isotope ratio analysis and techniques combining with other instruments. In these reports, the global variety of stable isotope compositions has been investigated, including light elements such as C, N, H, O and S, and heavy isotopes variation such as Sr and B. Several factors also have been considered, including the latitude, altitude, evaporation and climate conditions. In the present paper, an overview is provided on the authenticity and traceability of the agro-products from both animal and plant sources by stable isotope ratio analysis. Copyright © 2013 Elsevier Ltd. All rights reserved.

  17. JULIDE: a software tool for 3D reconstruction and statistical analysis of autoradiographic mouse brain sections.

    Directory of Open Access Journals (Sweden)

    Delphine Ribes

    Full Text Available In this article we introduce JULIDE, a software toolkit developed to perform the 3D reconstruction, intensity normalization, volume standardization by 3D image registration and voxel-wise statistical analysis of autoradiographs of mouse brain sections. This software tool has been developed in the open-source ITK software framework and is freely available under a GPL license. The article presents the complete image processing chain from raw data acquisition to 3D statistical group analysis. Results of the group comparison in the context of a study on spatial learning are shown as an illustration of the data that can be obtained with this tool.

  18. Analysis for Parallel Execution without Performing Hardware/Software Co-simulation

    OpenAIRE

    Muhammad Rashid

    2014-01-01

    Hardware/software co-simulation improves the performance of embedded applications by executing the applications on a virtual platform before the actual hardware is available in silicon. However, the virtual platform of the target architecture is often not available during early stages of the embedded design flow. Consequently, analysis for parallel execution without performing hardware/software co-simulation is required. This article presents an analysis methodology for parallel execution of ...

  19. An assessment of software for flow cytometry analysis in banana plants

    Directory of Open Access Journals (Sweden)

    Renata Alves Lara Silva

    2014-02-01

    Full Text Available Flow cytometry is a technique that yields rapid results in analyses of cell properties such as volume, morphological complexity and quantitative DNA content, and it is considered more convenient than other techniques. However, the analysis usually generates histograms marked by variations that can be produced by many factors, including differences between the software packages that capture the data generated by the flow cytometer. The objective of the present work was to evaluate the performance of four software products commonly used in flow cytometry based on quantifications of DNA content and analyses of the coefficients of variation associated with the software outputs. Readings were obtained from 25 ‘NBA’ (AA banana leaf samples using the FACSCalibur (BD flow cytometer, and 25 histograms from each software product (CellQuest™, WinMDI™, FlowJo™ and FCS Express™ were analyzed to obtain the estimated DNA content and the coefficient of variation (CV of the estimates. The values of DNA content obtained from the software did not differ significantly. However, the CV analysis showed that the precision of the WinMDI™ software was low and that the CV values were underestimated, whereas the remaining software showed CV values that were in relatively close agreement with those found in the literature. The CellQuest™ software is recommended because it was developed by the same company that produces the flow cytometer used in the present study.

  20. pH-dependent equilibrium isotope fractionation associated with the compound specific nitrogen and carbon isotope analysis of substituted anilines by SPME-GC/IRMS.

    Science.gov (United States)

    Skarpeli-Liati, Marita; Turgeon, Aurora; Garr, Ashley N; Arnold, William A; Cramer, Christopher J; Hofstetter, Thomas B

    2011-03-01

    Solid-phase microextraction (SPME) coupled to gas chromatography/isotope ratio mass spectrometry (GC/IRMS) was used to elucidate the effects of N-atom protonation on the analysis of N and C isotope signatures of selected aromatic amines. Precise and accurate isotope ratios were measured using polydimethylsiloxane/divinylbenzene (PDMS/DVB) as the SPME fiber material at solution pH-values that exceeded the pK(a) of the substituted aniline's conjugate acid by two pH-units. Deviations of δ(15)N and δ(13)C-values from reference measurements by elemental analyzer IRMS were small (IRMS. Under these conditions, the detection limits for accurate isotope ratio measurements were between 0.64 and 2.1 mg L(-1) for δ(15)N and between 0.13 and 0.54 mg L(-1) for δ(13)C, respectively. Substantial inverse N isotope fractionation was observed by SPME-GC/IRMS as the fraction of protonated species increased with decreasing pH leading to deviations of -20‰ while the corresponding δ(13)C-values were largely invariant. From isotope ratio analysis at different solution pHs and theoretical calculations by density functional theory, we derived equilibrium isotope effects, EIEs, pertinent to aromatic amine protonation of 0.980 and 1.001 for N and C, respectively, which were very similar for all compounds investigated. Our work shows that N-atom protonation can compromise accurate compound-specific N isotope analysis of aromatic amines.

  1. Ar39 Detection at the 10-16 Isotopic Abundance Level with Atom Trap Trace Analysis

    Science.gov (United States)

    Jiang, W.; Williams, W.; Bailey, K.; Davis, A. M.; Hu, S.-M.; Lu, Z.-T.; O'Connor, T. P.; Purtschert, R.; Sturchio, N. C.; Sun, Y. R.; Mueller, P.

    2011-03-01

    Atom trap trace analysis, a laser-based atom counting method, has been applied to analyze atmospheric Ar39 (half-life=269yr), a cosmogenic isotope with an isotopic abundance of 8×10-16. In addition to the superior selectivity demonstrated in this work, the counting rate and efficiency of atom trap trace analysis have been improved by 2 orders of magnitude over prior results. The significant applications of this new analytical capability lie in radioisotope dating of ice and water samples and in the development of dark matter detectors.

  2. Field ionization mass spectrometry (FIMS) applied to tracer studies and isotope dilution analysis

    International Nuclear Information System (INIS)

    Anbar, M.; Heck, H.d'A.; McReynolds, J.H.; St John, G.A.

    1975-01-01

    The nonfragmenting nature of field ionization mass spectrometry makes it a preferred technique for the isotopic analysis of multilabeled organic compounds. The possibility of field ionization of nonvolatile thermolabile materials significantly extends the potential uses of this technique beyond those of conventional ionization methods. Multilabeled tracers may be studied in biological systems with a sensitivity comparable to that of radioactive tracers. Isotope dilution analysis may be performed reliably by this technique down to picogram levels. These techniques will be illustrated by a number of current studies using multilabeled metabolites and drugs. The scope and limitations of the methodology are discussed

  3. Microcalorimeter Q-spectroscopy for rapid isotopic analysis of trace actinide samples

    Energy Technology Data Exchange (ETDEWEB)

    Croce, M.P., E-mail: mpcroce@lanl.gov [Los Alamos National Laboratory, Los Alamos, NM (United States); Bond, E.M.; Hoover, A.S.; Kunde, G.J.; Mocko, V.; Rabin, M.W.; Weisse-Bernstein, N.R.; Wolfsberg, L.E. [Los Alamos National Laboratory, Los Alamos, NM (United States); Bennett, D.A.; Hays-Wehle, J.; Schmidt, D.R.; Ullom, J.N. [National Institute of Standards and Technology, Boulder, CO (United States)

    2015-06-01

    We are developing superconducting transition-edge sensor (TES) microcalorimeters that are optimized for rapid isotopic analysis of trace actinide samples by Q-spectroscopy. By designing mechanically robust TESs and simplified detector assembly methods, we have developed a detector for Q-spectroscopy of actinides that can be assembled in minutes. We have characterized the effects of each simplification and present the results. Finally, we show results of isotopic analysis of plutonium samples with Q-spectroscopy detectors and compare the results to mass spectrometry.

  4. Microcalorimeter Q-spectroscopy for rapid isotopic analysis of trace actinide samples

    International Nuclear Information System (INIS)

    Croce, M.P.; Bond, E.M.; Hoover, A.S.; Kunde, G.J.; Mocko, V.; Rabin, M.W.; Weisse-Bernstein, N.R.; Wolfsberg, L.E.; Bennett, D.A.; Hays-Wehle, J.; Schmidt, D.R.; Ullom, J.N.

    2015-01-01

    We are developing superconducting transition-edge sensor (TES) microcalorimeters that are optimized for rapid isotopic analysis of trace actinide samples by Q-spectroscopy. By designing mechanically robust TESs and simplified detector assembly methods, we have developed a detector for Q-spectroscopy of actinides that can be assembled in minutes. We have characterized the effects of each simplification and present the results. Finally, we show results of isotopic analysis of plutonium samples with Q-spectroscopy detectors and compare the results to mass spectrometry

  5. Sensitivity Analysis for Design Optimization Integrated Software Tools, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The objective of this proposed project is to provide a new set of sensitivity analysis theory and codes, the Sensitivity Analysis for Design Optimization Integrated...

  6. Development of computer software for pavement life cycle cost analysis.

    Science.gov (United States)

    1988-01-01

    The life cycle cost analysis program (LCCA) is designed to automate and standardize life cycle costing in Virginia. It allows the user to input information necessary for the analysis, and it then completes the calculations and produces a printed copy...

  7. Essentials of iron, chromium, and calcium isotope analysis of natural materials by thermal ionization mass spectrometry

    Science.gov (United States)

    Fantle, M.S.; Bullen, T.D.

    2009-01-01

    The use of isotopes to understand the behavior of metals in geological, hydrological, and biological systems has rapidly expanded in recent years. One of the mass spectrometric techniques used to analyze metal isotopes is thermal ionization mass spectrometry, or TIMS. While TIMS has been a useful analytical technique for the measurement of isotopic composition for decades and TIMS instruments are widely distributed, there are significant difficulties associated with using TIMS to analyze isotopes of the lighter alkaline earth elements and transition metals. Overcoming these difficulties to produce relatively long-lived and stable ion beams from microgram-sized samples is a non-trivial task. We focus here on TIMS analysis of three geologically and environmentally important elements (Fe, Cr, and Ca) and present an in-depth look at several key aspects that we feel have the greatest potential to trouble new users. Our discussion includes accessible descriptions of different analytical approaches and issues, including filament loading procedures, collector cup configurations, peak shapes and interferences, and the use of isotopic double spikes and related error estimation. Building on previous work, we present quantitative simulations, applied specifically in this study to Fe and Ca, that explore the effects of (1) time-variable evaporation of isotopically homogeneous spots from a filament and (2) interferences on the isotope ratios derived from a double spike subtraction routine. We discuss how and to what extent interferences at spike masses, as well as at other measured masses, affect the double spike-subtracted isotope ratio of interest (44Ca/40Ca in the case presented, though a similar analysis can be used to evaluate 56Fe/54Fe and 53Cr/52Cr). The conclusions of these simulations are neither intuitive nor immediately obvious, making this examination useful for those who are developing new methodologies. While all simulations are carried out in the context of a

  8. Testing sequential extraction methods for the analysis of multiple stable isotope systems from a bone sample

    Science.gov (United States)

    Sahlstedt, Elina; Arppe, Laura

    2017-04-01

    Stable isotope composition of bones, analysed either from the mineral phase (hydroxyapatite) or from the organic phase (mainly collagen) carry important climatological and ecological information and are therefore widely used in paleontological and archaeological research. For the analysis of the stable isotope compositions, both of the phases, hydroxyapatite and collagen, have their more or less well established separation and analytical techniques. Recent development in IRMS and wet chemical extraction methods have facilitated the analysis of very small bone fractions (500 μg or less starting material) for PO43-O isotope composition. However, the uniqueness and (pre-) historical value of each archaeological and paleontological finding lead to preciously little material available for stable isotope analyses, encouraging further development of microanalytical methods for the use of stable isotope analyses. Here we present the first results in developing extraction methods for combining collagen C- and N-isotope analyses to PO43-O-isotope analyses from a single bone sample fraction. We tested sequential extraction starting with dilute acid demineralization and collection of both collagen and PO43-fractions, followed by further purification step by H2O2 (PO43-fraction). First results show that bone sample separates as small as 2 mg may be analysed for their δ15N, δ13C and δ18OPO4 values. The method may be incorporated in detailed investigation of sequentially developing skeletal material such as teeth, potentially allowing for the investigation of interannual variability in climatological/environmental signals or investigation of the early life history of an individual.

  9. Maintaining high precision of isotope ratio analysis over extended periods of time.

    Science.gov (United States)

    Brand, Willi A

    2009-06-01

    Stable isotope ratios are reliable and long lasting process tracers. In order to compare data from different locations or different sampling times at a high level of precision, a measurement strategy must include reliable traceability to an international stable isotope scale via a reference material (RM). Since these international RMs are available in low quantities only, we have developed our own analysis schemes involving laboratory working RM. In addition, quality assurance RMs are used to control the long-term performance of the delta-value assignments. The analysis schemes allow the construction of quality assurance performance charts over years of operation. In this contribution, the performance of three typical techniques established in IsoLab at the MPI-BGC in Jena is discussed. The techniques are (1) isotope ratio mass spectrometry with an elemental analyser for delta(15)N and delta(13)C analysis of bulk (organic) material, (2) high precision delta(13)C and delta(18)O analysis of CO(2) in clean-air samples, and (3) stable isotope analysis of water samples using a high-temperature reaction with carbon. In addition, reference strategies on a laser ablation system for high spatial resolution delta(13)C analysis in tree rings is exemplified briefly.

  10. Isotope analysis by emission spectroscopy; Analyse isotopique par spectroscopie d'emission

    Energy Technology Data Exchange (ETDEWEB)

    Artaud, J; Gerstenkorn, S [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires; Blaise, J [Centre National de la Recherche Scientifique (CNRS), Lab. Aime Cotton, 92 - Meudon-Bellevue (France)

    1959-07-01

    Quantitative analysis of isotope mixtures by emission spectroscopy is resulting from the phenomenon called 'isotope shift', say from the fact that spectral lines produced by a mixture of isotopes of a same element are complex. Every spectral line is, indeed, resulting from several lines respectively corresponding to each isotope. Then isotopic components are near one to others, and their separation is effected by means of Fabry-Perot calibration standard: the apparatus allowing to measure abundances is the Fabry-Perot photo-electric spectrometer, designed in 1948 by MM. JACQUINOT and DUFOUR. This method has been used to make abundance determination in the case of helium, lithium, lead and uranium. In the case of lithium, the utilised analysis line depends on the composition of examined isotopic mixture. For mixtures containing 7 to 93 pour cent of one of isotopes of lithium, this line is the lithium blue line: {lambda} = 4603 angstrom. In other cases the red line {lambda} = 6707 angstrom is preferable, though it allows to do easily nothing but relative determinations. Helium shows no particular difficulty and the analysis line selected was {lambda} = 6678 angstrom. For lead the line {lambda} = 5201 angstrom gives the possibility to determine the isotope abundance for the four isotopes of lead notwithstanding the presence of hyperfine structure of {sup 207}Pb. For uranium, line {lambda} 5027 angstrom is used, and this method allows to determine the composition of isotope mixtures, the content of which in {sup 235}U may shorten to 0,1 per cent. Relative precision is about 2 per cent for contents in {sup 235}U over 1 per cent. For lower contents, this line {lambda} = 5027 angstrom will allow relative measures when using previously dosed mixtures. (author) [French] L'analyse quantitative des melanges isotopiques par spectroscopie d'emission doit son existence au phenomene appele 'deplacement isotopique', c'est-a-dire au fait que les raies spectrales emises par un

  11. Rapid-swept CW cavity ring-down laser spectroscopy for carbon isotope analysis

    International Nuclear Information System (INIS)

    Tomita, Hideki; Watanabe, Kenichi; Takiguchi, Yu; Kawarabayashi, Jun; Iguchi, Tetsuo

    2006-01-01

    With the aim of developing a portable system for an in field isotope analysis, we investigate an isotope analysis based on rapid-swept CW cavity ring-down laser spectroscopy, in which the concentration of a chemical species is derived from its photo absorbance. Such a system can identify the isotopomer and still be constructed as a quite compact system. We have made some basic experimental measurements of the overtone absorption lines of carbon dioxide ( 12 C 16 O 2 , 13 C 16 O 2 ) by rapid-swept cavity ring-down spectroscopy with a CW infrared diode laser at 6,200 cm -1 (1.6 μm). The isotopic ratio has been obtained as (1.07±0.13)x10 -2 , in good agreement with the natural abundance within experimental uncertainty. The detection sensitivity in absorbance has been estimated to be 3x10 -8 cm -1 . (author)

  12. Feasibility study of plutonium isotopic analysis of resin beads by nondestructive gamma-ray spectroscopy

    International Nuclear Information System (INIS)

    Li, T.K.

    1985-01-01

    We have initiated a feasibility study on the use of nondestructive low-energy gamma-ray spectroscopy for plutonium isotopic analysis on resin beads. Seven resin bead samples were measured, with each sample containing an average of 9 μg of plutonium; the isotopic compositions of the samples varied over a wide range. The gamma-ray spectroscopy results, obtained from 4-h counting-time measurements, were compared with mass spectrometry results. The average ratios of gamma-ray spectroscopy to mass spectrometry were 1.014 +- 0.025 for 238 Pu/ 239 Pu, 0.996 +- 0.018 for 240 Pu/ 239 Pu, and 0.980 +- 0.038 for 241 Pu/ 239 Pu. The rapid, automated, and accurate nondestructive isotopic analysis of resin beads may be very useful to process technicians and International Atomic Energy Agency inspectors. 3 refs., 1 fig., 3 tabs

  13. Analytical developments in thermal ionization mass spectrometry for the isotopic analysis of very small amounts

    International Nuclear Information System (INIS)

    Mialle, S.

    2011-01-01

    In the framework of the French transmutation project of nuclear wastes, experiments consisted in the irradiation in a fast neutron reactor of few milligrams of isotopically enriched powders. Hence, the isotopic analysis of very small amount of irradiation products is one of the main issues. The aim of this study was to achieve analytical developments in thermal ionization mass spectrometry in order to accurately analyze these samples. Several axes were studied including the new total evaporation method, deposition techniques, electron multiplier potentialities and comparison between different isotope measurement techniques. Results showed that it was possible to drastically decrease the amounts needed for analysis, especially with Eu and Nd, while maintaining an uncertainty level in agreement with the project requirements. (author) [fr

  14. Quantification of the carbonaceous matter origin in submicron marine aerosol particles by dual carbon isotope analysis

    Science.gov (United States)

    Ceburnis, D.; Garbaras, A.; Szidat, S.; Rinaldi, M.; Fahrni, S.; Perron, N.; Wacker, L.; Leinert, S.; Remeikis, V.; Facchini, M. C.; Prevot, A. S. H.; Jennings, S. G.; O'Dowd, C. D.

    2011-01-01

    Dual carbon isotope analysis has been performed for the first time demonstrating a potential in organic matter apportionment between three principal sources: marine, terrestrial (non-fossil) and fossil fuel due to unique isotopic signatures. The results presented here, utilising combinations of dual carbon isotope analysis, provides a conclusive evidence of a dominant biogenic organic fraction to organic aerosol over biologically active oceans. In particular, the NE Atlantic, which is also subjected to notable anthropogenic influences via pollution transport processes, was found to contain 80% organic aerosol matter of biogenic origin directly linked to plankton emissions. The remaining carbonaceous aerosol was of fossil-fuel origin. By contrast, for polluted air advecting out from Europe into the NE Atlantic, the source apportionment is 30% marine biogenic, 40% fossil fuel, and 30% continental non-fossil fuel. The dominant marine organic aerosol source in the atmosphere has significant implications for climate change feedback processes.

  15. Stable-isotope analysis: a neglected tool for placing parasites in food webs.

    Science.gov (United States)

    Sabadel, A J M; Stumbo, A D; MacLeod, C D

    2018-02-28

    Parasites are often overlooked in the construction of food webs, despite their ubiquitous presence in almost every type of ecosystem. Researchers who do recognize their importance often struggle to include parasites using classical food-web theory, mainly due to the parasites' multiple hosts and life stages. A novel approach using compound-specific stable-isotope analysis promises to provide considerable insight into the energetic exchanges of parasite and host, which may solve some of the issues inherent in incorporating parasites using a classical approach. Understanding the role of parasites within food webs, and tracing the associated biomass transfers, are crucial to constructing new models that will expand our knowledge of food webs. This mini-review focuses on stable-isotope studies published in the past decade, and introduces compound-specific stable-isotope analysis as a powerful, but underutilized, newly developed tool that may answer many unresolved questions regarding the role of parasites in food webs.

  16. GRO/EGRET data analysis software: An integrated system of custom and commercial software using standard interfaces

    Science.gov (United States)

    Laubenthal, N. A.; Bertsch, D.; Lal, N.; Etienne, A.; Mcdonald, L.; Mattox, J.; Sreekumar, P.; Nolan, P.; Fierro, J.

    1992-01-01

    The Energetic Gamma Ray Telescope Experiment (EGRET) on the Compton Gamma Ray Observatory has been in orbit for more than a year and is being used to map the full sky for gamma rays in a wide energy range from 30 to 20,000 MeV. Already these measurements have resulted in a wide range of exciting new information on quasars, pulsars, galactic sources, and diffuse gamma ray emission. The central part of the analysis is done with sky maps that typically cover an 80 x 80 degree section of the sky for an exposure time of several days. Specific software developed for this program generates the counts, exposure, and intensity maps. The analysis is done on a network of UNIX based workstations and takes full advantage of a custom-built user interface called X-dialog. The maps that are generated are stored in the FITS format for a collection of energies. These, along with similar diffuse emission background maps generated from a model calculation, serve as input to a maximum likelihood program that produces maps of likelihood with optional contours that are used to evaluate regions for sources. Likelihood also evaluates the background corrected intensity at each location for each energy interval from which spectra can be generated. Being in a standard FITS format permits all of the maps to be easily accessed by the full complement of tools available in several commercial astronomical analysis systems. In the EGRET case, IDL is used to produce graphics plots in two and three dimensions and to quickly implement any special evaluation that might be desired. Other custom-built software, such as the spectral and pulsar analyses, take advantage of the XView toolkit for display and Postscript output for the color hard copy. This poster paper outlines the data flow and provides examples of the user interfaces and output products. It stresses the advantages that are derived from the integration of the specific instrument-unique software and powerful commercial tools for graphics and

  17. Learning from examples - Generation and evaluation of decision trees for software resource analysis

    Science.gov (United States)

    Selby, Richard W.; Porter, Adam A.

    1988-01-01

    A general solution method for the automatic generation of decision (or classification) trees is investigated. The approach is to provide insights through in-depth empirical characterization and evaluation of decision trees for software resource data analysis. The trees identify classes of objects (software modules) that had high development effort. Sixteen software systems ranging from 3,000 to 112,000 source lines were selected for analysis from a NASA production environment. The collection and analysis of 74 attributes (or metrics), for over 4,700 objects, captured information about the development effort, faults, changes, design style, and implementation style. A total of 9,600 decision trees were automatically generated and evaluated. The trees correctly identified 79.3 percent of the software modules that had high development effort or faults, and the trees generated from the best parameter combinations correctly identified 88.4 percent of the modules on the average.

  18. Emergency diesel generator reliability analysis high flux isotope reactor

    International Nuclear Information System (INIS)

    Merryman, L.; Christie, B.

    1993-01-01

    A program to apply some of the techniques of reliability engineering to the High Flux Isotope Reactor (HFIR) was started on August 8, 1992. Part of the program was to track the conditional probabilities of the emergency diesel generators responding to a valid demand. This was done to determine if the performance of the emergency diesel generators (which are more than 25 years old) has deteriorated. The conditional probabilities of the diesel generators were computed and trended for the period from May 1990 to December 1992. The calculations indicate that the performance of the emergency diesel generators has not deteriorated in recent years, i.e., the conditional probabilities of the emergency diesel generators have been fairly stable over the last few years. This information will be one factor than may be considered in the decision to replace the emergency diesel generators

  19. Nutritional assessment by isotope dilution analysis of body composition

    International Nuclear Information System (INIS)

    Szeluga, D.J.; Stuart, R.K.; Utermohlen, V.; Santos, G.W.

    1984-01-01

    The three components of body mass, body cell mass (BCM), extracellular fluid (ECF), and fat + extracellular solids (ECS: bone, tendon, etc) can be quantified using established isotope dilution techniques. With these techniques, total body water (TBW) and ECF are measured using 3H 2 O and 82 Bromine, respectively, as tracers. BCM is calculated from intracellular fluid (ICF) where ICF . TBW - ECF. Fat + ECS is estimated as: body weight - (BCM + ECF). TBW and ECF can be determined by either of two calculation methods, one requiring several timed plasma samples (extrapolation method) and one requiring a single plasma sample and a 4-h urine collection (urine-corrected method). The comparability of the two calculation methods was evaluated in 20 studies in 12 bone marrow transplant recipients. We found that for determination of TBW and ECF there was a very strong linear relationship (r2 greater than 0.98) between the calculation methods. Further comparisons (by t test, 2-sided) indicated that for the determination of ECF, the methods were not significantly (p greater than 0.90) different; however, TBW determined by the urine-corrected method was slightly (0.1 to 6%), but significantly (p less than 0.01) greater than that determined by the extrapolation method. Therefore, relative to the extrapolation method, the urine-corrected method ''over-estimates'' BCM and ''under-estimates'' fat + ECS since determination of these compartment sizes depends on measurement of TBW. We currently use serial isotope dilution studies to monitor the body composition changes of patients receiving therapeutic nutritional support

  20. Automatic sample changer control software for automation of neutron activation analysis process in Malaysian Nuclear Agency

    Science.gov (United States)

    Yussup, N.; Ibrahim, M. M.; Rahman, N. A. A.; Mokhtar, M.; Salim, N. A. A.; Soh@Shaari, S. C.; Azman, A.; Lombigit, L.; Azman, A.; Omar, S. A.

    2018-01-01

    Most of the procedures in neutron activation analysis (NAA) process that has been established in Malaysian Nuclear Agency (Nuclear Malaysia) since 1980s were performed manually. These manual procedures carried out by the NAA laboratory personnel are time consuming and inefficient especially for sample counting and measurement process. The sample needs to be changed and the measurement software needs to be setup for every one hour counting time. Both of these procedures are performed manually for every sample. Hence, an automatic sample changer system (ASC) that consists of hardware and software is developed to automate sample counting process for up to 30 samples consecutively. This paper describes the ASC control software for NAA process which is designed and developed to control the ASC hardware and call GammaVision software for sample measurement. The software is developed by using National Instrument LabVIEW development package.

  1. Analysis and recommendations for a reliable programming of software based safety systems

    International Nuclear Information System (INIS)

    Nunez McLeod, J.; Nunez McLeod, J.E.; Rivera, S.S.

    1997-01-01

    The present paper summarizes the results of several studies performed for the development of high software on i486 microprocessors, towards its utilization for control and safety systems for nuclear power plants. The work is based on software programmed in C language. Several recommendations oriented to high reliability software are analyzed, relating the requirements on high level language to its influence on assembler level. Several metrics are implemented, that allow for the quantification of the results achieved. New metrics were developed and other were adapted, in order to obtain more efficient indexes for the software description. Such metrics are helpful to visualize the adaptation of the software under development to the quality rules under use. A specific program developed to assist the reliability analyst on this quantification is also present in the paper. It performs the analysis of an executable program written in C language, disassembling it and evaluating its inter al structures. (author)

  2. Experimental analysis of specification language diversity impact on NPP software diversity

    International Nuclear Information System (INIS)

    Yoo, Chang Sik

    1999-02-01

    In order to increase computer system reliability, software fault tolerance methods have been adopted to some safety critical systems including NPP. Prevention of software common mode failure is very crucial problem in software fault tolerance, but the effective method for this problem is not found yet. In our research, to find out an effective method for prevention of software common mode failure, the impact of specification language diversity on NPP software diversity was examined experimentally. Three specification languages were used to compose three requirements specifications, and programmers made twelve product codes from the specifications. From the product codes analysis, using fault diversity criteria, we concluded that diverse specification language method would enhance program diversity through diversification of requirements specification imperfections

  3. Monitoring of the aerobe biodegradation of chlorinated organic solvents by stable isotope analysis

    Science.gov (United States)

    Horváth, Anikó; Futó, István; Palcsu, László

    2014-05-01

    Our chemical-biological basic research aims to eliminate chlorinated environmental contaminants from aquifers around industrial areas in the frame of research program supported by the European Social Fund (TÁMOP-4.2.2.A-11/1/KONV-2012-0043). The most careful and simplest way includes the in situ biodegradation with the help of cultured and compound specific strains. Numerous members of Pseudomonas bacteria are famous about function of bioremediation. They can metabolism the environmental hazardous chemicals like gas oils, dyes, and organic solvents. Our research based on the Pseudomonas putida F1 strain, because its ability to degrade halogenated hydrocarbons such as trichloroethylene. Several methods were investigated to estimate the rate of biodegradation, such as the measurement of the concentration of the pollutant along the contamination pathway, the microcosm's studies or the compound specific stable isotope analysis. In this area in the Transcarpathian basin we are pioneers in the stable isotope monitoring of biodegradation. The main goal is to find stable isotope fractionation factors by stable isotope analysis, which can help us to estimate the rate and effectiveness of the biodegradation. The subsequent research period includes the investigation of the method, testing its feasibility and adaptation in the environment. Last but not least, the research gives an opportunity to identify the producer of the contaminant based on the stable isotope composition of the contaminant.

  4. Cl and C isotope analysis to assess the effectiveness of chlorinated ethene degradation by zero-valent iron: Evidence from dual element and product isotope values

    International Nuclear Information System (INIS)

    Audí-Miró, Carme; Cretnik, Stefan; Otero, Neus; Palau, Jordi; Shouakar-Stash, Orfan; Soler, Albert

    2013-01-01

    Highlights: ► TCE and cis-DCE Cl isotope fractionation was investigated for the first time with ZVI. ► A C–Cl bond is broken in the rate-limiting step during ethylene ZVI dechlorination. ► Dual C/Cl isotope plot is a promising tool to discriminate abiotic degradation. ► Product-related carbon isotopic fractionation gives evidence of abiotic degradation. ► Hydrogenolysis and β-dichloroelimination pathways occur simultaneously. - Abstract: This study investigated C and, for the first time, Cl isotope fractionation of trichloroethene (TCE) and cis-dichloroethene (cis-DCE) during reductive dechlorination by cast zero-valent iron (ZVI). Hydrogenolysis and β-dichloroelimination pathways occurred as parallel reactions, with ethene and ethane deriving from the β-dichloroelimination pathway. Carbon isotope fractionation of TCE and cis-DCE was consistent for different batches of Fe studied. Transformation of TCE and cis-DCE showed Cl isotopic enrichment factors (ε Cl ) of −2.6‰ ± 0.1‰ (TCE) and −6.2‰ ± 0.8‰ (cis-DCE), with Apparent Kinetic Isotope Effects (AKIE Cl ) for Cl of 1.008 ± 0.001 (TCE) and 1.013 ± 0.002 (cis-DCE). This indicates that a C–Cl bond breakage is rate-determining in TCE and cis-DCE transformation by ZVI. Two approaches were investigated to evaluate if isotope fractionation analysis can distinguish the effectiveness of transformation by ZVI as opposed to natural biodegradation. (i) Dual isotope plots. This study reports the first dual (C, Cl) element isotope plots for TCE and cis-DCE degradation by ZVI. The pattern for cis-DCE differs markedly from that reported for biodegradation of the same compound by KB-1, a commercially available Dehalococcoides-containing culture. The different trends suggest an expedient approach to distinguish abiotic and biotic transformation, but this needs to be confirmed in future studies. (ii) Product-related isotope fractionation. Carbon isotope ratios of the hydrogenolysis product cis

  5. BASTILLE - Better Analysis Software to Treat ILL Experiments - a unified, unifying approach to data reduction and analysis

    International Nuclear Information System (INIS)

    Johnson, M.

    2011-01-01

    Data reduction and analysis is a key component in the production of scientific results. If this component, like any other in the chain, is weak, the final output is compromised. The current situation for data reduction and analysis may be regarded as adequate, but it is variable, depending on the instrument, and should be improved. In particular the delivery of new and upgraded instruments in Millennium Phase I and those proposed for Phase II will bring new demands and challenges for software development. Failure to meet these challenges will hamper the exploitation of higher data rates and the delivery of new science. The proposed project is to provide a single, underpinning software infrastructure for data analysis, which would ensure: 1) a clear vision of software provision at ILL; 2) a clear role for the 'Computing for Science' Group (CS) in maintaining and developing the infrastructure and the codes; 3) a well-defined framework for recruiting and training CS staff; 4) ease and efficiency of development within a common, well-defined software environment; 5) safeguarding of key, existing software; and 6) ease of communication with other software like instrument control software to allow real-time data analysis and experiment control, or software from other institutes or sources

  6. Analysis of transuranic isotopes in irradiated U3Si2-Al fuel by alpha spectrometry

    International Nuclear Information System (INIS)

    Dian Anggraini; Aslina B Ginting; Arif Nugroho

    2011-01-01

    Separation and analysis of transuranic isotopes (uranium and plutonium) in irradiated U 3 Si 2 -Al plate has been done. The analysis experiment includes sample preparation (i.e. cutting, dissolving, filtering, dilution), fission products separation from heavy elements, and analysis of transuranic isotopes content with alpha spectrometer. The separation of transuranic isotopes (U, Pu) was done by two methods, i.e. direct method and ion exchanger method with zeolite. Measurement of standard transuranic isotope (AMR 43) and standard U 3 O 8 was done in advance in order to determine percentage of 235 U recovery and detector efficiency. Recovery of 235 U isotope was obtained as much as 92,58%, which fulfills validation requirement, and the detector efficiency was 0.314. Based on the measured recovery and detector efficiency, the separation was done by direct electrodeposition method of 250 µL irradiated U 3 Si 2 -Al solution. The deposited sample was subsequently analyzed with alpha spectrometer. The separation with ion exchanger was done by mixing and shaking of 300 µL irradiated U 3 Si 2 -Al solution and 0.5 gram zeolite to separate the liquid phase from the solid phase. The liquid phase was electrodeposited and analyzed with alpha spectrometer. The analysis of transuranic isotopes (U, Pu) by both methods shows different results. Heavy element ( 238 U, 236 U, 234 U, 239 Pu) content obtained by direct method was 0.0525 g/g and 235 U= 0.0076 g/g, while the separation using zeolite ion exchanger resulted in Heavy element = 0.0253 g/g and 235 U = 0.0092 g/g. (author)

  7. Isotope analysis of micro metal particles by adopting laser-ablation mass spectrometry

    International Nuclear Information System (INIS)

    Song, Kyu Seok; Ha, Young Kyung; Han, Sun Ho; Park, Yong Joon; Kim, Won Ho

    2005-01-01

    The isotope analysis of microparticles in environmental samples as well as laboratory samples is an important task. A special concern is necessary in particle analysis of swipe samples. Micro particles are normally analyzed either by dissolving particles in the solvents and adopting conventional analytical methods or direct analysis method such as a laser-ablation ICP mass spectrometry (LA-ICP-MS), SIMS, and SNMS (sputtered neutral mass spectrometry). But the LA-ICPMS uses large amount of samples because normally laser beam is tightly focused on the target particle for the complete ablation. The SIMS and SNMS utilize ion beams for the generation of sample ions from the particle. But the number of ions generated by an ion beam is less than 5% of the total generated particles in SIMS. The SNMS is also an excellent analytical technique for particle analysis, however, ion beam and frequency tunable laser system are required for the analysis. Recently a direct analysis of elements as well as isotopes by using laser-ablation is recognized one of the most efficient detection technology for particle samples. The laser-ablation mass spectrometry requires only one laser source without frequency tuneability with no sample pretreatment. Therefore this technique is one of the simplest analysis techniques for solid samples as well as particles. In this study as a part of the development of the new isotope analysis techniques for particles samples, a direct laser-ablation is adopted with mass spectrometry. Zinc and gadolinium were chosen as target samples, since these elements have isotopes with minor abundance (0.62% for Zn, and 0.2% for Gd). The preliminary result indicates that isotopes of these two elements are analyzed within 10% of natural abundance with good mass resolution by using direct laser-ablation mass spectrometry

  8. EZ-Rhizo software: the gateway to root architecture analysis.

    Science.gov (United States)

    Armengaud, Patrick

    2009-02-01

    Plants are sessile organisms that have to cope with the available nutritional resources and environmental constraints in the place where they germinate. To fully exploit their nearby resources, they have evolved a highly plastic and responsive root system. Adaptations to limited nutrients include a wide range of specific root responses, e.g., the emergence of new root types, root branching or specific growth of lateral roots. These root system architecture (RSA) features are of utmost importance when investigating the underlying mechanisms by forward, reverse or quantitative genetic approaches. The EZ-Rhizo software was developed to facilitate such root measurements in a fast, simple and accurate way. The performances of EZ-Rhizo in providing about 20 primary and derived RSA parameters were illustrated by looking at natural variability across 23 Arabidopsis accessions. The different RSA profiles obtained from plants grown in favorable condition illustrated the wide reservoir of natural genetic resources underlying specific features of root growth. This diversity was used here to correlate the RSA genetic variability with growth, development and environmental properties of accession origins.

  9. The R software fundamentals of programming and statistical analysis

    CERN Document Server

    Lafaye de Micheaux, Pierre; Liquet, Benoit

    2013-01-01

    The contents of The R Software are presented so as to be both comprehensive and easy for the reader to use. Besides its application as a self-learning text, this book can support lectures on R at any level from beginner to advanced. This book can serve as a textbook on R for beginners as well as more advanced users, working on Windows, MacOs or Linux OSes. The first part of the book deals with the heart of the R language and its fundamental concepts, including data organization, import and export, various manipulations, documentation, plots, programming and maintenance.  The last chapter in this part deals with oriented object programming as well as interfacing R with C/C++ or Fortran, and contains a section on debugging techniques. This is followed by the second part of the book, which provides detailed explanations on how to perform many standard statistical analyses, mainly in the Biostatistics field. Topics from mathematical and statistical settings that are included are matrix operations, integration, o...

  10. Analysis on flexible manufacturing system layout using arena simulation software

    Science.gov (United States)

    Fadzly, M. K.; Saad, Mohd Sazli; Shayfull, Z.

    2017-09-01

    Flexible manufacturing system (FMS) was defined as highly automated group technology machine cell, consisting of a group of processing stations interconnected by an automated material handling and storage system, and controlled by an integrated computer system. FMS can produce parts or products are in the mid-volume, mid-variety production range. The layout system in FMS is an important criterion to design the FMS system to produce a part or product. This facility layout of an FMS involves the positioning of cells within given boundaries, so as to minimize the total projected travel time between cells. Defining the layout includes specifying the spatial coordinates of each cell, its orientation in either a horizontal or vertical position, and the location of its load or unloads point. There are many types of FMS layout such as In-line, loop ladder and robot centered cell layout. The research is concentrating on the design and optimization FMS layout. The final conclusion can be summarized that the objective to design and optimisation of FMS layout for this study is successful because the FMS In-line layout is the best layout based on effective time and cost using ARENA simulation software.

  11. Comparing Results from Constant Comparative and Computer Software Methods: A Reflection about Qualitative Data Analysis

    Science.gov (United States)

    Putten, Jim Vander; Nolen, Amanda L.

    2010-01-01

    This study compared qualitative research results obtained by manual constant comparative analysis with results obtained by computer software analysis of the same data. An investigated about issues of trustworthiness and accuracy ensued. Results indicated that the inductive constant comparative data analysis generated 51 codes and two coding levels…

  12. Standard gamma-ray spectra for the comparison of spectral analysis software

    International Nuclear Information System (INIS)

    Woods, S.; Hemingway, J.; Bowles, N.

    1997-01-01

    Three sets of standard γ-ray spectra have been produced for use in assessing the performance of spectral analysis software. The origin of and rationale behind the spectra are described. Nine representative analysis systems have been tested both in terms of component performance and in terms of overall performance and the problems encountered in the analysis are discussed. (author)

  13. Standard gamma-ray spectra for the comparison of spectral analysis software

    Energy Technology Data Exchange (ETDEWEB)

    Woods, S.; Hemingway, J.; Bowles, N. [and others

    1997-08-01

    Three sets of standard {gamma}-ray spectra have been produced for use in assessing the performance of spectral analysis software. The origin of and rationale behind the spectra are described. Nine representative analysis systems have been tested both in terms of component performance and in terms of overall performance and the problems encountered in the analysis are discussed. (author)

  14. Separation of polybrominated diphenyl ethers in fish for compound-specific stable carbon isotope analysis

    Energy Technology Data Exchange (ETDEWEB)

    Zeng, Yan-Hong [State Key Laboratory of Organic Geochemistry, Guangzhou Institute of Geochemistry, Chinese Academy of Sciences, Guangzhou 510640 (China); Graduate University of Chinese Academy of Sciences, Beijing, 100049 (China); Luo, Xiao-Jun, E-mail: luoxiaoj@gig.ac.cn [State Key Laboratory of Organic Geochemistry, Guangzhou Institute of Geochemistry, Chinese Academy of Sciences, Guangzhou 510640 (China); Chen, Hua-Shan; Wu, Jiang-Ping; Chen, She-Jun; Mai, Bi-Xian [State Key Laboratory of Organic Geochemistry, Guangzhou Institute of Geochemistry, Chinese Academy of Sciences, Guangzhou 510640 (China)

    2012-05-15

    A separation and isotopic analysis method was developed to accurately measure the stable carbon isotope ratios of polybrominated diphenyl ethers (PBDEs) with three to six substituted bromine atoms in fish samples. Sample extracts were treated with concentrated sulfuric acid to remove lipids, purified using complex silica gel column chromatography, and finally processed using alumina/silica (Al/Si) gel column chromatography. The purities of extracts were verified by gas chromatography and mass spectrometry (GC-MS) in the full-scan mode. The average recoveries of all compounds across the purification method were between 60% and 110%, with the exception of BDE-154. The stable carbon isotopic compositions of PBDEs can be measured with a standard deviation of less than 0.5 Per-Mille-Sign . No significant isotopic fraction was found during the purification of the main PBDE congeners. A significant change in the stable carbon isotope ratio of BDE-47 was observed in fish carcasses compared to the original isotopic signatures, implying that PBDE stable carbon isotopic compositions can be used to trace the biotransformation of PBDEs in biota. - Highlights: Black-Right-Pointing-Pointer A method for the purification of PBDEs for CSIA was developed. Black-Right-Pointing-Pointer The {delta}{sup 13}C of PBDE congeners can be measured with a standard deviation of less than 0.5 Per-Mille-Sign . Black-Right-Pointing-Pointer Common carp were exposed to a PBDE mixture to investigate debromination. Black-Right-Pointing-Pointer Ratios of the {delta}{sup 13}C values can be used to trace the debromination of PBDE in fish.

  15. Development of a gamma ray spectrometry software for neutron activation analysis using the open source concept

    International Nuclear Information System (INIS)

    Lucia, Silvio Rogerio de; Maihara, Vera Akiko; Menezes, Mario O. de

    2009-01-01

    In this work, a new software - SAANI (Instrumental Neutron Activation Analysis Software) was developed and used for gamma ray spectra analysis in the Neutron Activation Laboratory (LAN) of the Nuclear and Energetic Research Institute (IPEN-CNEN/SP). The software was developed to completely replace the old one - VISPECT. Besides the visual improvement in the user interface, the new software will allow the standardization of several procedures which are done nowadays in several different ways by each researcher, avoiding intermediate steps in the calculations. By using a modern programming language - Python, together with the graphical library Qt (by Trolltech), both multi-platform, the new software is able to run in Windows, Linux and other platforms. In addition to this, the new software has being designed to be extensible through plug-ins. In order to achieve the proposed initial scope, that is, completely replace the old software, SAANI has undergone several and different kinds of tests, using spectra from certified reference materials, standards and common spectra already analyzed by other software or that were used in international inter-comparisons. The results obtained by SAANI in all tests were considered very good. Some small discrepancies were found and after careful search and analysis, their source was identified as being an accuracy bug in the old software. Usability and robustness tests were conducted by installing SAANI in several laboratory computers and following them during daily utilization. The results of these tests also indicated that SAANI was ready to be used by all researchers in the LAN-IPEN. (author)

  16. Progress in the analysis and interpretation of N2O isotopes: Potential and future challenges

    Science.gov (United States)

    Mohn, Joachim; Tuzson, Béla; Zellweger, Christoph; Harris, Eliza; Ibraim, Erkan; Yu, Longfei; Emmenegger, Lukas

    2017-04-01

    In recent years, research on nitrous oxide (N2O) stable isotopes has significantly advanced, addressing an increasing number of research questions in biogeochemical and atmospheric sciences [1]. An important milestone was the development of quantum cascade laser based spectroscopic devices [2], which are inherently specific for structural isomers (15N14N16O vs. 14N15N16O) and capable to collect real-time data with high temporal resolution, complementary to the well-established isotope-ratio mass-spectrometry (IRMS) method. In combination with automated preconcentration, optical isotope ratio spectroscopy (OIRS) has been applied to disentangle source processes in suburban, rural and pristine environments [e.g. 3, 4]. Within the European Metrology Research Programme (EMRP) ENV52 project "Metrology for high-impact greenhouse gases (HIGHGAS)", the quality of N2O stable isotope analysis by OIRS, the comparability between laboratories, and the traceability to the international isotope ratio scales have been addressed. An inter-laboratory comparison between eleven IRMS and OIRS laboratories, organised within HIGHGAS, indicated limited comparability for 15N site preference, i.e. the difference between 15N abundance in central (N*NO) and end (*NNO) position [5]. In addition, the accuracy of the NH4NO3 decomposition reaction, which provides the link between 15N site preference and the international 15N/14N scale, was found to be limited by non-quantitative NH4NO3 decomposition in combination with substantially different isotope enrichment factors for both nitrogen atoms [6]. Results of the HIGHGAS project indicate that the following research tasks have to be completed to foster research on N2O isotopes: 1) develop improved techniques to link the 15N and 18O abundance and the 15N site preference in N2O to the international stable isotope ratio scales; 2) provide N2O reference materials, pure and diluted in an air matrix, to improve inter-laboratory compatibility. These tasks

  17. Enhanced understanding of ectoparasite: host trophic linkages on coral reefs through stable isotope analysis

    Science.gov (United States)

    Demopoulos, Amanda W. J.; Sikkel, Paul C.

    2015-01-01

    Parasitism, although the most common type of ecological interaction, is usually ignored in food web models and studies of trophic connectivity. Stable isotope analysis is widely used in assessing the flow of energy in ecological communities and thus is a potentially valuable tool in understanding the cryptic trophic relationships mediated by parasites. In an effort to assess the utility of stable isotope analysis in understanding the role of parasites in complex coral-reef trophic systems, we performed stable isotope analysis on three common Caribbean reef fish hosts and two kinds of ectoparasitic isopods: temporarily parasitic gnathiids (Gnathia marleyi) and permanently parasitic cymothoids (Anilocra). To further track the transfer of fish-derived carbon (energy) from parasites to parasite consumers, gnathiids from host fish were also fed to captive Pederson shrimp (Ancylomenes pedersoni) for at least 1 month. Parasitic isopods had δ13C and δ15N values similar to their host, comparable with results from the small number of other host–parasite studies that have employed stable isotopes. Adult gnathiids were enriched in 15N and depleted in13C relative to juvenile gnathiids, providing insights into the potential isotopic fractionation associated with blood-meal assimilation and subsequent metamorphosis. Gnathiid-fed Pedersen shrimp also had δ13C values consistent with their food source and enriched in 15N as predicted due to trophic fractionation. These results further indicate that stable isotopes can be an effective tool in deciphering cryptic feeding relationships involving parasites and their consumers, and the role of parasites and cleaners in carbon transfer in coral-reef ecosystems specifically.

  18. Enhanced understanding of ectoparasite–host trophic linkages on coral reefs through stable isotope analysis

    Directory of Open Access Journals (Sweden)

    Amanda W.J. Demopoulos

    2015-04-01

    Full Text Available Parasitism, although the most common type of ecological interaction, is usually ignored in food web models and studies of trophic connectivity. Stable isotope analysis is widely used in assessing the flow of energy in ecological communities and thus is a potentially valuable tool in understanding the cryptic trophic relationships mediated by parasites. In an effort to assess the utility of stable isotope analysis in understanding the role of parasites in complex coral-reef trophic systems, we performed stable isotope analysis on three common Caribbean reef fish hosts and two kinds of ectoparasitic isopods: temporarily parasitic gnathiids (Gnathia marleyi and permanently parasitic cymothoids (Anilocra. To further track the transfer of fish-derived carbon (energy from parasites to parasite consumers, gnathiids from host fish were also fed to captive Pederson shrimp (Ancylomenes pedersoni for at least 1 month. Parasitic isopods had δ13C and δ15N values similar to their host, comparable with results from the small number of other host–parasite studies that have employed stable isotopes. Adult gnathiids were enriched in 15N and depleted in 13C relative to juvenile gnathiids, providing insights into the potential isotopic fractionation associated with blood-meal assimilation and subsequent metamorphosis. Gnathiid-fed Pedersen shrimp also had δ13C values consistent with their food source and enriched in 15N as predicted due to trophic fractionation. These results further indicate that stable isotopes can be an effective tool in deciphering cryptic feeding relationships involving parasites and their consumers, and the role of parasites and cleaners in carbon transfer in coral-reef ecosystems specifically.

  19. Key attributes of the SAPHIRE risk and reliability analysis software for risk-informed probabilistic applications

    International Nuclear Information System (INIS)

    Smith, Curtis; Knudsen, James; Kvarfordt, Kellie; Wood, Ted

    2008-01-01

    The Idaho National Laboratory is a primary developer of probabilistic risk and reliability analysis (PRRA) tools, dating back over 35 years. Evolving from mainframe-based software, the current state-of-the-practice has led to the creation of the SAPHIRE software. Currently, agencies such as the Nuclear Regulatory Commission, the National Aeronautics and Aerospace Agency, the Department of Energy, and the Department of Defense use version 7 of the SAPHIRE software for many of their risk-informed activities. In order to better understand and appreciate the power of software as part of risk-informed applications, we need to recall that our current analysis methods and solution methods have built upon pioneering work done 30-40 years ago. We contrast this work with the current capabilities in the SAPHIRE analysis package. As part of this discussion, we provide information for both the typical features and special analysis capabilities, which are available. We also present the application and results typically found with state-of-the-practice PRRA models. By providing both a high-level and detailed look at the SAPHIRE software, we give a snapshot in time for the current use of software tools in a risk-informed decision arena

  20. HeteroGenius: A Framework for Hybrid Analysis of Heterogeneous Software Specifications

    Directory of Open Access Journals (Sweden)

    Manuel Giménez

    2014-01-01

    Full Text Available Nowadays, software artifacts are ubiquitous in our lives being an essential part of home appliances, cars, cell phones, and even in more critical activities like aeronautics and health sciences. In this context software failures may produce enormous losses, either economical or, in the worst case, in human lives. Software analysis is an area in software engineering concerned with the application of diverse techniques in order to prove the absence of errors in software pieces. In many cases different analysis techniques are applied by following specific methodological combinations that ensure better results. These interactions between tools are usually carried out at the user level and it is not supported by the tools. In this work we present HeteroGenius, a framework conceived to develop tools that allow users to perform hybrid analysis of heterogeneous software specifications. HeteroGenius was designed prioritising the possibility of adding new specification languages and analysis tools and enabling a synergic relation of the techniques under a graphical interface satisfying several well-known usability enhancement criteria. As a case-study we implemented the functionality of Dynamite on top of HeteroGenius.