WorldWideScience

Sample records for analysis software suite

  1. The ESA's Space Trajectory Analysis software suite

    Science.gov (United States)

    Ortega, Guillermo

    The European Space Agency (ESA) initiated in 2005 an internal activity to develop an open source software suite involving university science departments and research institutions all over the world. This project is called the "Space Trajectory Analysis" or STA. This article describes the birth of STA and its present configuration. One of the STA aims is to promote the exchange of technical ideas, and raise knowledge and competence in the areas of applied mathematics, space engineering, and informatics at University level. Conceived as a research and education tool to support the analysis phase of a space mission, STA is able to visualize a wide range of space trajectories. These include among others ascent, re-entry, descent and landing trajectories, orbits around planets and moons, interplanetary trajectories, rendezvous trajectories, etc. The article explains that STA project is an original idea of the Technical Directorate of ESA. It was born in August 2005 to provide a framework in astrodynamics research at University level. As research and education software applicable to Academia, a number of Universities support this development by joining ESA in leading the development. ESA and Universities partnership are expressed in the STA Steering Board. Together with ESA, each University has a chair in the board whose tasks are develop, control, promote, maintain, and expand the software suite. The article describes that STA provides calculations in the fields of spacecraft tracking, attitude analysis, coverage and visibility analysis, orbit determination, position and velocity of solar system bodies, etc. STA implements the concept of "space scenario" composed of Solar system bodies, spacecraft, ground stations, pads, etc. It is able to propagate the orbit of a spacecraft where orbital propagators are included. STA is able to compute communication links between objects of a scenario (coverage, line of sight), and to represent the trajectory computations and

  2. Freud: a software suite for high-throughput simulation analysis

    Science.gov (United States)

    Harper, Eric; Spellings, Matthew; Anderson, Joshua; Glotzer, Sharon

    Computer simulation is an indispensable tool for the study of a wide variety of systems. As simulations scale to fill petascale and exascale supercomputing clusters, so too does the size of the data produced, as well as the difficulty in analyzing these data. We present Freud, an analysis software suite for efficient analysis of simulation data. Freud makes no assumptions about the system being analyzed, allowing for general analysis methods to be applied to nearly any type of simulation. Freud includes standard analysis methods such as the radial distribution function, as well as new methods including the potential of mean force and torque and local crystal environment analysis. Freud combines a Python interface with fast, parallel C + + analysis routines to run efficiently on laptops, workstations, and supercomputing clusters. Data analysis on clusters reduces data transfer requirements, a prohibitive cost for petascale computing. Used in conjunction with simulation software, Freud allows for smart simulations that adapt to the current state of the system, enabling the study of phenomena such as nucleation and growth, intelligent investigation of phases and phase transitions, and determination of effective pair potentials.

  3. eXtended CASA Line Analysis Software Suite (XCLASS)

    Science.gov (United States)

    Möller, T.; Endres, C.; Schilke, P.

    2017-02-01

    The eXtended CASA Line Analysis Software Suite (XCLASS) is a toolbox for the Common Astronomy Software Applications package (CASA) containing new functions for modeling interferometric and single dish data. Among the tools is the myXCLASS program which calculates synthetic spectra by solving the radiative transfer equation for an isothermal object in one dimension, whereas the finite source size and dust attenuation are considered as well. Molecular data required by the myXCLASS program are taken from an embedded SQLite3 database containing entries from the Cologne Database for Molecular Spectroscopy (CDMS) and JPL using the Virtual Atomic and Molecular Data Center (VAMDC) portal. Additionally, the toolbox provides an interface for the model optimizer package Modeling and Analysis Generic Interface for eXternal numerical codes (MAGIX), which helps to find the best description of observational data using myXCLASS (or another external model program), that is, finding the parameter set that most closely reproduces the data. http://www.astro.uni-koeln.de/projects/schilke/myXCLASSInterface A copy of the code is available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (http://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/598/A7

  4. Controlatron Neutron Tube Test Suite Software Manual - Operation Manual (V2.2)

    CERN Document Server

    Noel, W P; Hertrich, R J; Martinez, M L; Wallace, D L

    2002-01-01

    The Controlatron Software Suite is a custom built application to perform automated testing of Controlatron neutron tubes. The software package was designed to allowing users to design tests and to run a series of test suites on a tube. The data is output to ASCII files of a pre-defined format for data analysis and viewing with the Controlatron Data Viewer Application. This manual discusses the operation of the Controlatron Test Suite Software and a brief discussion of state machine theory, as state machine is the functional basis of the software.

  5. The BTeV Software Tutorial Suite

    International Nuclear Information System (INIS)

    Kutschke, Robert K.

    2004-01-01

    The BTeV Collaboration is starting to develop its C++ based offline software suite, an integral part of which is a series of tutorials. These tutorials are targeted at a diverse audience, including new graduate students, experienced physicists with little or no C++ experience, those with just enough C++ to be dangerous, and experts who need only an overview of the available tools. The tutorials must both teach C++ in general and the BTeV specific tools in particular. Finally, they must teach physicists how to find and use the detailed documentation. This report will review the status of the BTeV experiment, give an overview of the plans for and the state of the software and will then describe the plans for the tutorial suite

  6. Tier-3 Monitoring Software Suite (T3MON) proposal

    CERN Document Server

    Andreeva, J; The ATLAS collaboration; Klimentov, A; Korenkov, V; Oleynik, D; Panitkin, S; Petrosyan, A

    2011-01-01

    The ATLAS Distributed Computing activities concentrated so far in the “central” part of the computing system of the experiment, namely the first 3 tiers (CERN Tier0, the 10 Tier1s centres and the 60+ Tier2s). This is a coherent system to perform data processing and management on a global scale and host (re)processing, simulation activities down to group and user analysis. Many ATLAS Institutes and National Communities built (or have plans to build) Tier-3 facilities. The definition of Tier-3 concept has been outlined (REFERENCE). Tier-3 centres consist of non-pledged resources mostly dedicated for the data analysis by the geographically close or local scientific groups. Tier-3 sites comprise a range of architectures and many do not possess Grid middleware, which would render application of Tier-2 monitoring systems useless. This document describes a strategy to develop a software suite for monitoring of the Tier3 sites. This software suite will enable local monitoring of the Tier3 sites and the global vie...

  7. Improvements to the APBS biomolecular solvation software suite: Improvements to the APBS Software Suite

    Energy Technology Data Exchange (ETDEWEB)

    Jurrus, Elizabeth [Pacific Northwest National Laboratory, Richland Washington; Engel, Dave [Pacific Northwest National Laboratory, Richland Washington; Star, Keith [Pacific Northwest National Laboratory, Richland Washington; Monson, Kyle [Pacific Northwest National Laboratory, Richland Washington; Brandi, Juan [Pacific Northwest National Laboratory, Richland Washington; Felberg, Lisa E. [University of California, Berkeley California; Brookes, David H. [University of California, Berkeley California; Wilson, Leighton [University of Michigan, Ann Arbor Michigan; Chen, Jiahui [Southern Methodist University, Dallas Texas; Liles, Karina [Pacific Northwest National Laboratory, Richland Washington; Chun, Minju [Pacific Northwest National Laboratory, Richland Washington; Li, Peter [Pacific Northwest National Laboratory, Richland Washington; Gohara, David W. [St. Louis University, St. Louis Missouri; Dolinsky, Todd [FoodLogiQ, Durham North Carolina; Konecny, Robert [University of California San Diego, San Diego California; Koes, David R. [University of Pittsburgh, Pittsburgh Pennsylvania; Nielsen, Jens Erik [Protein Engineering, Novozymes A/S, Copenhagen Denmark; Head-Gordon, Teresa [University of California, Berkeley California; Geng, Weihua [Southern Methodist University, Dallas Texas; Krasny, Robert [University of Michigan, Ann Arbor Michigan; Wei, Guo-Wei [Michigan State University, East Lansing Michigan; Holst, Michael J. [University of California San Diego, San Diego California; McCammon, J. Andrew [University of California San Diego, San Diego California; Baker, Nathan A. [Pacific Northwest National Laboratory, Richland Washington; Brown University, Providence Rhode Island

    2017-10-24

    The Adaptive Poisson-Boltzmann Solver (APBS) software was developed to solve the equations of continuum electrostatics for large biomolecular assemblages that has provided impact in the study of a broad range of chemical, biological, and biomedical applications. APBS addresses three key technology challenges for understanding solvation and electrostatics in biomedical applications: accurate and efficient models for biomolecular solvation and electrostatics, robust and scalable software for applying those theories to biomolecular systems, and mechanisms for sharing and analyzing biomolecular electrostatics data in the scientific community. To address new research applications and advancing computational capabilities, we have continually updated APBS and its suite of accompanying software since its release in 2001. In this manuscript, we discuss the models and capabilities that have recently been implemented within the APBS software package including: a Poisson-Boltzmann analytical and a semi-analytical solver, an optimized boundary element solver, a geometry-based geometric flow solvation model, a graph theory based algorithm for determining pKa values, and an improved web-based visualization tool for viewing electrostatics.

  8. NGSUtils: a software suite for analyzing and manipulating next-generation sequencing datasets

    OpenAIRE

    Breese, Marcus R.; Liu, Yunlong

    2013-01-01

    Summary: NGSUtils is a suite of software tools for manipulating data common to next-generation sequencing experiments, such as FASTQ, BED and BAM format files. These tools provide a stable and modular platform for data management and analysis.

  9. Extending and Enhancing SAS (Static Analysis Suite)

    CERN Document Server

    Ho, David

    2016-01-01

    The Static Analysis Suite (SAS) is an open-source software package used to perform static analysis on C and C++ code, helping to ensure safety, readability and maintainability. In this Summer Student project, SAS was enhanced to improve ease of use and user customisation. A straightforward method of integrating static analysis into a project at compilation time was provided using the automated build tool CMake. The process of adding checkers to the suite was streamlined and simplied by developing an automatic code generator. To make SAS more suitable for continuous integration, a reporting mechanism summarising results was added. This suitability has been demonstrated by inclusion of SAS in the Future Circular Collider Software nightly build system. Scalability of the improved package was demonstrated by using the tool to analyse the ROOT code base.

  10. CLMSVault: A Software Suite for Protein Cross-Linking Mass-Spectrometry Data Analysis and Visualization.

    Science.gov (United States)

    Courcelles, Mathieu; Coulombe-Huntington, Jasmin; Cossette, Émilie; Gingras, Anne-Claude; Thibault, Pierre; Tyers, Mike

    2017-07-07

    Protein cross-linking mass spectrometry (CL-MS) enables the sensitive detection of protein interactions and the inference of protein complex topology. The detection of chemical cross-links between protein residues can identify intra- and interprotein contact sites or provide physical constraints for molecular modeling of protein structure. Recent innovations in cross-linker design, sample preparation, mass spectrometry, and software tools have significantly improved CL-MS approaches. Although a number of algorithms now exist for the identification of cross-linked peptides from mass spectral data, a dearth of user-friendly analysis tools represent a practical bottleneck to the broad adoption of the approach. To facilitate the analysis of CL-MS data, we developed CLMSVault, a software suite designed to leverage existing CL-MS algorithms and provide intuitive and flexible tools for cross-platform data interpretation. CLMSVault stores and combines complementary information obtained from different cross-linkers and search algorithms. CLMSVault provides filtering, comparison, and visualization tools to support CL-MS analyses and includes a workflow for label-free quantification of cross-linked peptides. An embedded 3D viewer enables the visualization of quantitative data and the mapping of cross-linked sites onto PDB structural models. We demonstrate the application of CLMSVault for the analysis of a noncovalent Cdc34-ubiquitin protein complex cross-linked under different conditions. CLMSVault is open-source software (available at https://gitlab.com/courcelm/clmsvault.git ), and a live demo is available at http://democlmsvault.tyerslab.com/ .

  11. Software FMEA analysis for safety-related application software

    International Nuclear Information System (INIS)

    Park, Gee-Yong; Kim, Dong Hoon; Lee, Dong Young

    2014-01-01

    Highlights: • We develop a modified FMEA analysis suited for applying to software architecture. • A template for failure modes on a specific software language is established. • A detailed-level software FMEA analysis on nuclear safety software is presented. - Abstract: A method of a software safety analysis is described in this paper for safety-related application software. The target software system is a software code installed at an Automatic Test and Interface Processor (ATIP) in a digital reactor protection system (DRPS). For the ATIP software safety analysis, at first, an overall safety or hazard analysis is performed over the software architecture and modules, and then a detailed safety analysis based on the software FMEA (Failure Modes and Effect Analysis) method is applied to the ATIP program. For an efficient analysis, the software FMEA analysis is carried out based on the so-called failure-mode template extracted from the function blocks used in the function block diagram (FBD) for the ATIP software. The software safety analysis by the software FMEA analysis, being applied to the ATIP software code, which has been integrated and passed through a very rigorous system test procedure, is proven to be able to provide very valuable results (i.e., software defects) that could not be identified during various system tests

  12. CAMEO (Computer-Aided Management of Emergency Operations) Software Suite

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — CAMEO is the umbrella name for a system of software applications used widely to plan for and respond to chemical emergencies. All of the programs in the suite work...

  13. The IMBA suite: integrated modules for bioassay analysis

    Energy Technology Data Exchange (ETDEWEB)

    Birchall, A.; Jarvis, N.S.; Peace, M.S.; Riddell, A.E.; Battersby, W.P

    1998-07-01

    The increasing complexity of models representing the biokinetic behaviour of radionuclides in the body following intake poses problems for people who are required to implement these models. The problem is exacerbated by the current paucity of suitable software. In order to remedy this situation, a collaboration between British Nuclear Fuels, Westlakes Research Institute and the National Radiological Protection Board has started with the aim of producing a suite of modules for estimating intakes and doses from bioassay measurements using the new ICRP models. Each module will have a single purpose (e.g. to calculate respiratory tract deposition) and will interface with other software using data files. The elements to be implemented initially are plutonium, uranium, caesium, iodine and tritium. It is intended to make the software available to other parties under terms yet to be decided. This paper describes the proposed suite of integrated modules for bioassay analysis, IMBA. (author)

  14. MODEL: A software suite for data acquisition

    Energy Technology Data Exchange (ETDEWEB)

    Sendall, D M; Boissat, C; Bozzoli, W; Burkimsher, P; Jones, R; Matheys, J P; Mornacchi, G; Nguyen, T; Vyvre, P vande; Vascotto, A; Weaver, D [European Organization for Nuclear Research, Geneva (Switzerland). DD Div.

    1989-12-01

    MODEL is a new suite of modular data-acquisition software. It is aimed at the needs of LEP experiments, and is also general enough to be more widely used. It can accomodate a variety of users styles. It runs on a set of loosely coupled processors, and makes use of the remote procedure call technique. Implemented originally for the VAX family, some of its services have already been extended to other systems, including embedded microprocessors. The software modules available include facilities for data-flow management, a framework for monitoring programs, a window-oriented human interface, an error message utility, a process control utility and a run control scheme. It is already in use in a variety of experiments, and is still under development in the light of user experience. (orig.).

  15. Knowledge Architect : A Tool Suite for Managing Software Architecture Knowledge

    NARCIS (Netherlands)

    Liang, Peng; Jansen, Anton; Avgeriou, Paris

    2009-01-01

    Management of software architecture knowledge (AK) is vital for improving an organization’s architectural capabilities. To support the architecting process within our industrial partner: Astron, the Dutch radio astronomy institute, we implemented the Knowledge Architect (KA): a tool suite for

  16. Improvements to the APBS biomolecular solvation software suite.

    Science.gov (United States)

    Jurrus, Elizabeth; Engel, Dave; Star, Keith; Monson, Kyle; Brandi, Juan; Felberg, Lisa E; Brookes, David H; Wilson, Leighton; Chen, Jiahui; Liles, Karina; Chun, Minju; Li, Peter; Gohara, David W; Dolinsky, Todd; Konecny, Robert; Koes, David R; Nielsen, Jens Erik; Head-Gordon, Teresa; Geng, Weihua; Krasny, Robert; Wei, Guo-Wei; Holst, Michael J; McCammon, J Andrew; Baker, Nathan A

    2018-01-01

    The Adaptive Poisson-Boltzmann Solver (APBS) software was developed to solve the equations of continuum electrostatics for large biomolecular assemblages that have provided impact in the study of a broad range of chemical, biological, and biomedical applications. APBS addresses the three key technology challenges for understanding solvation and electrostatics in biomedical applications: accurate and efficient models for biomolecular solvation and electrostatics, robust and scalable software for applying those theories to biomolecular systems, and mechanisms for sharing and analyzing biomolecular electrostatics data in the scientific community. To address new research applications and advancing computational capabilities, we have continually updated APBS and its suite of accompanying software since its release in 2001. In this article, we discuss the models and capabilities that have recently been implemented within the APBS software package including a Poisson-Boltzmann analytical and a semi-analytical solver, an optimized boundary element solver, a geometry-based geometric flow solvation model, a graph theory-based algorithm for determining pK a values, and an improved web-based visualization tool for viewing electrostatics. © 2017 The Protein Society.

  17. Modular Algorithm Testbed Suite (MATS): A Software Framework for Automatic Target Recognition

    Science.gov (United States)

    2017-01-01

    NAVAL SURFACE WARFARE CENTER PANAMA CITY DIVISION PANAMA CITY, FL 32407-7001 TECHNICAL REPORT NSWC PCD TR-2017-004 MODULAR ...31-01-2017 Technical Modular Algorithm Testbed Suite (MATS): A Software Framework for Automatic Target Recognition DR...flexible platform to facilitate the development and testing of ATR algorithms. To that end, NSWC PCD has created the Modular Algorithm Testbed Suite

  18. SSAGES: Software Suite for Advanced General Ensemble Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Sidky, Hythem [Department of Chemical and Biomolecular Engineering, University of Notre Dame, Notre Dame, Indiana 46556, USA; Colón, Yamil J. [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Institute for Molecular Engineering and Materials Science Division, Argonne National Laboratory, Lemont, Illinois 60439, USA; Helfferich, Julian [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Steinbuch Center for Computing, Karlsruhe Institute of Technology, Hermann-von-Helmholtz-Platz 1, 76344 Eggenstein-Leopoldshafen, Germany; Sikora, Benjamin J. [Department of Chemical and Biomolecular Engineering, University of Notre Dame, Notre Dame, Indiana 46556, USA; Bezik, Cody [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Chu, Weiwei [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Giberti, Federico [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Guo, Ashley Z. [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Jiang, Xikai [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Lequieu, Joshua [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Li, Jiyuan [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Moller, Joshua [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Quevillon, Michael J. [Department of Chemical and Biomolecular Engineering, University of Notre Dame, Notre Dame, Indiana 46556, USA; Rahimi, Mohammad [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Ramezani-Dakhel, Hadi [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Department of Biochemistry and Molecular Biology, University of Chicago, Chicago, Illinois 60637, USA; Rathee, Vikramjit S. [Department of Chemical and Biomolecular Engineering, University of Notre Dame, Notre Dame, Indiana 46556, USA; Reid, Daniel R. [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Sevgen, Emre [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Thapar, Vikram [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Webb, Michael A. [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Institute for Molecular Engineering and Materials Science Division, Argonne National Laboratory, Lemont, Illinois 60439, USA; Whitmer, Jonathan K. [Department of Chemical and Biomolecular Engineering, University of Notre Dame, Notre Dame, Indiana 46556, USA; de Pablo, Juan J. [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Institute for Molecular Engineering and Materials Science Division, Argonne National Laboratory, Lemont, Illinois 60439, USA

    2018-01-28

    Molecular simulation has emerged as an essential tool for modern-day research, but obtaining proper results and making reliable conclusions from simulations requires adequate sampling of the system under consideration. To this end, a variety of methods exist in the literature that can enhance sampling considerably, and increasingly sophisticated, effective algorithms continue to be developed at a rapid pace. Implementation of these techniques, however, can be challenging for experts and non-experts alike. There is a clear need for software that provides rapid, reliable, and easy access to a wide range of advanced sampling methods, and that facilitates implementation of new techniques as they emerge. Here we present SSAGES, a publicly available Software Suite for Advanced General Ensemble Simulations designed to interface with multiple widely used molecular dynamics simulations packages. SSAGES allows facile application of a variety of enhanced sampling techniques—including adaptive biasing force, string methods, and forward flux sampling—that extract meaningful free energy and transition path data from all-atom and coarse grained simulations. A noteworthy feature of SSAGES is a user-friendly framework that facilitates further development and implementation of new methods and collective variables. In this work, the use of SSAGES is illustrated in the context of simple representative applications involving distinct methods and different collective variables that are available in the current release of the suite.

  19. SSAGES: Software Suite for Advanced General Ensemble Simulations

    Science.gov (United States)

    Sidky, Hythem; Colón, Yamil J.; Helfferich, Julian; Sikora, Benjamin J.; Bezik, Cody; Chu, Weiwei; Giberti, Federico; Guo, Ashley Z.; Jiang, Xikai; Lequieu, Joshua; Li, Jiyuan; Moller, Joshua; Quevillon, Michael J.; Rahimi, Mohammad; Ramezani-Dakhel, Hadi; Rathee, Vikramjit S.; Reid, Daniel R.; Sevgen, Emre; Thapar, Vikram; Webb, Michael A.; Whitmer, Jonathan K.; de Pablo, Juan J.

    2018-01-01

    Molecular simulation has emerged as an essential tool for modern-day research, but obtaining proper results and making reliable conclusions from simulations requires adequate sampling of the system under consideration. To this end, a variety of methods exist in the literature that can enhance sampling considerably, and increasingly sophisticated, effective algorithms continue to be developed at a rapid pace. Implementation of these techniques, however, can be challenging for experts and non-experts alike. There is a clear need for software that provides rapid, reliable, and easy access to a wide range of advanced sampling methods and that facilitates implementation of new techniques as they emerge. Here we present SSAGES, a publicly available Software Suite for Advanced General Ensemble Simulations designed to interface with multiple widely used molecular dynamics simulations packages. SSAGES allows facile application of a variety of enhanced sampling techniques—including adaptive biasing force, string methods, and forward flux sampling—that extract meaningful free energy and transition path data from all-atom and coarse-grained simulations. A noteworthy feature of SSAGES is a user-friendly framework that facilitates further development and implementation of new methods and collective variables. In this work, the use of SSAGES is illustrated in the context of simple representative applications involving distinct methods and different collective variables that are available in the current release of the suite. The code may be found at: https://github.com/MICCoM/SSAGES-public.

  20. Data analysis and graphing in an introductory physics laboratory: spreadsheet versus statistics suite

    International Nuclear Information System (INIS)

    Peterlin, Primoz

    2010-01-01

    Two methods of data analysis are compared: spreadsheet software and a statistics software suite. Their use is compared analysing data collected in three selected experiments taken from an introductory physics laboratory, which include a linear dependence, a nonlinear dependence and a histogram. The merits of each method are compared.

  1. Data Analysis and Graphing in an Introductory Physics Laboratory: Spreadsheet versus Statistics Suite

    Science.gov (United States)

    Peterlin, Primoz

    2010-01-01

    Two methods of data analysis are compared: spreadsheet software and a statistics software suite. Their use is compared analysing data collected in three selected experiments taken from an introductory physics laboratory, which include a linear dependence, a nonlinear dependence and a histogram. The merits of each method are compared. (Contains 7…

  2. MASH Suite: a user-friendly and versatile software interface for high-resolution mass spectrometry data interpretation and visualization.

    Science.gov (United States)

    Guner, Huseyin; Close, Patrick L; Cai, Wenxuan; Zhang, Han; Peng, Ying; Gregorich, Zachery R; Ge, Ying

    2014-03-01

    The rapid advancements in mass spectrometry (MS) instrumentation, particularly in Fourier transform (FT) MS, have made the acquisition of high-resolution and high-accuracy mass measurements routine. However, the software tools for the interpretation of high-resolution MS data are underdeveloped. Although several algorithms for the automatic processing of high-resolution MS data are available, there is still an urgent need for a user-friendly interface with functions that allow users to visualize and validate the computational output. Therefore, we have developed MASH Suite, a user-friendly and versatile software interface for processing high-resolution MS data. MASH Suite contains a wide range of features that allow users to easily navigate through data analysis, visualize complex high-resolution MS data, and manually validate automatically processed results. Furthermore, it provides easy, fast, and reliable interpretation of top-down, middle-down, and bottom-up MS data. MASH Suite is convenient, easily operated, and freely available. It can greatly facilitate the comprehensive interpretation and validation of high-resolution MS data with high accuracy and reliability.

  3. Strengthening Software Authentication with the ROSE Software Suite

    International Nuclear Information System (INIS)

    White, G

    2006-01-01

    Many recent nonproliferation and arms control software projects include a software authentication regime. These include U.S. Government-sponsored projects both in the United States and in the Russian Federation (RF). This trend toward requiring software authentication is only accelerating. Demonstrating assurance that software performs as expected without hidden ''backdoors'' is crucial to a project's success. In this context, ''authentication'' is defined as determining that a software package performs only its intended purpose and performs said purpose correctly and reliably over the planned duration of an agreement. In addition to visual inspections by knowledgeable computer scientists, automated tools are needed to highlight suspicious code constructs, both to aid visual inspection and to guide program development. While many commercial tools are available for portions of the authentication task, they are proprietary and not extensible. An open-source, extensible tool can be customized to the unique needs of each project (projects can have both common and custom rules to detect flaws and security holes). Any such extensible tool has to be based on a complete language compiler. ROSE is precisely such a compiler infrastructure developed within the Department of Energy (DOE) and targeted at the optimization of scientific applications and user-defined libraries within large-scale applications (typically applications of a million lines of code). ROSE is a robust, source-to-source analysis and optimization infrastructure currently addressing large, million-line DOE applications in C and C++ (handling the full C, C99, C++ languages and with current collaborations to support Fortran90). We propose to extend ROSE to address a number of security-specific requirements, and apply it to software authentication for nonproliferation and arms control projects

  4. Engineering Software Suite Validates System Design

    Science.gov (United States)

    2007-01-01

    EDAptive Computing Inc.'s (ECI) EDAstar engineering software tool suite, created to capture and validate system design requirements, was significantly funded by NASA's Ames Research Center through five Small Business Innovation Research (SBIR) contracts. These programs specifically developed Syscape, used to capture executable specifications of multi-disciplinary systems, and VectorGen, used to automatically generate tests to ensure system implementations meet specifications. According to the company, the VectorGen tests considerably reduce the time and effort required to validate implementation of components, thereby ensuring their safe and reliable operation. EDASHIELD, an additional product offering from ECI, can be used to diagnose, predict, and correct errors after a system has been deployed using EDASTAR -created models. Initial commercialization for EDASTAR included application by a large prime contractor in a military setting, and customers include various branches within the U.S. Department of Defense, industry giants like the Lockheed Martin Corporation, Science Applications International Corporation, and Ball Aerospace and Technologies Corporation, as well as NASA's Langley and Glenn Research Centers

  5. A tool to include gamma analysis software into a quality assurance program.

    Science.gov (United States)

    Agnew, Christina E; McGarry, Conor K

    2016-03-01

    To provide a tool to enable gamma analysis software algorithms to be included in a quality assurance (QA) program. Four image sets were created comprising two geometric images to independently test the distance to agreement (DTA) and dose difference (DD) elements of the gamma algorithm, a clinical step and shoot IMRT field and a clinical VMAT arc. The images were analysed using global and local gamma analysis with 2 in-house and 8 commercially available software encompassing 15 software versions. The effect of image resolution on gamma pass rates was also investigated. All but one software accurately calculated the gamma passing rate for the geometric images. Variation in global gamma passing rates of 1% at 3%/3mm and over 2% at 1%/1mm was measured between software and software versions with analysis of appropriately sampled images. This study provides a suite of test images and the gamma pass rates achieved for a selection of commercially available software. This image suite will enable validation of gamma analysis software within a QA program and provide a frame of reference by which to compare results reported in the literature from various manufacturers and software versions. Copyright © 2015. Published by Elsevier Ireland Ltd.

  6. TypingSuite: Integrated Software for Presenting Stimuli, and Collecting and Analyzing Typing Data

    Science.gov (United States)

    Mazerolle, Erin L.; Marchand, Yannick

    2015-01-01

    Research into typing patterns has broad applications in both psycholinguistics and biometrics (i.e., improving security of computer access via each user's unique typing patterns). We present a new software package, TypingSuite, which can be used for presenting visual and auditory stimuli, collecting typing data, and summarizing and analyzing the…

  7. A Software Suite for Testing SpaceWire Devices and Networks

    Science.gov (United States)

    Mills, Stuart; Parkes, Steve

    2015-09-01

    SpaceWire is a data-handling network for use on-board spacecraft, which connects together instruments, mass-memory, processors, downlink telemetry, and other on-board sub-systems. SpaceWire is simple to implement and has some specific characteristics that help it support data-handling applications in space: high-speed, low-power, simplicity, relatively low implementation cost, and architectural flexibility making it ideal for many space missions. SpaceWire provides high-speed (2 Mbits/s to 200 Mbits/s), bi-directional, full-duplex data-links, which connect together SpaceWire enabled equipment. Data-handling networks can be built to suit particular applications using point-to-point data-links and routing switches. STAR-Dundee’s STAR-System software stack has been designed to meet the needs of engineers designing and developing SpaceWire networks and devices. This paper describes the aims of the software and how those needs were met.

  8. Data processing software suite SITENNO for coherent X-ray diffraction imaging using the X-ray free-electron laser SACLA

    International Nuclear Information System (INIS)

    Sekiguchi, Yuki; Oroguchi, Tomotaka; Takayama, Yuki; Nakasako, Masayoshi

    2014-01-01

    The software suite SITENNO is developed for processing diffraction data collected in coherent X-ray diffraction imaging experiments of non-crystalline particles using an X-ray free-electron laser. Coherent X-ray diffraction imaging is a promising technique for visualizing the structures of non-crystalline particles with dimensions of micrometers to sub-micrometers. Recently, X-ray free-electron laser sources have enabled efficient experiments in the ‘diffraction before destruction’ scheme. Diffraction experiments have been conducted at SPring-8 Angstrom Compact free-electron LAser (SACLA) using the custom-made diffraction apparatus KOTOBUKI-1 and two multiport CCD detectors. In the experiments, ten thousands of single-shot diffraction patterns can be collected within several hours. Then, diffraction patterns with significant levels of intensity suitable for structural analysis must be found, direct-beam positions in diffraction patterns determined, diffraction patterns from the two CCD detectors merged, and phase-retrieval calculations for structural analyses performed. A software suite named SITENNO has been developed to semi-automatically apply the four-step processing to a huge number of diffraction data. Here, details of the algorithm used in the suite are described and the performance for approximately 9000 diffraction patterns collected from cuboid-shaped copper oxide particles reported. Using the SITENNO suite, it is possible to conduct experiments with data processing immediately after the data collection, and to characterize the size distribution and internal structures of the non-crystalline particles

  9. A comprehensive software suite for protein family construction and functional site prediction.

    Directory of Open Access Journals (Sweden)

    David Renfrew Haft

    Full Text Available In functionally diverse protein families, conservation in short signature regions may outperform full-length sequence comparisons for identifying proteins that belong to a subgroup within which one specific aspect of their function is conserved. The SIMBAL workflow (Sites Inferred by Metabolic Background Assertion Labeling is a data-mining procedure for finding such signature regions. It begins by using clues from genomic context, such as co-occurrence or conserved gene neighborhoods, to build a useful training set from a large number of uncharacterized but mutually homologous proteins. When training set construction is successful, the YES partition is enriched in proteins that share function with the user's query sequence, while the NO partition is depleted. A selected query sequence is then mined for short signature regions whose closest matches overwhelmingly favor proteins from the YES partition. High-scoring signature regions typically contain key residues critical to functional specificity, so proteins with the highest sequence similarity across these regions tend to share the same function. The SIMBAL algorithm was described previously, but significant manual effort, expertise, and a supporting software infrastructure were required to prepare the requisite training sets. Here, we describe a new, distributable software suite that speeds up and simplifies the process for using SIMBAL, most notably by providing tools that automate training set construction. These tools have broad utility for comparative genomics, allowing for flexible collection of proteins or protein domains based on genomic context as well as homology, a capability that can greatly assist in protein family construction. Armed with this new software suite, SIMBAL can serve as a fast and powerful in silico alternative to direct experimentation for characterizing proteins and their functional interactions.

  10. SAGES: a suite of freely-available software tools for electronic disease surveillance in resource-limited settings.

    Directory of Open Access Journals (Sweden)

    Sheri L Lewis

    Full Text Available Public health surveillance is undergoing a revolution driven by advances in the field of information technology. Many countries have experienced vast improvements in the collection, ingestion, analysis, visualization, and dissemination of public health data. Resource-limited countries have lagged behind due to challenges in information technology infrastructure, public health resources, and the costs of proprietary software. The Suite for Automated Global Electronic bioSurveillance (SAGES is a collection of modular, flexible, freely-available software tools for electronic disease surveillance in resource-limited settings. One or more SAGES tools may be used in concert with existing surveillance applications or the SAGES tools may be used en masse for an end-to-end biosurveillance capability. This flexibility allows for the development of an inexpensive, customized, and sustainable disease surveillance system. The ability to rapidly assess anomalous disease activity may lead to more efficient use of limited resources and better compliance with World Health Organization International Health Regulations.

  11. User Guide for the STAYSL PNNL Suite of Software Tools

    Energy Technology Data Exchange (ETDEWEB)

    Greenwood, Lawrence R.; Johnson, Christian D.

    2013-02-27

    The STAYSL PNNL software suite provides a set of tools for working with neutron activation rates measured in a nuclear fission reactor, an accelerator-based neutron source, or any neutron field to determine the neutron flux spectrum through a generalized least-squares approach. This process is referred to as neutron spectral adjustment since the preferred approach is to use measured data to adjust neutron spectra provided by neutron physics calculations. The input data consist of the reaction rates based on measured activities, an initial estimate of the neutron flux spectrum, neutron activation cross sections and their associated uncertainties (covariances), and relevant correction factors. The output consists of the adjusted neutron flux spectrum and associated covariance matrix, which is useful for neutron dosimetry and radiation damage calculations.

  12. PLEASE: The Python Low-energy Electron Analysis SuitE – Enabling Rapid Analysis of LEEM and LEED Data

    Directory of Open Access Journals (Sweden)

    Maxwell Grady

    2018-02-01

    Full Text Available PLEASE, the Python Low-energy Electron Analysis SuitE, provides an open source and cross-platform graphical user interface (GUI for rapid analysis and visualization of low energy electron microscopy (LEEM data sets. LEEM and the associated technique, selected area micro-spot low energy electron diffraction (μ-LEED, are powerful tools for analysis of the surface structure for many novel materials. Specifically, these tools are uniquely suited for the characterization of two-dimensional materials. PLEASE offers a user-friendly point-and-click method for extracting intensity-voltage curves from LEEM and LEED data sets. Analysis of these curves provides insight into the atomic structure of the target material surface with unparalleled resolution.

  13. Real-time fluoroscopic needle guidance in the interventional radiology suite using navigational software for percutaneous bone biopsies in children

    Energy Technology Data Exchange (ETDEWEB)

    Shellikeri, Sphoorti; Srinivasan, Abhay; Krishnamurthy, Ganesh; Vatsky, Seth; Zhu, Xiaowei; Keller, Marc S.; Cahill, Anne Marie [The Children' s Hospital of Philadelphia, Department of Radiology, Philadelphia, PA (United States); Setser, Randolph M. [Siemens Medical Solutions USA, Inc., Hoffman Estates, IL (United States); Hwang, Tiffany J. [University of Southern California, Keck School of Medicine, Los Angeles, CA (United States); Girard, Erin [Siemens Medical Solutions USA, Inc., Princeton, NJ (United States)

    2017-07-15

    Navigational software provides real-time fluoroscopic needle guidance for percutaneous procedures in the Interventional Radiology (IR) suite. We describe our experience with navigational software for pediatric percutaneous bone biopsies in the IR suite and compare technical success, diagnostic accuracy, radiation dose and procedure time with that of CT-guided biopsies. Pediatric bone biopsies performed using navigational software (Syngo iGuide, Siemens Healthcare) from 2011 to 2016 were prospectively included and anatomically matched CT-guided bone biopsies from 2008 to 2016 were retrospectively reviewed with institutional review board approval. C-arm CT protocols used for navigational software-assisted cases included institution-developed low-dose (0.1/0.17 μGy/projection), regular-dose (0.36 μGy/projection), or a combination of low-dose/regular-dose protocols. Estimated effective radiation dose and procedure times were compared between software-assisted and CT-guided biopsies. Twenty-six patients (15 male; mean age: 10 years) underwent software-assisted biopsies (15 pelvic, 7 lumbar and 4 lower extremity) and 33 patients (13 male; mean age: 9 years) underwent CT-guided biopsies (22 pelvic, 7 lumbar and 4 lower extremity). Both modality biopsies resulted in a 100% technical success rate. Twenty-five of 26 (96%) software-assisted and 29/33 (88%) CT-guided biopsies were diagnostic. Overall, the effective radiation dose was significantly lower in software-assisted than CT-guided cases (3.0±3.4 vs. 6.6±7.7 mSv, P=0.02). The effective dose difference was most dramatic in software-assisted cases using low-dose C-arm CT (1.2±1.8 vs. 6.6±7.7 mSv, P=0.001) or combined low-dose/regular-dose C-arm CT (1.9±2.4 vs. 6.6±7.7 mSv, P=0.04), whereas effective dose was comparable in software-assisted cases using regular-dose C-arm CT (6.0±3.5 vs. 6.6±7.7 mSv, P=0.7). Mean procedure time was significantly lower for software-assisted cases (91±54 vs. 141±68 min, P=0

  14. Approaches in highly parameterized inversion—PEST++ Version 3, a Parameter ESTimation and uncertainty analysis software suite optimized for large environmental models

    Science.gov (United States)

    Welter, David E.; White, Jeremy T.; Hunt, Randall J.; Doherty, John E.

    2015-09-18

    The PEST++ Version 1 object-oriented parameter estimation code is here extended to Version 3 to incorporate additional algorithms and tools to further improve support for large and complex environmental modeling problems. PEST++ Version 3 includes the Gauss-Marquardt-Levenberg (GML) algorithm for nonlinear parameter estimation, Tikhonov regularization, integrated linear-based uncertainty quantification, options of integrated TCP/IP based parallel run management or external independent run management by use of a Version 2 update of the GENIE Version 1 software code, and utilities for global sensitivity analyses. The Version 3 code design is consistent with PEST++ Version 1 and continues to be designed to lower the barriers of entry for users as well as developers while providing efficient and optimized algorithms capable of accommodating large, highly parameterized inverse problems. As such, this effort continues the original focus of (1) implementing the most popular and powerful features of the PEST software suite in a fashion that is easy for novice or experienced modelers to use and (2) developing a software framework that is easy to extend.

  15. Analytical Tools for Space Suit Design

    Science.gov (United States)

    Aitchison, Lindsay

    2011-01-01

    As indicated by the implementation of multiple small project teams within the agency, NASA is adopting a lean approach to hardware development that emphasizes quick product realization and rapid response to shifting program and agency goals. Over the past two decades, space suit design has been evolutionary in approach with emphasis on building prototypes then testing with the largest practical range of subjects possible. The results of these efforts show continuous improvement but make scaled design and performance predictions almost impossible with limited budgets and little time. Thus, in an effort to start changing the way NASA approaches space suit design and analysis, the Advanced Space Suit group has initiated the development of an integrated design and analysis tool. It is a multi-year-if not decadal-development effort that, when fully implemented, is envisioned to generate analysis of any given space suit architecture or, conversely, predictions of ideal space suit architectures given specific mission parameters. The master tool will exchange information to and from a set of five sub-tool groups in order to generate the desired output. The basic functions of each sub-tool group, the initial relationships between the sub-tools, and a comparison to state of the art software and tools are discussed.

  16. Problem of Office Suite Training at the University

    Directory of Open Access Journals (Sweden)

    Natalia A. Nastashchuk

    2013-01-01

    Full Text Available Te paper considers the problem of office suite applications training, caused by a rapid change of their versions, variety of software developers and a rapid development of software and hardware platforms. The content of office suite applications training, based on the system of office suite notions, its basic functional and standards of information technologies development (OpenDocument Format Standard, ISO 26300-200Х is presented.

  17. HAZARD ANALYSIS SOFTWARE

    International Nuclear Information System (INIS)

    Sommer, S; Tinh Tran, T.

    2008-01-01

    Washington Safety Management Solutions, LLC developed web-based software to improve the efficiency and consistency of hazard identification and analysis, control selection and classification, and to standardize analysis reporting at Savannah River Site. In the new nuclear age, information technology provides methods to improve the efficiency of the documented safety analysis development process which includes hazard analysis activities. This software provides a web interface that interacts with a relational database to support analysis, record data, and to ensure reporting consistency. A team of subject matter experts participated in a series of meetings to review the associated processes and procedures for requirements and standard practices. Through these meetings, a set of software requirements were developed and compiled into a requirements traceability matrix from which software could be developed. The software was tested to ensure compliance with the requirements. Training was provided to the hazard analysis leads. Hazard analysis teams using the software have verified its operability. The software has been classified as NQA-1, Level D, as it supports the analysis team but does not perform the analysis. The software can be transported to other sites with alternate risk schemes. The software is being used to support the development of 14 hazard analyses. User responses have been positive with a number of suggestions for improvement which are being incorporated as time permits. The software has enforced a uniform implementation of the site procedures. The software has significantly improved the efficiency and standardization of the hazard analysis process

  18. Salvus: A scalable software suite for full-waveform modelling & inversion

    Science.gov (United States)

    Afanasiev, M.; Boehm, C.; van Driel, M.; Krischer, L.; Fichtner, A.

    2017-12-01

    Full-waveform inversion (FWI), whether at the lab, exploration, or planetary scale, requires the cooperation of five principal components. (1) The geometry of the domain needs to be properly discretized and an initial guess of the model parameters must be projected onto it; (2) Large volumes of recorded waveform data must be collected, organized, and processed; (3) Synthetic waveform data must be efficiently and accurately computed through complex domains; (4) Suitable misfit functions and optimization techniques must be used to relate discrepancies in data space to perturbations in the model; and (5) Some form of workflow management must be employed to schedule and run (1) - (4) in the correct order. Each one of these components can represent a formidable technical challenge which redirects energy from the true task at hand: using FWI to extract new information about some underlying continuum.In this presentation we give an overview of the current status of the Salvus software suite, which was introduced to address the challenges listed above. Specifically, we touch on (1) salvus_mesher, which eases the discretization of complex Earth models into hexahedral meshes; (2) salvus_seismo, which integrates with LASIF and ObsPy to streamline the processing and preparation of seismic data; (3) salvus_wave, a high-performance and scalable spectral-element solver capable of simulating waveforms through general unstructured 2- and 3-D domains, and (4) salvus_opt, an optimization toolbox specifically designed for full-waveform inverse problems. Tying everything together, we also discuss (5) salvus_flow: a workflow package designed to orchestrate and manage the rest of the suite. It is our hope that these developments represent a step towards the automation of large-scale seismic waveform inversion, while also lowering the barrier of entry for new applications. We include several examples of Salvus' use in (extra-) planetary seismology, non-destructive testing, and medical

  19. Technical Note: DIRART- A software suite for deformable image registration and adaptive radiotherapy research

    International Nuclear Information System (INIS)

    Yang Deshan; Brame, Scott; El Naqa, Issam; Aditya, Apte; Wu Yu; Murty Goddu, S.; Mutic, Sasa; Deasy, Joseph O.; Low, Daniel A.

    2011-01-01

    Purpose: Recent years have witnessed tremendous progress in image guide radiotherapy technology and a growing interest in the possibilities for adapting treatment planning and delivery over the course of treatment. One obstacle faced by the research community has been the lack of a comprehensive open-source software toolkit dedicated for adaptive radiotherapy (ART). To address this need, the authors have developed a software suite called the Deformable Image Registration and Adaptive Radiotherapy Toolkit (DIRART). Methods: DIRART is an open-source toolkit developed in MATLAB. It is designed in an object-oriented style with focus on user-friendliness, features, and flexibility. It contains four classes of DIR algorithms, including the newer inverse consistency algorithms to provide consistent displacement vector field in both directions. It also contains common ART functions, an integrated graphical user interface, a variety of visualization and image-processing features, dose metric analysis functions, and interface routines. These interface routines make DIRART a powerful complement to the Computational Environment for Radiotherapy Research (CERR) and popular image-processing toolkits such as ITK. Results: DIRART provides a set of image processing/registration algorithms and postprocessing functions to facilitate the development and testing of DIR algorithms. It also offers a good amount of options for DIR results visualization, evaluation, and validation. Conclusions: By exchanging data with treatment planning systems via DICOM-RT files and CERR, and by bringing image registration algorithms closer to radiotherapy applications, DIRART is potentially a convenient and flexible platform that may facilitate ART and DIR research.

  20. Free software for performing physical analysis of systems for digital radiography and mammography

    Energy Technology Data Exchange (ETDEWEB)

    Donini, Bruno; Lanconelli, Nico, E-mail: nico.lanconelli@unibo.it [Alma Mater Studiorum, Department of Physics and Astronomy, University of Bologna, Bologna 40127 (Italy); Rivetti, Stefano [Fisica Medica, Ospedale di Sassuolo S.p.A., Sassuolo 41049 (Italy); Bertolini, Marco [Medical Physics Unit, Azienda Ospedaliera ASMN, Istituto di Ricovero e Cura a Carattere Scientifico, Reggio Emilia 42123 (Italy)

    2014-05-15

    Purpose: In this paper, the authors present a free software for assisting users in achieving the physical characterization of x-ray digital systems and image quality checks. Methods: The program was developed as a plugin of a well-known public-domain suite ImageJ. The software can assist users in calculating various physical parameters such as the response curve (also termed signal transfer property), modulation transfer function (MTF), noise power spectra (NPS), and detective quantum efficiency (DQE). It also includes the computation of some image quality checks: defective pixel analysis, uniformity, dark analysis, and lag. Results: The software was made available in 2009 and has been used during the last couple of years by many users who gave us valuable feedback for improving its usability. It was tested for achieving the physical characterization of several clinical systems for digital radiography and mammography. Various published papers made use of the outcomes of the plugin. Conclusions: This software is potentially beneficial to a variety of users: physicists working in hospitals, staff working in radiological departments, such as medical physicists, physicians, engineers. The plugin, together with a brief user manual, are freely available and can be found online ( http://www.medphys.it/downloads.htm ). With our plugin users can estimate all three most important parameters used for physical characterization (MTF, NPS, and also DQE). The plugin can run on any operating system equipped with ImageJ suite. The authors validated the software by comparing MTF and NPS curves on a common set of images with those obtained with other dedicated programs, achieving a very good agreement.

  1. Free software for performing physical analysis of systems for digital radiography and mammography.

    Science.gov (United States)

    Donini, Bruno; Rivetti, Stefano; Lanconelli, Nico; Bertolini, Marco

    2014-05-01

    In this paper, the authors present a free software for assisting users in achieving the physical characterization of x-ray digital systems and image quality checks. The program was developed as a plugin of a well-known public-domain suite ImageJ. The software can assist users in calculating various physical parameters such as the response curve (also termed signal transfer property), modulation transfer function (MTF), noise power spectra (NPS), and detective quantum efficiency (DQE). It also includes the computation of some image quality checks: defective pixel analysis, uniformity, dark analysis, and lag. The software was made available in 2009 and has been used during the last couple of years by many users who gave us valuable feedback for improving its usability. It was tested for achieving the physical characterization of several clinical systems for digital radiography and mammography. Various published papers made use of the outcomes of the plugin. This software is potentially beneficial to a variety of users: physicists working in hospitals, staff working in radiological departments, such as medical physicists, physicians, engineers. The plugin, together with a brief user manual, are freely available and can be found online (www.medphys.it/downloads.htm). With our plugin users can estimate all three most important parameters used for physical characterization (MTF, NPS, and also DQE). The plugin can run on any operating system equipped with ImageJ suite. The authors validated the software by comparing MTF and NPS curves on a common set of images with those obtained with other dedicated programs, achieving a very good agreement.

  2. Free software for performing physical analysis of systems for digital radiography and mammography

    International Nuclear Information System (INIS)

    Donini, Bruno; Lanconelli, Nico; Rivetti, Stefano; Bertolini, Marco

    2014-01-01

    Purpose: In this paper, the authors present a free software for assisting users in achieving the physical characterization of x-ray digital systems and image quality checks. Methods: The program was developed as a plugin of a well-known public-domain suite ImageJ. The software can assist users in calculating various physical parameters such as the response curve (also termed signal transfer property), modulation transfer function (MTF), noise power spectra (NPS), and detective quantum efficiency (DQE). It also includes the computation of some image quality checks: defective pixel analysis, uniformity, dark analysis, and lag. Results: The software was made available in 2009 and has been used during the last couple of years by many users who gave us valuable feedback for improving its usability. It was tested for achieving the physical characterization of several clinical systems for digital radiography and mammography. Various published papers made use of the outcomes of the plugin. Conclusions: This software is potentially beneficial to a variety of users: physicists working in hospitals, staff working in radiological departments, such as medical physicists, physicians, engineers. The plugin, together with a brief user manual, are freely available and can be found online ( http://www.medphys.it/downloads.htm ). With our plugin users can estimate all three most important parameters used for physical characterization (MTF, NPS, and also DQE). The plugin can run on any operating system equipped with ImageJ suite. The authors validated the software by comparing MTF and NPS curves on a common set of images with those obtained with other dedicated programs, achieving a very good agreement

  3. EpiTools, A software suite for presurgical brain mapping in epilepsy: Intracerebral EEG.

    Science.gov (United States)

    Medina Villalon, S; Paz, R; Roehri, N; Lagarde, S; Pizzo, F; Colombet, B; Bartolomei, F; Carron, R; Bénar, C-G

    2018-03-29

    In pharmacoresistant epilepsy, exploration with depth electrodes can be needed to precisely define the epileptogenic zone. Accurate location of these electrodes is thus essential for the interpretation of Stereotaxic EEG (SEEG) signals. As SEEG analysis increasingly relies on signal processing, it is crucial to make a link between these results and patient's anatomy. Our aims were thus to develop a suite of software tools, called "EpiTools", able to i) precisely and automatically localize the position of each SEEG contact and ii) display the results of signal analysis in each patient's anatomy. The first tool, GARDEL (GUI for Automatic Registration and Depth Electrode Localization), is able to automatically localize SEEG contacts and to label each contact according to a pre-specified nomenclature (for instance that of FreeSurfer or MarsAtlas). The second tool, 3Dviewer, enables to visualize in the 3D anatomy of the patient the origin of signal processing results such as rate of biomarkers, connectivity graphs or Epileptogenicity Index. GARDEL was validated in 30 patients by clinicians and proved to be highly reliable to determine within the patient's individual anatomy the actual location of contacts. GARDEL is a fully automatic electrode localization tool needing limited user interaction (only for electrode naming or contact correction). The 3Dviewer is able to read signal processing results and to display them in link with patient's anatomy. EpiTools can help speeding up the interpretation of SEEG data and improving its precision. Copyright © 2018 Elsevier B.V. All rights reserved.

  4. The IFPUG guide to IT and software measurement

    CERN Document Server

    IFPUG

    2012-01-01

    The widespread deployment of millions of current and emerging software applications has placed software economic studies among the most critical of any form of business analysis. Unfortunately, a lack of an integrated suite of metrics makes software economic analysis extremely difficult. The International Function Point Users Group (IFPUG), a nonprofit and member-governed organization, has become the recognized leader in promoting the effective management of application software development and maintenance activities. The IFPUG Guide to IT and Software Measurement brings together 52 leading so

  5. A Parametric Model of Shoulder Articulation for Virtual Assessment of Space Suit Fit

    Science.gov (United States)

    Kim, K. Han; Young, Karen S.; Bernal, Yaritza; Boppana, Abhishektha; Vu, Linh Q.; Benson, Elizabeth A.; Jarvis, Sarah; Rajulu, Sudhakar L.

    2016-01-01

    Suboptimal suit fit is a known risk factor for crewmember shoulder injury. Suit fit assessment is however prohibitively time consuming and cannot be generalized across wide variations of body shapes and poses. In this work, we have developed a new design tool based on the statistical analysis of body shape scans. This tool is aimed at predicting the skin deformation and shape variations for any body size and shoulder pose for a target population. This new process, when incorporated with CAD software, will enable virtual suit fit assessments, predictively quantifying the contact volume, and clearance between the suit and body surface at reduced time and cost.

  6. Dexterity: A MATLAB-based analysis software suite for processing and visualizing data from tasks that measure arm or forelimb function.

    Science.gov (United States)

    Butensky, Samuel D; Sloan, Andrew P; Meyers, Eric; Carmel, Jason B

    2017-07-15

    Hand function is critical for independence, and neurological injury often impairs dexterity. To measure hand function in people or forelimb function in animals, sensors are employed to quantify manipulation. These sensors make assessment easier and more quantitative and allow automation of these tasks. While automated tasks improve objectivity and throughput, they also produce large amounts of data that can be burdensome to analyze. We created software called Dexterity that simplifies data analysis of automated reaching tasks. Dexterity is MATLAB software that enables quick analysis of data from forelimb tasks. Through a graphical user interface, files are loaded and data are identified and analyzed. These data can be annotated or graphed directly. Analysis is saved, and the graph and corresponding data can be exported. For additional analysis, Dexterity provides access to custom scripts created by other users. To determine the utility of Dexterity, we performed a study to evaluate the effects of task difficulty on the degree of impairment after injury. Dexterity analyzed two months of data and allowed new users to annotate the experiment, visualize results, and save and export data easily. Previous analysis of tasks was performed with custom data analysis, requiring expertise with analysis software. Dexterity made the tools required to analyze, visualize and annotate data easy to use by investigators without data science experience. Dexterity increases accessibility to automated tasks that measure dexterity by making analysis of large data intuitive, robust, and efficient. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Personal computer security: part 1. Firewalls, antivirus software, and Internet security suites.

    Science.gov (United States)

    Caruso, Ronald D

    2003-01-01

    Personal computer (PC) security in the era of the Health Insurance Portability and Accountability Act of 1996 (HIPAA) involves two interrelated elements: safeguarding the basic computer system itself and protecting the information it contains and transmits, including personal files. HIPAA regulations have toughened the requirements for securing patient information, requiring every radiologist with such data to take further precautions. Security starts with physically securing the computer. Account passwords and a password-protected screen saver should also be set up. A modern antivirus program can easily be installed and configured. File scanning and updating of virus definitions are simple processes that can largely be automated and should be performed at least weekly. A software firewall is also essential for protection from outside intrusion, and an inexpensive hardware firewall can provide yet another layer of protection. An Internet security suite yields additional safety. Regular updating of the security features of installed programs is important. Obtaining a moderate degree of PC safety and security is somewhat inconvenient but is necessary and well worth the effort. Copyright RSNA, 2003

  8. A 'Toolbox' Equivalent Process for Safety Analysis Software

    International Nuclear Information System (INIS)

    O'Kula, K.R.; Eng, Tony

    2004-01-01

    Defense Nuclear Facilities Safety Board (DNFSB) Recommendation 2002-1 (Quality Assurance for Safety-Related Software) identified a number of quality assurance issues on the use of software in Department of Energy (DOE) facilities for analyzing hazards, and designing and operating controls that prevent or mitigate potential accidents. The development and maintenance of a collection, or 'toolbox', of multiple-site use, standard solution, Software Quality Assurance (SQA)-compliant safety software is one of the major improvements identified in the associated DOE Implementation Plan (IP). The DOE safety analysis toolbox will contain a set of appropriately quality-assured, configuration-controlled, safety analysis codes, recognized for DOE-broad, safety basis applications. Currently, six widely applied safety analysis computer codes have been designated for toolbox consideration. While the toolbox concept considerably reduces SQA burdens among DOE users of these codes, many users of unique, single-purpose, or single-site software may still have sufficient technical justification to continue use of their computer code of choice, but are thwarted by the multiple-site condition on toolbox candidate software. The process discussed here provides a roadmap for an equivalency argument, i.e., establishing satisfactory SQA credentials for single-site software that can be deemed ''toolbox-equivalent''. The process is based on the model established to meet IP Commitment 4.2.1.2: Establish SQA criteria for the safety analysis ''toolbox'' codes. Implementing criteria that establish the set of prescriptive SQA requirements are based on implementation plan/procedures from the Savannah River Site, also incorporating aspects of those from the Waste Isolation Pilot Plant (SNL component) and the Yucca Mountain Project. The major requirements are met with evidence of a software quality assurance plan, software requirements and design documentation, user's instructions, test report, a

  9. Dtest Testing Software

    Science.gov (United States)

    Jain, Abhinandan; Cameron, Jonathan M.; Myint, Steven

    2013-01-01

    This software runs a suite of arbitrary software tests spanning various software languages and types of tests (unit level, system level, or file comparison tests). The dtest utility can be set to automate periodic testing of large suites of software, as well as running individual tests. It supports distributing multiple tests over multiple CPU cores, if available. The dtest tool is a utility program (written in Python) that scans through a directory (and its subdirectories) and finds all directories that match a certain pattern and then executes any tests in that directory as described in simple configuration files.

  10. Software architecture analysis tool : software architecture metrics collection

    NARCIS (Netherlands)

    Muskens, J.; Chaudron, M.R.V.; Westgeest, R.

    2002-01-01

    The Software Engineering discipline lacks the ability to evaluate software architectures. Here we describe a tool for software architecture analysis that is based on metrics. Metrics can be used to detect possible problems and bottlenecks in software architectures. Even though metrics do not give a

  11. Application of Metric-based Software Reliability Analysis to Example Software

    International Nuclear Information System (INIS)

    Kim, Man Cheol; Smidts, Carol

    2008-07-01

    The software reliability of TELLERFAST ATM software is analyzed by using two metric-based software reliability analysis methods, a state transition diagram-based method and a test coverage-based method. The procedures for the software reliability analysis by using the two methods and the analysis results are provided in this report. It is found that the two methods have a relation of complementary cooperation, and therefore further researches on combining the two methods to reflect the benefit of the complementary cooperative effect to the software reliability analysis are recommended

  12. Space suit bioenergetics: framework and analysis of unsuited and suited activity.

    Science.gov (United States)

    Carr, Christopher E; Newman, Dava J

    2007-11-01

    Metabolic costs limit the duration and intensity of extravehicular activity (EVA), an essential component of future human missions to the Moon and Mars. Energetics Framework: We present a framework for comparison of energetics data across and between studies. This framework, applied to locomotion, differentiates between muscle efficiency and energy recovery, two concepts often confused in the literature. The human run-walk transition in Earth gravity occurs at the point for which energy recovery is approximately the same for walking and running, suggesting a possible role for recovery in gait transitions. Muscular Energetics: Muscle physiology limits the overall efficiency by which chemical energy is converted through metabolism to useful work. Unsuited Locomotion: Walking and running use different methods of energy storage and release. These differences contribute to the relative changes in the metabolic cost of walking and running as gravity is varied, with the metabolic cost of locomoting at a given velocity changing in proportion to gravity for running and less than in proportion for walking. Space Suits: Major factors affecting the energetic cost of suited movement include suit pressurization, gravity, velocity, surface slope, and space suit configuration. Apollo lunar surface EVA traverse metabolic rates, while unexpectedly low, were higher than other activity categories. The Lunar Roving Vehicle facilitated even lower metabolic rates, thus longer duration EVAs. Muscles and tendons act like springs during running; similarly, longitudinal pressure forces in gas pressure space suits allow spring-like storage and release of energy when suits are self-supporting.

  13. ATSAS 2.8: a comprehensive data analysis suite for small-angle scattering from macromolecular solutions.

    Science.gov (United States)

    Franke, D; Petoukhov, M V; Konarev, P V; Panjkovich, A; Tuukkanen, A; Mertens, H D T; Kikhney, A G; Hajizadeh, N R; Franklin, J M; Jeffries, C M; Svergun, D I

    2017-08-01

    ATSAS is a comprehensive software suite for the analysis of small-angle scattering data from dilute solutions of biological macromolecules or nanoparticles. It contains applications for primary data processing and assessment, ab initio bead modelling, and model validation, as well as methods for the analysis of flexibility and mixtures. In addition, approaches are supported that utilize information from X-ray crystallography, nuclear magnetic resonance spectroscopy or atomistic homology modelling to construct hybrid models based on the scattering data. This article summarizes the progress made during the 2.5-2.8 ATSAS release series and highlights the latest developments. These include AMBIMETER , an assessment of the reconstruction ambiguity of experimental data; DATCLASS , a multiclass shape classification based on experimental data; SASRES , for estimating the resolution of ab initio model reconstructions; CHROMIXS , a convenient interface to analyse in-line size exclusion chromatography data; SHANUM , to evaluate the useful angular range in measured data; SREFLEX , to refine available high-resolution models using normal mode analysis; SUPALM for a rapid superposition of low- and high-resolution models; and SASPy , the ATSAS plugin for interactive modelling in PyMOL . All these features and other improvements are included in the ATSAS release 2.8, freely available for academic users from https://www.embl-hamburg.de/biosaxs/software.html.

  14. Results and Analysis from Space Suit Joint Torque Testing

    Science.gov (United States)

    Matty, Jennifer

    2010-01-01

    This joint mobility KC lecture included information from two papers, "A Method for and Issues Associated with the Determination of Space Suit Joint Requirements" and "Results and Analysis from Space Suit Joint Torque Testing," as presented for the International Conference on Environmental Systems in 2009 and 2010, respectively. The first paper discusses historical joint torque testing methodologies and approaches that were tested in 2008 and 2009. The second paper discusses the testing that was completed in 2009 and 2010.

  15. An ion beam analysis software based on ImageJ

    International Nuclear Information System (INIS)

    Udalagama, C.; Chen, X.; Bettiol, A.A.; Watt, F.

    2013-01-01

    The suit of techniques (RBS, STIM, ERDS, PIXE, IL, IF,…) available in ion beam analysis yields a variety of rich information. Typically, after the initial challenge of acquiring data we are then faced with the task of having to extract relevant information or to present the data in a format with the greatest impact. This process sometimes requires developing new software tools. When faced with such situations the usual practice at the Centre for Ion Beam Applications (CIBA) in Singapore has been to use our computational expertise to develop ad hoc software tools as and when we need them. It then became apparent that the whole ion beam community can benefit from such tools; specifically from a common software toolset that can be developed and maintained by everyone with freedom to use and allowance to modify. In addition to the benefits of readymade tools and sharing the onus of development, this also opens up the possibility for collaborators to access and analyse ion beam data without having to depend on an ion beam lab. This has the virtue of making the ion beam techniques more accessible to a broader scientific community. We have identified ImageJ as an appropriate software base to develop such a common toolset. In addition to being in the public domain and been setup for collaborative tool development, ImageJ is accompanied by hundreds of modules (plugins) that allow great breadth in analysis. The present work is the first step towards integrating ion beam analysis into ImageJ. Some of the features of the current version of the ImageJ ‘ion beam’ plugin are: (1) reading list mode or event-by-event files, (2) energy gates/sorts, (3) sort stacks, (4) colour function, (5) real time map updating, (6) real time colour updating and (7) median and average map creation

  16. An ion beam analysis software based on ImageJ

    Energy Technology Data Exchange (ETDEWEB)

    Udalagama, C., E-mail: chammika@nus.edu.sg [Centre for Ion Beam Applications (CIBA), Department of Physics, National University of Singapore, 2 Science Drive 3, Singapore 117 542 (Singapore); Chen, X.; Bettiol, A.A.; Watt, F. [Centre for Ion Beam Applications (CIBA), Department of Physics, National University of Singapore, 2 Science Drive 3, Singapore 117 542 (Singapore)

    2013-07-01

    The suit of techniques (RBS, STIM, ERDS, PIXE, IL, IF,…) available in ion beam analysis yields a variety of rich information. Typically, after the initial challenge of acquiring data we are then faced with the task of having to extract relevant information or to present the data in a format with the greatest impact. This process sometimes requires developing new software tools. When faced with such situations the usual practice at the Centre for Ion Beam Applications (CIBA) in Singapore has been to use our computational expertise to develop ad hoc software tools as and when we need them. It then became apparent that the whole ion beam community can benefit from such tools; specifically from a common software toolset that can be developed and maintained by everyone with freedom to use and allowance to modify. In addition to the benefits of readymade tools and sharing the onus of development, this also opens up the possibility for collaborators to access and analyse ion beam data without having to depend on an ion beam lab. This has the virtue of making the ion beam techniques more accessible to a broader scientific community. We have identified ImageJ as an appropriate software base to develop such a common toolset. In addition to being in the public domain and been setup for collaborative tool development, ImageJ is accompanied by hundreds of modules (plugins) that allow great breadth in analysis. The present work is the first step towards integrating ion beam analysis into ImageJ. Some of the features of the current version of the ImageJ ‘ion beam’ plugin are: (1) reading list mode or event-by-event files, (2) energy gates/sorts, (3) sort stacks, (4) colour function, (5) real time map updating, (6) real time colour updating and (7) median and average map creation.

  17. An Analysis of Security and Privacy Issues in Smart Grid Software Architectures on Clouds

    Energy Technology Data Exchange (ETDEWEB)

    Simmhan, Yogesh; Kumbhare, Alok; Cao, Baohua; Prasanna, Viktor K.

    2011-07-09

    Power utilities globally are increasingly upgrading to Smart Grids that use bi-directional communication with the consumer to enable an information-driven approach to distributed energy management. Clouds offer features well suited for Smart Grid software platforms and applications, such as elastic resources and shared services. However, the security and privacy concerns inherent in an information rich Smart Grid environment are further exacerbated by their deployment on Clouds. Here, we present an analysis of security and privacy issues in a Smart Grids software architecture operating on different Cloud environments, in the form of a taxonomy. We use the Los Angeles Smart Grid Project that is underway in the largest U.S. municipal utility to drive this analysis that will benefit both Cloud practitioners targeting Smart Grid applications, and Cloud researchers investigating security and privacy.

  18. Software safety hazard analysis

    International Nuclear Information System (INIS)

    Lawrence, J.D.

    1996-02-01

    Techniques for analyzing the safety and reliability of analog-based electronic protection systems that serve to mitigate hazards in process control systems have been developed over many years, and are reasonably well understood. An example is the protection system in a nuclear power plant. The extension of these techniques to systems which include digital computers is not well developed, and there is little consensus among software engineering experts and safety experts on how to analyze such systems. One possible technique is to extend hazard analysis to include digital computer-based systems. Software is frequently overlooked during system hazard analyses, but this is unacceptable when the software is in control of a potentially hazardous operation. In such cases, hazard analysis should be extended to fully cover the software. A method for performing software hazard analysis is proposed in this paper

  19. Real-time development of data acquisition and analysis software for hands-on physiology education in neuroscience: G-PRIME.

    Science.gov (United States)

    Lott, Gus K; Johnson, Bruce R; Bonow, Robert H; Land, Bruce R; Hoy, Ronald R

    2009-01-01

    We report on the real-time creation of an application for hands-on neurophysiology in an advanced undergraduate teaching laboratory. Enabled by the rapid software development tools included in the Matlab technical computing environment (The Mathworks, Natick, MA), a team, consisting of a neurophysiology educator and a biophysicist trained as an electrical engineer, interfaced to a course of approximately 15 students from engineering and biology backgrounds. The result is the powerful freeware data acquisition and analysis environment, "g-PRIME." The software was developed from week to week in response to curriculum demands, and student feedback. The program evolved from a simple software oscilloscope, enabling RC circuit analysis, to a suite of tools supporting analysis of neuronal excitability and synaptic transmission analysis in invertebrate model systems. The program has subsequently expanded in application to university courses, research, and high school projects in the US and abroad as free courseware.

  20. RAJA Performance Suite

    Energy Technology Data Exchange (ETDEWEB)

    2017-09-01

    The RAJA Performance Suite is designed to evaluate performance of the RAJA performance portability library on a wide variety of important high performance computing (HPC) algorithmic lulmels. These kernels assess compiler optimizations and various parallel programming model backends accessible through RAJA, such as OpenMP, CUDA, etc. The Initial version of the suite contains 25 computational kernels, each of which appears in 6 variants: Baseline SequcntiaJ, RAJA SequentiaJ, Baseline OpenMP, RAJA OpenMP, Baseline CUDA, RAJA CUDA. All variants of each kernel perform essentially the same mathematical operations and the loop body code for each kernel is identical across all variants. There are a few kernels, such as those that contain reduction operations, that require CUDA-specific coding for their CUDA variants. ActuaJ computer instructions executed and how they run in parallel differs depending on the parallel programming model backend used and which optimizations are perfonned by the compiler used to build the Perfonnance Suite executable. The Suite will be used primarily by RAJA developers to perform regular assessments of RAJA performance across a range of hardware platforms and compilers as RAJA features are being developed. It will also be used by LLNL hardware and software vendor panners for new defining requirements for future computing platform procurements and acceptance testing. In particular, the RAJA Performance Suite will be used for compiler acceptance testing of the upcoming CORAUSierra machine {initial LLNL delivery expected in late-2017/early 2018) and the CORAL-2 procurement. The Suite will aJso be used to generate concise source code reproducers of compiler and runtime issues we uncover so that we may provide them to relevant vendors to be fixed.

  1. Technical Reference Suite Addressing Challenges of Providing Assurance for Fault Management Architectural Design

    Science.gov (United States)

    Fitz, Rhonda; Whitman, Gerek

    2016-01-01

    Research into complexities of software systems Fault Management (FM) and how architectural design decisions affect safety, preservation of assets, and maintenance of desired system functionality has coalesced into a technical reference (TR) suite that advances the provision of safety and mission assurance. The NASA Independent Verification and Validation (IV&V) Program, with Software Assurance Research Program support, extracted FM architectures across the IV&V portfolio to evaluate robustness, assess visibility for validation and test, and define software assurance methods applied to the architectures and designs. This investigation spanned IV&V projects with seven different primary developers, a wide range of sizes and complexities, and encompassed Deep Space Robotic, Human Spaceflight, and Earth Orbiter mission FM architectures. The initiative continues with an expansion of the TR suite to include Launch Vehicles, adding the benefit of investigating differences intrinsic to model-based FM architectures and insight into complexities of FM within an Agile software development environment, in order to improve awareness of how nontraditional processes affect FM architectural design and system health management. The identification of particular FM architectures, visibility, and associated IV&V techniques provides a TR suite that enables greater assurance that critical software systems will adequately protect against faults and respond to adverse conditions. Additionally, the role FM has with regard to strengthened security requirements, with potential to advance overall asset protection of flight software systems, is being addressed with the development of an adverse conditions database encompassing flight software vulnerabilities. Capitalizing on the established framework, this TR suite provides assurance capability for a variety of FM architectures and varied development approaches. Research results are being disseminated across NASA, other agencies, and the

  2. Application of Software Safety Analysis Methods

    International Nuclear Information System (INIS)

    Park, G. Y.; Hur, S.; Cheon, S. W.; Kim, D. H.; Lee, D. Y.; Kwon, K. C.; Lee, S. J.; Koo, Y. H.

    2009-01-01

    A fully digitalized reactor protection system, which is called the IDiPS-RPS, was developed through the KNICS project. The IDiPS-RPS has four redundant and separated channels. Each channel is mainly composed of a group of bistable processors which redundantly compare process variables with their corresponding setpoints and a group of coincidence processors that generate a final trip signal when a trip condition is satisfied. Each channel also contains a test processor called the ATIP and a display and command processor called the COM. All the functions were implemented in software. During the development of the safety software, various software safety analysis methods were applied, in parallel to the verification and validation (V and V) activities, along the software development life cycle. The software safety analysis methods employed were the software hazard and operability (Software HAZOP) study, the software fault tree analysis (Software FTA), and the software failure modes and effects analysis (Software FMEA)

  3. Software Application Profile: RVPedigree: a suite of family-based rare variant association tests for normally and non-normally distributed quantitative traits.

    Science.gov (United States)

    Oualkacha, Karim; Lakhal-Chaieb, Lajmi; Greenwood, Celia Mt

    2016-04-01

    RVPedigree (Rare Variant association tests in Pedigrees) implements a suite of programs facilitating genome-wide analysis of association between a quantitative trait and autosomal region-based genetic variation. The main features here are the ability to appropriately test for association of rare variants with non-normally distributed quantitative traits, and also to appropriately adjust for related individuals, either from families or from population structure and cryptic relatedness. RVPedigree is available as an R package. The package includes calculation of kinship matrices, various options for coping with non-normality, three different ways of estimating statistical significance incorporating triaging to enable efficient use of the most computationally-intensive calculations, and a parallelization option for genome-wide analysis. The software is available from the Comprehensive R Archive Network [CRAN.R-project.org] under the name 'RVPedigree' and at [https://github.com/GreenwoodLab]. It has been published under General Public License (GPL) version 3 or newer. © The Author 2016; all rights reserved. Published by Oxford University Press on behalf of the International Epidemiological Association.

  4. Development of Advanced Suite of Deterministic Codes for VHTR Physics Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Kang Seog; Cho, J. Y.; Lee, K. H. (and others)

    2007-07-15

    Advanced Suites of deterministic codes for VHTR physics analysis has been developed for detailed analysis of current and advanced reactor designs as part of a US-ROK collaborative I-NERI project. These code suites include the conventional 2-step procedure in which a few group constants are generated by a transport lattice calculation, and the reactor physics analysis is performed by a 3-dimensional diffusion calculation, and a whole core transport code that can model local heterogeneities directly at the core level. Particular modeling issues in physics analysis of the gas-cooled VHTRs were resolved, which include a double heterogeneity of the coated fuel particles, a neutron streaming in the coolant channels, a strong core-reflector interaction, and large spectrum shifts due to changes of the surrounding environment, temperature and burnup. And the geometry handling capability of the DeCART code were extended to deal with the hexagonal fuel elements of the VHTR core. The developed code suites were validated and verified by comparing the computational results with those of the Monte Carlo calculations for the benchmark problems.

  5. Dependability Analysis Methods For Configurable Software

    International Nuclear Information System (INIS)

    Dahll, Gustav; Pulkkinen, Urho

    1996-01-01

    Configurable software systems are systems which are built up by standard software components in the same way as a hardware system is built up by standard hardware components. Such systems are often used in the control of NPPs, also in safety related applications. A reliability analysis of such systems is therefore necessary. This report discusses what configurable software is, and what is particular with respect to reliability assessment of such software. Two very commonly used techniques in traditional reliability analysis, viz. failure mode, effect and criticality analysis (FMECA) and fault tree analysis are investigated. A real example is used to illustrate the discussed methods. Various aspects relevant to the assessment of the software reliability in such systems are discussed. Finally some models for quantitative software reliability assessment applicable on configurable software systems are described. (author)

  6. Reliability analysis of software based safety functions

    International Nuclear Information System (INIS)

    Pulkkinen, U.

    1993-05-01

    The methods applicable in the reliability analysis of software based safety functions are described in the report. Although the safety functions also include other components, the main emphasis in the report is on the reliability analysis of software. The check list type qualitative reliability analysis methods, such as failure mode and effects analysis (FMEA), are described, as well as the software fault tree analysis. The safety analysis based on the Petri nets is discussed. The most essential concepts and models of quantitative software reliability analysis are described. The most common software metrics and their combined use with software reliability models are discussed. The application of software reliability models in PSA is evaluated; it is observed that the recent software reliability models do not produce the estimates needed in PSA directly. As a result from the study some recommendations and conclusions are drawn. The need of formal methods in the analysis and development of software based systems, the applicability of qualitative reliability engineering methods in connection to PSA and the need to make more precise the requirements for software based systems and their analyses in the regulatory guides should be mentioned. (orig.). (46 refs., 13 figs., 1 tab.)

  7. Supervised Semi-Automated Data Analysis Software for Gas Chromatography / Differential Mobility Spectrometry (GC/DMS) Metabolomics Applications.

    Science.gov (United States)

    Peirano, Daniel J; Pasamontes, Alberto; Davis, Cristina E

    2016-09-01

    Modern differential mobility spectrometers (DMS) produce complex and multi-dimensional data streams that allow for near-real-time or post-hoc chemical detection for a variety of applications. An active area of interest for this technology is metabolite monitoring for biological applications, and these data sets regularly have unique technical and data analysis end user requirements. While there are initial publications on how investigators have individually processed and analyzed their DMS metabolomic data, there are no user-ready commercial or open source software packages that are easily used for this purpose. We have created custom software uniquely suited to analyze gas chromatograph / differential mobility spectrometry (GC/DMS) data from biological sources. Here we explain the implementation of the software, describe the user features that are available, and provide an example of how this software functions using a previously-published data set. The software is compatible with many commercial or home-made DMS systems. Because the software is versatile, it can also potentially be used for other similarly structured data sets, such as GC/GC and other IMS modalities.

  8. Software development processes and analysis software: a mismatch and a novel framework

    International Nuclear Information System (INIS)

    Kelly, D.; Harauz, J.

    2011-01-01

    This paper discusses the salient characteristics of analysis software and the impact of those characteristics on its development. From this discussion, it can be seen that mainstream software development processes, usually characterized as Plan Driven or Agile, are built upon assumptions that are mismatched to the development and maintenance of analysis software. We propose a novel software development framework that would match the process normally observed in the development of analysis software. In the discussion of this framework, we suggest areas of research and directions for future work. (author)

  9. Fault tree analysis of KNICS RPS software

    International Nuclear Information System (INIS)

    Park, Gee Yong; Kwon, Kee Choon; Koh, Kwang Yong; Jee, Eun Kyoung; Seong, Poong Hyun; Lee, Dae Hyung

    2008-01-01

    This paper describes the application of a software Fault Tree Analysis (FTA) as one of the analysis techniques for a Software Safety Analysis (SSA) at the design phase and its analysis results for the safety-critical software of a digital reactor protection system, which is called the KNICS RPS, being developed in the KNICS (Korea Nuclear Instrumentation and Control Systems) project. The software modules in the design description were represented by Function Blocks (FBs), and the software FTA was performed based on the well-defined fault tree templates for the FBs. The SSA, which is part of the verification and validation (V and V) activities, was activated at each phase of the software lifecycle for the KNICS RPS. At the design phase, the software HAZOP (Hazard and Operability) and the software FTA were employed in the SSA in such a way that the software HAZOP was performed first and then the software FTA was applied. The software FTA was applied to some critical modules selected from the software HAZOP analysis

  10. BuddySuite: Command-Line Toolkits for Manipulating Sequences, Alignments, and Phylogenetic Trees.

    Science.gov (United States)

    Bond, Stephen R; Keat, Karl E; Barreira, Sofia N; Baxevanis, Andreas D

    2017-06-01

    The ability to manipulate sequence, alignment, and phylogenetic tree files has become an increasingly important skill in the life sciences, whether to generate summary information or to prepare data for further downstream analysis. The command line can be an extremely powerful environment for interacting with these resources, but only if the user has the appropriate general-purpose tools on hand. BuddySuite is a collection of four independent yet interrelated command-line toolkits that facilitate each step in the workflow of sequence discovery, curation, alignment, and phylogenetic reconstruction. Most common sequence, alignment, and tree file formats are automatically detected and parsed, and over 100 tools have been implemented for manipulating these data. The project has been engineered to easily accommodate the addition of new tools, is written in the popular programming language Python, and is hosted on the Python Package Index and GitHub to maximize accessibility. Documentation for each BuddySuite tool, including usage examples, is available at http://tiny.cc/buddysuite_wiki. All software is open source and freely available through http://research.nhgri.nih.gov/software/BuddySuite. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution 2017. This work is written by US Government employees and is in the public domain in the US.

  11. ROCKETSHIP: a flexible and modular software tool for the planning, processing and analysis of dynamic MRI studies

    International Nuclear Information System (INIS)

    Barnes, Samuel R.; Ng, Thomas S. C.; Santa-Maria, Naomi; Montagne, Axel; Zlokovic, Berislav V.; Jacobs, Russell E.

    2015-01-01

    Dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) is a promising technique to characterize pathology and evaluate treatment response. However, analysis of DCE-MRI data is complex and benefits from concurrent analysis of multiple kinetic models and parameters. Few software tools are currently available that specifically focuses on DCE-MRI analysis with multiple kinetic models. Here, we developed ROCKETSHIP, an open-source, flexible and modular software for DCE-MRI analysis. ROCKETSHIP incorporates analyses with multiple kinetic models, including data-driven nested model analysis. ROCKETSHIP was implemented using the MATLAB programming language. Robustness of the software to provide reliable fits using multiple kinetic models is demonstrated using simulated data. Simulations also demonstrate the utility of the data-driven nested model analysis. Applicability of ROCKETSHIP for both preclinical and clinical studies is shown using DCE-MRI studies of the human brain and a murine tumor model. A DCE-MRI software suite was implemented and tested using simulations. Its applicability to both preclinical and clinical datasets is shown. ROCKETSHIP was designed to be easily accessible for the beginner, but flexible enough for changes or additions to be made by the advanced user as well. The availability of a flexible analysis tool will aid future studies using DCE-MRI. A public release of ROCKETSHIP is available at (https://github.com/petmri/ROCKETSHIP)

  12. On-Orbit Software Analysis

    Science.gov (United States)

    Moran, Susanne I.

    2004-01-01

    The On-Orbit Software Analysis Research Infusion Project was done by Intrinsyx Technologies Corporation (Intrinsyx) at the National Aeronautics and Space Administration (NASA) Ames Research Center (ARC). The Project was a joint collaborative effort between NASA Codes IC and SL, Kestrel Technology (Kestrel), and Intrinsyx. The primary objectives of the Project were: Discovery and verification of software program properties and dependencies, Detection and isolation of software defects across different versions of software, and Compilation of historical data and technical expertise for future applications

  13. An Effective Strategy to Build Up a Balanced Test Suite for Spectrum-Based Fault Localization

    Directory of Open Access Journals (Sweden)

    Ning Li

    2016-01-01

    Full Text Available During past decades, many automated software faults diagnosis techniques including Spectrum-Based Fault Localization (SBFL have been proposed to improve the efficiency of software debugging activity. In the field of SBFL, suspiciousness calculation is closely related to the number of failed and passed test cases. Studies have shown that the ratio of the number of failed and passed test case has more significant impact on the accuracy of SBFL than the total number of test cases, and a balanced test suite is more beneficial to improving the accuracy of SBFL. Based on theoretical analysis, we proposed an PNF (Passed test cases, Not execute Faulty statement strategy to reduce test suite and build up a more balanced one for SBFL, which can be used in regression testing. We evaluated the strategy making experiments using the Siemens program and Space program. Experiments indicated that our PNF strategy can be used to construct a new test suite effectively. Compared with the original test suite, the new one has smaller size (average 90% test case was reduced in experiments and more balanced ratio of failed test cases to passed test cases, while it has the same statement coverage and fault localization accuracy.

  14. Buying in to bioinformatics: an introduction to commercial sequence analysis software.

    Science.gov (United States)

    Smith, David Roy

    2015-07-01

    Advancements in high-throughput nucleotide sequencing techniques have brought with them state-of-the-art bioinformatics programs and software packages. Given the importance of molecular sequence data in contemporary life science research, these software suites are becoming an essential component of many labs and classrooms, and as such are frequently designed for non-computer specialists and marketed as one-stop bioinformatics toolkits. Although beautifully designed and powerful, user-friendly bioinformatics packages can be expensive and, as more arrive on the market each year, it can be difficult for researchers, teachers and students to choose the right software for their needs, especially if they do not have a bioinformatics background. This review highlights some of the currently available and most popular commercial bioinformatics packages, discussing their prices, usability, features and suitability for teaching. Although several commercial bioinformatics programs are arguably overpriced and overhyped, many are well designed, sophisticated and, in my opinion, worth the investment. If you are just beginning your foray into molecular sequence analysis or an experienced genomicist, I encourage you to explore proprietary software bundles. They have the potential to streamline your research, increase your productivity, energize your classroom and, if anything, add a bit of zest to the often dry detached world of bioinformatics. © The Author 2014. Published by Oxford University Press.

  15. A study of software safety analysis system for safety-critical software

    International Nuclear Information System (INIS)

    Chang, H. S.; Shin, H. K.; Chang, Y. W.; Jung, J. C.; Kim, J. H.; Han, H. H.; Son, H. S.

    2004-01-01

    The core factors and requirements for the safety-critical software traced and the methodology adopted in each stage of software life cycle are presented. In concept phase, Failure Modes and Effects Analysis (FMEA) for the system has been performed. The feasibility evaluation of selected safety parameter was performed and Preliminary Hazards Analysis list was prepared using HAZOP(Hazard and Operability) technique. And the check list for management control has been produced via walk-through technique. Based on the evaluation of the check list, activities to be performed in requirement phase have been determined. In the design phase, hazard analysis has been performed to check the safety capability of the system with regard to safety software algorithm using Fault Tree Analysis (FTA). In the test phase, the test items based on FMEA have been checked for fitness guided by an accident scenario. The pressurizer low pressure trip algorithm has been selected to apply FTA method to software safety analysis as a sample. By applying CASE tool, the requirements traceability of safety critical system has been enhanced during all of software life cycle phases

  16. The Sample Analysis at Mars Investigation and Instrument Suite

    Science.gov (United States)

    Mahaffy, Paul; Webster, Christopher R.; Conrad, Pamela G.; Arvey, Robert; Bleacher, Lora; Brinckerhoff, William B.; Eigenbrode, Jennifer L.; Chalmers, Robert A.; Dworkin, Jason P.; Errigo, Therese; hide

    2012-01-01

    The Sample Analysis at Mars (SAM) investigation of the Mars Science Laboratory (MSL) addresses the chemical and isotopic composition of the atmosphere and volatiles extracted from solid samples. The SAM investigation is designed to contribute substantially to the mission goal of quantitatively assessing the habitability of Mars as an essential step in the search for past or present life on Mars. SAM is a 40 kg instrument suite located in the interior of MSL's Curiosity rover. The SAM instruments are a quadrupole mass spectrometer, a tunable laser spectrometer, and a 6-column gas chromatograph all coupled through solid and gas processing systems to provide complementary information on the same samples. The SAM suite is able to measure a suite of light isotopes and to analyze volatiles directly from the atmosphere or thermally released from solid samples. In addition to measurements of simple inorganic compounds and noble gases SAM will conduct a sensitive search for organic compounds with either thermal or chemical extraction from sieved samples delivered by the sample processing system on the Curiosity rover's robotic arm,

  17. The application of formal software engineering methods to the unattended and remote monitoring software suite at Los Alamos National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Determan, John Clifford [Los Alamos National Laboratory; Longo, Joseph F [Los Alamos National Laboratory; Michel, Kelly D [Los Alamos National Laboratory

    2009-01-01

    The Unattended and Remote Monitoring (UNARM) system is a collection of specialized hardware and software used by the International Atomic Energy Agency (IAEA) to institute nuclear safeguards at many nuclear facilities around the world. The hardware consists of detectors, instruments, and networked computers for acquiring various forms of data, including but not limited to radiation data, global position coordinates, camera images, isotopic data, and operator declarations. The software provides two primary functions: the secure and reliable collection of this data from the instruments and the ability to perform an integrated review and analysis of the disparate data sources. Several years ago the team responsible for maintaining the software portion of the UNARM system began the process of formalizing its operations. These formal operations include a configuration management system, a change control board, an issue tracking system, and extensive formal testing, for both functionality and reliability. Functionality is tested with formal test cases chosen to fully represent the data types and methods of analysis that will be commonly encountered. Reliability is tested with iterative, concurrent testing where up to five analyses are executed simultaneously for thousands of cycles. Iterative concurrent testing helps ensure that there are no resource conflicts or leaks when multiple system components are in use simultaneously. The goal of this work is to provide a high quality, reliable product, commensurate with the criticality of the application. Testing results will be presented that demonstrate that this goal has been achieved and the impact of the introduction of a formal software engineering framework to the UNARM product will be presented.

  18. The CMSSW benchmarking suite: Using HEP code to measure CPU performance

    International Nuclear Information System (INIS)

    Benelli, G

    2010-01-01

    The demanding computing needs of the CMS experiment require thoughtful planning and management of its computing infrastructure. A key factor in this process is the use of realistic benchmarks when assessing the computing power of the different architectures available. In recent years a discrepancy has been observed between the CPU performance estimates given by the reference benchmark for HEP computing (SPECint) and actual performances of HEP code. Making use of the CPU performance tools from the CMSSW performance suite, comparative CPU performance studies have been carried out on several architectures. A benchmarking suite has been developed and integrated in the CMSSW framework, to allow computing centers and interested third parties to benchmark architectures directly with CMSSW. The CMSSW benchmarking suite can be used out of the box, to test and compare several machines in terms of CPU performance and report with the wanted level of detail the different benchmarking scores (e.g. by processing step) and results. In this talk we describe briefly the CMSSW software performance suite, and in detail the CMSSW benchmarking suite client/server design, the performance data analysis and the available CMSSW benchmark scores. The experience in the use of HEP code for benchmarking will be discussed and CMSSW benchmark results presented.

  19. The Effects of Development Team Skill on Software Product Quality

    Science.gov (United States)

    Beaver, Justin M.; Schiavone, Guy A.

    2006-01-01

    This paper provides an analysis of the effect of the skill/experience of the software development team on the quality of the final software product. A method for the assessment of software development team skill and experience is proposed, and was derived from a workforce management tool currently in use by the National Aeronautics and Space Administration. Using data from 26 smallscale software development projects, the team skill measures are correlated to 5 software product quality metrics from the ISO/IEC 9126 Software Engineering Product Quality standard. in the analysis of the results, development team skill is found to be a significant factor in the adequacy of the design and implementation. In addition, the results imply that inexperienced software developers are tasked with responsibilities ill-suited to their skill level, and thus have a significant adverse effect on the quality of the software product. Keywords: software quality, development skill, software metrics

  20. CCP4 Software Suite: history, evolution, content, challenges and future developments

    Directory of Open Access Journals (Sweden)

    Krissinel, Eugene

    2015-04-01

    Full Text Available Collaborative Computational Project Number 4 (CCP4 in Protein Crystallography is a public resource for producing and supporting a world-leading, integrated Suite of programs that allows researchers to determine macromolecular structures by X-ray crystallography, and other biophysical techniques. CCP4 supports the widest possible researcher community, embracing academic, not for profit, and for profit research. The primary aims of CCP4 include development and support of the development of cutting edge approaches to experimental determination and analysis of protein structure, with integration of them into the suite for worldwide dissemination. In addition, CCP4 plays an important role in the education and training of scientists in experimental structural biology. In this paper, we overview CCP4’s 35-year long history and (technical milestones of its evolution. We will also consider how a particular structure of CCP4 Suite and Collaboration has emerged, its main functionality, current state and plans for future.“Collaborative Computational Project Number 4 (CCP4” en Cristalografía de Proteínas es un recurso público líder mundial, encaminado a producir y mantener un conjunto integrado de programas que permite a los investigadores determinar estructuras macromoleculares mediante cristalografía de rayos-X, así como por otras técnicas biofísicas. CCP4 va dirigido a la más amplia comunidad científica posible, abarcando investigaciones en el ámbito académico, tanto sin ánimo de lucro como con él. Sus objetivos principales incluyen el desarrollo y soporte de metodologías punteras para la determinación y análisis de estructuras de proteínas, integradas en un conjunto bien definido para facilitar su fácil difusión mundial. Además, CCP4 juega un papel importante en la formación y entrenamiento de científicos en biología estructural experimental. En este artículo, ofreceré una visión de conjunto de la larga historia e hitos t

  1. An integrative variant analysis suite for whole exome next-generation sequencing data

    Directory of Open Access Journals (Sweden)

    Challis Danny

    2012-01-01

    Full Text Available Abstract Background Whole exome capture sequencing allows researchers to cost-effectively sequence the coding regions of the genome. Although the exome capture sequencing methods have become routine and well established, there is currently a lack of tools specialized for variant calling in this type of data. Results Using statistical models trained on validated whole-exome capture sequencing data, the Atlas2 Suite is an integrative variant analysis pipeline optimized for variant discovery on all three of the widely used next generation sequencing platforms (SOLiD, Illumina, and Roche 454. The suite employs logistic regression models in conjunction with user-adjustable cutoffs to accurately separate true SNPs and INDELs from sequencing and mapping errors with high sensitivity (96.7%. Conclusion We have implemented the Atlas2 Suite and applied it to 92 whole exome samples from the 1000 Genomes Project. The Atlas2 Suite is available for download at http://sourceforge.net/projects/atlas2/. In addition to a command line version, the suite has been integrated into the Genboree Workbench, allowing biomedical scientists with minimal informatics expertise to remotely call, view, and further analyze variants through a simple web interface. The existing genomic databases displayed via the Genboree browser also streamline the process from variant discovery to functional genomics analysis, resulting in an off-the-shelf toolkit for the broader community.

  2. The capabilities and applications of the saphire 5.0 safety assessment software

    International Nuclear Information System (INIS)

    Russell, K.D.; Wood, S.T.; Kvarfordt, K.J.

    1994-01-01

    The System Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) refers to a suite of computer programs that were developed to create and analyze a probabilistic risk assessment (PRA) of a nuclear power plant. The programs in this suite include: Models and Results Data Base (MAR-D) software, Integrated Reliability and Risk Analysis System (IRRAS) software, System Analysis and Risk Assessment (SARA) software, and Fault tree, Event tree, and Piping and instrumentation diagram (FEP) graphical editor. Each of these programs performs a specific function in taking a PRA from the conceptual state all the way to publication. This paper provides an overview of the features and capabilities provided in version 5.0 of this software system. Some major new features include the ability to store unlimited cut sets, the ability to perform location transformations, the ability to perform seismic analysis, the ability to perform automated rule based recovery analysis and end state cut set partitioning, the ability to perform end state analysis, a new alphanumeric fault tree editor, and a new alphanumeric event tree editor. Many enhancements and improvements to the user interface as well as a significant reduction in the time required to perform an analysis are included in version 5.0. These new features and capabilities provide a powerful set of PC based PRA analysis tools

  3. Computer-assisted qualitative data analysis software.

    Science.gov (United States)

    Cope, Diane G

    2014-05-01

    Advances in technology have provided new approaches for data collection methods and analysis for researchers. Data collection is no longer limited to paper-and-pencil format, and numerous methods are now available through Internet and electronic resources. With these techniques, researchers are not burdened with entering data manually and data analysis is facilitated by software programs. Quantitative research is supported by the use of computer software and provides ease in the management of large data sets and rapid analysis of numeric statistical methods. New technologies are emerging to support qualitative research with the availability of computer-assisted qualitative data analysis software (CAQDAS).CAQDAS will be presented with a discussion of advantages, limitations, controversial issues, and recommendations for this type of software use.

  4. Gamma-Ray Spectrum Analysis Software GDA

    International Nuclear Information System (INIS)

    Wanabongse, P.

    1998-01-01

    The developmental work on computer software for gamma-ray spectrum analysis has been completed as a software package version 1.02 named GDA, which is an acronym for Gamma-spectrum Deconvolution and Analysis. The software package consists of three 3.5-inch diskettes for setup and a user's manual. GDA software can be installed for using on a personal computer with Windows 95 or Windows NT 4.0 operating system. A computer maybe the type of 80486 CPU with 8 megabytes of memory

  5. Development of software tools for 4-D visualization and quantitative analysis of PHITS simulation results

    International Nuclear Information System (INIS)

    Furutaka, Kazuyoshi

    2015-02-01

    A suite of software tools has been developed to facilitate the development of apparatus using a radiation transport simulation code PHITS by enabling 4D visualization (3D space and time) and quantitative analysis of so-called dieaway plots. To deliver useable tools as soon as possible, the existing software was utilized as much as possible; ParaView will be used for the 4D visualization of the results, whereas the analyses of dieaway plots will be done with ROOT toolkit with a tool named “diana”. To enable 4D visualization using ParaView, a group of tools (angel2vtk, DispDCAS1, CamPos) has been developed for the conversion of the data format to the one which can be read from ParaView and to ease the visualization. (author)

  6. A suite of models to support the quantitative assessment of spread in pest risk analysis

    NARCIS (Netherlands)

    Robinet, C.; Kehlenbeck, H.; Werf, van der W.

    2012-01-01

    In the frame of the EU project PRATIQUE (KBBE-2007-212459 Enhancements of pest risk analysis techniques) a suite of models was developed to support the quantitative assessment of spread in pest risk analysis. This dataset contains the model codes (R language) for the four models in the suite. Three

  7. Software component quality evaluation

    Science.gov (United States)

    Clough, A. J.

    1991-01-01

    The paper describes a software inspection process that can be used to evaluate the quality of software components. Quality criteria, process application, independent testing of the process and proposed associated tool support are covered. Early results indicate that this technique is well suited for assessing software component quality in a standardized fashion. With automated machine assistance to facilitate both the evaluation and selection of software components, such a technique should promote effective reuse of software components.

  8. The PyRosetta Toolkit: a graphical user interface for the Rosetta software suite.

    Science.gov (United States)

    Adolf-Bryfogle, Jared; Dunbrack, Roland L

    2013-01-01

    The Rosetta Molecular Modeling suite is a command-line-only collection of applications that enable high-resolution modeling and design of proteins and other molecules. Although extremely useful, Rosetta can be difficult to learn for scientists with little computational or programming experience. To that end, we have created a Graphical User Interface (GUI) for Rosetta, called the PyRosetta Toolkit, for creating and running protocols in Rosetta for common molecular modeling and protein design tasks and for analyzing the results of Rosetta calculations. The program is highly extensible so that developers can add new protocols and analysis tools to the PyRosetta Toolkit GUI.

  9. Software quality testing process analysis

    OpenAIRE

    Mera Paz, Julián

    2016-01-01

    Introduction: This article is the result of reading, review, analysis of books, magazines and articles well known for their scientific and research quality, which have addressed the software quality testing process. The author, based on his work experience in software development companies, teaching and other areas, has compiled and selected information to argue and substantiate the importance of the software quality testing process. Methodology: the existing literature on the software qualit...

  10. Infusing Reliability Techniques into Software Safety Analysis

    Science.gov (United States)

    Shi, Ying

    2015-01-01

    Software safety analysis for a large software intensive system is always a challenge. Software safety practitioners need to ensure that software related hazards are completely identified, controlled, and tracked. This paper discusses in detail how to incorporate the traditional reliability techniques into the entire software safety analysis process. In addition, this paper addresses how information can be effectively shared between the various practitioners involved in the software safety analyses. The author has successfully applied the approach to several aerospace applications. Examples are provided to illustrate the key steps of the proposed approach.

  11. Metadata database and data analysis software for the ground-based upper atmospheric data developed by the IUGONET project

    Science.gov (United States)

    Hayashi, H.; Tanaka, Y.; Hori, T.; Koyama, Y.; Shinbori, A.; Abe, S.; Kagitani, M.; Kouno, T.; Yoshida, D.; Ueno, S.; Kaneda, N.; Yoneda, M.; Tadokoro, H.; Motoba, T.; Umemura, N.; Iugonet Project Team

    2011-12-01

    The Inter-university Upper atmosphere Global Observation NETwork (IUGONET) is a Japanese inter-university project by the National Institute of Polar Research (NIPR), Tohoku University, Nagoya University, Kyoto University, and Kyushu University to build a database of metadata for ground-based observations of the upper atmosphere. The IUGONET institutes/universities have been collecting various types of data by radars, magnetometers, photometers, radio telescopes, helioscopes, etc. at various locations all over the world and at various altitude layers from the Earth's surface to the Sun. The metadata database will be of great help to researchers in efficiently finding and obtaining these observational data spread over the institutes/universities. This should also facilitate synthetic analysis of multi-disciplinary data, which will lead to new types of research in the upper atmosphere. The project has also been developing a software to help researchers download, visualize, and analyze the data provided from the IUGONET institutes/universities. The metadata database system is built on the platform of DSpace, which is an open source software for digital repositories. The data analysis software is written in the IDL language with the TDAS (THEMIS Data Analysis Software suite) library. These products have been just released for beta-testing.

  12. UFMulti: A new parallel processing software system for HEP

    Science.gov (United States)

    Avery, Paul; White, Andrew

    1989-12-01

    UFMulti is a multiprocessing software package designed for general purpose high energy physics applications, including physics and detector simulation, data reduction and DST physics analysis. The system is particularly well suited for installations where several workstation or computers are connected through a local area network (LAN). The initial configuration of the software is currently running on VAX/VMS machines with a planned extension to ULTRIX, using the new RISC CPUs from Digital, in the near future.

  13. UFMULTI: A new parallel processing software system for HEP

    International Nuclear Information System (INIS)

    Avery, P.; White, A.

    1989-01-01

    UFMulti is a multiprocessing software package designed for general purpose high energy physics applications, including physics and detector simulation, data reduction and DST physics analysis. The system is particularly well suited for installations where several workstations or computers are connected through a local area network (LAN). The initial configuration of the software is currently running on VAX/VMS machines with a planned extension to ULTRIX, using the new RISC CPUs from Digital, in the near future. (orig.)

  14. Fault Tree Analysis for Safety/Security Verification in Aviation Software

    Directory of Open Access Journals (Sweden)

    Andrew J. Kornecki

    2013-01-01

    Full Text Available The Next Generation Air Traffic Management system (NextGen is a blueprint of the future National Airspace System. Supporting NextGen is a nation-wide Aviation Simulation Network (ASN, which allows integration of a variety of real-time simulations to facilitate development and validation of the NextGen software by simulating a wide range of operational scenarios. The ASN system is an environment, including both simulated and human-in-the-loop real-life components (pilots and air traffic controllers. Real Time Distributed Simulation (RTDS developed at Embry Riddle Aeronautical University, a suite of applications providing low and medium fidelity en-route simulation capabilities, is one of the simulations contributing to the ASN. To support the interconnectivity with the ASN, we designed and implemented a dedicated gateway acting as an intermediary, providing logic for two-way communication and transfer messages between RTDS and ASN and storage for the exchanged data. It has been necessary to develop and analyze safety/security requirements for the gateway software based on analysis of system assets, hazards, threats and attacks related to ultimate real-life future implementation. Due to the nature of the system, the focus was placed on communication security and the related safety of the impacted aircraft in the simulation scenario. To support development of safety/security requirements, a well-established fault tree analysis technique was used. This fault tree model-based analysis, supported by a commercial tool, was a foundation to propose mitigations assuring the gateway system safety and security. 

  15. Standardizing Activation Analysis: New Software for Photon Activation Analysis

    Science.gov (United States)

    Sun, Z. J.; Wells, D.; Segebade, C.; Green, J.

    2011-06-01

    Photon Activation Analysis (PAA) of environmental, archaeological and industrial samples requires extensive data analysis that is susceptible to error. For the purpose of saving time, manpower and minimizing error, a computer program was designed, built and implemented using SQL, Access 2007 and asp.net technology to automate this process. Based on the peak information of the spectrum and assisted by its PAA library, the program automatically identifies elements in the samples and calculates their concentrations and respective uncertainties. The software also could be operated in browser/server mode, which gives the possibility to use it anywhere the internet is accessible. By switching the nuclide library and the related formula behind, the new software can be easily expanded to neutron activation analysis (NAA), charged particle activation analysis (CPAA) or proton-induced X-ray emission (PIXE). Implementation of this would standardize the analysis of nuclear activation data. Results from this software were compared to standard PAA analysis with excellent agreement. With minimum input from the user, the software has proven to be fast, user-friendly and reliable.

  16. Standardizing Activation Analysis: New Software for Photon Activation Analysis

    International Nuclear Information System (INIS)

    Sun, Z. J.; Wells, D.; Green, J.; Segebade, C.

    2011-01-01

    Photon Activation Analysis (PAA) of environmental, archaeological and industrial samples requires extensive data analysis that is susceptible to error. For the purpose of saving time, manpower and minimizing error, a computer program was designed, built and implemented using SQL, Access 2007 and asp.net technology to automate this process. Based on the peak information of the spectrum and assisted by its PAA library, the program automatically identifies elements in the samples and calculates their concentrations and respective uncertainties. The software also could be operated in browser/server mode, which gives the possibility to use it anywhere the internet is accessible. By switching the nuclide library and the related formula behind, the new software can be easily expanded to neutron activation analysis (NAA), charged particle activation analysis (CPAA) or proton-induced X-ray emission (PIXE). Implementation of this would standardize the analysis of nuclear activation data. Results from this software were compared to standard PAA analysis with excellent agreement. With minimum input from the user, the software has proven to be fast, user-friendly and reliable.

  17. The Development of an Analyses-Intensive Software for Improved ...

    African Journals Online (AJOL)

    The computer-aided software developed in this research work is used in designing cam systems by generating various follower motions and cam profiles. It is highly suited for extensive dynamics, kinematics and geometric design analysis based on some inherent features that are unique. The plate cam with either flat-face ...

  18. Analysis of open source GIS software

    OpenAIRE

    Božnis, Andrius

    2006-01-01

    GIS is one of the most perspective information technology sciences sphere. GIS conjuncts the digital image analysis and data base systems. This makes GIS wide applicable and very high skills demanding system. There is a lot of commercial GIS software which is well advertised and which functionality is pretty well known, while open source software is forgotten. In this diploma work is made analysis of available open source GIS software on the Internet, in the scope of different projects interr...

  19. Journal and Wave Bearing Impedance Calculation Software

    Science.gov (United States)

    Hanford, Amanda; Campbell, Robert

    2012-01-01

    The wave bearing software suite is a MALTA application that computes bearing properties for user-specified wave bearing conditions, as well as plain journal bearings. Wave bearings are fluid film journal bearings with multi-lobed wave patterns around the circumference of the bearing surface. In this software suite, the dynamic coefficients are outputted in a way for easy implementation in a finite element model used in rotor dynamics analysis. The software has a graphical user interface (GUI) for inputting bearing geometry parameters, and uses MATLAB s structure interface for ease of interpreting data. This innovation was developed to provide the stiffness and damping components of wave bearing impedances. The computational method for computing bearing coefficients was originally designed for plain journal bearings and tilting pad bearings. Modifications to include a wave bearing profile consisted of changing the film thickness profile given by an equation, and writing an algorithm to locate the integration limits for each fluid region. Careful consideration was needed to implement the correct integration limits while computing the dynamic coefficients, depending on the form of the input/output variables specified in the algorithm.

  20. Software Design for Smile Analysis

    Directory of Open Access Journals (Sweden)

    A. Sarkhosh

    2010-12-01

    Full Text Available Introduction: Esthetics and attractiveness of the smile is one of the major demands in contemporary orthodontic treatment. In order to improve a smile design, it is necessary to record “posed smile” as an intentional, non-pressure, static, natural and reproduciblesmile. The record then should be analyzed to determine its characteristics. In this study,we intended to design and introduce a software to analyze the smile rapidly and precisely in order to produce an attractive smile for the patients.Materials and Methods: For this purpose, a practical study was performed to design multimedia software “Smile Analysis” which can receive patients’ photographs and videographs. After giving records to the software, the operator should mark the points and lines which are displayed on the system’s guide and also define the correct scale for each image. Thirty-three variables are measured by the software and displayed on the report page. Reliability of measurements in both image and video was significantly high(=0.7-1.Results: In order to evaluate intra- operator and inter-operator reliability, five cases were selected randomly. Statistical analysis showed that calculations performed in smile analysis software were both valid and highly reliable (for both video and photo.Conclusion: The results obtained from smile analysis could be used in diagnosis,treatment planning and evaluation of the treatment progress.

  1. Intercomparison of gamma ray analysis software packages

    International Nuclear Information System (INIS)

    1998-04-01

    The IAEA undertook an intercomparison exercise to review available software for gamma ray spectra analysis. This document describes the methods used in the intercomparison exercise, characterizes the software packages reviewed and presents the results obtained. Only direct results are given without any recommendation for a particular software or method for gamma ray spectra analysis

  2. Software safety analysis techniques for developing safety critical software in the digital protection system of the LMR

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jang Soo; Cheon, Se Woo; Kim, Chang Hoi; Sim, Yun Sub

    2001-02-01

    This report has described the software safety analysis techniques and the engineering guidelines for developing safety critical software to identify the state of the art in this field and to give the software safety engineer a trail map between the code and standards layer and the design methodology and documents layer. We have surveyed the management aspects of software safety activities during the software lifecycle in order to improve the safety. After identifying the conventional safety analysis techniques for systems, we have surveyed in details the software safety analysis techniques, software FMEA(Failure Mode and Effects Analysis), software HAZOP(Hazard and Operability Analysis), and software FTA(Fault Tree Analysis). We have also surveyed the state of the art in the software reliability assessment techniques. The most important results from the reliability techniques are not the specific probability numbers generated, but the insights into the risk importance of software features. To defend against potential common-mode failures, high quality, defense-in-depth, and diversity are considered to be key elements in digital I and C system design. To minimize the possibility of CMFs and thus increase the plant reliability, we have provided D-in-D and D analysis guidelines.

  3. Software safety analysis techniques for developing safety critical software in the digital protection system of the LMR

    International Nuclear Information System (INIS)

    Lee, Jang Soo; Cheon, Se Woo; Kim, Chang Hoi; Sim, Yun Sub

    2001-02-01

    This report has described the software safety analysis techniques and the engineering guidelines for developing safety critical software to identify the state of the art in this field and to give the software safety engineer a trail map between the code and standards layer and the design methodology and documents layer. We have surveyed the management aspects of software safety activities during the software lifecycle in order to improve the safety. After identifying the conventional safety analysis techniques for systems, we have surveyed in details the software safety analysis techniques, software FMEA(Failure Mode and Effects Analysis), software HAZOP(Hazard and Operability Analysis), and software FTA(Fault Tree Analysis). We have also surveyed the state of the art in the software reliability assessment techniques. The most important results from the reliability techniques are not the specific probability numbers generated, but the insights into the risk importance of software features. To defend against potential common-mode failures, high quality, defense-in-depth, and diversity are considered to be key elements in digital I and C system design. To minimize the possibility of CMFs and thus increase the plant reliability, we have provided D-in-D and D analysis guidelines

  4. Computing and software

    Directory of Open Access Journals (Sweden)

    White, G. C.

    2004-06-01

    Full Text Available The reality is that the statistical methods used for analysis of data depend upon the availability of software. Analysis of marked animal data is no different than the rest of the statistical field. The methods used for analysis are those that are available in reliable software packages. Thus, the critical importance of having reliable, up–to–date software available to biologists is obvious. Statisticians have continued to develop more robust models, ever expanding the suite of potential analysis methods available. But without software to implement these newer methods, they will languish in the abstract, and not be applied to the problems deserving them. In the Computers and Software Session, two new software packages are described, a comparison of implementation of methods for the estimation of nest survival is provided, and a more speculative paper about how the next generation of software might be structured is presented. Rotella et al. (2004 compare nest survival estimation with different software packages: SAS logistic regression, SAS non–linear mixed models, and Program MARK. Nests are assumed to be visited at various, possibly infrequent, intervals. All of the approaches described compute nest survival with the same likelihood, and require that the age of the nest is known to account for nests that eventually hatch. However, each approach offers advantages and disadvantages, explored by Rotella et al. (2004. Efford et al. (2004 present a new software package called DENSITY. The package computes population abundance and density from trapping arrays and other detection methods with a new and unique approach. DENSITY represents the first major addition to the analysis of trapping arrays in 20 years. Barker & White (2004 discuss how existing software such as Program MARK require that each new model’s likelihood must be programmed specifically for that model. They wishfully think that future software might allow the user to combine

  5. Effectiveness comparison of partially executed t-way test suite based generated by existing strategies

    Science.gov (United States)

    Othman, Rozmie R.; Ahmad, Mohd Zamri Zahir; Ali, Mohd Shaiful Aziz Rashid; Zakaria, Hasneeza Liza; Rahman, Md. Mostafijur

    2015-05-01

    Consuming 40 to 50 percent of software development cost, software testing is one of the most resource consuming activities in software development lifecycle. To ensure an acceptable level of quality and reliability of a typical software product, it is desirable to test every possible combination of input data under various configurations. Due to combinatorial explosion problem, considering all exhaustive testing is practically impossible. Resource constraints, costing factors as well as strict time-to-market deadlines are amongst the main factors that inhibit such consideration. Earlier work suggests that sampling strategy (i.e. based on t-way parameter interaction or called as t-way testing) can be effective to reduce number of test cases without effecting the fault detection capability. However, for a very large system, even t-way strategy will produce a large test suite that need to be executed. In the end, only part of the planned test suite can be executed in order to meet the aforementioned constraints. Here, there is a need for test engineers to measure the effectiveness of partially executed test suite in order for them to assess the risk they have to take. Motivated by the abovementioned problem, this paper presents the effectiveness comparison of partially executed t-way test suite generated by existing strategies using tuples coverage method. Here, test engineers can predict the effectiveness of the testing process if only part of the original test cases is executed.

  6. DelPhi: a comprehensive suite for DelPhi software and associated resources

    Directory of Open Access Journals (Sweden)

    Li Lin

    2012-05-01

    Full Text Available Abstract Background Accurate modeling of electrostatic potential and corresponding energies becomes increasingly important for understanding properties of biological macromolecules and their complexes. However, this is not an easy task due to the irregular shape of biological entities and the presence of water and mobile ions. Results Here we report a comprehensive suite for the well-known Poisson-Boltzmann solver, DelPhi, enriched with additional features to facilitate DelPhi usage. The suite allows for easy download of both DelPhi executable files and source code along with a makefile for local installations. The users can obtain the DelPhi manual and parameter files required for the corresponding investigation. Non-experienced researchers can download examples containing all necessary data to carry out DelPhi runs on a set of selected examples illustrating various DelPhi features and demonstrating DelPhi’s accuracy against analytical solutions. Conclusions DelPhi suite offers not only the DelPhi executable and sources files, examples and parameter files, but also provides links to third party developed resources either utilizing DelPhi or providing plugins for DelPhi. In addition, the users and developers are offered a forum to share ideas, resolve issues, report bugs and seek help with respect to the DelPhi package. The resource is available free of charge for academic users from URL: http://compbio.clemson.edu/DelPhi.php.

  7. An Overview of the XGAM Code and Related Software for Gamma-ray Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Younes, W. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2014-11-13

    The XGAM spectrum-fitting code and associated software were developed specifically to analyze the complex gamma-ray spectra that can result from neutron-induced reactions. The XGAM code is designed to fit a spectrum over the entire available gamma-ray energy range as a single entity, in contrast to the more traditional piecewise approaches. This global-fit philosophy enforces background continuity as well as consistency between local and global behavior throughout the spectrum, and in a natural way. This report presents XGAM and the suite of programs built around it with an emphasis on how they fit into an overall analysis methodology for complex gamma-ray data. An application to the analysis of time-dependent delayed gamma-ray yields from 235U fission is shown in order to showcase the codes and how they interact.

  8. Performance testing of 3D point cloud software

    Science.gov (United States)

    Varela-González, M.; González-Jorge, H.; Riveiro, B.; Arias, P.

    2013-10-01

    LiDAR systems are being used widely in recent years for many applications in the engineering field: civil engineering, cultural heritage, mining, industry and environmental engineering. One of the most important limitations of this technology is the large computational requirements involved in data processing, especially for large mobile LiDAR datasets. Several software solutions for data managing are available in the market, including open source suites, however, users often unknown methodologies to verify their performance properly. In this work a methodology for LiDAR software performance testing is presented and four different suites are studied: QT Modeler, VR Mesh, AutoCAD 3D Civil and the Point Cloud Library running in software developed at the University of Vigo (SITEGI). The software based on the Point Cloud Library shows better results in the loading time of the point clouds and CPU usage. However, it is not as strong as commercial suites in working set and commit size tests.

  9. Performance testing of 3D point cloud software

    Directory of Open Access Journals (Sweden)

    M. Varela-González

    2013-10-01

    Full Text Available LiDAR systems are being used widely in recent years for many applications in the engineering field: civil engineering, cultural heritage, mining, industry and environmental engineering. One of the most important limitations of this technology is the large computational requirements involved in data processing, especially for large mobile LiDAR datasets. Several software solutions for data managing are available in the market, including open source suites, however, users often unknown methodologies to verify their performance properly. In this work a methodology for LiDAR software performance testing is presented and four different suites are studied: QT Modeler, VR Mesh, AutoCAD 3D Civil and the Point Cloud Library running in software developed at the University of Vigo (SITEGI. The software based on the Point Cloud Library shows better results in the loading time of the point clouds and CPU usage. However, it is not as strong as commercial suites in working set and commit size tests.

  10. Software criticality analysis of COTS/SOUP

    Energy Technology Data Exchange (ETDEWEB)

    Bishop, Peter; Bloomfield, Robin; Clement, Tim; Guerra, Sofia

    2003-09-01

    This paper describes the Software Criticality Analysis (SCA) approach that was developed to support the justification of using commercial off-the-shelf software (COTS) in a safety-related system. The primary objective of SCA is to assess the importance to safety of the software components within the COTS and to show there is segregation between software components with different safety importance. The approach taken was a combination of Hazops based on design documents and on a detailed analysis of the actual code (100 kloc). Considerable effort was spent on validation and ensuring the conservative nature of the results. The results from reverse engineering from the code showed that results based only on architecture and design documents would have been misleading.

  11. Software criticality analysis of COTS/SOUP

    International Nuclear Information System (INIS)

    Bishop, Peter; Bloomfield, Robin; Clement, Tim; Guerra, Sofia

    2003-01-01

    This paper describes the Software Criticality Analysis (SCA) approach that was developed to support the justification of using commercial off-the-shelf software (COTS) in a safety-related system. The primary objective of SCA is to assess the importance to safety of the software components within the COTS and to show there is segregation between software components with different safety importance. The approach taken was a combination of Hazops based on design documents and on a detailed analysis of the actual code (100 kloc). Considerable effort was spent on validation and ensuring the conservative nature of the results. The results from reverse engineering from the code showed that results based only on architecture and design documents would have been misleading

  12. An Integrated Software Suite for Surface-based Analyses of Cerebral Cortex

    Science.gov (United States)

    Van Essen, David C.; Drury, Heather A.; Dickson, James; Harwell, John; Hanlon, Donna; Anderson, Charles H.

    2001-01-01

    The authors describe and illustrate an integrated trio of software programs for carrying out surface-based analyses of cerebral cortex. The first component of this trio, SureFit (Surface Reconstruction by Filtering and Intensity Transformations), is used primarily for cortical segmentation, volume visualization, surface generation, and the mapping of functional neuroimaging data onto surfaces. The second component, Caret (Computerized Anatomical Reconstruction and Editing Tool Kit), provides a wide range of surface visualization and analysis options as well as capabilities for surface flattening, surface-based deformation, and other surface manipulations. The third component, SuMS (Surface Management System), is a database and associated user interface for surface-related data. It provides for efficient insertion, searching, and extraction of surface and volume data from the database. PMID:11522765

  13. An integrated software suite for surface-based analyses of cerebral cortex

    Science.gov (United States)

    Van Essen, D. C.; Drury, H. A.; Dickson, J.; Harwell, J.; Hanlon, D.; Anderson, C. H.

    2001-01-01

    The authors describe and illustrate an integrated trio of software programs for carrying out surface-based analyses of cerebral cortex. The first component of this trio, SureFit (Surface Reconstruction by Filtering and Intensity Transformations), is used primarily for cortical segmentation, volume visualization, surface generation, and the mapping of functional neuroimaging data onto surfaces. The second component, Caret (Computerized Anatomical Reconstruction and Editing Tool Kit), provides a wide range of surface visualization and analysis options as well as capabilities for surface flattening, surface-based deformation, and other surface manipulations. The third component, SuMS (Surface Management System), is a database and associated user interface for surface-related data. It provides for efficient insertion, searching, and extraction of surface and volume data from the database.

  14. Acoustic Emission Analysis Applet (AEAA) Software

    Science.gov (United States)

    Nichols, Charles T.; Roth, Don J.

    2013-01-01

    NASA Glenn Research and NASA White Sands Test Facility have developed software supporting an automated pressure vessel structural health monitoring (SHM) system based on acoustic emissions (AE). The software, referred to as the Acoustic Emission Analysis Applet (AEAA), provides analysts with a tool that can interrogate data collected on Digital Wave Corp. and Physical Acoustics Corp. software using a wide spectrum of powerful filters and charts. This software can be made to work with any data once the data format is known. The applet will compute basic AE statistics, and statistics as a function of time and pressure (see figure). AEAA provides value added beyond the analysis provided by the respective vendors' analysis software. The software can handle data sets of unlimited size. A wide variety of government and commercial applications could benefit from this technology, notably requalification and usage tests for compressed gas and hydrogen-fueled vehicles. Future enhancements will add features similar to a "check engine" light on a vehicle. Once installed, the system will ultimately be used to alert International Space Station crewmembers to critical structural instabilities, but will have little impact to missions otherwise. Diagnostic information could then be transmitted to experienced technicians on the ground in a timely manner to determine whether pressure vessels have been impacted, are structurally unsound, or can be safely used to complete the mission.

  15. Software safety analysis practice in installation phase

    Energy Technology Data Exchange (ETDEWEB)

    Huang, H. W.; Chen, M. H.; Shyu, S. S., E-mail: hwhwang@iner.gov.t [Institute of Nuclear Energy Research, No. 1000 Wenhua Road, Chiaan Village, Longtan Township, 32546 Taoyuan County, Taiwan (China)

    2010-10-15

    This work performed a software safety analysis in the installation phase of the Lung men nuclear power plant in Taiwan, under the cooperation of Institute of Nuclear Energy Research and Tpc. The US Nuclear Regulatory Commission requests licensee to perform software safety analysis and software verification and validation in each phase of software development life cycle with Branch Technical Position 7-14. In this work, 37 safety grade digital instrumentation and control systems were analyzed by failure mode and effects analysis, which is suggested by IEEE standard 7-4.3.2-2003. During the installation phase, skew tests for safety grade network and point to point tests were performed. The failure mode and effects analysis showed all the single failure modes can be resolved by the redundant means. Most of the common mode failures can be resolved by operator manual actions. (Author)

  16. Software safety analysis practice in installation phase

    International Nuclear Information System (INIS)

    Huang, H. W.; Chen, M. H.; Shyu, S. S.

    2010-10-01

    This work performed a software safety analysis in the installation phase of the Lung men nuclear power plant in Taiwan, under the cooperation of Institute of Nuclear Energy Research and Tpc. The US Nuclear Regulatory Commission requests licensee to perform software safety analysis and software verification and validation in each phase of software development life cycle with Branch Technical Position 7-14. In this work, 37 safety grade digital instrumentation and control systems were analyzed by failure mode and effects analysis, which is suggested by IEEE standard 7-4.3.2-2003. During the installation phase, skew tests for safety grade network and point to point tests were performed. The failure mode and effects analysis showed all the single failure modes can be resolved by the redundant means. Most of the common mode failures can be resolved by operator manual actions. (Author)

  17. Measuring CMS Software Performance in the first years of LHC collisions

    CERN Document Server

    Benelli, Gabriele; Pfeiffer, Andreas; Piparo, Danilo; Zemleris, Vidmantas

    2011-01-01

    The CMSSW software framework is a complex project enabling the CMS collaboration to investigate the fast growing LHC collision data sample. A software performance suite of tools has been developed and integrated in CMSSW to keep track of cpu time, memory footprint and event size on disk. These three metrics are key constraints in software development in order to meet the computing requirements used in the planning and management of the CMS computing infrastructure. The performance suite allows the measurement and tracking of the performance across the framework, publishing the results in a dedicated database. A web application makes the results easily accessible to software release managers allowing for automatic integration in CMSSW release cycle quality assurance. The performance suite is also available to individual developers for dedicated code optimization and the web application allows historic regression and comparisons across releases. The performance suite tools and the performance of the CMSSW frame...

  18. Software development for teleroentgenogram analysis

    Science.gov (United States)

    Goshkoderov, A. A.; Khlebnikov, N. A.; Obabkov, I. N.; Serkov, K. V.; Gajniyarov, I. M.; Aliev, A. A.

    2017-09-01

    A framework for the analysis and calculation of teleroentgenograms was developed. Software development was carried out in the Department of Children's Dentistry and Orthodontics in Ural State Medical University. The software calculates the teleroentgenogram by the original method which was developed in this medical department. Program allows designing its own methods for calculating the teleroentgenograms by new methods. It is planned to use the technology of machine learning (Neural networks) in the software. This will help to make the process of calculating the teleroentgenograms easier because methodological points will be placed automatically.

  19. Performance testing of LiDAR exploitation software

    Science.gov (United States)

    Varela-González, M.; González-Jorge, H.; Riveiro, B.; Arias, P.

    2013-04-01

    Mobile LiDAR systems are being used widely in recent years for many applications in the field of geoscience. One of most important limitations of this technology is the large computational requirements involved in data processing. Several software solutions for data processing are available in the market, but users are often unknown about the methodologies to verify their performance accurately. In this work a methodology for LiDAR software performance testing is presented and six different suites are studied: QT Modeler, AutoCAD Civil 3D, Mars 7, Fledermaus, Carlson and TopoDOT (all of them in x64). Results depict as QTModeler, TopoDOT and AutoCAD Civil 3D allow the loading of large datasets, while Fledermaus, Mars7 and Carlson do not achieve these powerful performance. AutoCAD Civil 3D needs large loading time in comparison with the most powerful softwares such as QTModeler and TopoDOT. Carlson suite depicts the poorest results among all the softwares under study, where point clouds larger than 5 million points cannot be loaded and loading time is very large in comparison with the other suites even for the smaller datasets. AutoCAD Civil 3D, Carlson and TopoDOT show more threads than other softwares like QTModeler, Mars7 and Fledermaus.

  20. A coherent environment of software improvement tools for CMS

    International Nuclear Information System (INIS)

    Eulisse, G.; Muzaffar, S.; Osborne, I.; Taylor, L.; Tuura, L.A.

    2004-01-01

    CMS has developed approximately one million lines of C++ code and uses many more from HEP, Grid and public domain projects. We describe a suite of tools which help to manage this complexity by measuring software dependencies, quality metrics, and CPU and memory performance. This coherent environment integrates and extends existing open-source tools where possible and provides new in-house components where a suitable solution does not already exist. This is a freely available environment with graphical user interface which can be run on any software without the need to recompile or instrument it. We have developed ignominy which performs software dependency analysis of source code, binary products and external software. CPU profiling is provided based on oprofile, with added features such as profile snapshots, distributed profiling and aggregate profiles for farm systems including server-side tools for collecting profile data. Finally, we have developed a low-overhead performance and memory profiling tool, MemProf, which can perform (gprof-style) hierarchical performance profiling, in a way that works with multiple threads and dynamically loaded libraries (unlike gprof). It also gathers exact memory allocation profiles including which code allocates most, in what sizes of chunks, for how long, where the memory is getting freed and where it is getting leaked. We describe this tool suite and how it has been used to enhance the quality of CMS software

  1. Numerical methods in software and analysis

    CERN Document Server

    Rice, John R

    1992-01-01

    Numerical Methods, Software, and Analysis, Second Edition introduces science and engineering students to the methods, tools, and ideas of numerical computation. Introductory courses in numerical methods face a fundamental problem-there is too little time to learn too much. This text solves that problem by using high-quality mathematical software. In fact, the objective of the text is to present scientific problem solving using standard mathematical software. This book discusses numerous programs and software packages focusing on the IMSL library (including the PROTRAN system) and ACM Algorithm

  2. FPGAs for software programmers

    CERN Document Server

    Hannig, Frank; Ziener, Daniel

    2016-01-01

    This book makes powerful Field Programmable Gate Array (FPGA) and reconfigurable technology accessible to software engineers by covering different state-of-the-art high-level synthesis approaches (e.g., OpenCL and several C-to-gates compilers). It introduces FPGA technology, its programming model, and how various applications can be implemented on FPGAs without going through low-level hardware design phases. Readers will get a realistic sense for problems that are suited for FPGAs and how to implement them from a software designer’s point of view. The authors demonstrate that FPGAs and their programming model reflect the needs of stream processing problems much better than traditional CPU or GPU architectures, making them well-suited for a wide variety of systems, from embedded systems performing sensor processing to large setups for Big Data number crunching. This book serves as an invaluable tool for software designers and FPGA design engineers who are interested in high design productivity through behavi...

  3. Software for Graph Analysis and Visualization

    Directory of Open Access Journals (Sweden)

    M. I. Kolomeychenko

    2014-01-01

    Full Text Available This paper describes the software for graph storage, analysis and visualization. The article presents a comparative analysis of existing software for analysis and visualization of graphs, describes the overall architecture of application and basic principles of construction and operation of the main modules. Furthermore, a description of the developed graph storage oriented to storage and processing of large-scale graphs is presented. The developed algorithm for finding communities and implemented algorithms of autolayouts of graphs are the main functionality of the product. The main advantage of the developed software is high speed processing of large size networks (up to millions of nodes and links. Moreover, the proposed graph storage architecture is unique and has no analogues. The developed approaches and algorithms are optimized for operating with big graphs and have high productivity.

  4. The VI-Suite: a set of environmental analysis tools with geospatial data applications

    NARCIS (Netherlands)

    Southall, Ryan; Biljecki, F.

    2017-01-01

    Background: The VI-Suite is a free and open-source addon for the 3D content creation application Blender, developed primarily as a tool for the contextual and performative analysis of buildings. Its functionality has grown from simple, static lighting analysis to fully parametric lighting,

  5. GWAMA: software for genome-wide association meta-analysis

    Directory of Open Access Journals (Sweden)

    Mägi Reedik

    2010-05-01

    Full Text Available Abstract Background Despite the recent success of genome-wide association studies in identifying novel loci contributing effects to complex human traits, such as type 2 diabetes and obesity, much of the genetic component of variation in these phenotypes remains unexplained. One way to improving power to detect further novel loci is through meta-analysis of studies from the same population, increasing the sample size over any individual study. Although statistical software analysis packages incorporate routines for meta-analysis, they are ill equipped to meet the challenges of the scale and complexity of data generated in genome-wide association studies. Results We have developed flexible, open-source software for the meta-analysis of genome-wide association studies. The software incorporates a variety of error trapping facilities, and provides a range of meta-analysis summary statistics. The software is distributed with scripts that allow simple formatting of files containing the results of each association study and generate graphical summaries of genome-wide meta-analysis results. Conclusions The GWAMA (Genome-Wide Association Meta-Analysis software has been developed to perform meta-analysis of summary statistics generated from genome-wide association studies of dichotomous phenotypes or quantitative traits. Software with source files, documentation and example data files are freely available online at http://www.well.ox.ac.uk/GWAMA.

  6. pcircle - A Suite of Scalable Parallel File System Tools

    Energy Technology Data Exchange (ETDEWEB)

    2015-10-01

    Most of the software related to file system are written for conventional local file system, they are serialized and can't take advantage of the benefit of a large scale parallel file system. "pcircle" software builds on top of ubiquitous MPI in cluster computing environment and "work-stealing" pattern to provide a scalable, high-performance suite of file system tools. In particular - it implemented parallel data copy and parallel data checksumming, with advanced features such as async progress report, checkpoint and restart, as well as integrity checking.

  7. A new paradigm for the development of analysis software

    International Nuclear Information System (INIS)

    Kelly, D.; Harauz, J.

    2012-01-01

    For the CANDU industry, analysis software is an important tool for scientists and engineers to examine issues related to safety, operation, and design. However, the software quality assurance approach currently used for these tools assumes the software is the delivered product. In this paper, we present a model that shifts the emphasis from software being the end-product to software being support for the end-product, the science. We describe a novel software development paradigm that supports this shift and provides the groundwork for re-examining the quality assurance practices used for analysis software. (author)

  8. Adobe Creative Suite 6 Design and Web Premium all-in-one for dummies

    CERN Document Server

    Smith, Jennifer; Gerantabee, Fred

    2012-01-01

    The must-have book on the leading suite of software for graphic and web designers Fully revised and updated this hands-on resource offers a one-stop learning opportunity through eight mini-book dedicated to each product inside Adobe's Design & Web Premium Suite. The mini-books include Adobe Creative Suite Basics, InDesign, Illustrator, Photoshop, Acrobat, Dreamweaver, Flash, Fireworks. The book may contain new image enhancements to After Effects, 64-bit versions of Illustrator and Flash Professional, and a new tool, dubbed Helium, that will enable designers to create content using HTML5 and

  9. ANALYSIS OF DESIGN ELEMENTS IN SKI SUITS

    Directory of Open Access Journals (Sweden)

    Birsen Çileroğlu

    2014-06-01

    Full Text Available Popularity of Ski Sport in 19th century necessitated a new perspective on protective skiing clothing ag ainst the mountain climates and excessive cold. Winter clothing were the basis of ski attire during this period. By the beginning of 20th century lining cloth were used to minimize the wind effect. The difference between the men and women’s ski attire of the time consisted of a knee - length skirts worn over the golf trousers. Subsequent to the First World War, skiing suit models were influenced by the period uniforms and the producers reflected the fashion trends to the ski clothing. In conformance with th e prevailing trends, ski trousers were designed and produced for the women thus leading to reduction in gender differences. Increases in the ski tourism and holding of the first winter olympics in 1924 resulted in variations in ski attires, development of design characteristics, growth in user numbers, and enlargement of production capacities. Designers emphasized in their collections combined presence of elegance and practicality in the skiing attire. In 1930s, the ski suits influenced by pilots’ uniforms included characteristics permitting freedom of motion, and the design elements exhibited changes in terms of style, material and aerodynamics. In time, the ski attires showed varying design features distinguishing professionals from the amateurs. While protective functionality was primary consideration for the amateurs, for professionals the aerodynamic design was also a leading factor. Eventually, the increased differences in design characteristics were exhibited in ski suit collections, World reknown brands were formed, production and sales volumes showed significant rise. During 20th century the ski suits influenced by fashion trends to acquire unique styles reached a position of dominance to impact current fashion trends, and apart from sports attir es they became a style determinant in the clothing of cold climates. Ski suits

  10. A proposal for performing software safety hazard analysis

    International Nuclear Information System (INIS)

    Lawrence, J.D.; Gallagher, J.M.

    1997-01-01

    Techniques for analyzing the safety and reliability of analog-based electronic protection systems that serve to mitigate hazards in process control systems have been developed over many years, and are reasonably understood. An example is the protection system in a nuclear power plant. The extension of these techniques to systems which include digital computers is not well developed, and there is little consensus among software engineering experts and safety experts on how to analyze such systems. One possible technique is to extend hazard analysis to include digital computer-based systems. Software is frequently overlooked during system hazard analyses, but this is unacceptable when the software is in control of a potentially hazardous operation. In such cases, hazard analysis should be extended to fully cover the software. A method for performing software hazard analysis is proposed in this paper. The method concentrates on finding hazards during the early stages of the software life cycle, using an extension of HAZOP

  11. Software testing for evolutionary iterative rapid prototyping

    OpenAIRE

    Davis, Edward V., Jr.

    1990-01-01

    Approved for public release; distribution unlimited. Rapid prototyping is emerging as a promising software development paradigm. It provides a systematic and automatable means of developing a software system under circumstances where initial requirements are not well known or where requirements change frequently during development. To provide high software quality assurance requires sufficient software testing. The unique nature of evolutionary iterative prototyping is not well-suited for ...

  12. Software safety analysis application in installation phase

    International Nuclear Information System (INIS)

    Huang, H. W.; Yih, S.; Wang, L. H.; Liao, B. C.; Lin, J. M.; Kao, T. M.

    2010-01-01

    This work performed a software safety analysis (SSA) in the installation phase of the Lungmen nuclear power plant (LMNPP) in Taiwan, under the cooperation of INER and TPC. The US Nuclear Regulatory Commission (USNRC) requests licensee to perform software safety analysis (SSA) and software verification and validation (SV and V) in each phase of software development life cycle with Branch Technical Position (BTP) 7-14. In this work, 37 safety grade digital instrumentation and control (I and C) systems were analyzed by Failure Mode and Effects Analysis (FMEA), which is suggested by IEEE Standard 7-4.3.2-2003. During the installation phase, skew tests for safety grade network and point to point tests were performed. The FMEA showed all the single failure modes can be resolved by the redundant means. Most of the common mode failures can be resolved by operator manual actions. (authors)

  13. Development of a New VLBI Data Analysis Software

    Science.gov (United States)

    Bolotin, Sergei; Gipson, John M.; MacMillan, Daniel S.

    2010-01-01

    We present an overview of a new VLBI analysis software under development at NASA GSFC. The new software will replace CALC/SOLVE and many related utility programs. It will have the capabilities of the current system as well as incorporate new models and data analysis techniques. In this paper we give a conceptual overview of the new software. We formulate the main goals of the software. The software should be flexible and modular to implement models and estimation techniques that currently exist or will appear in future. On the other hand it should be reliable and possess production quality for processing standard VLBI sessions. Also, it needs to be capable of processing observations from a fully deployed network of VLBI2010 stations in a reasonable time. We describe the software development process and outline the software architecture.

  14. Comprehensive understandings of energy confinement in LHD plasmas through extensive application of the integrated transport analysis suite

    International Nuclear Information System (INIS)

    Yokoyama, M.; Seki, R.; Suzuki, C.; Ida, K.; Osakabe, M.; Satake, S.; Yamada, H.; Murakami, S.

    2014-10-01

    The integrated transport analysis suite, TASK3D-a, has enhanced energy transport analyses in LHD. It has clearly elucidated (1) the systematic dependence of ion and electron energy confinement on wide variation of plasma parameters, and (2) statistically-derived fitting expressions for the ion and electron heat diffusivities (χ i and χ e ), separately, taking also those radial-profile information into account. In particular, the latter approach can outstrip the conventional scaling laws for the global confinement time (τ E ) in terms of its considerations on profiles (temperature, density, heating depositions etc.). This has been made possible with the analysis database accumulated by the extensive application of the integrated transport analysis suite to experiment data. In this proceeding, TASK3D-a analysis-database for high-ion-temperature (high-T i ) plasmas in LHD (Large Helical Device) are exemplified. This approach should be applicable to any other combinations of integrated transport analysis suites and fusion experiments. (author)

  15. Gendermetrics.NET: a novel software for analyzing the gender representation in scientific authoring.

    Science.gov (United States)

    Bendels, Michael H K; Brüggmann, Dörthe; Schöffel, Norman; Groneberg, David A

    2016-01-01

    Imbalances in female career promotion are believed to be strong in the field of academic science. A primary parameter to analyze gender inequalities is the gender authoring in scientific publications. Since the presently available data on gender distribution is largely limited to underpowered studies, we here develop a new approach to analyze authors' genders in large bibliometric databases. A SQL-Server based multiuser software suite was developed that serves as an integrative tool for analyzing bibliometric data with a special emphasis on gender and topographical analysis. The presented system allows seamless integration, inspection, modification, evaluation and visualization of bibliometric data. By providing an adaptive and almost fully automatic integration and analysis process, the inter-individual variability of analysis is kept at a low level. Depending on the scientific question, the system enables the user to perform a scientometric analysis including its visualization within a short period of time. In summary, a new software suite for analyzing gender representations in scientific articles was established. The system is suitable for the comparative analysis of scientific structures on the level of continents, countries, cities, city regions, institutions, research fields and journals.

  16. Long-term preservation of analysis software environment

    International Nuclear Information System (INIS)

    Toppe Larsen, Dag; Blomer, Jakob; Buncic, Predrag; Charalampidis, Ioannis; Haratyunyan, Artem

    2012-01-01

    Long-term preservation of scientific data represents a challenge to experiments, especially regarding the analysis software. Preserving data is not enough; the full software and hardware environment is needed. Virtual machines (VMs) make it possible to preserve hardware “in software”. A complete infrastructure package has been developed for easy deployment and management of VMs, based on CERN virtual machine (CernVM). Further, a HTTP-based file system, CernVM file system (CVMFS), is used for the distribution of the software. It is possible to process data with any given software version, and a matching, regenerated VM version. A point-and-click web user interface is being developed for setting up the complete processing chain, including VM and software versions, number and type of processing nodes, and the particular type of analysis and data. This paradigm also allows for distributed cloud-computing on private and public clouds, for both legacy and contemporary experiments.

  17. PIV/HPIV Film Analysis Software Package

    Science.gov (United States)

    Blackshire, James L.

    1997-01-01

    A PIV/HPIV film analysis software system was developed that calculates the 2-dimensional spatial autocorrelations of subregions of Particle Image Velocimetry (PIV) or Holographic Particle Image Velocimetry (HPIV) film recordings. The software controls three hardware subsystems including (1) a Kodak Megaplus 1.4 camera and EPIX 4MEG framegrabber subsystem, (2) an IEEE/Unidex 11 precision motion control subsystem, and (3) an Alacron I860 array processor subsystem. The software runs on an IBM PC/AT host computer running either the Microsoft Windows 3.1 or Windows 95 operating system. It is capable of processing five PIV or HPIV displacement vectors per second, and is completely automated with the exception of user input to a configuration file prior to analysis execution for update of various system parameters.

  18. [Pitfalls in informed consent: a statistical analysis of malpractice law suits].

    Science.gov (United States)

    Echigo, Junko

    2014-05-01

    In medical malpractice law suits, the notion of informed consent is often relevant in assessing whether negligence can be attributed to the medical practitioner who has caused injury to a patient. Furthermore, it is not rare that courts award damages for a lack of appropriate informed consent alone. In this study, two results were arrived at from a statistical analysis of medical malpractice law suits. One, unexpectedly, was that the severity of a patient's illness made no significant difference to whether damages were awarded. The other was that cases of typical medical treatment that national medical insurance does not cover were involved significantly more often than insured treatment cases. In cases where damages were awarded, the courts required more disclosure and written documents of information by medical practitioners, especially about complications and adverse effects that the patient might suffer.

  19. Software Quality Control at Belle II

    Science.gov (United States)

    Ritter, M.; Kuhr, T.; Hauth, T.; Gebard, T.; Kristof, M.; Pulvermacher, C.; Belle Software Group, II

    2017-10-01

    Over the last seven years the software stack of the next generation B factory experiment Belle II has grown to over one million lines of C++ and Python code, counting only the part included in offline software releases. There are several thousand commits to the central repository by about 100 individual developers per year. To keep a coherent software stack of high quality that it can be sustained and used efficiently for data acquisition, simulation, reconstruction, and analysis over the lifetime of the Belle II experiment is a challenge. A set of tools is employed to monitor the quality of the software and provide fast feedback to the developers. They are integrated in a machinery that is controlled by a buildbot master and automates the quality checks. The tools include different compilers, cppcheck, the clang static analyzer, valgrind memcheck, doxygen, a geometry overlap checker, a check for missing or extra library links, unit tests, steering file level tests, a sophisticated high-level validation suite, and an issue tracker. The technological development infrastructure is complemented by organizational means to coordinate the development.

  20. Inertial motion capture system for biomechanical analysis in pressure suits

    Science.gov (United States)

    Di Capua, Massimiliano

    A non-invasive system has been developed at the University of Maryland Space System Laboratory with the goal of providing a new capability for quantifying the motion of the human inside a space suit. Based on an array of six microprocessors and eighteen microelectromechanical (MEMS) inertial measurement units (IMUs), the Body Pose Measurement System (BPMS) allows the monitoring of the kinematics of the suit occupant in an unobtrusive, self-contained, lightweight and compact fashion, without requiring any external equipment such as those necessary with modern optical motion capture systems. BPMS measures and stores the accelerations, angular rates and magnetic fields acting upon each IMU, which are mounted on the head, torso, and each segment of each limb. In order to convert the raw data into a more useful form, such as a set of body segment angles quantifying pose and motion, a series of geometrical models and a non-linear complimentary filter were implemented. The first portion of this works focuses on assessing system performance, which was measured by comparing the BPMS filtered data against rigid body angles measured through an external VICON optical motion capture system. This type of system is the industry standard, and is used here for independent measurement of body pose angles. By comparing the two sets of data, performance metrics such as BPMS system operational conditions, accuracy, and drift were evaluated and correlated against VICON data. After the system and models were verified and their capabilities and limitations assessed, a series of pressure suit evaluations were conducted. Three different pressure suits were used to identify the relationship between usable range of motion and internal suit pressure. In addition to addressing range of motion, a series of exploration tasks were also performed, recorded, and analysed in order to identify different motion patterns and trajectories as suit pressure is increased and overall suit mobility is reduced

  1. Software Users Manual (SUM): Extended Testability Analysis (ETA) Tool

    Science.gov (United States)

    Maul, William A.; Fulton, Christopher E.

    2011-01-01

    This software user manual describes the implementation and use the Extended Testability Analysis (ETA) Tool. The ETA Tool is a software program that augments the analysis and reporting capabilities of a commercial-off-the-shelf (COTS) testability analysis software package called the Testability Engineering And Maintenance System (TEAMS) Designer. An initial diagnostic assessment is performed by the TEAMS Designer software using a qualitative, directed-graph model of the system being analyzed. The ETA Tool utilizes system design information captured within the diagnostic model and testability analysis output from the TEAMS Designer software to create a series of six reports for various system engineering needs. The ETA Tool allows the user to perform additional studies on the testability analysis results by determining the detection sensitivity to the loss of certain sensors or tests. The ETA Tool was developed to support design and development of the NASA Ares I Crew Launch Vehicle. The diagnostic analysis provided by the ETA Tool was proven to be valuable system engineering output that provided consistency in the verification of system engineering requirements. This software user manual provides a description of each output report generated by the ETA Tool. The manual also describes the example diagnostic model and supporting documentation - also provided with the ETA Tool software release package - that were used to generate the reports presented in the manual

  2. MEASUREMENT PROCESS OF SOFTWARE DEVELOPMENT PROJECTS FOR SUPPORTING STRATEGIC BUSINESS OBJECTIVES IN SOFTWARE DEVELOPING COMPANIES

    Directory of Open Access Journals (Sweden)

    Sandra Lais Pedroso

    2013-08-01

    Full Text Available Software developing companies work in a competitive market and are often challenged to make business decisions with impact on competitiveness. Models accessing maturity for software development processes quality, such as CMMI and MPS-BR, comprise process measurements systems (PMS. However, these models are not necessarily suitable to support business decisions, neither to achieve strategic goals. The objective of this work is to analyze how the PMS of software development projects could support business strategies for software developing companies. Results taken from this work show that PMS results from maturity models for software processes can be suited to help evaluating operating capabilities and supporting strategic business decisions.

  3. A development methodology for scientific software

    International Nuclear Information System (INIS)

    Cort, G.; Barrus, D.M.; Goldstone, J.A.; Miller, L.; Nelson, R.O.; Poore, R.V.

    1985-01-01

    We present the details of a software development methodology that addresses all phases of the software life cycle, yet is well suited for application by small projects with limited resources. The methodology has been developed at the Los Alamos Weapons Neutron Research (WNR) Facility and was utilized during the recent development of the WNR Data Acquisition Command Language. The methodology emphasizes the development and maintenance of comprehensive documentation for all software components. The impact of the methodology upon software quality and programmer productivity is assessed

  4. Software Architectures – Present and Visions

    Directory of Open Access Journals (Sweden)

    Catalin STRIMBEI

    2015-01-01

    Full Text Available Nowadays, architectural software systems are increasingly important because they can determine the success of the entire system. In this article we intend to rigorously analyze the most common types of systems architectures and present a personal opinion about the specifics of the university architecture. After analyzing monolithic architectures, SOA architecture and those of the micro- based services, we present specific issues and specific criteria for the university software systems. Each type of architecture is rundown and analyzed according to specific academic challenges. During the analysis, we took into account the factors that determine the success of each architecture and also the common causes of failure. At the end of the article, we objectively decide which architecture is best suited to be implemented in the university area.

  5. SUIT

    DEFF Research Database (Denmark)

    Algreen-Ussing, Gregers; Wedebrunn, Ola

    2003-01-01

    Leaflet om project SUIT udgivet af European Commission. Tryksagen forklarer i korte ord resultatet af projektet SUIT. Kulturværdier i Miljøspørgsmål. Vurdering af projekter og indvirkning på miljø....

  6. Development of output user interface software to support analysis

    Energy Technology Data Exchange (ETDEWEB)

    Wahanani, Nursinta Adi, E-mail: sintaadi@batan.go.id; Natsir, Khairina, E-mail: sintaadi@batan.go.id; Hartini, Entin, E-mail: sintaadi@batan.go.id [Center for Development of Nuclear Informatics - National Nuclear Energy Agency, PUSPIPTEK, Serpong, Tangerang, Banten (Indonesia)

    2014-09-30

    Data processing software packages such as VSOP and MCNPX are softwares that has been scientifically proven and complete. The result of VSOP and MCNPX are huge and complex text files. In the analyze process, user need additional processing like Microsoft Excel to show informative result. This research develop an user interface software for output of VSOP and MCNPX. VSOP program output is used to support neutronic analysis and MCNPX program output is used to support burn-up analysis. Software development using iterative development methods which allow for revision and addition of features according to user needs. Processing time with this software 500 times faster than with conventional methods using Microsoft Excel. PYTHON is used as a programming language, because Python is available for all major operating systems: Windows, Linux/Unix, OS/2, Mac, Amiga, among others. Values that support neutronic analysis are k-eff, burn-up and mass Pu{sup 239} and Pu{sup 241}. Burn-up analysis used the mass inventory values of actinide (Thorium, Plutonium, Neptunium and Uranium). Values are visualized in graphical shape to support analysis.

  7. Development of output user interface software to support analysis

    International Nuclear Information System (INIS)

    Wahanani, Nursinta Adi; Natsir, Khairina; Hartini, Entin

    2014-01-01

    Data processing software packages such as VSOP and MCNPX are softwares that has been scientifically proven and complete. The result of VSOP and MCNPX are huge and complex text files. In the analyze process, user need additional processing like Microsoft Excel to show informative result. This research develop an user interface software for output of VSOP and MCNPX. VSOP program output is used to support neutronic analysis and MCNPX program output is used to support burn-up analysis. Software development using iterative development methods which allow for revision and addition of features according to user needs. Processing time with this software 500 times faster than with conventional methods using Microsoft Excel. PYTHON is used as a programming language, because Python is available for all major operating systems: Windows, Linux/Unix, OS/2, Mac, Amiga, among others. Values that support neutronic analysis are k-eff, burn-up and mass Pu 239 and Pu 241 . Burn-up analysis used the mass inventory values of actinide (Thorium, Plutonium, Neptunium and Uranium). Values are visualized in graphical shape to support analysis

  8. Space Suit Joint Torque Testing

    Science.gov (United States)

    Valish, Dana J.

    2011-01-01

    In 2009 and early 2010, a test was performed to quantify the torque required to manipulate joints in several existing operational and prototype space suits in an effort to develop joint torque requirements appropriate for a new Constellation Program space suit system. The same test method was levied on the Constellation space suit contractors to verify that their suit design meets the requirements. However, because the original test was set up and conducted by a single test operator there was some question as to whether this method was repeatable enough to be considered a standard verification method for Constellation or other future space suits. In order to validate the method itself, a representative subset of the previous test was repeated, using the same information that would be available to space suit contractors, but set up and conducted by someone not familiar with the previous test. The resultant data was compared using graphical and statistical analysis and a variance in torque values for some of the tested joints was apparent. Potential variables that could have affected the data were identified and re-testing was conducted in an attempt to eliminate these variables. The results of the retest will be used to determine if further testing and modification is necessary before the method can be validated.

  9. Model-driven design of simulation support for the TERRA robot software tool suite

    NARCIS (Netherlands)

    Lu, Zhou; Bezemer, M.M.; Broenink, Johannes F.

    2015-01-01

    Model-Driven Development (MDD) – based on the concepts of model, meta-model and model transformation – is an approach to develop predictable and re- liable software for Cyber-Physical Systems (CPS). The work presented here concerns a methodology to design simulation software based on MDD techniques,

  10. Improving Software Systems By Flow Control Analysis

    Directory of Open Access Journals (Sweden)

    Piotr Poznanski

    2012-01-01

    Full Text Available Using agile methods during the implementation of the system that meets mission critical requirements can be a real challenge. The change in the system built of dozens or even hundreds of specialized devices with embedded software requires the cooperation of a large group of engineers. This article presents a solution that supports parallel work of groups of system analysts and software developers. Deployment of formal rules to the requirements written in natural language enables using formal analysis of artifacts being a bridge between software and system requirements. Formalism and textual form of requirements allowed the automatic generation of message flow graph for the (sub system, called the “big-picture-model”. Flow diagram analysis helped to avoid a large number of defects whose repair cost in extreme cases could undermine the legitimacy of agile methods in projects of this scale. Retrospectively, a reduction of technical debt was observed. Continuous analysis of the “big picture model” improves the control of the quality parameters of the software architecture. The article also tries to explain why the commercial platform based on UML modeling language may not be sufficient in projects of this complexity.

  11. Time Management in the Operating Room: An Analysis of the Dedicated Minimally Invasive Surgery Suite

    Science.gov (United States)

    Hsiao, Kenneth C.; Machaidze, Zurab

    2004-01-01

    Background: Dedicated minimally invasive surgery suites are available that contain specialized equipment to facilitate endoscopic surgery. Laparoscopy performed in a general operating room is hampered by the multitude of additional equipment that must be transported into the room. The objective of this study was to compare the preparation times between procedures performed in traditional operating rooms versus dedicated minimally invasive surgery suites to see whether operating room efficiency is improved in the specialized room. Methods: The records of 50 patients who underwent laparoscopic procedures between September 2000 and April 2002 were retrospectively reviewed. Twenty-three patients underwent surgery in a general operating room and 18 patients in an minimally invasive surgery suite. Nine patients were excluded because of cystoscopic procedures undergone prior to laparoscopy. Various time points were recorded from which various time intervals were derived, such as preanesthesia time, anesthesia induction time, and total preparation time. A 2-tailed, unpaired Student t test was used for statistical analysis. Results: The mean preanesthesia time was significantly faster in the minimally invasive surgery suite (12.2 minutes) compared with that in the traditional operating room (17.8 minutes) (P=0.013). Mean anesthesia induction time in the minimally invasive surgery suite (47.5 minutes) was similar to time in the traditional operating room (45.7 minutes) (P=0.734). The average total preparation time for the minimally invasive surgery suite (59.6 minutes) was not significantly faster than that in the general operating room (63.5 minutes) (P=0.481). Conclusion: The amount of time that elapses between the patient entering the room and anesthesia induction is statically shorter in a dedicated minimally invasive surgery suite. Laparoscopic surgery is performed more efficiently in a dedicated minimally invasive surgery suite versus a traditional operating room. PMID

  12. The CMS software performance at the start of data taking

    CERN Document Server

    Benelli, Gabriele

    2009-01-01

    The CMS software framework (CMSSW) is a complex project evolving very rapidly as the first LHC colliding beams approach. The computing requirements constrain performance in terms of CPU time, memory footprint and event size on disk to allow for planning and managing the computing infrastructure necessary to handle the needs of the experiment. A performance suite of tools has been developed to track all aspects of code performance, through the software release cycles, allowing for regression and guiding code development for optimization. In this talk, we describe the CMSSW performance suite tools used and present some sample performance results from the release integration process for the CMS software.

  13. A Flexible Method for Producing F.E.M. Analysis of Bone Using Open-Source Software

    Science.gov (United States)

    Boppana, Abhishektha; Sefcik, Ryan; Meyers, Jerry G.; Lewandowski, Beth E.

    2016-01-01

    This project, performed in support of the NASA GRC Space Academy summer program, sought to develop an open-source workflow methodology that segmented medical image data, created a 3D model from the segmented data, and prepared the model for finite-element analysis. In an initial step, a technological survey evaluated the performance of various existing open-source software that claim to perform these tasks. However, the survey concluded that no single software exhibited the wide array of functionality required for the potential NASA application in the area of bone, muscle and bio fluidic studies. As a result, development of a series of Python scripts provided the bridging mechanism to address the shortcomings of the available open source tools. The implementation of the VTK library provided the most quick and effective means of segmenting regions of interest from the medical images; it allowed for the export of a 3D model by using the marching cubes algorithm to build a surface mesh. To facilitate the development of the model domain from this extracted information required a surface mesh to be processed in the open-source software packages Blender and Gmsh. The Preview program of the FEBio suite proved to be sufficient for volume filling the model with an unstructured mesh and preparing boundaries specifications for finite element analysis. To fully allow FEM modeling, an in house developed Python script allowed assignment of material properties on an element by element basis by performing a weighted interpolation of voxel intensity of the parent medical image correlated to published information of image intensity to material properties, such as ash density. A graphical user interface combined the Python scripts and other software into a user friendly interface. The work using Python scripts provides a potential alternative to expensive commercial software and inadequate, limited open-source freeware programs for the creation of 3D computational models. More work

  14. SECIMTools: a suite of metabolomics data analysis tools.

    Science.gov (United States)

    Kirpich, Alexander S; Ibarra, Miguel; Moskalenko, Oleksandr; Fear, Justin M; Gerken, Joseph; Mi, Xinlei; Ashrafi, Ali; Morse, Alison M; McIntyre, Lauren M

    2018-04-20

    Metabolomics has the promise to transform the area of personalized medicine with the rapid development of high throughput technology for untargeted analysis of metabolites. Open access, easy to use, analytic tools that are broadly accessible to the biological community need to be developed. While technology used in metabolomics varies, most metabolomics studies have a set of features identified. Galaxy is an open access platform that enables scientists at all levels to interact with big data. Galaxy promotes reproducibility by saving histories and enabling the sharing workflows among scientists. SECIMTools (SouthEast Center for Integrated Metabolomics) is a set of Python applications that are available both as standalone tools and wrapped for use in Galaxy. The suite includes a comprehensive set of quality control metrics (retention time window evaluation and various peak evaluation tools), visualization techniques (hierarchical cluster heatmap, principal component analysis, modular modularity clustering), basic statistical analysis methods (partial least squares - discriminant analysis, analysis of variance, t-test, Kruskal-Wallis non-parametric test), advanced classification methods (random forest, support vector machines), and advanced variable selection tools (least absolute shrinkage and selection operator LASSO and Elastic Net). SECIMTools leverages the Galaxy platform and enables integrated workflows for metabolomics data analysis made from building blocks designed for easy use and interpretability. Standard data formats and a set of utilities allow arbitrary linkages between tools to encourage novel workflow designs. The Galaxy framework enables future data integration for metabolomics studies with other omics data.

  15. Software Process Improvement Using Force Field Analysis ...

    African Journals Online (AJOL)

    An improvement plan is then drawn and implemented. This paper studied the state of Nigerian software development organizations based on selected attributes. Force field analysis is used to partition the factors obtained into driving and restraining forces. An attempt was made to improve the software development process ...

  16. Development of a fatigue analysis software system

    International Nuclear Information System (INIS)

    Choi, B. I.; Lee, H. J.; Han, S. W.; Kim, J. Y.; Hwang, K. H.; Kang, J. Y.

    2001-01-01

    A general purpose fatigue analysis software to predict fatigue lives of mechanical components and structures was developed. This software has some characteristic features including functions of searching weak regions on the free surface in order to reduce computing time significantly, a database of fatigue properties for various materials, and an expert system which can assist any users to get more proper results. This software can be used in the environment consists of commercial finite element packages. Using the software developed fatigue analyses for a SAE keyhole specimen and an automobile knuckle were carried out. It was observed that the results were agree well with those from commercial packages

  17. Z-2 Suit Support Stand and MKIII Suit Center of Gravity Test

    Science.gov (United States)

    Nguyen, Tuan Q.

    2014-01-01

    NASA's next generation spacesuits are the Z-Series suits, made for a range of possible exploration missions in the near future. The prototype Z-1 suit has been developed and assembled to incorporate new technologies that has never been utilized before in the Apollo suits and the Extravehicular Mobility Unit (EMU). NASA engineers tested the Z-1 suit extensively in order to developed design requirements for the new Z-2 suit. At the end of 2014, NASA will be receiving the new Z-2 suit to perform more testing and to further develop the new technologies of the suit. In order to do so, a suit support stand will be designed and fabricated to support the Z-2 suit during maintenance, sizing, and structural leakage testing. The Z-2 Suit Support Stand (Z2SSS) will be utilized for these purposes in the early testing stages of the Z-2 suit.

  18. Elementary study on γ analysis software for low level measurement

    International Nuclear Information System (INIS)

    Ruan Guanglin; Huang Xianguo; Xing Shixiong

    2001-01-01

    The difficulty in using fashion γ analysis software in low level measurement is discussed. The ROI report file of ORTEC operation system has been chosen as interface file to write γ analysis software for low-level measurement. The author gives software flowchart and applied example and discusses the existent problems

  19. Continuous software quality analysis for the ATLAS experiment

    CERN Document Server

    Washbrook, Andrew; The ATLAS collaboration

    2017-01-01

    The software for the ATLAS experiment on the Large Hadron Collider at CERN has evolved over many years to meet the demands of Monte Carlo simulation, particle detector reconstruction and data analysis. At present over 3.8 million lines of C++ code (and close to 6 million total lines of code) are maintained by an active worldwide developer community. In order to run the experiment software efficiently at hundreds of computing centres it is essential to maintain a high level of software quality standards. The methods proposed to improve software quality practices by incorporating checks into the new ATLAS software build infrastructure.

  20. Software hazard analysis for nuclear digital protection system by Colored Petri Net

    International Nuclear Information System (INIS)

    Bai, Tao; Chen, Wei-Hua; Liu, Zhen; Gao, Feng

    2017-01-01

    Highlights: •A dynamic hazard analysis method is proposed for the safety-critical software. •The mechanism relies on Colored Petri Net. •Complex interactions between software and hardware are captured properly. •Common failure mode in software are identified effectively. -- Abstract: The software safety of a nuclear digital protection system is critical for the safety of nuclear power plants as any software defect may result in severe damage. In order to ensure the safety and reliability of safety-critical digital system products and their applications, software hazard analysis is required to be performed during the lifecycle of software development. The dynamic software hazard modeling and analysis method based on Colored Petri Net is proposed and applied to the safety-critical control software of the nuclear digital protection system in this paper. The analysis results show that the proposed method can explain the complex interactions between software and hardware and identify the potential common cause failure in software properly and effectively. Moreover, the method can find the dominant software induced hazard to safety control actions, which aids in increasing software quality.

  1. Failure mode and effects analysis of software-based automation systems

    International Nuclear Information System (INIS)

    Haapanen, P.; Helminen, A.

    2002-08-01

    Failure mode and effects analysis (FMEA) is one of the well-known analysis methods having an established position in the traditional reliability analysis. The purpose of FMEA is to identify possible failure modes of the system components, evaluate their influences on system behaviour and propose proper countermeasures to suppress these effects. The generic nature of FMEA has enabled its wide use in various branches of industry reaching from business management to the design of spaceships. The popularity and diverse use of the analysis method has led to multiple interpretations, practices and standards presenting the same analysis method. FMEA is well understood at the systems and hardware levels, where the potential failure modes usually are known and the task is to analyse their effects on system behaviour. Nowadays, more and more system functions are realised on software level, which has aroused the urge to apply the FMEA methodology also on software based systems. Software failure modes generally are unknown - 'software modules do not fail, they only display incorrect behaviour' - and depend on dynamic behaviour of the application. These facts set special requirements on the FMEA of software based systems and make it difficult to realise. In this report the failure mode and effects analysis is studied for the use of reliability analysis of software-based systems. More precisely, the target system of FMEA is defined to be a safety-critical software-based automation application in a nuclear power plant, implemented on an industrial automation system platform. Through a literature study the report tries to clarify the intriguing questions related to the practical use of software failure mode and effects analysis. The study is a part of the research project 'Programmable Automation System Safety Integrity assessment (PASSI)', belonging to the Finnish Nuclear Safety Research Programme (FINNUS, 1999-2002). In the project various safety assessment methods and tools for

  2. Automated Software Vulnerability Analysis

    Science.gov (United States)

    Sezer, Emre C.; Kil, Chongkyung; Ning, Peng

    Despite decades of research, software continues to have vulnerabilities. Successful exploitations of these vulnerabilities by attackers cost millions of dollars to businesses and individuals. Unfortunately, most effective defensive measures, such as patching and intrusion prevention systems, require an intimate knowledge of the vulnerabilities. Many systems for detecting attacks have been proposed. However, the analysis of the exploited vulnerabilities is left to security experts and programmers. Both the human effortinvolved and the slow analysis process are unfavorable for timely defensive measure to be deployed. The problem is exacerbated by zero-day attacks.

  3. Error-Free Software

    Science.gov (United States)

    1989-01-01

    001 is an integrated tool suited for automatically developing ultra reliable models, simulations and software systems. Developed and marketed by Hamilton Technologies, Inc. (HTI), it has been applied in engineering, manufacturing, banking and software tools development. The software provides the ability to simplify the complex. A system developed with 001 can be a prototype or fully developed with production quality code. It is free of interface errors, consistent, logically complete and has no data or control flow errors. Systems can be designed, developed and maintained with maximum productivity. Margaret Hamilton, President of Hamilton Technologies, also directed the research and development of USE.IT, an earlier product which was the first computer aided software engineering product in the industry to concentrate on automatically supporting the development of an ultrareliable system throughout its life cycle. Both products originated in NASA technology developed under a Johnson Space Center contract.

  4. IFDOTMETER: A New Software Application for Automated Immunofluorescence Analysis.

    Science.gov (United States)

    Rodríguez-Arribas, Mario; Pizarro-Estrella, Elisa; Gómez-Sánchez, Rubén; Yakhine-Diop, S M S; Gragera-Hidalgo, Antonio; Cristo, Alejandro; Bravo-San Pedro, Jose M; González-Polo, Rosa A; Fuentes, José M

    2016-04-01

    Most laboratories interested in autophagy use different imaging software for managing and analyzing heterogeneous parameters in immunofluorescence experiments (e.g., LC3-puncta quantification and determination of the number and size of lysosomes). One solution would be software that works on a user's laptop or workstation that can access all image settings and provide quick and easy-to-use analysis of data. Thus, we have designed and implemented an application called IFDOTMETER, which can run on all major operating systems because it has been programmed using JAVA (Sun Microsystems). Briefly, IFDOTMETER software has been created to quantify a variety of biological hallmarks, including mitochondrial morphology and nuclear condensation. The program interface is intuitive and user-friendly, making it useful for users not familiar with computer handling. By setting previously defined parameters, the software can automatically analyze a large number of images without the supervision of the researcher. Once analysis is complete, the results are stored in a spreadsheet. Using software for high-throughput cell image analysis offers researchers the possibility of performing comprehensive and precise analysis of a high number of images in an automated manner, making this routine task easier. © 2015 Society for Laboratory Automation and Screening.

  5. Power and performance software analysis and optimization

    CERN Document Server

    Kukunas, Jim

    2015-01-01

    Power and Performance: Software Analysis and Optimization is a guide to solving performance problems in modern Linux systems. Power-efficient chips are no help if the software those chips run on is inefficient. Starting with the necessary architectural background as a foundation, the book demonstrates the proper usage of performance analysis tools in order to pinpoint the cause of performance problems, and includes best practices for handling common performance issues those tools identify. Provides expert perspective from a key member of Intel's optimization team on how processors and memory

  6. One-Click Data Analysis Software for Science Operations

    Science.gov (United States)

    Navarro, Vicente

    2015-12-01

    One of the important activities of ESA Science Operations Centre is to provide Data Analysis Software (DAS) to enable users and scientists to process data further to higher levels. During operations and post-operations, Data Analysis Software (DAS) is fully maintained and updated for new OS and library releases. Nonetheless, once a Mission goes into the "legacy" phase, there are very limited funds and long-term preservation becomes more and more difficult. Building on Virtual Machine (VM), Cloud computing and Software as a Service (SaaS) technologies, this project has aimed at providing long-term preservation of Data Analysis Software for the following missions: - PIA for ISO (1995) - SAS for XMM-Newton (1999) - Hipe for Herschel (2009) - EXIA for EXOSAT (1983) Following goals have guided the architecture: - Support for all operations, post-operations and archive/legacy phases. - Support for local (user's computer) and cloud environments (ESAC-Cloud, Amazon - AWS). - Support for expert users, requiring full capabilities. - Provision of a simple web-based interface. This talk describes the architecture, challenges, results and lessons learnt gathered in this project.

  7. Evaluating Suit Fit Using Performance Degradation

    Science.gov (United States)

    Margerum, Sarah E.; Cowley, Matthew; Harvill, Lauren; Benson, Elizabeth; Rajulu, Sudhakar

    2011-01-01

    The Mark III suit has multiple sizes of suit components (arm, leg, and gloves) as well as sizing inserts to tailor the fit of the suit to an individual. This study sought to determine a way to identify the point an ideal suit fit transforms into a bad fit and how to quantify this breakdown using mobility-based physical performance data. This study examined the changes in human physical performance via degradation of the elbow and wrist range of motion of the planetary suit prototype (Mark III) with respect to changes in sizing and as well as how to apply that knowledge to suit sizing options and improvements in suit fit. The methods implemented in this study focused on changes in elbow and wrist mobility due to incremental suit sizing modifications. This incremental sizing was within a range that included both optimum and poor fit. Suited range of motion data was collected using a motion analysis system for nine isolated and functional tasks encompassing the elbow and wrist joints. A total of four subjects were tested with motions involving both arms simultaneously as well as the right arm only. The results were then compared across sizing configurations. The results of this study indicate that range of motion may be used as a viable parameter to quantify at what stage suit sizing causes a detriment in performance; however the human performance decrement appeared to be based on the interaction of multiple joints along a limb, not a single joint angle. The study was able to identify a preliminary method to quantify the impact of size on performance and to develop a means to gauge tolerances around optimal size. More work is needed to improve the assessment of optimal fit and to compensate for multiple joint interactions.

  8. Visual querying and analysis of large software repositories

    NARCIS (Netherlands)

    Voinea, Lucian; Telea, Alexandru

    We present a software framework for mining software repositories. Our extensible framework enables the integration of data extraction from repositories with data analysis and interactive visualization. We demonstrate the applicability of the framework by presenting several case studies performed on

  9. Selection of bioprocess simulation software for industrial applications.

    Science.gov (United States)

    Shanklin, T; Roper, K; Yegneswaran, P K; Marten, M R

    2001-02-20

    Two commercially available, process-simulation software packages (Aspen Batch Plus v1.2, Aspen Technology, Inc., Cambridge, Massachusetts, and Intelligen SuperPro v3.0, INTELLIGEN, INC., Scotch Plains, Ner Jersey) are evaluated for use in modeling industrial, biotechnology processes. Software is quantitatively evaluated by Kepner-Tregoe Decision Analysis (Kepner and Tregoe, 1981). This evaluation shows that Aspen Batch Plus v1.2 (ABP) and Intelligen SuperPro v3.0 (ISP) can successfully perform specific simulation tasks but do not provide a complete model of all phenomena occurring within a biotechnology process. Software is best suited to provide a format for process management, using material and energy balances to answer scheduling questions, explore equipment change-outs, and calculate cost data. The ability of simulation software to accurately predict unit operation scale-up and optimize bioprocesses is limited. To realistically evaluate the software, a vaccine manufacturing process under development at Merck & Company is simulated. Case studies from the vaccine process are presented as examples of how ABP and ISP can be used to shed light on real-world processing issues. Copyright 2001 John Wiley & Sons, Inc.

  10. Effective Results Analysis for the Similar Software Products’ Orthogonality

    Directory of Open Access Journals (Sweden)

    Ion Ivan

    2009-10-01

    Full Text Available It is defined the concept of similar software. There are established conditions of archiving the software components. It is carried out the orthogonality evaluation and the correlation between the orthogonality and the complexity of the homogenous software components is analyzed. Shall proceed to build groups of similar software products, belonging to the orthogonality intervals. There are presented in graphical form the results of the analysis. There are detailed aspects of the functioning of the software product allocated for the orthogonality.

  11. Introducing a New Software for Geodetic Analysis

    Science.gov (United States)

    Hjelle, Geir Arne; Dähnn, Michael; Fausk, Ingrid; Kirkvik, Ann-Silje; Mysen, Eirik

    2017-04-01

    At the Norwegian Mapping Authority, we are currently developing Where, a new software for geodetic analysis. Where is built on our experiences with the Geosat software, and will be able to analyse and combine data from VLBI, SLR, GNSS and DORIS. The software is mainly written in Python which has proved very fruitful. The code is quick to write and the architecture is easily extendable and maintainable, while at the same time taking advantage of well-tested code like the SOFA and IERS libraries. This presentation will show some of the current capabilities of Where, including benchmarks against other software packages, and outline our plans for further progress. In addition we will report on some investigations we have done experimenting with alternative weighting strategies for VLBI.

  12. Automated Structure Solution with the PHENIX Suite

    Energy Technology Data Exchange (ETDEWEB)

    Zwart, Peter H.; Zwart, Peter H.; Afonine, Pavel; Grosse-Kunstleve, Ralf W.; Hung, Li-Wei; Ioerger, Tom R.; McCoy, A.J.; McKee, Eric; Moriarty, Nigel; Read, Randy J.; Sacchettini, James C.; Sauter, Nicholas K.; Storoni, L.C.; Terwilliger, Tomas C.; Adams, Paul D.

    2008-06-09

    Significant time and effort are often required to solve and complete a macromolecular crystal structure. The development of automated computational methods for the analysis, solution and completion of crystallographic structures has the potential to produce minimally biased models in a short time without the need for manual intervention. The PHENIX software suite is a highly automated system for macromolecular structure determination that can rapidly arrive at an initial partial model of a structure without significant human intervention, given moderate resolution and good quality data. This achievement has been made possible by the development of new algorithms for structure determination, maximum-likelihood molecular replacement (PHASER), heavy-atom search (HySS), template and pattern-based automated model-building (RESOLVE, TEXTAL), automated macromolecular refinement (phenix.refine), and iterative model-building, density modification and refinement that can operate at moderate resolution (RESOLVE, AutoBuild). These algorithms are based on a highly integrated and comprehensive set of crystallographic libraries that have been built and made available to the community. The algorithms are tightly linked and made easily accessible to users through the PHENIX Wizards and the PHENIX GUI.

  13. Automated structure solution with the PHENIX suite

    Energy Technology Data Exchange (ETDEWEB)

    Terwilliger, Thomas C [Los Alamos National Laboratory; Zwart, Peter H [LBNL; Afonine, Pavel V [LBNL; Grosse - Kunstleve, Ralf W [LBNL

    2008-01-01

    Significant time and effort are often required to solve and complete a macromolecular crystal structure. The development of automated computational methods for the analysis, solution, and completion of crystallographic structures has the potential to produce minimally biased models in a short time without the need for manual intervention. The PHENIX software suite is a highly automated system for macromolecular structure determination that can rapidly arrive at an initial partial model of a structure without significant human intervention, given moderate resolution, and good quality data. This achievement has been made possible by the development of new algorithms for structure determination, maximum-likelihood molecular replacement (PHASER), heavy-atom search (HySS), template- and pattern-based automated model-building (RESOLVE, TEXTAL), automated macromolecular refinement (phenix. refine), and iterative model-building, density modification and refinement that can operate at moderate resolution (RESOLVE, AutoBuild). These algorithms are based on a highly integrated and comprehensive set of crystallographic libraries that have been built and made available to the community. The algorithms are tightly linked and made easily accessible to users through the PHENIX Wizards and the PHENIX GUI.

  14. Development of Emittance Analysis Software for Ion Beam Characterization

    International Nuclear Information System (INIS)

    Padilla, M.J.; Liu, Yuan

    2007-01-01

    Transverse beam emittance is a crucial property of charged particle beams that describes their angular and spatial spread. It is a figure of merit frequently used to determine the quality of ion beams, the compatibility of an ion beam with a given beam transport system, and the ability to suppress neighboring isotopes at on-line mass separator facilities. Generally, a high-quality beam is characterized by a small emittance. In order to determine and improve the quality of ion beams used at the Holifield Radioactive Ion Beam Facility (HRIBF) for nuclear physics and nuclear astrophysics research, the emittances of the ion beams are measured at the off-line Ion Source Test Facilities. In this project, emittance analysis software was developed to perform various data processing tasks for noise reduction, to evaluate root-mean-square emittance, Twiss parameters, and area emittance of different beam fractions. The software also provides 2D and 3D graphical views of the emittance data, beam profiles, emittance contours, and RMS. Noise exclusion is essential for accurate determination of beam emittance values. A Self-Consistent, Unbiased Elliptical Exclusion (SCUBEEx) method is employed. Numerical data analysis techniques such as interpolation and nonlinear fitting are also incorporated into the software. The software will provide a simplified, fast tool for comprehensive emittance analysis. The main functions of the software package have been completed. In preliminary tests with experimental emittance data, the analysis results using the software were shown to be accurate

  15. DEVELOPMENT OF EMITTANCE ANALYSIS SOFTWARE FOR ION BEAM CHARACTERIZATION

    Energy Technology Data Exchange (ETDEWEB)

    Padilla, M. J.; Liu, Y.

    2007-01-01

    Transverse beam emittance is a crucial property of charged particle beams that describes their angular and spatial spread. It is a fi gure of merit frequently used to determine the quality of ion beams, the compatibility of an ion beam with a given beam transport system, and the ability to suppress neighboring isotopes at on-line mass separator facilities. Generally a high quality beam is characterized by a small emittance. In order to determine and improve the quality of ion beams used at the Holifi eld Radioactive Ion beam Facility (HRIBF) for nuclear physics and nuclear astrophysics research, the emittances of the ion beams are measured at the off-line Ion Source Test Facilities. In this project, emittance analysis software was developed to perform various data processing tasks for noise reduction, to evaluate root-mean-square emittance, Twiss parameters, and area emittance of different beam fractions. The software also provides 2D and 3D graphical views of the emittance data, beam profi les, emittance contours, and RMS. Noise exclusion is essential for accurate determination of beam emittance values. A Self-Consistent, Unbiased Elliptical Exclusion (SCUBEEx) method is employed. Numerical data analysis techniques such as interpolation and nonlinear fi tting are also incorporated into the software. The software will provide a simplifi ed, fast tool for comprehensive emittance analysis. The main functions of the software package have been completed. In preliminary tests with experimental emittance data, the analysis results using the software were shown to be accurate.

  16. Software para análise quantitativa da deglutição Swallowing quantitative analysis software

    Directory of Open Access Journals (Sweden)

    André Augusto Spadotto

    2008-02-01

    Full Text Available OBJETIVO: Apresentar um software que permita uma análise detalhada da dinâmica da deglutição. MATERIAIS E MÉTODOS: Participaram deste estudo dez indivíduos após acidente vascular encefálico, sendo seis do gênero masculino, com idade média de 57,6 anos. Foi realizada videofluoroscopia da deglutição e as imagens foram digitalizadas em microcomputador, com posterior análise do tempo do trânsito faríngeo da deglutição, por meio de um cronômetro e do software. RESULTADOS: O tempo médio do trânsito faríngeo da deglutição apresentou-se diferente quando comparados os métodos utilizados (cronômetro e software. CONCLUSÃO: Este software é um instrumento de análise dos parâmetros tempo e velocidade da deglutição, propiciando melhor compreensão da dinâmica da deglutição, com reflexos tanto na abordagem clínica dos pacientes com disfagia como para fins de pesquisa científica.OBJECTIVE: The present paper is aimed at introducing a software to allow a detailed analysis of the swallowing dynamics. MATERIALS AND METHODS: The sample included ten (six male and four female stroke patients, with mean age of 57.6 years. Swallowing videofluoroscopy was performed and images were digitized for posterior analysis of the pharyngeal transit time with the aid of a chronometer and the software. RESULTS: Differences were observed in the average pharyngeal swallowing transit time as a result of measurements with chronometer and software. CONCLUSION: This software is a useful tool for the analysis of parameters such as swallowing time and speed, allowing a better understanding of the swallowing dynamics, both in the clinical approach of patients with oropharyngeal dysphagia and for scientific research purposes.

  17. Recent software developments for biomechanical assessment

    Science.gov (United States)

    Greaves, John O. B.

    1990-08-01

    While much of the software developed in research laboratories is narrow in focus and suited for a specific experiment, some of it is broad enough and of high enough quality to be useful to others in solving similar problems. Several biomechanical assessment packages are now beginning to emerge, including: * 3D research biomechanics (5- and 6-DOF) with kinematics, kinetics, 32-channel analog data subsystem, and project management. * 3D full-body gait analysis with kinematics, kinetics, EMG charts, and force plate charts. * 2D dynamic rear-foot assessment. * 2D occupational biomechanics lifting task and personnel assessments. * 2D dynamic gait analysis. * Multiple 2D dynamic spine assessments. * 2D sport and biomechanics assessments with kinematics and kinetics. * 2D and 3D equine gait assessments.

  18. NASA Data Acquisition System Software Development for Rocket Propulsion Test Facilities

    Science.gov (United States)

    Herbert, Phillip W., Sr.; Elliot, Alex C.; Graves, Andrew R.

    2015-01-01

    Current NASA propulsion test facilities include Stennis Space Center in Mississippi, Marshall Space Flight Center in Alabama, Plum Brook Station in Ohio, and White Sands Test Facility in New Mexico. Within and across these centers, a diverse set of data acquisition systems exist with different hardware and software platforms. The NASA Data Acquisition System (NDAS) is a software suite designed to operate and control many critical aspects of rocket engine testing. The software suite combines real-time data visualization, data recording to a variety formats, short-term and long-term acquisition system calibration capabilities, test stand configuration control, and a variety of data post-processing capabilities. Additionally, data stream conversion functions exist to translate test facility data streams to and from downstream systems, including engine customer systems. The primary design goals for NDAS are flexibility, extensibility, and modularity. Providing a common user interface for a variety of hardware platforms helps drive consistency and error reduction during testing. In addition, with an understanding that test facilities have different requirements and setups, the software is designed to be modular. One engine program may require real-time displays and data recording; others may require more complex data stream conversion, measurement filtering, or test stand configuration management. The NDAS suite allows test facilities to choose which components to use based on their specific needs. The NDAS code is primarily written in LabVIEW, a graphical, data-flow driven language. Although LabVIEW is a general-purpose programming language; large-scale software development in the language is relatively rare compared to more commonly used languages. The NDAS software suite also makes extensive use of a new, advanced development framework called the Actor Framework. The Actor Framework provides a level of code reuse and extensibility that has previously been difficult

  19. User-driven integrated software lives: ``Paleomag'' paleomagnetics analysis on the Macintosh

    Science.gov (United States)

    Jones, Craig H.

    2002-12-01

    "PaleoMag," a paleomagnetics analysis package originally developed for the Macintosh operating system in 1988, allows examination of demagnetization of individual samples and analysis of directional data from collections of samples. Prior to recent reinvigorated development of the software for both Macintosh and Windows, it was widely used despite not running properly on machines and operating systems sold after 1995. This somewhat surprising situation demonstrates that there is a continued need for integrated analysis software within the earth sciences, in addition to well-developed scripting and batch-mode software. One distinct advantage of software like PaleoMag is in the ability to combine quality control with analysis within a unique graphical environment. Because such demands are frequent within the earth sciences, means of nurturing the development of similar software should be found.

  20. Development of evaluation method for software safety analysis techniques

    International Nuclear Information System (INIS)

    Huang, H.; Tu, W.; Shih, C.; Chen, C.; Yang, W.; Yih, S.; Kuo, C.; Chen, M.

    2006-01-01

    Full text: Full text: Following the massive adoption of digital Instrumentation and Control (I and C) system for nuclear power plant (NPP), various Software Safety Analysis (SSA) techniques are used to evaluate the NPP safety for adopting appropriate digital I and C system, and then to reduce risk to acceptable level. However, each technique has its specific advantage and disadvantage. If the two or more techniques can be complementarily incorporated, the SSA combination would be more acceptable. As a result, if proper evaluation criteria are available, the analyst can then choose appropriate technique combination to perform analysis on the basis of resources. This research evaluated the applicable software safety analysis techniques nowadays, such as, Preliminary Hazard Analysis (PHA), Failure Modes and Effects Analysis (FMEA), Fault Tree Analysis (FTA), Markov chain modeling, Dynamic Flowgraph Methodology (DFM), and simulation-based model analysis; and then determined indexes in view of their characteristics, which include dynamic capability, completeness, achievability, detail, signal/ noise ratio, complexity, and implementation cost. These indexes may help the decision makers and the software safety analysts to choose the best SSA combination arrange their own software safety plan. By this proposed method, the analysts can evaluate various SSA combinations for specific purpose. According to the case study results, the traditional PHA + FMEA + FTA (with failure rate) + Markov chain modeling (without transfer rate) combination is not competitive due to the dilemma for obtaining acceptable software failure rates. However, the systematic architecture of FTA and Markov chain modeling is still valuable for realizing the software fault structure. The system centric techniques, such as DFM and Simulation-based model analysis, show the advantage on dynamic capability, achievability, detail, signal/noise ratio. However, their disadvantage are the completeness complexity

  1. Buying behavior and competition in the software market

    OpenAIRE

    Rosário, João Carlos Vinagre Ferreira do; Reis, António Palma dos

    2012-01-01

    In the actual world, the impact of the software buying decisions has a rising relevance in social and economic terms. This research tries to explain it focusing on the organizations buying decisions of Operating Systems and Office Suites for personal computers and the impact on the competition between incumbent and alternative players in the market in these software categories, although the research hypotheses and conclusions may extend to other software categories and platforms. We concluded...

  2. Fault tree synthesis for software design analysis of PLC based safety-critical systems

    International Nuclear Information System (INIS)

    Koo, S. R.; Cho, C. H.; Seong, P. H.

    2006-01-01

    As a software verification and validation should be performed for the development of PLC based safety-critical systems, a software safety analysis is also considered in line with entire software life cycle. In this paper, we propose a technique of software safety analysis in the design phase. Among various software hazard analysis techniques, fault tree analysis is most widely used for the safety analysis of nuclear power plant systems. Fault tree analysis also has the most intuitive notation and makes both qualitative and quantitative analyses possible. To analyze the design phase more effectively, we propose a technique of fault tree synthesis, along with a universal fault tree template for the architecture modules of nuclear software. Consequently, we can analyze the safety of software on the basis of fault tree synthesis. (authors)

  3. Effective Results Analysis for the Similar Software Products’ Orthogonality

    OpenAIRE

    Ion Ivan; Daniel Milodin

    2009-01-01

    It is defined the concept of similar software. There are established conditions of archiving the software components. It is carried out the orthogonality evaluation and the correlation between the orthogonality and the complexity of the homogenous software components is analyzed. Shall proceed to build groups of similar software products, belonging to the orthogonality intervals. There are presented in graphical form the results of the analysis. There are detailed aspects of the functioning o...

  4. Integrated analysis software for bulk power system stability

    Energy Technology Data Exchange (ETDEWEB)

    Tanaka, T; Nagao, T; Takahashi, K [Central Research Inst. of Electric Power Industry, Tokyo (Japan)

    1994-12-31

    This paper presents Central Research Inst.of Electric Power Industry - CRIEPI`s - own developed three softwares for bulk power network analysis and the user support system which arranges tremendous data necessary for these softwares with easy and high reliability. (author) 3 refs., 7 figs., 2 tabs.

  5. Software design specification and analysis(NuFDS) approach for the safety critical software based on porgrammable logic controller(PLC)

    International Nuclear Information System (INIS)

    Koo, Seo Ryong; Seong, Poong Hyun; Jung, Jin Yong; Choi, Seong Soo

    2004-01-01

    This paper introduces the software design specification and analysis technique for the safety-critical system based on Programmable Logic Controller (PLC). During software development phases, the design phase should perform an important role to connect between requirements phase and implementation phase as a process of translating problem requirements into software structures. In this work, the Nuclear FBD-style Design Specification and analysis (NuFDS) approach was proposed. The NuFDS approach for nuclear Instrumentation and Control (I and C) software are suggested in a straight forward manner. It consists of four major specifications as follows; Database, Software Architecture, System Behavior, and PLC Hardware Configuration. Additionally, correctness, completeness, consistency, and traceability check techniques are also suggested for the formal design analysis in NuFDS approach. In addition, for the tool supporting, we are developing NuSDS tool based on the NuFDS approach which is a tool, especially for the software design specification in nuclear fields

  6. Risk Analysis and Decision-Making Software Package (1997 Version) User Manual

    Energy Technology Data Exchange (ETDEWEB)

    Chung, F.T.H.

    1999-02-11

    This manual provides instructions for using the U.S. Department of Energy's (DOE) risk analysis and decision making software (1997 version) developed at BDM Petroleum Technologies by BDM-Oklahoma, Inc. for DOE, under contract No. DE-AC22-94PC91OO8. This software provides petroleum producers with a simple, handy tool for exploration and production risk analysis and decision-making. It collects useful risk analysis tools in one package so that users do not have to use several programs separately. The software is simple to use, but still provides many functions. The 1997 version of the software package includes the following tools: (1) Investment risk (Gambler's ruin) analysis; (2) Monte Carlo simulation; (3) Best fit for distribution functions; (4) Sample and rank correlation; (5) Enhanced oil recovery method screening; and (6) artificial neural network. This software package is subject to change. Suggestions and comments from users are welcome and will be considered for future modifications and enhancements of the software. Please check the opening screen of the software for the current contact information. In the future, more tools will be added to this software package. This manual includes instructions on how to use the software but does not attempt to fully explain the theory and algorithms used to create it.

  7. Software safety analysis on the model specified by NuSCR and SMV input language at requirements phase of software development life cycle using SMV

    International Nuclear Information System (INIS)

    Koh, Kwang Yong; Seong, Poong Hyun

    2005-01-01

    Safety-critical software process is composed of development process, verification and validation (V and V) process and safety analysis process. Safety analysis process has been often treated as an additional process and not found in a conventional software process. But software safety analysis (SSA) is required if software is applied to a safety system, and the SSA shall be performed independently for the safety software through software development life cycle (SDLC). Of all the phases in software development, requirements engineering is generally considered to play the most critical role in determining the overall software quality. NASA data demonstrate that nearly 75% of failures found in operational software were caused by errors in the requirements. The verification process in requirements phase checks the correctness of software requirements specification, and the safety analysis process analyzes the safety-related properties in detail. In this paper, the method for safety analysis at requirements phase of software development life cycle using symbolic model verifier (SMV) is proposed. Hazard is discovered by hazard analysis and in other to use SMV for the safety analysis, the safety-related properties are expressed by computation tree logic (CTL)

  8. PuMA: the Porous Microstructure Analysis software

    Science.gov (United States)

    Ferguson, Joseph C.; Panerai, Francesco; Borner, Arnaud; Mansour, Nagi N.

    2018-01-01

    The Porous Microstructure Analysis (PuMA) software has been developed in order to compute effective material properties and perform material response simulations on digitized microstructures of porous media. PuMA is able to import digital three-dimensional images obtained from X-ray microtomography or to generate artificial microstructures. PuMA also provides a module for interactive 3D visualizations. Version 2.1 includes modules to compute porosity, volume fractions, and surface area. Two finite difference Laplace solvers have been implemented to compute the continuum tortuosity factor, effective thermal conductivity, and effective electrical conductivity. A random method has been developed to compute tortuosity factors from the continuum to rarefied regimes. Representative elementary volume analysis can be performed on each property. The software also includes a time-dependent, particle-based model for the oxidation of fibrous materials. PuMA was developed for Linux operating systems and is available as a NASA software under a US & Foreign release.

  9. Software for computerised analysis of cardiotocographic traces.

    Science.gov (United States)

    Romano, M; Bifulco, P; Ruffo, M; Improta, G; Clemente, F; Cesarelli, M

    2016-02-01

    Despite the widespread use of cardiotocography in foetal monitoring, the evaluation of foetal status suffers from a considerable inter and intra-observer variability. In order to overcome the main limitations of visual cardiotocographic assessment, computerised methods to analyse cardiotocographic recordings have been recently developed. In this study, a new software for automated analysis of foetal heart rate is presented. It allows an automatic procedure for measuring the most relevant parameters derivable from cardiotocographic traces. Simulated and real cardiotocographic traces were analysed to test software reliability. In artificial traces, we simulated a set number of events (accelerations, decelerations and contractions) to be recognised. In the case of real signals, instead, results of the computerised analysis were compared with the visual assessment performed by 18 expert clinicians and three performance indexes were computed to gain information about performances of the proposed software. The software showed preliminary performance we judged satisfactory in that the results matched completely the requirements, as proved by tests on artificial signals in which all simulated events were detected from the software. Performance indexes computed in comparison with obstetricians' evaluations are, on the contrary, not so satisfactory; in fact they led to obtain the following values of the statistical parameters: sensitivity equal to 93%, positive predictive value equal to 82% and accuracy equal to 77%. Very probably this arises from the high variability of trace annotation carried out by clinicians. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  10. A Heuristic for Improving Legacy Software Quality during Maintenance: An Empirical Case Study

    Science.gov (United States)

    Sale, Michael John

    2017-01-01

    Many organizations depend on the functionality of mission-critical legacy software and the continued maintenance of this software is vital. Legacy software is defined here as software that contains no testing suite, is often foreign to the developer performing the maintenance, lacks meaningful documentation, and over time, has become difficult to…

  11. Integration and validation of a data grid software

    Science.gov (United States)

    Carenton-Madiec, Nicolas; Berger, Katharina; Cofino, Antonio

    2014-05-01

    The Earth System Grid Federation (ESGF) Peer-to-Peer (P2P) is a software infrastructure for the management, dissemination, and analysis of model output and observational data. The ESGF grid is composed with several types of nodes which have different roles. About 40 data nodes host model outputs and datasets using thredds catalogs. About 25 compute nodes offer remote visualization and analysis tools. About 15 index nodes crawl data nodes catalogs and implement faceted and federated search in a web interface. About 15 Identity providers nodes manage accounts, authentication and authorization. Here we will present an actual size test federation spread across different institutes in different countries and a python test suite that were started in December 2013. The first objective of the test suite is to provide a simple tool that helps to test and validate a single data node and its closest index, compute and identity provider peer. The next objective will be to run this test suite on every data node of the federation and therefore test and validate every single node of the whole federation. The suite already implements nosetests, requests, myproxy-logon, subprocess, selenium and fabric python libraries in order to test both web front ends, back ends and security services. The goal of this project is to improve the quality of deliverable in a small developers team context. Developers are widely spread around the world working collaboratively and without hierarchy. This kind of working organization context en-lighted the need of a federated integration test and validation process.

  12. Modular Open-Source Software for Item Factor Analysis

    Science.gov (United States)

    Pritikin, Joshua N.; Hunter, Micheal D.; Boker, Steven M.

    2015-01-01

    This article introduces an item factor analysis (IFA) module for "OpenMx," a free, open-source, and modular statistical modeling package that runs within the R programming environment on GNU/Linux, Mac OS X, and Microsoft Windows. The IFA module offers a novel model specification language that is well suited to programmatic generation…

  13. An effective technique for the software requirements analysis of NPP safety-critical systems, based on software inspection, requirements traceability, and formal specification

    International Nuclear Information System (INIS)

    Koo, Seo Ryong; Seong, Poong Hyun; Yoo, Junbeom; Cha, Sung Deok; Yoo, Yeong Jae

    2005-01-01

    A thorough requirements analysis is indispensable for developing and implementing safety-critical software systems such as nuclear power plant (NPP) software systems because a single error in the requirements can generate serious software faults. However, it is very difficult to completely analyze system requirements. In this paper, an effective technique for the software requirements analysis is suggested. For requirements verification and validation (V and V) tasks, our technique uses software inspection, requirement traceability, and formal specification with structural decomposition. Software inspection and requirements traceability analysis are widely considered the most effective software V and V methods. Although formal methods are also considered an effective V and V activity, they are difficult to use properly in the nuclear fields as well as in other fields because of their mathematical nature. In this work, we propose an integrated environment (IE) approach for requirements, which is an integrated approach that enables easy inspection by combining requirement traceability and effective use of a formal method. The paper also introduces computer-aided tools for supporting IE approach for requirements. Called the nuclear software inspection support and requirements traceability (NuSISRT), the tool incorporates software inspection, requirement traceability, and formal specification capabilities. We designed the NuSISRT to partially automate software inspection and analysis of requirement traceability. In addition, for the formal specification and analysis, we used the formal requirements specification and analysis tool for nuclear engineering (NuSRS)

  14. NuFTA: A CASE Tool for Automatic Software Fault Tree Analysis

    International Nuclear Information System (INIS)

    Yun, Sang Hyun; Lee, Dong Ah; Yoo, Jun Beom

    2010-01-01

    Software fault tree analysis (SFTA) is widely used for analyzing software requiring high-reliability. In SFTA, experts predict failures of system through HA-ZOP (Hazard and Operability study) or FMEA (Failure Mode and Effects Analysis) and draw software fault trees about the failures. Quality and cost of the software fault tree, therefore, depend on knowledge and experience of the experts. This paper proposes a CASE tool NuFTA in order to assist experts of safety analysis. The NuFTA automatically generate software fault trees from NuSCR formal requirements specification. NuSCR is a formal specification language used for specifying software requirements of KNICS RPS (Reactor Protection System) in Korea. We used the SFTA templates proposed by in order to generate SFTA automatically. The NuFTA also generates logical formulae summarizing the failure's cause, and we have a plan to use the formulae usefully through formal verification techniques

  15. Usability study of clinical exome analysis software: top lessons learned and recommendations.

    Science.gov (United States)

    Shyr, Casper; Kushniruk, Andre; Wasserman, Wyeth W

    2014-10-01

    New DNA sequencing technologies have revolutionized the search for genetic disruptions. Targeted sequencing of all protein coding regions of the genome, called exome analysis, is actively used in research-oriented genetics clinics, with the transition to exomes as a standard procedure underway. This transition is challenging; identification of potentially causal mutation(s) amongst ∼10(6) variants requires specialized computation in combination with expert assessment. This study analyzes the usability of user interfaces for clinical exome analysis software. There are two study objectives: (1) To ascertain the key features of successful user interfaces for clinical exome analysis software based on the perspective of expert clinical geneticists, (2) To assess user-system interactions in order to reveal strengths and weaknesses of existing software, inform future design, and accelerate the clinical uptake of exome analysis. Surveys, interviews, and cognitive task analysis were performed for the assessment of two next-generation exome sequence analysis software packages. The subjects included ten clinical geneticists who interacted with the software packages using the "think aloud" method. Subjects' interactions with the software were recorded in their clinical office within an urban research and teaching hospital. All major user interface events (from the user interactions with the packages) were time-stamped and annotated with coding categories to identify usability issues in order to characterize desired features and deficiencies in the user experience. We detected 193 usability issues, the majority of which concern interface layout and navigation, and the resolution of reports. Our study highlights gaps in specific software features typical within exome analysis. The clinicians perform best when the flow of the system is structured into well-defined yet customizable layers for incorporation within the clinical workflow. The results highlight opportunities to

  16. The UEA Small RNA Workbench: A Suite of Computational Tools for Small RNA Analysis.

    Science.gov (United States)

    Mohorianu, Irina; Stocks, Matthew Benedict; Applegate, Christopher Steven; Folkes, Leighton; Moulton, Vincent

    2017-01-01

    RNA silencing (RNA interference, RNAi) is a complex, highly conserved mechanism mediated by short, typically 20-24 nt in length, noncoding RNAs known as small RNAs (sRNAs). They act as guides for the sequence-specific transcriptional and posttranscriptional regulation of target mRNAs and play a key role in the fine-tuning of biological processes such as growth, response to stresses, or defense mechanism.High-throughput sequencing (HTS) technologies are employed to capture the expression levels of sRNA populations. The processing of the resulting big data sets facilitated the computational analysis of the sRNA patterns of variation within biological samples such as time point experiments, tissue series or various treatments. Rapid technological advances enable larger experiments, often with biological replicates leading to a vast amount of raw data. As a result, in this fast-evolving field, the existing methods for sequence characterization and prediction of interaction (regulatory) networks periodically require adapting or in extreme cases, a complete redesign to cope with the data deluge. In addition, the presence of numerous tools focused only on particular steps of HTS analysis hinders the systematic parsing of the results and their interpretation.The UEA small RNA Workbench (v1-4), described in this chapter, provides a user-friendly, modular, interactive analysis in the form of a suite of computational tools designed to process and mine sRNA datasets for interesting characteristics that can be linked back to the observed phenotypes. First, we show how to preprocess the raw sequencing output and prepare it for downstream analysis. Then we review some quality checks that can be used as a first indication of sources of variability between samples. Next we show how the Workbench can provide a comparison of the effects of different normalization approaches on the distributions of expression, enhanced methods for the identification of differentially expressed

  17. Advanced EVA Suit Camera System Development Project

    Science.gov (United States)

    Mock, Kyla

    2016-01-01

    The National Aeronautics and Space Administration (NASA) at the Johnson Space Center (JSC) is developing a new extra-vehicular activity (EVA) suit known as the Advanced EVA Z2 Suit. All of the improvements to the EVA Suit provide the opportunity to update the technology of the video imagery. My summer internship project involved improving the video streaming capabilities of the cameras that will be used on the Z2 Suit for data acquisition. To accomplish this, I familiarized myself with the architecture of the camera that is currently being tested to be able to make improvements on the design. Because there is a lot of benefit to saving space, power, and weight on the EVA suit, my job was to use Altium Design to start designing a much smaller and simplified interface board for the camera's microprocessor and external components. This involved checking datasheets of various components and checking signal connections to ensure that this architecture could be used for both the Z2 suit and potentially other future projects. The Orion spacecraft is a specific project that may benefit from this condensed camera interface design. The camera's physical placement on the suit also needed to be determined and tested so that image resolution can be maximized. Many of the options of the camera placement may be tested along with other future suit testing. There are multiple teams that work on different parts of the suit, so the camera's placement could directly affect their research or design. For this reason, a big part of my project was initiating contact with other branches and setting up multiple meetings to learn more about the pros and cons of the potential camera placements we are analyzing. Collaboration with the multiple teams working on the Advanced EVA Z2 Suit is absolutely necessary and these comparisons will be used as further progress is made for the overall suit design. This prototype will not be finished in time for the scheduled Z2 Suit testing, so my time was

  18. BIM Software Capability and Interoperability Analysis : An analytical approach toward structural usage of BIM software (S-BIM)

    OpenAIRE

    A. Taher, Ali

    2016-01-01

    This study focused on the structuralanalysis of BIM models. Different commercial software (Autodesk products and Rhinoceros)are presented through modelling and analysis of different structures with varying complexity,section properties, geometry, and material. Beside the commercial software, differentarchitectural and different tools for structural analysis are evaluated (dynamo, grasshopper,add-on tool, direct link, indirect link via IFC). BIM and Structural BIM (S-BIM)

  19. Research and Development of Statistical Analysis Software System of Maize Seedling Experiment

    OpenAIRE

    Hui Cao

    2014-01-01

    In this study, software engineer measures were used to develop a set of software system for maize seedling experiments statistics and analysis works. During development works, B/S structure software design method was used and a set of statistics indicators for maize seedling evaluation were established. The experiments results indicated that this set of software system could finish quality statistics and analysis for maize seedling very well. The development of this software system explored a...

  20. Software Tools for Fault Management Technologies, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — Fault Management (FM) is a key requirement for safety, efficient onboard and ground operations, maintenance, and repair. QSI's TEAMS Software suite is a leading...

  1. AGSuite: Software to conduct feature analysis of artificial grammar learning performance.

    Science.gov (United States)

    Cook, Matthew T; Chubala, Chrissy M; Jamieson, Randall K

    2017-10-01

    To simplify the problem of studying how people learn natural language, researchers use the artificial grammar learning (AGL) task. In this task, participants study letter strings constructed according to the rules of an artificial grammar and subsequently attempt to discriminate grammatical from ungrammatical test strings. Although the data from these experiments are usually analyzed by comparing the mean discrimination performance between experimental conditions, this practice discards information about the individual items and participants that could otherwise help uncover the particular features of strings associated with grammaticality judgments. However, feature analysis is tedious to compute, often complicated, and ill-defined in the literature. Moreover, the data violate the assumption of independence underlying standard linear regression models, leading to Type I error inflation. To solve these problems, we present AGSuite, a free Shiny application for researchers studying AGL. The suite's intuitive Web-based user interface allows researchers to generate strings from a database of published grammars, compute feature measures (e.g., Levenshtein distance) for each letter string, and conduct a feature analysis on the strings using linear mixed effects (LME) analyses. The LME analysis solves the inflation of Type I errors that afflicts more common methods of repeated measures regression analysis. Finally, the software can generate a number of graphical representations of the data to support an accurate interpretation of results. We hope the ease and availability of these tools will encourage researchers to take full advantage of item-level variance in their datasets in the study of AGL. We moreover discuss the broader applicability of the tools for researchers looking to conduct feature analysis in any field.

  2. Essence: Team-Based Software Innovation

    DEFF Research Database (Denmark)

    Aaen, Ivan

    2012-01-01

    Essence is a methodology supporting innovative software teams. It is designed with agile development in mind to allow for the problem situation to talk back to the team as they go along building solutions. Traditional software development teams – and for that matter probably also development teams...... using technologies other than software – might also enjoy adapting Essence to suit their situation. Essence is not yet another method for generating ideas. There are plenty of good methods already, and for that reason I choose to focus less on idea generation and more on the thereafter. Most teams....... Essence is based on the idea that challenges are open to interpretation and choice. We may often choose how we understand a challenge and choose among several strategies for answering it. Software development and indeed software innovation are far from linear. Essence is built on structures rather than...

  3. Continuous Software Quality analysis for the ATLAS experiment

    CERN Document Server

    Washbrook, Andrew; The ATLAS collaboration

    2017-01-01

    The regular application of software quality tools in large collaborative projects is required to reduce code defects to an acceptable level. If left unchecked the accumulation of defects invariably results in performance degradation at scale and problems with the long-term maintainability of the code. Although software quality tools are effective for identification there remains a non-trivial sociological challenge to resolve defects in a timely manner. This is a ongoing concern for the ATLAS software which has evolved over many years to meet the demands of Monte Carlo simulation, detector reconstruction and data analysis. At present over 3.8 million lines of C++ code (and close to 6 million total lines of code) are maintained by a community of hundreds of developers worldwide. It is therefore preferable to address code defects before they are introduced into a widely used software release. Recent wholesale changes to the ATLAS software infrastructure have provided an ideal opportunity to apply software quali...

  4. Design and validation of Segment - freely available software for cardiovascular image analysis

    International Nuclear Information System (INIS)

    Heiberg, Einar; Sjögren, Jane; Ugander, Martin; Carlsson, Marcus; Engblom, Henrik; Arheden, Håkan

    2010-01-01

    Commercially available software for cardiovascular image analysis often has limited functionality and frequently lacks the careful validation that is required for clinical studies. We have already implemented a cardiovascular image analysis software package and released it as freeware for the research community. However, it was distributed as a stand-alone application and other researchers could not extend it by writing their own custom image analysis algorithms. We believe that the work required to make a clinically applicable prototype can be reduced by making the software extensible, so that researchers can develop their own modules or improvements. Such an initiative might then serve as a bridge between image analysis research and cardiovascular research. The aim of this article is therefore to present the design and validation of a cardiovascular image analysis software package (Segment) and to announce its release in a source code format. Segment can be used for image analysis in magnetic resonance imaging (MRI), computed tomography (CT), single photon emission computed tomography (SPECT) and positron emission tomography (PET). Some of its main features include loading of DICOM images from all major scanner vendors, simultaneous display of multiple image stacks and plane intersections, automated segmentation of the left ventricle, quantification of MRI flow, tools for manual and general object segmentation, quantitative regional wall motion analysis, myocardial viability analysis and image fusion tools. Here we present an overview of the validation results and validation procedures for the functionality of the software. We describe a technique to ensure continued accuracy and validity of the software by implementing and using a test script that tests the functionality of the software and validates the output. The software has been made freely available for research purposes in a source code format on the project home page (http://segment.heiberg.se). Segment

  5. Residence time distribution software analysis. User's manual

    International Nuclear Information System (INIS)

    1996-01-01

    Radiotracer applications cover a wide range of industrial activities in chemical and metallurgical processes, water treatment, mineral processing, environmental protection and civil engineering. Experiment design, data acquisition, treatment and interpretation are the basic elements of tracer methodology. The application of radiotracers to determine impulse response as RTD as well as the technical conditions for conducting experiments in industry and in the environment create a need for data processing using special software. Important progress has been made during recent years in the preparation of software programs for data treatment and interpretation. The software package developed for industrial process analysis and diagnosis by the stimulus-response methods contains all the methods for data processing for radiotracer experiments

  6. Simulation software support (S3) system a software testing and debugging tool

    International Nuclear Information System (INIS)

    Burgess, D.C.; Mahjouri, F.S.

    1990-01-01

    The largest percentage of technical effort in the software development process is accounted for debugging and testing. It is not unusual for a software development organization to spend over 50% of the total project effort on testing. In the extreme, testing of human-rated software (e.g., nuclear reactor monitoring, training simulator) can cost three to five times as much as all other software engineering steps combined. The Simulation Software Support (S 3 ) System, developed by the Link-Miles Simulation Corporation is ideally suited for real-time simulation applications which involve a large database with models programmed in FORTRAN. This paper will focus on testing elements of the S 3 system. In this paper system support software utilities are provided which enable the loading and execution of modules in the development environment. These elements include the Linking/Loader (LLD) for dynamically linking program modules and loading them into memory and the interactive executive (IEXEC) for controlling the execution of the modules. Features of the Interactive Symbolic Debugger (SD) and the Real Time Executive (RTEXEC) to support the unit and integrated testing will be explored

  7. A software package for biomedical image processing and analysis

    International Nuclear Information System (INIS)

    Goncalves, J.G.M.; Mealha, O.

    1988-01-01

    The decreasing cost of computing power and the introduction of low cost imaging boards justifies the increasing number of applications of digital image processing techniques in the area of biomedicine. There is however a large software gap to be fulfilled, between the application and the equipment. The requirements to bridge this gap are twofold: good knowledge of the hardware provided and its interface to the host computer, and expertise in digital image processing and analysis techniques. A software package incorporating these two requirements was developed using the C programming language, in order to create a user friendly image processing programming environment. The software package can be considered in two different ways: as a data structure adapted to image processing and analysis, which acts as the backbone and the standard of communication for all the software; and as a set of routines implementing the basic algorithms used in image processing and analysis. Hardware dependency is restricted to a single module upon which all hardware calls are based. The data structure that was built has four main features: hierchical, open, object oriented, and object dependent dimensions. Considering the vast amount of memory needed by imaging applications and the memory available in small imaging systems, an effective image memory management scheme was implemented. This software package is being used for more than one and a half years by users with different applications. It proved to be an excellent tool for helping people to get adapted into the system, and for standardizing and exchanging software, yet preserving flexibility allowing for users' specific implementations. The philosophy of the software package is discussed and the data structure that was built is described in detail

  8. LEAP2000: tools for sustainable energy analysis

    Energy Technology Data Exchange (ETDEWEB)

    Heaps, C.; Lazarus, M.; Raskin, P. [SEU-Boston, Boston, MA (USA)

    2000-09-01

    LEAP2000 is a collaborative initiative, led by the Boston Center for the Stockholm Environment Institute, to create a new suite of analytical software and databases for integrated energy-environment analysis. The LEAP2000 software and the Technology and Environmental Database (TED) are described. 5 refs., 5 figs.

  9. Clinical trials radiotherapy treatment plan review software : is this the first quantified assessment

    International Nuclear Information System (INIS)

    Hatton, J.A.; Cornes, D.A.

    2011-01-01

    Full text: Clinical trials require robust quality assurance (QA) procedures to ensure commonality of all treatments, with independent reviews to assess compliance with trial protocols. All clinical trials tools, including QA software, require testing for validity and reliability. enabling inter- and intra-trial comparison. Unlike clinical radiotherapy treatment planning (RTP) systems, review software has no published guidelines. This study describes the design and development of a test suite to quantify the performance of review software in TROG clinical trials. Test areas are image handling and reconstruction; geometric accuracy; dosimetric accuracy; dose-volume histogram (DVH) calculation; display of plan parameters. TROG have developed tests for commissioning plan review software, assessed with SWAN 2.3, and CMS Elekta FocalPro. While image handling tests were based on published guidelines for RTP systems, dosimetric tests used the TROG QA case review requirements. Treatment plans represented systems of all manufacturers (Pinnacle, Eclipse, Xio and Oncentra) used in Australasian centres. The test suite identified areas for SW A software development, including the DVH algorithm, changed to reduce calculation time. Results, in Fig. I, for known volumes of varying shapes and sizes, demonstrate differences between SWAN 2.1 and 2.3 when compared with Eclipse. Liaison with SWAN programmers enabled re-instatement of 2.1 algorithm. The test suite has quantified the RTP review software, prioritised areas for development with the programmers, and improved the user experience.

  10. R suite for the Reduction and Analysis of UFO Orbit Data

    Science.gov (United States)

    Campbell-Burns, P.; Kacerek, R.

    2016-02-01

    This paper presents work undertaken by UKMON to compile a suite of simple R scripts for the reduction and analysis of meteor data. The application of R in this context is by no means an original idea and there is no doubt that it has been used already in many reports to the IMO. However, we are unaware of any common libraries or shared resources available to the meteor community. By sharing our work we hope to stimulate interest and discussion. Graphs shown in this paper are illustrative and are based on current data from both EDMOND and UKMON.

  11. HeteroGenius: A Framework for Hybrid Analysis of Heterogeneous Software Specifications

    Directory of Open Access Journals (Sweden)

    Manuel Giménez

    2014-01-01

    Full Text Available Nowadays, software artifacts are ubiquitous in our lives being an essential part of home appliances, cars, cell phones, and even in more critical activities like aeronautics and health sciences. In this context software failures may produce enormous losses, either economical or, in the worst case, in human lives. Software analysis is an area in software engineering concerned with the application of diverse techniques in order to prove the absence of errors in software pieces. In many cases different analysis techniques are applied by following specific methodological combinations that ensure better results. These interactions between tools are usually carried out at the user level and it is not supported by the tools. In this work we present HeteroGenius, a framework conceived to develop tools that allow users to perform hybrid analysis of heterogeneous software specifications. HeteroGenius was designed prioritising the possibility of adding new specification languages and analysis tools and enabling a synergic relation of the techniques under a graphical interface satisfying several well-known usability enhancement criteria. As a case-study we implemented the functionality of Dynamite on top of HeteroGenius.

  12. Using recurrence plot analysis for software execution interpretation and fault detection

    Science.gov (United States)

    Mosdorf, M.

    2015-09-01

    This paper shows a method targeted at software execution interpretation and fault detection using recurrence plot analysis. In in the proposed approach recurrence plot analysis is applied to software execution trace that contains executed assembly instructions. Results of this analysis are subject to further processing with PCA (Principal Component Analysis) method that simplifies number coefficients used for software execution classification. This method was used for the analysis of five algorithms: Bubble Sort, Quick Sort, Median Filter, FIR, SHA-1. Results show that some of the collected traces could be easily assigned to particular algorithms (logs from Bubble Sort and FIR algorithms) while others are more difficult to distinguish.

  13. Hazard Analysis of Software Requirements Specification for Process Module of FPGA-based Controllers in NPP

    Energy Technology Data Exchange (ETDEWEB)

    Jung; Sejin; Kim, Eui-Sub; Yoo, Junbeom [Konkuk University, Seoul (Korea, Republic of); Keum, Jong Yong; Lee, Jang-Soo [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2016-10-15

    Software in PLC, FPGA which are used to develop I and C system also should be analyzed to hazards and risks before used. NUREG/CR-6430 proposes the method for performing software hazard analysis. It suggests analysis technique for software affected hazards and it reveals that software hazard analysis should be performed with the aspects of software life cycle such as requirements analysis, design, detailed design, implements. It also provides the guide phrases for applying software hazard analysis. HAZOP (Hazard and operability analysis) is one of the analysis technique which is introduced in NUREG/CR-6430 and it is useful technique to use guide phrases. HAZOP is sometimes used to analyze the safety of software. Analysis method of NUREG/CR-6430 had been used in Korea nuclear power plant software for PLC development. Appropriate guide phrases and analysis process are selected to apply efficiently and NUREG/CR-6430 provides applicable methods for software hazard analysis is identified in these researches. We perform software hazard analysis of FPGA software requirements specification with two approaches which are NUREG/CR-6430 and HAZOP with using general GW. We also perform the comparative analysis with them. NUREG/CR-6430 approach has several pros and cons comparing with the HAZOP with general guide words and approach. It is enough applicable to analyze the software requirements specification of FPGA.

  14. Change Impact Analysis of Crosscutting in Software Architectural Design

    NARCIS (Netherlands)

    van den Berg, Klaas

    2006-01-01

    Software architectures should be amenable to changes in user requirements and implementation technology. The analysis of the impact of these changes can be based on traceability of architectural design elements. Design elements have dependencies with other software artifacts but also evolve in time.

  15. Development of design and analysis software for advanced nuclear system

    International Nuclear Information System (INIS)

    Wu Yican; Hu Liqin; Long Pengcheng; Luo Yuetong; Li Yazhou; Zeng Qin; Lu Lei; Zhang Junjun; Zou Jun; Xu Dezheng; Bai Yunqing; Zhou Tao; Chen Hongli; Peng Lei; Song Yong; Huang Qunying

    2010-01-01

    A series of professional codes, which are necessary software tools and data libraries for advanced nuclear system design and analysis, were developed by the FDS Team, including the codes of automatic modeling, physics and engineering calculation, virtual simulation and visualization, system engineering and safety analysis and the related database management etc. The development of these software series was proposed as an exercise of development of nuclear informatics. This paper introduced the main functions and key techniques of the software series, as well as some tests and practical applications. (authors)

  16. Automating risk analysis of software design models.

    Science.gov (United States)

    Frydman, Maxime; Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance.

  17. Practicality for Software Hazard Analysis for Nuclear Safety I and C System

    International Nuclear Information System (INIS)

    Kim, Yong-Ho; Moon, Kwon-Ki; Chang, Young-Woo; Jeong, Soo-Hyun

    2016-01-01

    We are using the concept of system safety in engineering. It is difficult to make any system perfectly safe and probably a complete system may not easily be achieved. The standard definition of a system from MIL-STD- 882E is: “The organization of hardware, software, material, facilities, personnel, data, and services needed to perform a designated function within a stated environment with specified results.” From the perspective of the system safety engineer and the hazard analysis process, software is considered as a subsystem. Regarding hazard analysis, to date, methods for identifying software failures and determining their effects is still a research problem. Since the success of software development is based on rigorous test of hardware and software, it is necessary to check the balance between software test and hardware test, and in terms of efficiency. Lessons learned and experience from similar systems are important for the work of hazard analysis. No major hazard has been issued for the software developed and verified in Korean NPPs. In addition to hazard analysis, software development, and verification and validation were thoroughly performed. It is reasonable that the test implementation including the development of the test case, stress and abnormal conditions, error recovery situations, and high risk hazardous situations play a key role in detecting and preventing software faults

  18. Practicality for Software Hazard Analysis for Nuclear Safety I and C System

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Yong-Ho; Moon, Kwon-Ki; Chang, Young-Woo; Jeong, Soo-Hyun [KEPCO Engineering and Construction Co., Deajeon (Korea, Republic of)

    2016-10-15

    We are using the concept of system safety in engineering. It is difficult to make any system perfectly safe and probably a complete system may not easily be achieved. The standard definition of a system from MIL-STD- 882E is: “The organization of hardware, software, material, facilities, personnel, data, and services needed to perform a designated function within a stated environment with specified results.” From the perspective of the system safety engineer and the hazard analysis process, software is considered as a subsystem. Regarding hazard analysis, to date, methods for identifying software failures and determining their effects is still a research problem. Since the success of software development is based on rigorous test of hardware and software, it is necessary to check the balance between software test and hardware test, and in terms of efficiency. Lessons learned and experience from similar systems are important for the work of hazard analysis. No major hazard has been issued for the software developed and verified in Korean NPPs. In addition to hazard analysis, software development, and verification and validation were thoroughly performed. It is reasonable that the test implementation including the development of the test case, stress and abnormal conditions, error recovery situations, and high risk hazardous situations play a key role in detecting and preventing software faults.

  19. Development of Cell Analysis Software for Cultivated Corneal Endothelial Cells.

    Science.gov (United States)

    Okumura, Naoki; Ishida, Naoya; Kakutani, Kazuya; Hongo, Akane; Hiwa, Satoru; Hiroyasu, Tomoyuki; Koizumi, Noriko

    2017-11-01

    To develop analysis software for cultured human corneal endothelial cells (HCECs). Software was designed to recognize cell borders and to provide parameters such as cell density, coefficient of variation, and polygonality of cultured HCECs based on phase contrast images. Cultured HCECs with high or low cell density were incubated with Ca-free and Mg-free phosphate-buffered saline for 10 minutes to reveal the cell borders and were then analyzed with software (n = 50). Phase contrast images showed that cell borders were not distinctly outlined, but these borders became more distinctly outlined after phosphate-buffered saline treatment and were recognized by cell analysis software. The cell density value provided by software was similar to that obtained using manual cell counting by an experienced researcher. Morphometric parameters, such as the coefficient of variation and polygonality, were also produced by software, and these values were significantly correlated with cell density (Pearson correlation coefficients -0.62 and 0.63, respectively). The software described here provides morphometric information from phase contrast images, and it enables subjective and noninvasive quality assessment for tissue engineering therapy of the corneal endothelium.

  20. Study of gamma ray analysis software's. Application to activation analysis of geological samples

    International Nuclear Information System (INIS)

    Silva, Luiz Roberto Nogueira da

    1998-01-01

    A comparative evaluation of the gamma-ray analysis software VISPECT, in relation to two commercial gamma-ray analysis software packages, OMNIGAM (EG and G Ortec) and SAMPO 90 (Canberra) was performed. For this evaluation, artificial gamma ray spectra were created, presenting peaks of different intensities and located at four different regions of the spectrum. Multiplet peaks with equal and different intensities, but with different channel separations, were also created. The results obtained showed a good performance of VISPECT in detecting and analysing single and multiplet peaks of different intensities in the gamma-ray spectrum. Neutron activation analysis of the geological reference material GS-N (IWG-GIT) and of the granite G-94, used in a Proficiency Testing Trial of Analytical Geochemistry Laboratories, was also performed , in order to evaluate the VISEPCT software in the analysis of real samples. The results obtained by using VISPECT were as good or better than the ones obtained using the other programs. (author)

  1. Development of interactive software for fuel management analysis

    International Nuclear Information System (INIS)

    Graves, H.W. Jr.

    1986-01-01

    Electronic computation plays a central part in engineering analysis of all types. Utilization of microcomputers for calculations that were formerly carried out on large mainframe computers presents a unique opportunity to develop software that not only takes advantage of the lower cost of using these machines, but also increases the efficiency of the engineers performing these calculations. This paper reviews the use of electronic computers in engineering analysis, discusses the potential for microcomputer utilization in this area, and describes a series of steps to be followed in software development that can yield significant gains in engineering design efficiency

  2. MAUS: MICE Analysis User Software

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    The Muon Ionization Cooling Experiment (MICE) has developed the MICE Analysis User Software (MAUS) to simulate and analyse experimental data. It serves as the primary codebase for the experiment, providing for online data quality checks and offline batch simulation and reconstruction. The code is structured in a Map-Reduce framework to allow parallelization whether on a personal machine or in the control room. Various software engineering practices from industry are also used to ensure correct and maintainable physics code, which include unit, functional and integration tests, continuous integration and load testing, code reviews, and distributed version control systems. Lastly, there are various small design decisions like using JSON as the data structure, using SWIG to allow developers to write components in either Python or C++, or using the SCons python-based build system that may be of interest to other experiments.

  3. An analysis software of tritium distribution in food and environmental water in China

    International Nuclear Information System (INIS)

    Li Wenhong; Xu Cuihua; Ren Tianshan; Deng Guilong

    2006-01-01

    Objective: The purpose of developing this analysis-software of tritium distribution in food and environmental water is to collect tritium monitoring data, to analyze the data, both automatically, statistically and graphically, and to study and share the data. Methods: Based on the data obtained before, analysis-software is wrote by using VC++. NET as tool software. The software first transfers data from EXCEL into a database. It has additive function of data-append, so operators can embody new monitoring data easily. Results: After turning the monitoring data saved as EXCEL file by original researchers into a database, people can easily access them. The software provides a tool of distributing-analysis of tritium. Conclusion: This software is a first attempt of data-analysis about tritium level in food and environmental water in China. Data achieving, searching and analyzing become easily and directly with the software. (authors)

  4. SIMPATIQCO: a server-based software suite which facilitates monitoring the time course of LC-MS performance metrics on Orbitrap instruments.

    Science.gov (United States)

    Pichler, Peter; Mazanek, Michael; Dusberger, Frederico; Weilnböck, Lisa; Huber, Christian G; Stingl, Christoph; Luider, Theo M; Straube, Werner L; Köcher, Thomas; Mechtler, Karl

    2012-11-02

    While the performance of liquid chromatography (LC) and mass spectrometry (MS) instrumentation continues to increase, applications such as analyses of complete or near-complete proteomes and quantitative studies require constant and optimal system performance. For this reason, research laboratories and core facilities alike are recommended to implement quality control (QC) measures as part of their routine workflows. Many laboratories perform sporadic quality control checks. However, successive and systematic longitudinal monitoring of system performance would be facilitated by dedicated automatic or semiautomatic software solutions that aid an effortless analysis and display of QC metrics over time. We present the software package SIMPATIQCO (SIMPle AuTomatIc Quality COntrol) designed for evaluation of data from LTQ Orbitrap, Q-Exactive, LTQ FT, and LTQ instruments. A centralized SIMPATIQCO server can process QC data from multiple instruments. The software calculates QC metrics supervising every step of data acquisition from LC and electrospray to MS. For each QC metric the software learns the range indicating adequate system performance from the uploaded data using robust statistics. Results are stored in a database and can be displayed in a comfortable manner from any computer in the laboratory via a web browser. QC data can be monitored for individual LC runs as well as plotted over time. SIMPATIQCO thus assists the longitudinal monitoring of important QC metrics such as peptide elution times, peak widths, intensities, total ion current (TIC) as well as sensitivity, and overall LC-MS system performance; in this way the software also helps identify potential problems. The SIMPATIQCO software package is available free of charge.

  5. Validation suite for MCNP

    International Nuclear Information System (INIS)

    Mosteller, Russell D.

    2002-01-01

    Two validation suites, one for criticality and another for radiation shielding, have been defined and tested for the MCNP Monte Carlo code. All of the cases in the validation suites are based on experiments so that calculated and measured results can be compared in a meaningful way. The cases in the validation suites are described, and results from those cases are discussed. For several years, the distribution package for the MCNP Monte Carlo code1 has included an installation test suite to verify that MCNP has been installed correctly. However, the cases in that suite have been constructed primarily to test options within the code and to execute quickly. Consequently, they do not produce well-converged answers, and many of them are physically unrealistic. To remedy these deficiencies, sets of validation suites are being defined and tested for specific types of applications. All of the cases in the validation suites are based on benchmark experiments. Consequently, the results from the measurements are reliable and quantifiable, and calculated results can be compared with them in a meaningful way. Currently, validation suites exist for criticality and radiation-shielding applications.

  6. Ionospheric Mapping Software Ensures Accuracy of Pilots GPS

    Science.gov (United States)

    2015-01-01

    IonoSTAGE and SuperTruth software are part of a suite created at the Jet Propulsion Laboratory to enable the Federal Aviation Administration's Wide Area Augmentation System, which provides pinpoint accuracy in aircraft GPS units. The system, used by more than 73,000 planes, facilitates landings under adverse conditions at small airports. In 2013, IonoSTAGE and SuperTruth found their first commercial license when NEC, based in Japan, with US headquarters in Irving, Texas, licensed the entire suite.

  7. A reflection on Software Engineering in HEP

    International Nuclear Information System (INIS)

    Carminati, Federico

    2012-01-01

    High Energy Physics (HEP) has been making very extensive usage of computers to achieve its research goals. Fairly large program suites have been developed, maintained and used over the years and it is fair to say that, overall, HEP has been successful in software development. Yet, HEP software development has not used classical Software Engineering techniques, which have been invented and refined to help the production of large programmes. In this paper we will review the development of HEP code with its strengths and weaknesses. Using several well-known HEP software projects as examples, we will try to demonstrate that our community has used a form of Software Engineering, albeit in an informal manner. The software development techniques employed in these projects are indeed very close in many aspects to the modern tendencies of Software Engineering itself, in particular the so-called “agile technologies”. The paper will conclude with an outlook on the future of software development in HEP.

  8. Distortion Analysis Toolkit—A Software Tool for Easy Analysis of Nonlinear Audio Systems

    Directory of Open Access Journals (Sweden)

    Jyri Pakarinen

    2010-01-01

    Full Text Available Several audio effects devices deliberately add nonlinear distortion to the processed signal in order to create a desired sound. When creating virtual analog models of nonlinearly distorting devices, it would be very useful to carefully analyze the type of distortion, so that the model could be made as realistic as possible. While traditional system analysis tools such as the frequency response give detailed information on the operation of linear and time-invariant systems, they are less useful for analyzing nonlinear devices. Furthermore, although there do exist separate algorithms for nonlinear distortion analysis, there is currently no unified, easy-to-use tool for rapid analysis of distorting audio systems. This paper offers a remedy by introducing a new software tool for easy analysis of distorting effects. A comparison between a well-known guitar tube amplifier and two commercial software simulations is presented as a case study. This freely available software is written in Matlab language, but the analysis tool can also run as a standalone program, so the user does not need to have Matlab installed in order to perform the analysis.

  9. BASTILLE - Better Analysis Software to Treat ILL Experiments - a unified, unifying approach to data reduction and analysis

    International Nuclear Information System (INIS)

    Johnson, M.

    2011-01-01

    Data reduction and analysis is a key component in the production of scientific results. If this component, like any other in the chain, is weak, the final output is compromised. The current situation for data reduction and analysis may be regarded as adequate, but it is variable, depending on the instrument, and should be improved. In particular the delivery of new and upgraded instruments in Millennium Phase I and those proposed for Phase II will bring new demands and challenges for software development. Failure to meet these challenges will hamper the exploitation of higher data rates and the delivery of new science. The proposed project is to provide a single, underpinning software infrastructure for data analysis, which would ensure: 1) a clear vision of software provision at ILL; 2) a clear role for the 'Computing for Science' Group (CS) in maintaining and developing the infrastructure and the codes; 3) a well-defined framework for recruiting and training CS staff; 4) ease and efficiency of development within a common, well-defined software environment; 5) safeguarding of key, existing software; and 6) ease of communication with other software like instrument control software to allow real-time data analysis and experiment control, or software from other institutes or sources

  10. [Development of analysis software package for the two kinds of Japanese fluoro-d-glucose-positron emission tomography guideline].

    Science.gov (United States)

    Matsumoto, Keiichi; Endo, Keigo

    2013-06-01

    Two kinds of Japanese guidelines for the data acquisition protocol of oncology fluoro-D-glucose-positron emission tomography (FDG-PET)/computed tomography (CT) scans were created by the joint task force of the Japanese Society of Nuclear Medicine Technology (JSNMT) and the Japanese Society of Nuclear Medicine (JSNM), and published in Kakuigaku-Gijutsu 27(5): 425-456, 2007 and 29(2): 195-235, 2009. These guidelines aim to standardize PET image quality among facilities and different PET/CT scanner models. The objective of this study was to develop a personal computer-based performance measurement and image quality processor for the two kinds of Japanese guidelines for oncology (18)F-FDG PET/CT scans. We call this software package the "PET quality control tool" (PETquact). Microsoft Corporation's Windows(™) is used as the operating system for PETquact, which requires 1070×720 image resolution and includes 12 different applications. The accuracy was examined for numerous applications of PETquact. For example, in the sensitivity application, the system sensitivity measurement results were equivalent when comparing two PET sinograms obtained from the PETquact and the report. PETquact is suited for analysis of the two kinds of Japanese guideline, and it shows excellent spec to performance measurements and image quality analysis. PETquact can be used at any facility if the software package is installed on a laptop computer.

  11. Modeling the Impact of Space Suit Components and Anthropometry on the Center of Mass of a Seated Crewmember

    Science.gov (United States)

    Rajulu, Sudhakar; Blackledge, Christopher; Ferrer, Mike; Margerum, Sarah

    2009-01-01

    subset of the possible maximum and minimum sized crewmembers, were segmented using point-cloud software to create 17 major body segments. The general approach used to calculate the human mass properties was to utilize center of volume outputs from the software for each body segment and apply a homogeneous density function to determine segment mass 3-D coordinates. Suit components, based on the current consensus regarding predicted suit configuration values, were treated as point masses and were positioned using vector mathematics along the body segments based on anthropometry and COM position. A custom MATLAB script then articulates the body segment and suit positions into a selected seated configuration, using joint angles that characterize a standard seated position and a CEV specific seated position. Additional MATLAB(r) scripts are finally used to calculate the composite COM positions in 3-D space for all 12 manikins in both suited and unsuited conditions for both seated configurations. The analysis focused on two aspects: (1) to quantify how much the whole body COM varied from the smallest to largest subject and (2) the impacts of the suit components on the overall COM in each seat configuration. The location across all boundary manikins of the anterior- posterior COM varied by approximately 7cm, the vertical COM varied by approximately 9-10cm, and the mediolateral COM varied by approximately 1.2 cm from the midline sagittal plane for both seat configurations. This variation was surprisingly large given the relative proportionality of the mass distribution of the human body. The suit components caused an anterior shift of the total COM by approximately 2 cm and a shift to the right along the mediolateral axis of 0.4 cm for both seat configurations. When the seat configuration is in the standard posture, the suited vertical COM shifts inferiorly by up to 1 cm whereas in the CEV posture the vertical COM has no appreciable change. These general differences were due the

  12. APMS: An Integrated Suite of Tools for Measuring Performance and Safety

    Science.gov (United States)

    Statler, Irving C.; Lynch, Robert E.; Connors, Mary M. (Technical Monitor)

    1997-01-01

    This is a report of work in progress. In it, I summarize the status of the research and development of the Aviation Performance Measuring System (APMS) for managing, processing, and analyzing digital flight-recorded data. The objectives of the NASA-FAA APMS research project are to establish a sound scientific and technological basis for flight-data analysis, to define an open and flexible architecture for flight-data-analysis systems, and to articulate guidelines for a standardized database structure on which to continue to build future flight-data-analysis extensions. APMS will offer to the air transport community an open, voluntary standard for flight-data-analysis software, a standard that will help to ensure suitable functionality, and data interchangeability, among competing software programs. APMS will develop and document the methodologies, algorithms, and procedures for data management and analyses to enable users to easily interpret the implications regarding safety and efficiency of operations. APMS does not entail the implementation of a nationwide flight-data-collection system. It is intended to provide technical tools to ease the large-scale implementation of flight-data analyses at both the air-carrier and the national-airspace levels in support of their Flight Operations and Quality Assurance (FOQA) Programs and Advanced Qualifications Programs (AQP). APMS cannot meet its objectives unless it develops tools that go substantially beyond the capabilities of the current commercially available software and supporting analytic methods that are mainly designed to count special events. These existing capabilities, while of proven value, were created primarily with the needs of air crews in mind. APMS tools must serve the needs of the government and air carriers, as well as air crews, to fully support the FOQA and AQP programs. They must be able to derive knowledge not only through the analysis of single flights (special-event detection), but through

  13. New software library of geometrical primitives for modelling of solids used in Monte Carlo detector simulations

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    We present our effort for the creation of a new software library of geometrical primitives, which are used for solid modelling in Monte Carlo detector simulations. We plan to replace and unify current geometrical primitive classes in the CERN software projects Geant4 and ROOT with this library. Each solid is represented by a C++ class with methods suited for measuring distances of particles from the surface of a solid and for determination as to whether the particles are located inside, outside or on the surface of the solid. We use numerical tolerance for determining whether the particles are located on the surface. The class methods also contain basic support for visualization. We use dedicated test suites for validation of the shape codes. These include also special performance and numerical value comparison tests for help with analysis of possible candidates of class methods as well as to verify that our new implementation proposals were designed and implemented properly. Currently, bridge classes are u...

  14. Quick Spacecraft Thermal Analysis Tool, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — For spacecraft design and development teams concerned with cost and schedule, the Quick Spacecraft Thermal Analysis Tool (QuickSTAT) is an innovative software suite...

  15. Crisis management for software development and knowledge transfer

    CERN Document Server

    Zykov, Sergey V

    2016-01-01

    This well structured book discusses lifecycle optimization of software projects for crisis management by means of software engineering methods and tools. Its outcomes are based on lessons learned from the software engineering crisis which started in the 1960s. The book presents a systematic approach to overcome the crisis in software engineering depends which not only depends on technology-related but also on human-related factors. It proposes an adaptive methodology for software product development, which optimizes the software product lifecycle in order to avoid “local” crises of software production. The general lifecycle pattern and its stages are discussed, and their impact on the time and budget of the software product development is analyzed. The book identifies key advantages and disadvantages for various models selected and concludes that there is no “silver bullet”, or universal model, which suits all software products equally well. It approaches software architecture in terms of process, dat...

  16. OST: analysis tool for real time software by simulation of material and software environments

    International Nuclear Information System (INIS)

    Boulc'h; Le Meur; Lapassat; Salichon; Segalard

    1988-07-01

    The utilization of microprocessors systems in a nuclear installation control oblige a great operation safety in the installation operation and in the environment protection. For the safety analysis of these installations the Institute of Protection and Nuclear Safety (IPSN) will dispose tools which permit to make controls during all the life of the software. The simulation and test tool (OST) which have been created is completely made by softwares. It is used on VAX calculators and can be easily transportable on other calculators [fr

  17. The software analysis project for the Office of Human Resources

    Science.gov (United States)

    Tureman, Robert L., Jr.

    1994-01-01

    There were two major sections of the project for the Office of Human Resources (OHR). The first section was to conduct a planning study to analyze software use with the goal of recommending software purchases and determining whether the need exists for a file server. The second section was analysis and distribution planning for retirement planning computer program entitled VISION provided by NASA Headquarters. The software planning study was developed to help OHR analyze the current administrative desktop computing environment and make decisions regarding software acquisition and implementation. There were three major areas addressed by the study: current environment new software requirements, and strategies regarding the implementation of a server in the Office. To gather data on current environment, employees were surveyed and an inventory of computers were produced. The surveys were compiled and analyzed by the ASEE fellow with interpretation help by OHR staff. New software requirements represented a compilation and analysis of the surveyed requests of OHR personnel. Finally, the information on the use of a server represents research done by the ASEE fellow and analysis of survey data to determine software requirements for a server. This included selection of a methodology to estimate the number of copies of each software program required given current use and estimated growth. The report presents the results of the computing survey, a description of the current computing environment, recommenations for changes in the computing environment, current software needs, management advantages of using a server, and management considerations in the implementation of a server. In addition, detailed specifications were presented for the hardware and software recommendations to offer a complete picture to OHR management. The retirement planning computer program available to NASA employees will aid in long-range retirement planning. The intended audience is the NASA civil

  18. Space Suit Performance: Methods for Changing the Quality of Quantitative Data

    Science.gov (United States)

    Cowley, Matthew; Benson, Elizabeth; Rajulu, Sudhakar

    2014-01-01

    NASA is currently designing a new space suit capable of working in deep space and on Mars. Designing a suit is very difficult and often requires trade-offs between performance, cost, mass, and system complexity. To verify that new suits will enable astronauts to perform to their maximum capacity, prototype suits must be built and tested with human subjects. However, engineers and flight surgeons often have difficulty understanding and applying traditional representations of human data without training. To overcome these challenges, NASA is developing modern simulation and analysis techniques that focus on 3D visualization. Early understanding of actual performance early on in the design cycle is extremely advantageous to increase performance capabilities, reduce the risk of injury, and reduce costs. The primary objective of this project was to test modern simulation and analysis techniques for evaluating the performance of a human operating in extra-vehicular space suits.

  19. Potential of a suite of robot/computer-assisted motivating systems for personalized, home-based, stroke rehabilitation

    Directory of Open Access Journals (Sweden)

    Feng Xin

    2007-03-01

    Full Text Available Abstract Background There is a need to improve semi-autonomous stroke therapy in home environments often characterized by low supervision of clinical experts and low extrinsic motivation. Our distributed device approach to this problem consists of an integrated suite of low-cost robotic/computer-assistive technologies driven by a novel universal access software framework called UniTherapy. Our design strategy for personalizing the therapy, providing extrinsic motivation and outcome assessment is presented and evaluated. Methods Three studies were conducted to evaluate the potential of the suite. A conventional force-reflecting joystick, a modified joystick therapy platform (TheraJoy, and a steering wheel platform (TheraDrive were tested separately with the UniTherapy software. Stroke subjects with hemiparesis and able-bodied subjects completed tracking activities with the devices in different positions. We quantify motor performance across subject groups and across device platforms and muscle activation across devices at two positions in the arm workspace. Results Trends in the assessment metrics were consistent across devices with able-bodied and high functioning strokes subjects being significantly more accurate and quicker in their motor performance than low functioning subjects. Muscle activation patterns were different for shoulder and elbow across different devices and locations. Conclusion The Robot/CAMR suite has potential for stroke rehabilitation. By manipulating hardware and software variables, we can create personalized therapy environments that engage patients, address their therapy need, and track their progress. A larger longitudinal study is still needed to evaluate these systems in under-supervised environments such as the home.

  20. Learn by Yourself: The Self-Learning Tools for Qualitative Analysis Software Packages

    Science.gov (United States)

    Freitas, Fábio; Ribeiro, Jaime; Brandão, Catarina; Reis, Luís Paulo; de Souza, Francislê Neri; Costa, António Pedro

    2017-01-01

    Computer Assisted Qualitative Data Analysis Software (CAQDAS) are tools that help researchers to develop qualitative research projects. These software packages help the users with tasks such as transcription analysis, coding and text interpretation, writing and annotation, content search and analysis, recursive abstraction, grounded theory…

  1. Monitoring and reporting software for the coal industry

    Energy Technology Data Exchange (ETDEWEB)

    Okanovic, M. [Advanced Systems Integration Pty Ltd. (Australia)

    2001-08-01

    This paper explains the development and launch of MineSuite software, designed to facilitate report production in coal mines. Advanced Systems Integration (ASI) has developed a system that is generic to all mining operations. Mine personnel can define all processes, KPIs, equipment, delays, reports etc. that are vital in monitoring mining operations. Its capabilities have been realised in opencut, underground and preparation plants throughout Australia. Written in Java, MineSuite is a multi-user, multi-threaded, multi-tasking distributed application. 3 figs.

  2. The MicroAnalysis Toolkit: X-ray Fluorescence Image Processing Software

    International Nuclear Information System (INIS)

    Webb, S. M.

    2011-01-01

    The MicroAnalysis Toolkit is an analysis suite designed for the processing of x-ray fluorescence microprobe data. The program contains a wide variety of analysis tools, including image maps, correlation plots, simple image math, image filtering, multiple energy image fitting, semi-quantitative elemental analysis, x-ray fluorescence spectrum analysis, principle component analysis, and tomographic reconstructions. To be as widely useful as possible, data formats from many synchrotron sources can be read by the program with more formats available by request. An overview of the most common features will be presented.

  3. A relational approach to support software architecture analysis

    NARCIS (Netherlands)

    Feijs, L.M.G.; Krikhaar, R.L.; van Ommering, R.C.

    1998-01-01

    This paper reports on our experience with a relational approach to support the analysis of existing software architectures. The analysis options provide for visualization and view calculation. The approach has been applied for reverse engineering. It is also possible to check concrete designs

  4. Software Tools to Support the Assessment of System Health

    Science.gov (United States)

    Melcher, Kevin J.

    2013-01-01

    This presentation provides an overview of three software tools that were developed by the NASA Glenn Research Center to support the assessment of system health: the Propulsion Diagnostic Method Evaluation Strategy (ProDIMES), the Systematic Sensor Selection Strategy (S4), and the Extended Testability Analysis (ETA) tool. Originally developed to support specific NASA projects in aeronautics and space, these software tools are currently available to U.S. citizens through the NASA Glenn Software Catalog. The ProDiMES software tool was developed to support a uniform comparison of propulsion gas path diagnostic methods. Methods published in the open literature are typically applied to dissimilar platforms with different levels of complexity. They often address different diagnostic problems and use inconsistent metrics for evaluating performance. As a result, it is difficult to perform a one ]to ]one comparison of the various diagnostic methods. ProDIMES solves this problem by serving as a theme problem to aid in propulsion gas path diagnostic technology development and evaluation. The overall goal is to provide a tool that will serve as an industry standard, and will truly facilitate the development and evaluation of significant Engine Health Management (EHM) capabilities. ProDiMES has been developed under a collaborative project of The Technical Cooperation Program (TTCP) based on feedback provided by individuals within the aircraft engine health management community. The S4 software tool provides a framework that supports the optimal selection of sensors for health management assessments. S4 is structured to accommodate user ]defined applications, diagnostic systems, search techniques, and system requirements/constraints. One or more sensor suites that maximize this performance while meeting other user ]defined system requirements that are presumed to exist. S4 provides a systematic approach for evaluating combinations of sensors to determine the set or sets of

  5. Exploration Space Suit Architecture: Destination Environmental-Based Technology Development

    Science.gov (United States)

    Hill, Terry R.

    2010-01-01

    This paper picks up where EVA Space Suit Architecture: Low Earth Orbit Vs. Moon Vs. Mars (Hill, Johnson, IEEEAC paper #1209) left off in the development of a space suit architecture that is modular in design and interfaces and could be reconfigured to meet the mission or during any given mission depending on the tasks or destination. This paper will walk though the continued development of a space suit system architecture, and how it should evolve to meeting the future exploration EVA needs of the United States space program. In looking forward to future US space exploration and determining how the work performed to date in the CxP and how this would map to a future space suit architecture with maximum re-use of technology and functionality, a series of thought exercises and analysis have provided a strong indication that the CxP space suit architecture is well postured to provide a viable solution for future exploration missions. Through the destination environmental analysis that is presented in this paper, the modular architecture approach provides the lowest mass, lowest mission cost for the protection of the crew given any human mission outside of low Earth orbit. Some of the studies presented here provide a look and validation of the non-environmental design drivers that will become every-increasingly important the further away from Earth humans venture and the longer they are away. Additionally, the analysis demonstrates a logical clustering of design environments that allows a very focused approach to technology prioritization, development and design that will maximize the return on investment independent of any particular program and provide architecture and design solutions for space suit systems in time or ahead of being required for any particular manned flight program in the future. The new approach to space suit design and interface definition the discussion will show how the architecture is very adaptable to programmatic and funding changes with

  6. NASA's Software Safety Standard

    Science.gov (United States)

    Ramsay, Christopher M.

    2007-01-01

    NASA relies more and more on software to control, monitor, and verify its safety critical systems, facilities and operations. Since the 1960's there has hardly been a spacecraft launched that does not have a computer on board that will provide command and control services. There have been recent incidents where software has played a role in high-profile mission failures and hazardous incidents. For example, the Mars Orbiter, Mars Polar Lander, the DART (Demonstration of Autonomous Rendezvous Technology), and MER (Mars Exploration Rover) Spirit anomalies were all caused or contributed to by software. The Mission Control Centers for the Shuttle, ISS, and unmanned programs are highly dependant on software for data displays, analysis, and mission planning. Despite this growing dependence on software control and monitoring, there has been little to no consistent application of software safety practices and methodology to NASA's projects with safety critical software. Meanwhile, academia and private industry have been stepping forward with procedures and standards for safety critical systems and software, for example Dr. Nancy Leveson's book Safeware: System Safety and Computers. The NASA Software Safety Standard, originally published in 1997, was widely ignored due to its complexity and poor organization. It also focused on concepts rather than definite procedural requirements organized around a software project lifecycle. Led by NASA Headquarters Office of Safety and Mission Assurance, the NASA Software Safety Standard has recently undergone a significant update. This new standard provides the procedures and guidelines for evaluating a project for safety criticality and then lays out the minimum project lifecycle requirements to assure the software is created, operated, and maintained in the safest possible manner. This update of the standard clearly delineates the minimum set of software safety requirements for a project without detailing the implementation for those

  7. Deep in Data: Empirical Data Based Software Accuracy Testing Using the Building America Field Data Repository: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Neymark, J.; Roberts, D.

    2013-06-01

    An opportunity is available for using home energy consumption and building description data to develop a standardized accuracy test for residential energy analysis tools. That is, to test the ability of uncalibrated simulations to match real utility bills. Empirical data collected from around the United States have been translated into a uniform Home Performance Extensible Markup Language format that may enable software developers to create translators to their input schemes for efficient access to the data. This may facilitate the possibility of modeling many homes expediently, and thus implementing software accuracy test cases by applying the translated data. This paper describes progress toward, and issues related to, developing a usable, standardized, empirical data-based software accuracy test suite.

  8. REVEAL - A tool for rule driven analysis of safety critical software

    International Nuclear Information System (INIS)

    Miedl, H.; Kersken, M.

    1998-01-01

    As the determination of ultrahigh reliability figures for safety critical software is hardly possible, national and international guidelines and standards give mainly requirements for the qualitative evaluation of software. An analysis whether all these requirements are fulfilled is time and effort consuming and prone to errors, if performed manually by analysts, and should instead be dedicated to tools as far as possible. There are many ''general-purpose'' software analysis tools, both static and dynamic, which help analyzing the source code. However, they are not designed to assess the adherence to specific requirements of guidelines and standards in the nuclear field. Against the background of the development of I and C systems in the nuclear field which are based on digital techniques and implemented in high level language, it is essential that the assessor or licenser has a tool with which he can automatically and uniformly qualify as many aspects as possible of the high level language software. For this purpose the software analysis tool REVEAL has been developed at ISTec and the Halden Reactor Project. (author)

  9. Progress in Addressing DNFSB Recommendation 2002-1 Issues: Improving Accident Analysis Software Applications

    International Nuclear Information System (INIS)

    VINCENT, ANDREW

    2005-01-01

    Defense Nuclear Facilities Safety Board (DNFSB) Recommendation 2002-1 (''Quality Assurance for Safety-Related Software'') identified a number of quality assurance issues on the use of software in Department of Energy (DOE) facilities for analyzing hazards, and designing and operating controls to prevent or mitigate potential accidents. Over the last year, DOE has begun several processes and programs as part of the Implementation Plan commitments, and in particular, has made significant progress in addressing several sets of issues particularly important in the application of software for performing hazard and accident analysis. The work discussed here demonstrates that through these actions, Software Quality Assurance (SQA) guidance and software tools are available that can be used to improve resulting safety analysis. Specifically, five of the primary actions corresponding to the commitments made in the Implementation Plan to Recommendation 2002-1 are identified and discussed in this paper. Included are the web-based DOE SQA Knowledge Portal and the Central Registry, guidance and gap analysis reports, electronic bulletin board and discussion forum, and a DOE safety software guide. These SQA products can benefit DOE safety contractors in the development of hazard and accident analysis by precluding inappropriate software applications and utilizing best practices when incorporating software results to safety basis documentation. The improvement actions discussed here mark a beginning to establishing stronger, standard-compliant programs, practices, and processes in SQA among safety software users, managers, and reviewers throughout the DOE Complex. Additional effort is needed, however, particularly in: (1) processes to add new software applications to the DOE Safety Software Toolbox; (2) improving the effectiveness of software issue communication; and (3) promoting a safety software quality assurance culture

  10. Applications of the BEam Cross section Analysis Software (BECAS)

    DEFF Research Database (Denmark)

    Blasques, José Pedro Albergaria Amaral; Bitsche, Robert; Fedorov, Vladimir

    2013-01-01

    A newly developed framework is presented for structural design and analysis of long slender beam-like structures, e.g., wind turbine blades. The framework is based on the BEam Cross section Analysis Software – BECAS – a finite element based cross section analysis tool. BECAS is used for the gener......A newly developed framework is presented for structural design and analysis of long slender beam-like structures, e.g., wind turbine blades. The framework is based on the BEam Cross section Analysis Software – BECAS – a finite element based cross section analysis tool. BECAS is used...... for the generation of beam finite element models which correctly account for effects stemming from material anisotropy and inhomogeneity in cross sections of arbitrary geometry. These type of modelling approach allows for an accurate yet computationally inexpensive representation of a general class of three...

  11. How to do Meta-Analysis using HLM software

    OpenAIRE

    Petscher, Yaacov

    2013-01-01

    This is a step-by-step presentation of how to run a meta-analysis using HLM software. Because it's a variance known model, it is not run through the GUI, but batch mode. These slides show how to prepare the data and run the analysis.

  12. ANALYSIS OF DESIGN ELEMENTS IN SKI SUITS

    OpenAIRE

    Çileroğlu, Birsen; Kelleci Özeren, Figen; Kıvılcımlar, İnci Seda

    2015-01-01

    Popularity of Ski Sport in 19th century necessitated a new perspective on protective skiing clothing against the mountain climates and excessive cold. Winter clothing were the basis of ski attire during this period.  By the beginning of 20th century lining cloth were used to minimize the wind effect. The difference between the men and women’s ski attire of the time consisted of a knee-length skirts worn over the golf trousers.  Subsequent to the First World War, skiing suit models were influe...

  13. Development of neutron activation analysis software

    International Nuclear Information System (INIS)

    Wang Liyu

    1987-10-01

    The software for quantitative neutron activation analysis was developed to run under the MS/DOS operating system. The programmes of the IBM/SPAN include: spectra file transfer from and to a Canberra Series 35 multichannel analyzer, spectrum evaluation routines, calibration subprogrammes, and quantitative analysis. The programmes for spectrum analysis include fitting routine for separation of multiple lines by reproducing the peak shape with a combination of Gaussian and exponential terms. The programmes were tested on an IBM/AT-compatible computer. The programmes and the sources are available costfree for the IAEA projects of Technical Cooperation. 7 refs, 3 figs

  14. Analysis for Parallel Execution without Performing Hardware/Software Co-simulation

    OpenAIRE

    Muhammad Rashid

    2014-01-01

    Hardware/software co-simulation improves the performance of embedded applications by executing the applications on a virtual platform before the actual hardware is available in silicon. However, the virtual platform of the target architecture is often not available during early stages of the embedded design flow. Consequently, analysis for parallel execution without performing hardware/software co-simulation is required. This article presents an analysis methodology for parallel execution of ...

  15. Space Suit Joint Torque Measurement Method Validation

    Science.gov (United States)

    Valish, Dana; Eversley, Karina

    2012-01-01

    In 2009 and early 2010, a test method was developed and performed to quantify the torque required to manipulate joints in several existing operational and prototype space suits. This was done in an effort to develop joint torque requirements appropriate for a new Constellation Program space suit system. The same test method was levied on the Constellation space suit contractors to verify that their suit design met the requirements. However, because the original test was set up and conducted by a single test operator there was some question as to whether this method was repeatable enough to be considered a standard verification method for Constellation or other future development programs. In order to validate the method itself, a representative subset of the previous test was repeated, using the same information that would be available to space suit contractors, but set up and conducted by someone not familiar with the previous test. The resultant data was compared using graphical and statistical analysis; the results indicated a significant variance in values reported for a subset of the re-tested joints. Potential variables that could have affected the data were identified and a third round of testing was conducted in an attempt to eliminate and/or quantify the effects of these variables. The results of the third test effort will be used to determine whether or not the proposed joint torque methodology can be applied to future space suit development contracts.

  16. Confirmatory Factor Analysis Alternative: Free, Accessible CBID Software.

    Science.gov (United States)

    Bott, Marjorie; Karanevich, Alex G; Garrard, Lili; Price, Larry R; Mudaranthakam, Dinesh Pal; Gajewski, Byron

    2018-02-01

    New software that performs Classical and Bayesian Instrument Development (CBID) is reported that seamlessly integrates expert (content validity) and participant data (construct validity) to produce entire reliability estimates with smaller sample requirements. The free CBID software can be accessed through a website and used by clinical investigators in new instrument development. Demonstrations are presented of the three approaches using the CBID software: (a) traditional confirmatory factor analysis (CFA), (b) Bayesian CFA using flat uninformative prior, and (c) Bayesian CFA using content expert data (informative prior). Outcomes of usability testing demonstrate the need to make the user-friendly, free CBID software available to interdisciplinary researchers. CBID has the potential to be a new and expeditious method for instrument development, adding to our current measurement toolbox. This allows for the development of new instruments for measuring determinants of health in smaller diverse populations or populations of rare diseases.

  17. MARTA: a suite of Java-based tools for assigning taxonomic status to DNA sequences.

    Science.gov (United States)

    Horton, Matthew; Bodenhausen, Natacha; Bergelson, Joy

    2010-02-15

    We have created a suite of Java-based software to better provide taxonomic assignments to DNA sequences. We anticipate that the program will be useful for protistologists, virologists, mycologists and other microbial ecologists. The program relies on NCBI utilities including the BLAST software and Taxonomy database and is easily manipulated at the command-line to specify a BLAST candidate's query-coverage or percent identity requirements; other options include the ability to set minimal consensus requirements (%) for each of the eight major taxonomic ranks (Domain, Kingdom, Phylum, ...) and whether to consider lower scoring candidates when the top-hit lacks taxonomic classification.

  18. Research and Development on Food Nutrition Statistical Analysis Software System

    OpenAIRE

    Du Li; Ke Yun

    2013-01-01

    Designing and developing a set of food nutrition component statistical analysis software can realize the automation of nutrition calculation, improve the nutrition processional professional’s working efficiency and achieve the informatization of the nutrition propaganda and education. In the software development process, the software engineering method and database technology are used to calculate the human daily nutritional intake and the intelligent system is used to evaluate the user’s hea...

  19. Developing safety-critical software: a practical guide for aviation software and DO-178c compliance

    National Research Council Canada - National Science Library

    Rierson, Leanna

    2013-01-01

    ... of a particular pedagogical approach or particular use of the MATLAB® and Simulink® software. CRC Press Taylor & Francis Group 6000 Broken Sound Parkway NW, Suite 300 Boca Raton, FL 33487-2742 © 2013 by Taylor & Francis Group, LLC CRC Press is an imprint of Taylor & Francis Group, an Informa business No claim to original U.S. Government works Versi...

  20. DIII-D Thomson Scattering Diagnostic Data Acquisition, Processing and Analysis Software

    International Nuclear Information System (INIS)

    Middaugh, K.R.; Bray, B.D.; Hsieh, C.L.; McHarg, B.B.Jr.; Penaflor, B.G.

    1999-01-01

    One of the diagnostic systems critical to the success of the DIII-D tokamak experiment is the Thomson scattering diagnostic. This diagnostic is unique in that it measures local electron temperature and density: (1) at multiple locations within the tokamak plasma; and (2) at different times throughout the plasma duration. Thomson ''raw'' data are digitized signals of scattered light, measured at different times and locations, from the laser beam paths fired into the plasma. Real-time acquisition of this data is performed by specialized hardware. Once obtained, the raw data are processed into meaningful temperature and density values which can be analyzed for measurement quality. This paper will provide an overview of the entire Thomson scattering diagnostic software and will focus on the data acquisition, processing, and analysis software implementation. The software falls into three general categories: (1) Set-up and Control: Initializes and controls all Thomson hardware and software, synchronizes with other DIII-D computers, and invokes other Thomson software as appropriate. (2) Data Acquisition and Processing: Obtains raw measured data from memory and processes it into temperature and density values. (3) Analysis: Provides a graphical user interface in which to perform analysis and sophisticated plotting of analysis parameters

  1. Automated Freedom from Interference Analysis for Automotive Software

    OpenAIRE

    Leitner-Fischer , Florian; Leue , Stefan; Liu , Sirui

    2016-01-01

    International audience; Freedom from Interference for automotive software systems developed according to the ISO 26262 standard means that a fault in a less safety critical software component will not lead to a fault in a more safety critical component. It is an important concern in the realm of functional safety for automotive systems. We present an automated method for the analysis of concurrency-related interferences based on the QuantUM approach and tool that we have previously developed....

  2. Instant Spring Tool Suite

    CERN Document Server

    Chiang, Geoff

    2013-01-01

    Filled with practical, step-by-step instructions and clear explanations for the most important and useful tasks. A tutorial guide that walks you through how to use the features of Spring Tool Suite using well defined sections for the different parts of Spring.Instant Spring Tool Suite is for novice to intermediate Java developers looking to get a head-start in enterprise application development using Spring Tool Suite and the Spring framework. If you are looking for a guide for effective application development using Spring Tool Suite, then this book is for you.

  3. Latest Developments in Data Analysis and Structure Determination and Refinement: Software for Chemical Crystallography

    International Nuclear Information System (INIS)

    Dix, I.; Adam, M.; Jacob, H. F.; Roter, A.

    2003-01-01

    The introduction of a two-dimensional CCD X-ray detector nearly 10 years ago by Bruker started a revolution in chemical crystallography. Since then, crystallographers can accomplish a complete data collection even of small and poorly scattering crystals in a few hours instead of days. The launch of the kappa geometry by Nonius a few years ago beforehand equally revolutionized the field of single crystal diffractometry. Currently Bruker Nonius has far more than 500 CCD systems installed. The latest development of Bruker Nonius, the X8 APEX, is the powerful combination of both: the APEX CCD detector and the unique Kappa four-circle goniometer. The APEX 4K CCD detector provides the utmost sensitivity, while the Kappa four-circle goniometer offers a very open geometry, granting all the flexibility to align any crystallographic axis. This provides a more efficient data collection for axial photographs to investigate e.g. diffuse scattering or incommensurate structures. Even the crystal-detector distance is computer-controlled for precise and superior data collection. The X8 APEX software suite gives a whole new look to the CCD users interface. It not only has improved data collection abilities, but also guides the chemist or mineralogist through gathering the raw crystal data to producing the final crystal structure. It provides context-dependent menus, which are well-known from business software packages such as Outlook. The tools for unit cell determination, views into reciprocal space, optimisation of the data collection strategy, data integration, scaling and correcting (SADABS) as well as tools for structure solving and refining (SHELXTL package) will be presented. Low temperature work has become an essential tool for challenging samples. The Bruker Nonius Kryo-Flex cryogenic device makes chemical crystallography at low temperatures a routine method in your laboratory. Of course, the Kryo-Flex is fully controlled by the new graphical user interface of the X8 APEX

  4. Radiation and environmental data analysis computer (REDAC) hardware, software band analysis procedures

    International Nuclear Information System (INIS)

    Hendricks, T.J.

    1985-01-01

    The REDAC was conceived originally as a tape verifier for the Radiation and Environmental Data Acquisition Recorder (REDAR). From that simple beginning in 1971, the REDAC has evolved into a family of systems used for complete analysis of data obtained by the REDAR and other acquisition systems. Portable or mobile REDACs are deployed to support checkout and analysis tasks in the field. Laboratory systems are additionally used for software development, physics investigations, data base management and graphics. System configurations range from man-portable systems to a large laboratory-based system which supports time-shared analysis and development tasks. Custom operating software allows the analyst to process data either interactively or by batch procedures. Analysis packages are provided for numerous necessary functions. All these analysis procedures can be performed even on the smallest man-portable REDAC. Examples of the multi-isotope stripping and radiation isopleth mapping are presented. Techniques utilized for these operations are also presented

  5. A software tool for design of process monitoring and analysis systems

    DEFF Research Database (Denmark)

    Singh, Ravendra; Gernaey, Krist; Gani, Rafiqul

    2009-01-01

    A well designed process monitoring and analysis system is necessary to consistently achieve any predefined end product quality. Systematic computer aided methods and tools provide the means to design the necessary process monitoring and analysis systems and/or to validate any existing monitoring...... and analysis system. A software to achieve this has been developed. Two developed supporting tools for the design, a knowledge base (consisting of the process knowledge as well as the knowledge on measurement methods & tools) and a model library (consisting of the process operational models) have been extended...... rigorously and integrated with the user interface, which made the software more generic and applicable to a wide range of problems. The software for the design of a process monitoring and analysis system is presented and illustrated with a tablet manufacturing process example....

  6. Exoskeletons, Robots and System Software: Tools for the Warfighter

    Science.gov (United States)

    2012-04-24

    Exoskeletons , Robots and System Software: Tools for the Warfighter? Paul Flanagan, Tuesday, April 24, 2012 11:15 am– 12:00 pm 1 “The views...Emerging technologies such as exoskeletons , robots , drones, and the underlying software are and will change the face of the battlefield. Warfighters will...global hub for educating, informing, and connecting Information Age leaders.” What is an exoskeleton ? An exoskeleton is a wearable robot suit that

  7. CosmoQuest Transient Tracker: Opensource Photometry & Astrometry software

    Science.gov (United States)

    Myers, Joseph L.; Lehan, Cory; Gay, Pamela; Richardson, Matthew; CosmoQuest Team

    2018-01-01

    CosmoQuest is moving from online citizen science, to observational astronomy with the creation of Transient Trackers. This open source software is designed to identify asteroids and other transient/variable objects in image sets. Transient Tracker’s features in final form will include: astrometric and photometric solutions, identification of moving/transient objects, identification of variable objects, and lightcurve analysis. In this poster we present our initial, v0.1 release and seek community input.This software builds on the existing NIH funded ImageJ libraries. Creation of this suite of opensource image manipulation routines is lead by Wayne Rasband and is released primarily under the MIT license. In this release, we are building on these libraries to add source identification for point / point-like sources, and to do astrometry. Our materials released under the Apache 2.0 license on github (http://github.com/CosmoQuestTeam) and documentation can be found at http://cosmoquest.org/TransientTracker.

  8. Software

    Energy Technology Data Exchange (ETDEWEB)

    Macedo, R.; Budd, G.; Ross, E.; Wells, P.

    2010-07-15

    The software section of this journal presented new software programs that have been developed to help in the exploration and development of hydrocarbon resources. Software provider IHS Inc. has made additions to its geological and engineering analysis software tool, IHS PETRA, a product used by geoscientists and engineers to visualize, analyze and manage well production, well log, drilling, reservoir, seismic and other related information. IHS PETRA also includes a directional well module and a decline curve analysis module to improve analysis capabilities in unconventional reservoirs. Petris Technology Inc. has developed a software to help manage the large volumes of data. PetrisWinds Enterprise (PWE) helps users find and manage wellbore data, including conventional wireline and MWD core data; analysis core photos and images; waveforms and NMR; and external files documentation. Ottawa-based Ambercore Software Inc. has been collaborating with Nexen on the Petroleum iQ software for steam assisted gravity drainage (SAGD) producers. Petroleum iQ integrates geology and geophysics data with engineering data in 3D and 4D. Calgary-based Envirosoft Corporation has developed a software that reduces the costly and time-consuming effort required to comply with Directive 39 of the Alberta Energy Resources Conservation Board. The product includes an emissions modelling software. Houston-based Seismic Micro-Technology (SMT) has developed the Kingdom software that features the latest in seismic interpretation. Holland-based Joa Oil and Gas and Calgary-based Computer Modelling Group have both supplied the petroleum industry with advanced reservoir simulation software that enables reservoir interpretation. The 2010 software survey included a guide to new software applications designed to facilitate petroleum exploration, drilling and production activities. Oil and gas producers can use the products for a range of functions, including reservoir characterization and accounting. In

  9. A Knowledge-based Environment for Software Process Performance Analysis

    Directory of Open Access Journals (Sweden)

    Natália Chaves Lessa Schots

    2015-08-01

    Full Text Available Background: Process performance analysis is a key step for implementing continuous improvement in software organizations. However, the knowledge to execute such analysis is not trivial and the person responsible to executing it must be provided with appropriate support. Aim: This paper presents a knowledge-based environment, named SPEAKER, proposed for supporting software organizations during the execution of process performance analysis. SPEAKER comprises a body of knowledge and a set of activities and tasks for software process performance analysis along with supporting tools to executing these activities and tasks. Method: We conducted an informal literature reviews and a systematic mapping study, which provided basic requirements for the proposed environment. We implemented the SPEAKER environment integrating supporting tools for the execution of activities and tasks of performance analysis and the knowledge necessary to execute them, in order to meet the variability presented by the characteristics of these activities. Results: In this paper, we describe each SPEAKER module and the individual evaluations of these modules, and also present an example of use comprising how the environment can guide the user through a specific performance analysis activity. Conclusion: Although we only conducted individual evaluations of SPEAKER’s modules, the example of use indicates the feasibility of the proposed environment. Therefore, the environment as a whole will be further evaluated to verify if it attains its goal of assisting in the execution of process performance analysis by non-specialist people.

  10. Evaluation of peak-fitting software for gamma spectrum analysis

    International Nuclear Information System (INIS)

    Zahn, Guilherme S.; Genezini, Frederico A.; Moralles, Mauricio

    2009-01-01

    In all applications of gamma-ray spectroscopy, one of the most important and delicate parts of the data analysis is the fitting of the gamma-ray spectra, where information as the number of counts, the position of the centroid and the width, for instance, are associated with each peak of each spectrum. There's a huge choice of computer programs that perform this type of analysis, and the most commonly used in routine work are the ones that automatically locate and fit the peaks; this fit can be made in several different ways - the most common ways are to fit a Gaussian function to each peak or simply to integrate the area under the peak, but some software go far beyond and include several small corrections to the simple Gaussian peak function, in order to compensate for secondary effects. In this work several gamma-ray spectroscopy software are compared in the task of finding and fitting the gamma-ray peaks in spectra taken with standard sources of 137 Cs, 60 Co, 133 Ba and 152 Eu. The results show that all of the automatic software can be properly used in the task of finding and fitting peaks, with the exception of GammaVision; also, it was possible to verify that the automatic peak-fitting software did perform as well as - and sometimes even better than - a manual peak-fitting software. (author)

  11. Specdata: Automated Analysis Software for Broadband Spectra

    Science.gov (United States)

    Oliveira, Jasmine N.; Martin-Drumel, Marie-Aline; McCarthy, Michael C.

    2017-06-01

    With the advancement of chirped-pulse techniques, broadband rotational spectra with a few tens to several hundred GHz of spectral coverage are now routinely recorded. When studying multi-component mixtures that might result, for example, with the use of an electrical discharge, lines of new chemical species are often obscured by those of known compounds, and analysis can be laborious. To address this issue, we have developed SPECdata, an open source, interactive tool which is designed to simplify and greatly accelerate the spectral analysis and discovery. Our software tool combines both automated and manual components that free the user from computation, while giving him/her considerable flexibility to assign, manipulate, interpret and export their analysis. The automated - and key - component of the new software is a database query system that rapidly assigns transitions of known species in an experimental spectrum. For each experiment, the software identifies spectral features, and subsequently assigns them to known molecules within an in-house database (Pickett .cat files, list of frequencies...), or those catalogued in Splatalogue (using automatic on-line queries). With suggested assignments, the control is then handed over to the user who can choose to accept, decline or add additional species. Data visualization, statistical information, and interactive widgets assist the user in making decisions about their data. SPECdata has several other useful features intended to improve the user experience. Exporting a full report of the analysis, or a peak file in which assigned lines are removed are among several options. A user may also save their progress to continue at another time. Additional features of SPECdata help the user to maintain and expand their database for future use. A user-friendly interface allows one to search, upload, edit or update catalog or experiment entries.

  12. RVA. 3-D Visualization and Analysis Software to Support Management of Oil and Gas Resources

    Energy Technology Data Exchange (ETDEWEB)

    Keefer, Donald A. [Univ. of Illinois, Champaign, IL (United States); Shaffer, Eric G. [Univ. of Illinois, Champaign, IL (United States); Storsved, Brynne [Univ. of Illinois, Champaign, IL (United States); Vanmoer, Mark [Univ. of Illinois, Champaign, IL (United States); Angrave, Lawrence [Univ. of Illinois, Champaign, IL (United States); Damico, James R. [Univ. of Illinois, Champaign, IL (United States); Grigsby, Nathan [Univ. of Illinois, Champaign, IL (United States)

    2015-12-01

    A free software application, RVA, has been developed as a plugin to the US DOE-funded ParaView visualization package, to provide support in the visualization and analysis of complex reservoirs being managed using multi-fluid EOR techniques. RVA, for Reservoir Visualization and Analysis, was developed as an open-source plugin to the 64 bit Windows version of ParaView 3.14. RVA was developed at the University of Illinois at Urbana-Champaign, with contributions from the Illinois State Geological Survey, Department of Computer Science and National Center for Supercomputing Applications. RVA was designed to utilize and enhance the state-of-the-art visualization capabilities within ParaView, readily allowing joint visualization of geologic framework and reservoir fluid simulation model results. Particular emphasis was placed on enabling visualization and analysis of simulation results highlighting multiple fluid phases, multiple properties for each fluid phase (including flow lines), multiple geologic models and multiple time steps. Additional advanced functionality was provided through the development of custom code to implement data mining capabilities. The built-in functionality of ParaView provides the capacity to process and visualize data sets ranging from small models on local desktop systems to extremely large models created and stored on remote supercomputers. The RVA plugin that we developed and the associated User Manual provide improved functionality through new software tools, and instruction in the use of ParaView-RVA, targeted to petroleum engineers and geologists in industry and research. The RVA web site (http://rva.cs.illinois.edu) provides an overview of functions, and the development web site (https://github.com/shaffer1/RVA) provides ready access to the source code, compiled binaries, user manual, and a suite of demonstration data sets. Key functionality has been included to support a range of reservoirs visualization and analysis needs, including

  13. Off-line radiometric analysis of Planck-LFI data

    Energy Technology Data Exchange (ETDEWEB)

    Tomasi, M; Mennella, A; Bersanelli, M [Dipartimento di Fisica, Universita degli Studi di Milano, Via Celoria 16, 20133 Milano (Italy); Galeotta, S; Maris, M [LFI-DPC INAF-OATs, Via Tiepolo 11, 34131 Trieste (Italy); Lowe, S R [Jodrell Bank Centre for Astrophysics, The University of Manchester, Manchester, M13 9PL (United Kingdom); Mendes, L [Planck Science Office, European Space Agency, ESAC, P.O. box 78, 28691 Villanueva de la Canada, Madrid (Spain); Leonardi, R; Meinhold, P [Department of Physics, University of California, Santa Barbara, CA 93106-9530 (United States); Villa, F; Sandri, M; Cuttaia, F; Terenzi, L; Valenziano, L; Butler, R C [INAF-IASF Bologna, Via Gobetti, 101, 40129, Bologna (Italy); Cappellini, B [INAF-IASF Milano, Via E. Bassini 15, 20133 Milano (Italy); Gregorio, A [Department of Physics, University of Trieste, Via Valerio, 2 Trieste I-34127 (Italy); Salmon, M J [Departamento de IngenierIa de Comunicaciones, Universidad de Cantabria, Avenida de los Castros s/n. 39005 Santander (Spain); Binko, P [ISDC Data Centre for Astrophysics, University of Geneva, ch. d' Ecogia 16, 1290 Versoix (Switzerland); D' Arcangelo, O, E-mail: tomasi@lambrate.inaf.i [IFP-CNR, Via Cozzi 53, Milano (Italy)

    2009-12-15

    The Planck Low Frequency Instrument (LFI) is an array of 22 pseudo-correlation radiometers on-board the Planck satellite to measure temperature and polarization anisotropies in the Cosmic Microwave Background (CMB) in three frequency bands (30, 44 and 70 GHz). To calibrate and verify the performances of the LFI, a software suite named LIFE has been developed. Its aims are to provide a common platform to use for analyzing the results of the tests performed on the single components of the instrument (RCAs, Radiometric Chain Assemblies) and on the integrated Radiometric Array Assembly (RAA). Moreover, its analysis tools are designed to be used during the flight as well to produce periodic reports on the status of the instrument. The LIFE suite has been developed using a multi-layered, cross-platform approach. It implements a number of analysis modules written in RSI IDL, each accessing the data through a portable and heavily optimized library of functions written in C and C++. One of the most important features of LIFE is its ability to run the same data analysis codes both using ground test data and real flight data as input. The LIFE software suite has been successfully used during the RCA/RAA tests and the Planck Integrated System Tests. Moreover, the software has also passed the verification for its in-flight use during the System Operations Verification Tests, held in October 2008.

  14. Off-line radiometric analysis of Planck-LFI data

    International Nuclear Information System (INIS)

    Tomasi, M; Mennella, A; Bersanelli, M; Galeotta, S; Maris, M; Lowe, S R; Mendes, L; Leonardi, R; Meinhold, P; Villa, F; Sandri, M; Cuttaia, F; Terenzi, L; Valenziano, L; Butler, R C; Cappellini, B; Gregorio, A; Salmon, M J; Binko, P; D'Arcangelo, O

    2009-01-01

    The Planck Low Frequency Instrument (LFI) is an array of 22 pseudo-correlation radiometers on-board the Planck satellite to measure temperature and polarization anisotropies in the Cosmic Microwave Background (CMB) in three frequency bands (30, 44 and 70 GHz). To calibrate and verify the performances of the LFI, a software suite named LIFE has been developed. Its aims are to provide a common platform to use for analyzing the results of the tests performed on the single components of the instrument (RCAs, Radiometric Chain Assemblies) and on the integrated Radiometric Array Assembly (RAA). Moreover, its analysis tools are designed to be used during the flight as well to produce periodic reports on the status of the instrument. The LIFE suite has been developed using a multi-layered, cross-platform approach. It implements a number of analysis modules written in RSI IDL, each accessing the data through a portable and heavily optimized library of functions written in C and C++. One of the most important features of LIFE is its ability to run the same data analysis codes both using ground test data and real flight data as input. The LIFE software suite has been successfully used during the RCA/RAA tests and the Planck Integrated System Tests. Moreover, the software has also passed the verification for its in-flight use during the System Operations Verification Tests, held in October 2008.

  15. Development of Software for Measurement and Analysis of Solar Radiation

    International Nuclear Information System (INIS)

    Mohamad Idris Taib; Abul Adli Anuar; Noor Ezati Shuib

    2015-01-01

    This software was under development using LabVIEW to be using with StellarNet spectrometers system with USB communication to computer. LabVIEW have capabilities in hardware interfacing, graphical user interfacing and mathematical calculation including array manipulation and processing. This software read data from StellarNet spectrometer in real-time and then processed for analysis. Several measurement of solar radiation and analysis have been done. Solar radiation involved mainly infra-red, visible light and ultra-violet. With solar radiation spectrum data, information of weather and suitability of plant can be gathered and analyzed. Furthermore, optimization of utilization and safety precaution of solar radiation can be planned. Using this software, more research and development in utilization and safety of solar radiation can be explored. (author)

  16. STAMPS: development and verification of swallowing kinematic analysis software.

    Science.gov (United States)

    Lee, Woo Hyung; Chun, Changmook; Seo, Han Gil; Lee, Seung Hak; Oh, Byung-Mo

    2017-10-17

    Swallowing impairment is a common complication in various geriatric and neurodegenerative diseases. Swallowing kinematic analysis is essential to quantitatively evaluate the swallowing motion of the oropharyngeal structures. This study aims to develop a novel swallowing kinematic analysis software, called spatio-temporal analyzer for motion and physiologic study (STAMPS), and verify its validity and reliability. STAMPS was developed in MATLAB, which is one of the most popular platforms for biomedical analysis. This software was constructed to acquire, process, and analyze the data of swallowing motion. The target of swallowing structures includes bony structures (hyoid bone, mandible, maxilla, and cervical vertebral bodies), cartilages (epiglottis and arytenoid), soft tissues (larynx and upper esophageal sphincter), and food bolus. Numerous functions are available for the spatiotemporal parameters of the swallowing structures. Testing for validity and reliability was performed in 10 dysphagia patients with diverse etiologies and using the instrumental swallowing model which was designed to mimic the motion of the hyoid bone and the epiglottis. The intra- and inter-rater reliability tests showed excellent agreement for displacement and moderate to excellent agreement for velocity. The Pearson correlation coefficients between the measured and instrumental reference values were nearly 1.00 (P software is expected to be useful for researchers who are interested in the swallowing motion analysis.

  17. PyPWA: A partial-wave/amplitude analysis software framework

    Science.gov (United States)

    Salgado, Carlos

    2016-05-01

    The PyPWA project aims to develop a software framework for Partial Wave and Amplitude Analysis of data; providing the user with software tools to identify resonances from multi-particle final states in photoproduction. Most of the code is written in Python. The software is divided into two main branches: one general-shell where amplitude's parameters (or any parametric model) are to be estimated from the data. This branch also includes software to produce simulated data-sets using the fitted amplitudes. A second branch contains a specific realization of the isobar model (with room to include Deck-type and other isobar model extensions) to perform PWA with an interface into the computer resources at Jefferson Lab. We are currently implementing parallelism and vectorization using the Intel's Xeon Phi family of coprocessors.

  18. GRACAT, Software for grounding and collision analysis

    DEFF Research Database (Denmark)

    Friis-Hansen, Peter; Simonsen, Bo Cerup

    2002-01-01

    and grounding accidents. The software consists of three basic analysis modules and one risk mitigation module: 1) frequency, 2) damage, and 3) consequence. These modules can be used individually or in series and the analyses can be performed in deterministic or probabilistic mode. Finally, in the mitigation...

  19. XP in a Small Software Development Business

    DEFF Research Database (Denmark)

    Babb, Jeffry; Hoda, Rashina; Nørbjerg, Jacob

    2014-01-01

    While small software development shops have trended towards the adoption of Agile methods, local conditions and high iteration pressure typically cause adaptations and appropriations of Agile methods. This paper shares evidence from a study concerning how a small software development company adopts...... and adapts, XP to suit their business. Based on a Dialogical Action Research project, the study reflects on the conditions leading to Agile process adaptation, and why ad hoc and “a la carte” approaches may be problematic. Limitations and drawbacks to aspects of XP are also discussed. The Agile practices...

  20. Decision Engines for Software Analysis Using Satisfiability Modulo Theories Solvers

    Science.gov (United States)

    Bjorner, Nikolaj

    2010-01-01

    The area of software analysis, testing and verification is now undergoing a revolution thanks to the use of automated and scalable support for logical methods. A well-recognized premise is that at the core of software analysis engines is invariably a component using logical formulas for describing states and transformations between system states. The process of using this information for discovering and checking program properties (including such important properties as safety and security) amounts to automatic theorem proving. In particular, theorem provers that directly support common software constructs offer a compelling basis. Such provers are commonly called satisfiability modulo theories (SMT) solvers. Z3 is a state-of-the-art SMT solver. It is developed at Microsoft Research. It can be used to check the satisfiability of logical formulas over one or more theories such as arithmetic, bit-vectors, lists, records and arrays. The talk describes some of the technology behind modern SMT solvers, including the solver Z3. Z3 is currently mainly targeted at solving problems that arise in software analysis and verification. It has been applied to various contexts, such as systems for dynamic symbolic simulation (Pex, SAGE, Vigilante), for program verification and extended static checking (Spec#/Boggie, VCC, HAVOC), for software model checking (Yogi, SLAM), model-based design (FORMULA), security protocol code (F7), program run-time analysis and invariant generation (VS3). We will describe how it integrates support for a variety of theories that arise naturally in the context of the applications. There are several new promising avenues and the talk will touch on some of these and the challenges related to SMT solvers. Proceedings

  1. Software failure events derivation and analysis by frame-based technique

    International Nuclear Information System (INIS)

    Huang, H.-W.; Shih, C.; Yih, Swu; Chen, M.-H.

    2007-01-01

    A frame-based technique, including physical frame, logical frame, and cognitive frame, was adopted to perform digital I and C failure events derivation and analysis for generic ABWR. The physical frame was structured with a modified PCTran-ABWR plant simulation code, which was extended and enhanced on the feedwater system, recirculation system, and steam line system. The logical model is structured with MATLAB, which was incorporated into PCTran-ABWR to improve the pressure control system, feedwater control system, recirculation control system, and automated power regulation control system. As a result, the software failure of these digital control systems can be properly simulated and analyzed. The cognitive frame was simulated by the operator awareness status in the scenarios. Moreover, via an internal characteristics tuning technique, the modified PCTran-ABWR can precisely reflect the characteristics of the power-core flow. Hence, in addition to the transient plots, the analysis results can then be demonstrated on the power-core flow map. A number of postulated I and C system software failure events were derived to achieve the dynamic analyses. The basis for event derivation includes the published classification for software anomalies, the digital I and C design data for ABWR, chapter 15 accident analysis of generic SAR, and the reported NPP I and C software failure events. The case study of this research includes: (1) the software CMF analysis for the major digital control systems; and (2) postulated ABWR digital I and C software failure events derivation from the actual happening of non-ABWR digital I and C software failure events, which were reported to LER of USNRC or IRS of IAEA. These events were analyzed by PCTran-ABWR. Conflicts among plant status, computer status, and human cognitive status are successfully identified. The operator might not easily recognize the abnormal condition, because the computer status seems to progress normally. However, a well

  2. Analysis of Software Development Methodologies to Build Safety Software Applications for the SATEX-II: A Mexican Experimental Satellite

    Science.gov (United States)

    Aguilar Cisneros, Jorge; Vargas Martinez, Hector; Pedroza Melendez, Alejandro; Alonso Arevalo, Miguel

    2013-09-01

    Mexico is a country where the experience to build software for satellite applications is beginning. This is a delicate situation because in the near future we will need to develop software for the SATEX-II (Mexican Experimental Satellite). SATEX- II is a SOMECyTA's project (the Mexican Society of Aerospace Science and Technology). We have experienced applying software development methodologies, like TSP (Team Software Process) and SCRUM in other areas. Then, we analyzed these methodologies and we concluded: these can be applied to develop software for the SATEX-II, also, we supported these methodologies with SSP-05-0 Standard in particular with ESA PSS-05-11. Our analysis was focusing on main characteristics of each methodology and how these methodologies could be used with the ESA PSS 05-0 Standards. Our outcomes, in general, may be used by teams who need to build small satellites, but, in particular, these are going to be used when we will build the on board software applications for the SATEX-II.

  3. Saphire models and software for ASP evaluations

    International Nuclear Information System (INIS)

    Sattison, M.B.

    1997-01-01

    The Idaho National Engineering Laboratory (INEL) over the three years has created 75 plant-specific Accident Sequence Precursor (ASP) models using the SAPHIRE suite of PRA codes. Along with the new models, the INEL has also developed a new module for SAPHIRE which is tailored specifically to the unique needs of ASP evaluations. These models and software will be the next generation of risk tools for the evaluation of accident precursors by both the U.S. Nuclear Regulatory Commission's (NRC's) Office of Nuclear Reactor Regulation (NRR) and the Office for Analysis and Evaluation of Operational Data (AEOD). This paper presents an overview of the models and software. Key characteristics include: (1) classification of the plant models according to plant response with a unique set of event trees for each plant class, (2) plant-specific fault trees using supercomponents, (3) generation and retention of all system and sequence cutsets, (4) full flexibility in modifying logic, regenerating cutsets, and requantifying results, and (5) user interface for streamlined evaluation of ASP events. Future plans for the ASP models is also presented

  4. Open source software and crowdsourcing for energy analysis

    International Nuclear Information System (INIS)

    Bazilian, Morgan; Rice, Andrew; Rotich, Juliana; Howells, Mark; DeCarolis, Joseph; Macmillan, Stuart; Brooks, Cameron; Bauer, Florian; Liebreich, Michael

    2012-01-01

    Informed energy decision making requires effective software, high-quality input data, and a suitably trained user community. Developing these resources can be expensive and time consuming. Even when data and tools are intended for public re-use they often come with technical, legal, economic and social barriers that make them difficult to adopt, adapt and combine for use in new contexts. We focus on the promise of open, publically accessible software and data as well as crowdsourcing techniques to develop robust energy analysis tools that can deliver crucial, policy-relevant insight, particularly in developing countries, where planning resources are highly constrained—and the need to adapt these resources and methods to the local context is high. We survey existing research, which argues that these techniques can produce high-quality results, and also explore the potential role that linked, open data can play in both supporting the modelling process and in enhancing public engagement with energy issues. - Highlights: ► We focus on the promise of open, publicly accessible software and data. ► These emerging techniques can produce high-quality results for energy analysis. ► Developing economies require new techniques for energy planning.

  5. Comparison of two software versions for assessment of body-composition analysis by DXA

    DEFF Research Database (Denmark)

    Vozarova, B; Wang, J; Weyer, C

    2001-01-01

    To compare two software versions provided by Lunar CO: for assessment of body composition analysis by DXA.......To compare two software versions provided by Lunar CO: for assessment of body composition analysis by DXA....

  6. Apparatus for storing protective suits

    International Nuclear Information System (INIS)

    Englemann, H.J.; Koller, J.; Schrader, H.R.; Schade, G.; Pedrerol, J.

    1975-01-01

    Arrangements are described for storing one or more protective suits when contaminated on the outside. In order to permit a person wearing a contaminated suit to leave a contaminated area safely, and without contaminating the environment, it has hitherto been the practice for the suit to be passed through a 'lock' and cleansed under decontaminating showers whilst still being worn. This procedure is time wasting and not always completely effective, and it may be necessary to provide a second suit for use whilst the first suit is being decontaminated. Repeated decontamination may also result in undue wear and tear. The arrangements described provide a 'lock' chamber in which a contaminated suit may be stowed away without its interior becoming contaminated, thus allowing repeated use by persons donning and shedding it. (U.K.)

  7. Generating an Automated Test Suite by Variable Strength Combinatorial Testing for Web Services

    Directory of Open Access Journals (Sweden)

    Yin Li

    2016-09-01

    Full Text Available Testing Web Services has become the spotlight of software engineering as an important means to assure the quality of Web application. Due to lacking of graphic interface and source code, Web services need an automated testing method, which is an important part in efficiently designing and generating test suite. However, the existing testing methods may lead to the redundancy of test suite and the decrease of fault-detecting ability since it cannot handle scenarios where the strengths of the different interactions are not uniform. With the purpose of solving this problem, firstly the formal tree model based on WSDL is constructed and the actual interaction relationship of each node is made sufficient consideration into, then the combinatorial testing is proposed to generate variable strength combinatorial test suite based on One-test-at-a-time strategy. At last test cases are minimized according to constraint rules. The results show that compared with conventional random testing, the proposed approach can detect more errors with the same amount of test cases which turning out to be more ideal than existing ones in size.

  8. Pharmacy settles suit.

    Science.gov (United States)

    1998-10-02

    A suit was filed by an HIV-positive man against a pharmacy that inadvertently disclosed his HIV status to his ex-wife and children. His ex-wife tried to use the information in a custody battle for their two children. The suit against the pharmacy was settled, but the terms of the settlement remain confidential.

  9. NASGRO 3.0: A Software for Analyzing Aging Aircraft

    Science.gov (United States)

    Mettu, S. R.; Shivakumar, V.; Beek, J. M.; Yeh, F.; Williams, L. C.; Forman, R. G.; McMahon, J. J.; Newman, J. C., Jr.

    1999-01-01

    Structural integrity analysis of aging aircraft is a critical necessity in view of the increasing numbers of such aircraft in general aviation, the airlines and the military. Efforts are in progress by NASA, the FAA and the DoD to focus attention on aging aircraft safety. The present paper describes the NASGRO software which is well-suited for effectively analyzing the behavior of defects that may be found in aging aircraft. The newly revised Version 3.0 has many features specifically implemented to suit the needs of the aircraft community. The fatigue crack growth computer program NASA/FLAGRO 2.0 was originally developed to analyze space hardware such as the Space Shuttle, the International Space Station and the associated payloads. Due to popular demand, the software was enhanced to suit the needs of the aircraft industry. Major improvements in Version 3.0 are the incorporation of the ability to read aircraft spectra of unlimited size, generation of common aircraft fatigue load blocks, and the incorporation of crack-growth models which include load-interaction effects such as retardation due to overloads and acceleration due to underloads. Five new crack-growth models, viz., generalized Willenborg, modified generalized Willenborg, constant closure model, Walker-Chang model and the deKoning-Newman strip-yield model, have been implemented. To facilitate easier input of geometry, material properties and load spectra, a Windows-style graphical user interface has been developed. Features to quickly change the input and rerun the problem as well as examine the output are incorporated. NASGRO has been organized into three modules, the crack-growth module being the primary one. The other two modules are the boundary element module and the material properties module. The boundary-element module provides the ability to model and analyze complex two-dimensional problems to obtain stresses and stress-intensity factors. The material properties module allows users to store and

  10. The use of prime implicants in dependability analysis of software controlled systems

    International Nuclear Information System (INIS)

    Yau, Michael; Apostolakis, George; Guarro, Sergio

    1998-01-01

    The behavior of software controlled systems is usually non-binary and dynamic. It is, thus, convenient to employ multi-valued logic to model these systems. Multi-valued logic functions can be used to represent the functional and temporal relationships between the software and hardware components. The resulting multi-valued logic model can be analyzed deductively, i.e. by tracking causality in reverse from undesirable 'top' events to identify faults that may be present in the system. The result of this deductive analysis is a set of prime implicants for a user-defined system top event. The prime implicants represent all the combinations of basic component conditions and software input conditions that may result in the top event; they are the extension to multi-valued logic of the concept of minimal cut sets that is used routinely in the analysis of binary fault trees. This paper discusses why prime implicants are needed in the dependability analysis of software controlled systems, how they are generated, and how they are used to identify faults in a software controlled system

  11. The use of prime implicants in dependability analysis of software controlled systems

    Energy Technology Data Exchange (ETDEWEB)

    Yau, Michael; Apostolakis, George; Guarro, Sergio

    1998-11-01

    The behavior of software controlled systems is usually non-binary and dynamic. It is, thus, convenient to employ multi-valued logic to model these systems. Multi-valued logic functions can be used to represent the functional and temporal relationships between the software and hardware components. The resulting multi-valued logic model can be analyzed deductively, i.e. by tracking causality in reverse from undesirable 'top' events to identify faults that may be present in the system. The result of this deductive analysis is a set of prime implicants for a user-defined system top event. The prime implicants represent all the combinations of basic component conditions and software input conditions that may result in the top event; they are the extension to multi-valued logic of the concept of minimal cut sets that is used routinely in the analysis of binary fault trees. This paper discusses why prime implicants are needed in the dependability analysis of software controlled systems, how they are generated, and how they are used to identify faults in a software controlled system.

  12. Development and use of mathematical models and software frameworks for integrated analysis of agricultural systems and associated water use impacts

    Science.gov (United States)

    Fowler, K. R.; Jenkins, E.W.; Parno, M.; Chrispell, J.C.; Colón, A. I.; Hanson, Randall T.

    2016-01-01

    The development of appropriate water management strategies requires, in part, a methodology for quantifying and evaluating the impact of water policy decisions on regional stakeholders. In this work, we describe the framework we are developing to enhance the body of resources available to policy makers, farmers, and other community members in their e orts to understand, quantify, and assess the often competing objectives water consumers have with respect to usage. The foundation for the framework is the construction of a simulation-based optimization software tool using two existing software packages. In particular, we couple a robust optimization software suite (DAKOTA) with the USGS MF-OWHM water management simulation tool to provide a flexible software environment that will enable the evaluation of one or multiple (possibly competing) user-defined (or stakeholder) objectives. We introduce the individual software components and outline the communication strategy we defined for the coupled development. We present numerical results for case studies related to crop portfolio management with several defined objectives. The objectives are not optimally satisfied for any single user class, demonstrating the capability of the software tool to aid in the evaluation of a variety of competing interests.

  13. Featureous: infrastructure for feature-centric analysis of object-oriented software

    DEFF Research Database (Denmark)

    Olszak, Andrzej; Jørgensen, Bo Nørregaard

    2010-01-01

    The decentralized nature of collaborations between objects in object-oriented software makes it difficult to understand how user-observable program features are implemented and how their implementations relate to each other. It is worthwhile to improve this situation, since feature-centric program...... understanding and modification are essential during software evolution and maintenance. In this paper, we present an infrastructure built on top of the NetBeans IDE called Featureous that allows for rapid construction of tools for feature-centric analysis of object-oriented software. Our infrastructure...... encompasses a lightweight feature location mechanism, a number of analytical views and an API allowing for addition of third-party extensions. To form a common conceptual framework for future feature-centric extensions, we propose to structure feature centric analysis along three dimensions: perspective...

  14. Potku – New analysis software for heavy ion elastic recoil detection analysis

    International Nuclear Information System (INIS)

    Arstila, K.; Julin, J.; Laitinen, M.I.; Aalto, J.; Konu, T.; Kärkkäinen, S.; Rahkonen, S.; Raunio, M.; Itkonen, J.; Santanen, J.-P.; Tuovinen, T.; Sajavaara, T.

    2014-01-01

    Time-of-flight elastic recoil detection (ToF-ERD) analysis software has been developed. The software combines a Python-language graphical front-end with a C code computing back-end in a user-friendly way. The software uses a list of coincident time-of-flight–energy (ToF–E) events as an input. The ToF calibration can be determined with a simple graphical procedure. The graphical interface allows the user to select different elements and isotopes from a ToF–E histogram and to convert the selections to individual elemental energy and depth profiles. The resulting sample composition can be presented as relative or absolute concentrations by integrating the depth profiles over user-defined ranges. Beam induced composition changes can be studied by displaying the event-based data in fractions relative to the substrate reference data. Optional angular input data allows for kinematic correction of the depth profiles. This open source software is distributed under the GPL license for Linux, Mac, and Windows environments

  15. Potku – New analysis software for heavy ion elastic recoil detection analysis

    Energy Technology Data Exchange (ETDEWEB)

    Arstila, K., E-mail: kai.arstila@jyu.fi [Department of Physics, University of Jyväskylä, P.O. Box 35, FI-40014, Jyväskylä (Finland); Julin, J.; Laitinen, M.I. [Department of Physics, University of Jyväskylä, P.O. Box 35, FI-40014, Jyväskylä (Finland); Aalto, J.; Konu, T.; Kärkkäinen, S.; Rahkonen, S.; Raunio, M.; Itkonen, J.; Santanen, J.-P.; Tuovinen, T. [Department of Mathematical Information Technology, University of Jyväskylä, P.O. Box 35, FI-40014, Jyväskylä (Finland); Sajavaara, T. [Department of Physics, University of Jyväskylä, P.O. Box 35, FI-40014, Jyväskylä (Finland)

    2014-07-15

    Time-of-flight elastic recoil detection (ToF-ERD) analysis software has been developed. The software combines a Python-language graphical front-end with a C code computing back-end in a user-friendly way. The software uses a list of coincident time-of-flight–energy (ToF–E) events as an input. The ToF calibration can be determined with a simple graphical procedure. The graphical interface allows the user to select different elements and isotopes from a ToF–E histogram and to convert the selections to individual elemental energy and depth profiles. The resulting sample composition can be presented as relative or absolute concentrations by integrating the depth profiles over user-defined ranges. Beam induced composition changes can be studied by displaying the event-based data in fractions relative to the substrate reference data. Optional angular input data allows for kinematic correction of the depth profiles. This open source software is distributed under the GPL license for Linux, Mac, and Windows environments.

  16. Capi text V.1--data analysis software for nailfold skin capillaroscopy.

    Science.gov (United States)

    Dobrev, Hristo P

    2007-01-01

    Nailfold skin capillaroscopy is a simple non-invasive method used to assess conditions of disturbed microcirculation such as Raynaud's phenomenon, acrocyanosis, perniones, connective tissue diseases, psoriasis, diabetes mellitus, neuropathy and vibration disease. To develop data analysis software aimed at assisting the documentation and analysis of a capillaroscopic investigation. SOFTWARE DESCRIPTION: The programme is based on a modular principle. The module "Nomenclatures" includes menus for the patients' data. The module "Examinations" includes menus for all general and specific aspects of the medical examination and capillaroscopic investigations. The modules "Settings" and "Information" include customization menus for the programme. The results of nailfold capillaroscopy can be printed in a short or expanded form. This software allows physicians to perform quick search by using various specified criteria and prepare analyses and reports. This software programme will facilitate any practitioner who performs nailfold skin capillaroscopy.

  17. Informed-Proteomics: open-source software package for top-down proteomics

    Energy Technology Data Exchange (ETDEWEB)

    Park, Jungkap; Piehowski, Paul D.; Wilkins, Christopher; Zhou, Mowei; Mendoza, Joshua; Fujimoto, Grant M.; Gibbons, Bryson C.; Shaw, Jared B.; Shen, Yufeng; Shukla, Anil K.; Moore, Ronald J.; Liu, Tao; Petyuk, Vladislav A.; Tolić, Nikola; Paša-Tolić, Ljiljana; Smith, Richard D.; Payne, Samuel H.; Kim, Sangtae

    2017-08-07

    Top-down proteomics involves the analysis of intact proteins. This approach is very attractive as it allows for analyzing proteins in their endogenous form without proteolysis, preserving valuable information about post-translation modifications, isoforms, proteolytic processing or their combinations collectively called proteoforms. Moreover, the quality of the top-down LC-MS/MS datasets is rapidly increasing due to advances in the liquid chromatography and mass spectrometry instrumentation and sample processing protocols. However, the top-down mass spectra are substantially more complex compare to the more conventional bottom-up data. To take full advantage of the increasing quality of the top-down LC-MS/MS datasets there is an urgent need to develop algorithms and software tools for confident proteoform identification and quantification. In this study we present a new open source software suite for top-down proteomics analysis consisting of an LC-MS feature finding algorithm, a database search algorithm, and an interactive results viewer. The presented tool along with several other popular tools were evaluated using human-in-mouse xenograft luminal and basal breast tumor samples that are known to have significant differences in protein abundance based on bottom-up analysis.

  18. Digital PIV (DPIV) Software Analysis System

    Science.gov (United States)

    Blackshire, James L.

    1997-01-01

    A software package was developed to provide a Digital PIV (DPIV) capability for NASA LaRC. The system provides an automated image capture, test correlation, and autocorrelation analysis capability for the Kodak Megaplus 1.4 digital camera system for PIV measurements. The package includes three separate programs that, when used together with the PIV data validation algorithm, constitutes a complete DPIV analysis capability. The programs are run on an IBM PC/AT host computer running either Microsoft Windows 3.1 or Windows 95 using a 'quickwin' format that allows simple user interface and output capabilities to the windows environment.

  19. Summary of Documentation for DYNA3D-ParaDyn's Software Quality Assurance Regression Test Problems

    Energy Technology Data Exchange (ETDEWEB)

    Zywicz, Edward [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-08-18

    The Software Quality Assurance (SQA) regression test suite for DYNA3D (Zywicz and Lin, 2015) and ParaDyn (DeGroot, et al., 2015) currently contains approximately 600 problems divided into 21 suites, and is a required component of ParaDyn’s SQA plan (Ferencz and Oliver, 2013). The regression suite allows developers to ensure that software modifications do not unintentionally alter the code response. The entire regression suite is run prior to permanently incorporating any software modification or addition. When code modifications alter test problem results, the specific cause must be determined and fully understood before the software changes and revised test answers can be incorporated. The regression suite is executed on LLNL platforms using a Python script and an associated data file. The user specifies the DYNA3D or ParaDyn executable, number of processors to use, test problems to run, and other options to the script. The data file details how each problem and its answer extraction scripts are executed. For each problem in the regression suite there exists an input deck, an eight-processor partition file, an answer file, and various extraction scripts. These scripts assemble a temporary answer file in a specific format from the simulation results. The temporary and stored answer files are compared to a specific level of numerical precision, and when differences are detected the test problem is flagged as failed. Presently, numerical results are stored and compared to 16 digits. At this accuracy level different processor types, compilers, number of partitions, etc. impact the results to various degrees. Thus, for consistency purposes the regression suite is run with ParaDyn using 8 processors on machines with a specific processor type (currently the Intel Xeon E5530 processor). For non-parallel regression problems, i.e., the two XFEM problems, DYNA3D is used instead. When environments or platforms change, executables using the current source code and the new

  20. HORACE: Software for the analysis of data from single crystal spectroscopy experiments at time-of-flight neutron instruments

    International Nuclear Information System (INIS)

    Ewings, R.A.; Buts, A.; Le, M.D.; Duijn, J. van; Bustinduy, I.; Perring, T.G.

    2016-01-01

    The HORACE suite of programs has been developed to work with large multiple-measurement data sets collected from time-of-flight neutron spectrometers equipped with arrays of position-sensitive detectors. The software allows exploratory studies of the four dimensions of reciprocal space and excitation energy to be undertaken, enabling multi-dimensional subsets to be visualized, algebraically manipulated, and models for the scattering to simulated or fitted to the data. The software is designed to be an extensible framework, thus allowing user-customized operations to be performed on the data. Examples of the use of its features are given for measurements exploring the spin waves of the simple antiferromagnet RbMnF_3 and ferromagnetic iron, and the phonons in URu_2Si_2.

  1. HORACE: Software for the analysis of data from single crystal spectroscopy experiments at time-of-flight neutron instruments

    Energy Technology Data Exchange (ETDEWEB)

    Ewings, R.A.; Buts, A.; Le, M.D. [ISIS Facility, STFC Rutherford Appleton Laboratory, Harwell Campus, Didcot OX11 0QX (United Kingdom); Duijn, J. van [Departamento de Mecánica, Universidad de Córdoba, 14071 Córdoba (Spain); Bustinduy, I. [ESS Bilbao, Poligono Ugaldeguren III, Pol. A, 7B - 48170 Zamudio, Bizkaia – País Vasco (Spain); Perring, T.G., E-mail: toby.perring@stfc.ac.uk [ISIS Facility, STFC Rutherford Appleton Laboratory, Harwell Campus, Didcot OX11 0QX (United Kingdom); London Centre for Nanotechnology and Department of Physics and Astronomy, University College London, Gower Street, London WC1E 6BT (United Kingdom)

    2016-10-21

    The HORACE suite of programs has been developed to work with large multiple-measurement data sets collected from time-of-flight neutron spectrometers equipped with arrays of position-sensitive detectors. The software allows exploratory studies of the four dimensions of reciprocal space and excitation energy to be undertaken, enabling multi-dimensional subsets to be visualized, algebraically manipulated, and models for the scattering to simulated or fitted to the data. The software is designed to be an extensible framework, thus allowing user-customized operations to be performed on the data. Examples of the use of its features are given for measurements exploring the spin waves of the simple antiferromagnet RbMnF{sub 3} and ferromagnetic iron, and the phonons in URu{sub 2}Si{sub 2}.

  2. An assessment of software for flow cytometry analysis in banana plants

    Directory of Open Access Journals (Sweden)

    Renata Alves Lara Silva

    2014-02-01

    Full Text Available Flow cytometry is a technique that yields rapid results in analyses of cell properties such as volume, morphological complexity and quantitative DNA content, and it is considered more convenient than other techniques. However, the analysis usually generates histograms marked by variations that can be produced by many factors, including differences between the software packages that capture the data generated by the flow cytometer. The objective of the present work was to evaluate the performance of four software products commonly used in flow cytometry based on quantifications of DNA content and analyses of the coefficients of variation associated with the software outputs. Readings were obtained from 25 ‘NBA’ (AA banana leaf samples using the FACSCalibur (BD flow cytometer, and 25 histograms from each software product (CellQuest™, WinMDI™, FlowJo™ and FCS Express™ were analyzed to obtain the estimated DNA content and the coefficient of variation (CV of the estimates. The values of DNA content obtained from the software did not differ significantly. However, the CV analysis showed that the precision of the WinMDI™ software was low and that the CV values were underestimated, whereas the remaining software showed CV values that were in relatively close agreement with those found in the literature. The CellQuest™ software is recommended because it was developed by the same company that produces the flow cytometer used in the present study.

  3. Functional Mobility Testing: A Novel Method to Create Suit Design Requirements

    Science.gov (United States)

    England, Scott A.; Benson, Elizabeth A.; Rajulu, Sudhakar L.

    2008-01-01

    This study was performed to aide in the creation of design requirements for the next generation of space suits that more accurately describe the level of mobility necessary for a suited crewmember through the use of an innovative methodology utilizing functional mobility. A novel method was utilized involving the collection of kinematic data while 20 subjects (10 male, 10 female) performed pertinent functional tasks that will be required of a suited crewmember during various phases of a lunar mission. These tasks were selected based on relevance and criticality from a larger list of tasks that may be carried out by the crew. Kinematic data was processed through Vicon BodyBuilder software to calculate joint angles for the ankle, knee, hip, torso, shoulder, elbow, and wrist. Maximum functional mobility was consistently lower than maximum isolated mobility. This study suggests that conventional methods for establishing design requirements for human-systems interfaces based on maximal isolated joint capabilities may overestimate the required mobility. Additionally, this method provides a valuable means of evaluating systems created from these requirements by comparing the mobility available in a new spacesuit, or the mobility required to use a new piece of hardware, to this newly established database of functional mobility.

  4. Model extension and improvement for simulator-based software safety analysis

    Energy Technology Data Exchange (ETDEWEB)

    Huang, H.-W. [Department of Engineering and System Science, National Tsing Hua University (NTHU), 101 Section 2 Kuang Fu Road, Hsinchu, Taiwan (China) and Institute of Nuclear Energy Research (INER), No. 1000 Wenhua Road, Chiaan Village, Longtan Township, Taoyuan County 32546, Taiwan (China)]. E-mail: hwhwang@iner.gov.tw; Shih Chunkuan [Department of Engineering and System Science, National Tsing Hua University (NTHU), 101 Section 2 Kuang Fu Road, Hsinchu, Taiwan (China); Yih Swu [Department of Computer Science and Information Engineering, Ching Yun University, 229 Chien-Hsin Road, Jung-Li, Taoyuan County 320, Taiwan (China); Chen, M.-H. [Institute of Nuclear Energy Research (INER), No. 1000Wenhua Road, Chiaan Village, Longtan Township, Taoyuan County 32546, Taiwan (China); Lin, J.-M. [Taiwan Power Company (TPC), 242 Roosevelt Road, Section 3, Taipei 100, Taiwan (China)

    2007-05-15

    One of the major concerns when employing digital I and C system in nuclear power plant is digital system may introduce new failure mode, which differs with previous analog I and C system. Various techniques are under developing to analyze the hazard originated from software faults in digital systems. Preliminary hazard analysis, failure modes and effects analysis, and fault tree analysis are the most extensive used techniques. However, these techniques are static analysis methods, cannot perform dynamic analysis and the interactions among systems. This research utilizes 'simulator/plant model testing' technique classified in (IEEE Std 7-4.3.2-2003, 2003. IEEE Standard for Digital Computers in Safety Systems of Nuclear Power Generating Stations) to identify hazards which might be induced by nuclear I and C software defects. The recirculation flow system, control rod system, feedwater system, steam line model, dynamic power-core flow map, and related control systems of PCTran-ABWR model were successfully extended and improved. The benchmark against ABWR SAR proves this modified model is capable to accomplish dynamic system level software safety analysis and better than the static methods. This improved plant simulation can then further be applied to hazard analysis for operator/digital I and C interface interaction failure study, and the hardware-in-the-loop fault injection study.

  5. Intraprocedural dataflow analysis for software product lines

    DEFF Research Database (Denmark)

    Brabrand, Claus; Ribeiro, Márcio; Tolêdo, Társis

    2013-01-01

    Software product lines (SPLs) developed using annotative approaches such as conditional compilation come with an inherent risk of constructing erroneous products. For this reason, it is essential to be able to analyze such SPLs. However, as dataflow analysis techniques are not able to deal with SP...... and memory characteristics on five qualitatively different SPLs. On our benchmarks, the combined analysis strategy is up to almost eight times faster than the brute-force approach....

  6. Spacelab user implementation assessment study. (Software requirements analysis). Volume 2: Technical report

    Science.gov (United States)

    1976-01-01

    The engineering analyses and evaluation studies conducted for the Software Requirements Analysis are discussed. Included are the development of the study data base, synthesis of implementation approaches for software required by both mandatory onboard computer services and command/control functions, and identification and implementation of software for ground processing activities.

  7. Software ion scan functions in analysis of glycomic and lipidomic MS/MS datasets.

    Science.gov (United States)

    Haramija, Marko

    2018-03-01

    Hardware ion scan functions unique to tandem mass spectrometry (MS/MS) mode of data acquisition, such as precursor ion scan (PIS) and neutral loss scan (NLS), are important for selective extraction of key structural data from complex MS/MS spectra. However, their software counterparts, software ion scan (SIS) functions, are still not regularly available. Software ion scan functions can be easily coded for additional functionalities, such as software multiple precursor ion scan, software no ion scan, and software variable ion scan functions. These are often necessary, since they allow more efficient analysis of complex MS/MS datasets, often encountered in glycomics and lipidomics. Software ion scan functions can be easily coded by using modern script languages and can be independent of instrument manufacturer. Here we demonstrate the utility of SIS functions on a medium-size glycomic MS/MS dataset. Knowledge of sample properties, as well as of diagnostic and conditional diagnostic ions crucial for data analysis, was needed. Based on the tables constructed with the output data from the SIS functions performed, a detailed analysis of a complex MS/MS glycomic dataset could be carried out in a quick, accurate, and efficient manner. Glycomic research is progressing slowly, and with respect to the MS experiments, one of the key obstacles for moving forward is the lack of appropriate bioinformatic tools necessary for fast analysis of glycomic MS/MS datasets. Adding novel SIS functionalities to the glycomic MS/MS toolbox has a potential to significantly speed up the glycomic data analysis process. Similar tools are useful for analysis of lipidomic MS/MS datasets as well, as will be discussed briefly. Copyright © 2017 John Wiley & Sons, Ltd.

  8. A parallel and sensitive software tool for methylation analysis on multicore platforms.

    Science.gov (United States)

    Tárraga, Joaquín; Pérez, Mariano; Orduña, Juan M; Duato, José; Medina, Ignacio; Dopazo, Joaquín

    2015-10-01

    DNA methylation analysis suffers from very long processing time, as the advent of Next-Generation Sequencers has shifted the bottleneck of genomic studies from the sequencers that obtain the DNA samples to the software that performs the analysis of these samples. The existing software for methylation analysis does not seem to scale efficiently neither with the size of the dataset nor with the length of the reads to be analyzed. As it is expected that the sequencers will provide longer and longer reads in the near future, efficient and scalable methylation software should be developed. We present a new software tool, called HPG-Methyl, which efficiently maps bisulphite sequencing reads on DNA, analyzing DNA methylation. The strategy used by this software consists of leveraging the speed of the Burrows-Wheeler Transform to map a large number of DNA fragments (reads) rapidly, as well as the accuracy of the Smith-Waterman algorithm, which is exclusively employed to deal with the most ambiguous and shortest reads. Experimental results on platforms with Intel multicore processors show that HPG-Methyl significantly outperforms in both execution time and sensitivity state-of-the-art software such as Bismark, BS-Seeker or BSMAP, particularly for long bisulphite reads. Software in the form of C libraries and functions, together with instructions to compile and execute this software. Available by sftp to anonymous@clariano.uv.es (password 'anonymous'). juan.orduna@uv.es or jdopazo@cipf.es. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  9. Software quality assurance documentation for the release of NUFT 2.0s for SUN platforms

    International Nuclear Information System (INIS)

    Fernandez, M.W.; Shaffer, R.J.

    1998-01-01

    The purpose of this Individual Software Plan (ISP) is to plan and organize the activities required to certify the NUFT code for quality affecting work involving problems that include cross drift analysis of the Yucca Mountain Repository facility. NUFT is software for application to the solution of a class of coupled mass and heat transport problems in porous geologic media including Yucca Mountain Repository Cross Drift Problem (YMRCDP- also known as the Enhanced Characterization of the Repository Block (ECRB)). Solution of this class of problems requires a suite of multiphase, multi-component models for numerical solution of non-isothermal flow and transport in porous media with applications to subsurface contaminant transport problems

  10. The Implication of Using NVivo Software in Qualitative Data Analysis: Evidence-Based Reflections.

    Science.gov (United States)

    Zamawe, F C

    2015-03-01

    For a long time, electronic data analysis has been associated with quantitative methods. However, Computer Assisted Qualitative Data Analysis Software (CAQDAS) are increasingly being developed. Although the CAQDAS has been there for decades, very few qualitative health researchers report using it. This may be due to the difficulties that one has to go through to master the software and the misconceptions that are associated with using CAQDAS. While the issue of mastering CAQDAS has received ample attention, little has been done to address the misconceptions associated with CAQDAS. In this paper, the author reflects on his experience of interacting with one of the popular CAQDAS (NVivo) in order to provide evidence-based implications of using the software. The key message is that unlike statistical software, the main function of CAQDAS is not to analyse data but rather to aid the analysis process, which the researcher must always remain in control of. In other words, researchers must equally know that no software can analyse qualitative data. CAQDAS are basically data management packages, which support the researcher during analysis.

  11. PIPER: Performance Insight for Programmers and Exascale Runtimes: Guiding the Development of the Exascale Software Stack

    Energy Technology Data Exchange (ETDEWEB)

    Mellor-Crummey, John [Rice Univ., Houston, TX (United States)

    2017-10-20

    The PIPER project set out to develop methodologies and software for measurement, analysis, attribution, and presentation of performance data for extreme-scale systems. Goals of the project were to support analysis of massive multi-scale parallelism, heterogeneous architectures, multi-faceted performance concerns, and to support both post-mortem performance analysis to identify program features that contribute to problematic performance and on-line performance analysis to drive adaptation. This final report summarizes the research and development activity at Rice University as part of the PIPER project. Producing a complete suite of performance tools for exascale platforms during the course of this project was impossible since both hardware and software for exascale systems is still a moving target. For that reason, the project focused broadly on the development of new techniques for measurement and analysis of performance on modern parallel architectures, enhancements to HPCToolkit’s software infrastructure to support our research goals or use on sophisticated applications, engaging developers of multithreaded runtimes to explore how support for tools should be integrated into their designs, engaging operating system developers with feature requests for enhanced monitoring support, engaging vendors with requests that they add hardware measure- ment capabilities and software interfaces needed by tools as they design new components of HPC platforms including processors, accelerators and networks, and finally collaborations with partners interested in using HPCToolkit to analyze and tune scalable parallel applications.

  12. Comparison of two three-dimensional cephalometric analysis computer software.

    Science.gov (United States)

    Sawchuk, Dena; Alhadlaq, Adel; Alkhadra, Thamer; Carlyle, Terry D; Kusnoto, Budi; El-Bialy, Tarek

    2014-10-01

    Three-dimensional cephalometric analyses are getting more attraction in orthodontics. The aim of this study was to compare two softwares to evaluate three-dimensional cephalometric analyses of orthodontic treatment outcomes. Twenty cone beam computed tomography images were obtained using i-CAT(®) imaging system from patient's records as part of their regular orthodontic records. The images were analyzed using InVivoDental5.0 (Anatomage Inc.) and 3DCeph™ (University of Illinois at Chicago, Chicago, IL, USA) software. Before and after orthodontic treatments data were analyzed using t-test. Reliability test using interclass correlation coefficient was stronger for InVivoDental5.0 (0.83-0.98) compared with 3DCeph™ (0.51-0.90). Paired t-test comparison of the two softwares shows no statistical significant difference in the measurements made in the two softwares. InVivoDental5.0 measurements are more reproducible and user friendly when compared to 3DCeph™. No statistical difference between the two softwares in linear or angular measurements. 3DCeph™ is more time-consuming in performing three-dimensional analysis compared with InVivoDental5.0.

  13. Choosing your weapons : on sentiment analysis tools for software engineering research

    NARCIS (Netherlands)

    Jongeling, R.M.; Datta, S.; Serebrenik, A.; Koschke, R.; Krinke, J.; Robillard, M.

    2015-01-01

    Recent years have seen an increasing attention to social aspects of software engineering, including studies of emotions and sentiments experienced and expressed by the software developers. Most of these studies reuse existing sentiment analysis tools such as SentiStrength and NLTK. However, these

  14. VOLCWORKS: A suite for optimization of hazards mapping

    Science.gov (United States)

    Delgado Granados, H.; Ramírez Guzmán, R.; Villareal Benítez, J. L.; García Sánchez, T.

    2012-04-01

    Making hazards maps is a process linking basic science, applied science and engineering for the benefit of the society. The methodologies for hazards maps' construction have evolved enormously together with the tools that allow the forecasting of the behavior of the materials produced by different eruptive processes. However, in spite of the development of tools and evolution of methodologies, the utility of hazards maps has not changed: prevention and mitigation of volcanic disasters. Integration of different tools for simulation of different processes for a single volcano is a challenge to be solved using software tools including processing, simulation and visualization techniques, and data structures in order to build up a suit that helps in the construction process starting from the integration of the geological data, simulations and simplification of the output to design a hazards/scenario map. Scientific visualization is a powerful tool to explore and gain insight into complex data from instruments and simulations. The workflow from data collection, quality control and preparation for simulations, to achieve visual and appropriate presentation is a process that is usually disconnected, using in most of the cases different applications for each of the needed processes, because it requires many tools that are not built for the solution of a specific problem, or were developed by research groups to solve particular tasks, but disconnected. In volcanology, due to its complexity, groups typically examine only one aspect of the phenomenon: ash dispersal, laharic flows, pyroclastic flows, lava flows, and ballistic projectile ejection, among others. However, when studying the hazards associated to the activity of a volcano, it is important to analyze all the processes comprehensively, especially for communication of results to the end users: decision makers and planners. In order to solve this problem and connect different parts of a workflow we are developing the

  15. Calibration Analysis Software for the ATLAS Pixel Detector

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00372086; The ATLAS collaboration

    2016-01-01

    The calibration of the ATLAS Pixel detector at LHC fulfils two main purposes: to tune the front-end configuration parameters for establishing the best operational settings and to measure the tuning performance through a subset of scans. An analysis framework has been set up in order to take actions on the detector given the outcome of a calibration scan (e.g. to create a mask for disabling noisy pixels). The software framework to control all aspects of the Pixel detector scans and analyses is called Calibration Console. The introduction of a new layer, equipped with new Front End-I4 Chips, required an update the Console architecture. It now handles scans and scans analyses applied together to chips with different characteristics. An overview of the newly developed Calibration Analysis Software will be presented, together with some preliminary result.

  16. Planning and Analysis of the Company’s Financial Performances by Using a Software Simulation

    Directory of Open Access Journals (Sweden)

    Meri BOSHKOSKA

    2017-06-01

    Full Text Available Information Technology includes a wide range of software solution that helps managers in decision making processes in order to increase the company's business performance. Using software solution in financial analysis is a valuable tool for managers in the financial decision making process. The objective of the study was accomplished by developing Software that easily determines the financial performances of the company through integration of the analysis of financial indicators and DuPont profitability analysis model. Through this software, managers will be able to calculate the current financial state and visually analyze how their actions will affect the financial performance of the company. This will enable them to identify the best ways to improve the financial performance of the company. The software can perform a financial analysis and give a clear, useful overview of the current business performance and can also help in planning the growth of the company. The Software can also be implemented in educational purposes for students and managers in the field of financial management.

  17. Development of the software dead time methodology for the 4πβ-γ software coincidence system analysis program

    International Nuclear Information System (INIS)

    Toledo, Fabio de; Brancaccio, Franco; Dias, Mauro da Silva

    2009-01-01

    The Laboratorio de Metrologia Nuclear - LMN, Nuclear Metrology Laboratory -, at IPEN-CNEN/SP, Sao Paulo, Brazil, developed a new Software Coincidence System (SCS) for 4πβ-γ radioisotope standardization. SCS is composed by the data acquisition hardware, for the coincidence data recording, and the coincidence data analysis program that performs the radioactive activity calculation for the target sample. Due to hardware intrinsic signal sampling characteristics, multiple undesired data recording occurs from a single saturated pulse. Also pulse pileup leads to bad data recording. As the beta counting rates are much greater than the gamma ones, due to the high 4π geometry beta detecting efficiencies, the beta counting significantly increases because of multiple pulse recordings, resulting in a respective increasing in the calculated activity value. In order to minimize such bad recordings effect, a software dead time value was introduced in the coincidence analysis program, under development at LMN, discarding multiple recordings, due to pulse pileup or saturation. This work presents the methodology developed to determine the optimal software dead time data value, for better accuracy results attaining, and discusses the results, pointing to software improvement possibilities. (author)

  18. EARLY IDENTIFICATION OF SOFTWARE DEFECTS USING OCL PREDICATES TO IMPROVE SOFTWARE QUALITY

    Directory of Open Access Journals (Sweden)

    A. JALILA

    2015-03-01

    Full Text Available Formal specification-based testing has been used widely to assess potential faults or prove their absence in a given system at the earliest. This research work has proposed an automated fault-based testing framework to test the specification of the system using Object Constraint Language (OCL. Accordingly, the possible faults in OCL specification of the system has been anticipated by mutating its method based on OCL predicate-based fault classes. Then, test cases are generated using Genetic Algorithm with simulated annealing technique. In this paper, a novel OCL-predicate based fitness function is defined to evaluate the generated test data. Finally, this paper presents the experimental results, which indicate that the proposed methodology provides more test coverage with the reduced test suite and test run. This results in cost-effective software development so as to improve software quality.

  19. SIMA: Python software for analysis of dynamic fluorescence imaging data

    Directory of Open Access Journals (Sweden)

    Patrick eKaifosh

    2014-09-01

    Full Text Available Fluorescence imaging is a powerful method for monitoring dynamic signals in the nervous system. However, analysis of dynamic fluorescence imaging data remains burdensome, in part due to the shortage of available software tools. To address this need, we have developed SIMA, an open source Python package that facilitates common analysis tasks related to fluorescence imaging. Functionality of this package includes correction of motion artifacts occurring during in vivo imaging with laser-scanning microscopy, segmentation of imaged fields into regions of interest (ROIs, and extraction of signals from the segmented ROIs. We have also developed a graphical user interface (GUI for manual editing of the automatically segmented ROIs and automated registration of ROIs across multiple imaging datasets. This software has been designed with flexibility in mind to allow for future extension with different analysis methods and potential integration with other packages. Software, documentation, and source code for the SIMA package and ROI Buddy GUI are freely available at http://www.losonczylab.org/sima/.

  20. Multicriteria analysis of ontologically represented information

    Science.gov (United States)

    Wasielewska, K.; Ganzha, M.; Paprzycki, M.; Bǎdicǎ, C.; Ivanovic, M.; Lirkov, I.

    2014-11-01

    Our current work concerns the development of a decision support system for the software selection problem. The main idea is to utilize expert knowledge to help the user in selecting the best software / method / computational resource to solve a computational problem. Obviously, this involves multicriterial decision making and the key open question is: which method to choose. The context of the work is provided by the Agents in Grid (AiG) project, where the software selection (and thus multicriterial analysis) is to be realized when all information concerning the problem, the hardware and the software is ontologically represented. Initially, we have considered the Analytical Hierarchy Process (AHP), which is well suited for the hierarchical data structures (e.g., such that have been formulated in terms of ontologies). However, due to its well-known shortcomings, we have decided to extend our search for the multicriterial analysis method best suited for the problem in question. In this paper we report results of our search, which involved: (i) TOPSIS (Technique for Order Preference by Similarity to Ideal Solution), (ii) PROMETHEE, and (iii) GRIP (Generalized Regression with Intensities of Preference). We also briefly argue why other methods have not been considered as valuable candidates.

  1. 7. annual software survey 2009

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    2009-07-15

    This article presented a guide to new software applications designed to facilitate petroleum exploration, drilling and production activities. Oil and gas producers can use the products for a range of functions, including reservoir characterization and accounting. In addition to a description of the software application, this article listed the name of software providers and the new features available in each product. The featured software developed by Calgary-based providers included: OpenInvoice software developed by DO2 Technologies Inc; oil and gas solutions by Energy Navigator; WellSpring planning system by Enersight; Entero MOSAIC and Entero ONE software packages by Entero Corporation; Emission Manager by Envirosoft Corporation; ResSurveil, ResBalance and ResAssist by Epic Consulting Services Ltd.; OMNI 3D, VISTA 2D/3D seismic software by Gedco; geoSCOUT, petroCUBE and gDC by GeoLOGIC Systems Ltd.; IHS AccuMap and PETRA by IHS; WELLFLO, PIPEFLO and FORGAS wellbore solutions by Neotec; AFENexus, FANexus, GeoNexus, JVNexus, PANexus software by Pandell Technology Corporation; Oil and gas solutions by the Risk Advisory division of SAS; Petrel, ECLIPSE, Avocet, Osprey and Merak by Schlumberger Information Solutions; esi.manage and esi.executive by 3esi; and STABView, ROCKSBank by Weatherford Advanced Geotechnology. The featured software developed by Texas-based providers included the HTRI Xchanger Suite by Heat Transfer Research Inc.; the RFID-based asset tracking system by Merrick Systems; oil and gas solutions by Neuralog Inc.; geoscience data programs by OpenSpirit; and oil and gas solutions by Seismic Micro-Technology Inc. The featured software developed by Vancouver-based providers included the oil and gas solutions by Sustainet Software Solutions Inc.

  2. MEDBASE: Strategic Planning and Implementation of an Army Medical Department Software Application

    Science.gov (United States)

    2003-07-17

    downtime procedures) Software/ hardware performance testing Contingency plans Celebration / people management Risk Zone 4 Planning for upgrades...goal to “Crush Adidas !” and Honda’s goal in 1970 to “destroy Yamaha!”. MEDBASE: Strategy and Implementation 57 (3) Role Model BHAGs suit up-and...Crush Adidas ! (Nike, 1960s) Yamaha so tsubusu! We will destroy Yamaha! (Honda, 1970s) Role-Model BHAGs suit up-and-coming organizations Become the

  3. VDJtools: Unifying Post-analysis of T Cell Receptor Repertoires.

    Directory of Open Access Journals (Sweden)

    Mikhail Shugay

    2015-11-01

    Full Text Available Despite the growing number of immune repertoire sequencing studies, the field still lacks software for analysis and comprehension of this high-dimensional data. Here we report VDJtools, a complementary software suite that solves a wide range of T cell receptor (TCR repertoires post-analysis tasks, provides a detailed tabular output and publication-ready graphics, and is built on top of a flexible API. Using TCR datasets for a large cohort of unrelated healthy donors, twins, and multiple sclerosis patients we demonstrate that VDJtools greatly facilitates the analysis and leads to sound biological conclusions. VDJtools software and documentation are available at https://github.com/mikessh/vdjtools.

  4. Modularity analysis of automotive control software

    OpenAIRE

    Dajsuren, Y.; Brand, van den, M.G.J.; Serebrenik, A.

    2013-01-01

    A design language and tool like MATLAB/Simulink is used for the graphical modelling and simulation of automotive control software. As the functionality based on electronics and software systems increases in motor vehicles, it is becoming increasingly important for system/software architects and control engineers in the automotive industry to ensure the quality of the highly complex MATLAB/Simulink control software. For automotive software, modularity is recognized as being a crucial quality a...

  5. Modular reweighting software for statistical mechanical analysis of biased equilibrium data

    Science.gov (United States)

    Sindhikara, Daniel J.

    2012-07-01

    Here a simple, useful, modular approach and software suite designed for statistical reweighting and analysis of equilibrium ensembles is presented. Statistical reweighting is useful and sometimes necessary for analysis of equilibrium enhanced sampling methods, such as umbrella sampling or replica exchange, and also in experimental cases where biasing factors are explicitly known. Essentially, statistical reweighting allows extrapolation of data from one or more equilibrium ensembles to another. Here, the fundamental separable steps of statistical reweighting are broken up into modules - allowing for application to the general case and avoiding the black-box nature of some “all-inclusive” reweighting programs. Additionally, the programs included are, by-design, written with little dependencies. The compilers required are either pre-installed on most systems, or freely available for download with minimal trouble. Examples of the use of this suite applied to umbrella sampling and replica exchange molecular dynamics simulations will be shown along with advice on how to apply it in the general case. New version program summaryProgram title: Modular reweighting version 2 Catalogue identifier: AEJH_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEJH_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License, version 3 No. of lines in distributed program, including test data, etc.: 179 118 No. of bytes in distributed program, including test data, etc.: 8 518 178 Distribution format: tar.gz Programming language: C++, Python 2.6+, Perl 5+ Computer: Any Operating system: Any RAM: 50-500 MB Supplementary material: An updated version of the original manuscript (Comput. Phys. Commun. 182 (2011) 2227) is available Classification: 4.13 Catalogue identifier of previous version: AEJH_v1_0 Journal reference of previous version: Comput. Phys. Commun. 182 (2011) 2227 Does the new

  6. Development of a software for INAA analysis automation

    International Nuclear Information System (INIS)

    Zahn, Guilherme S.; Genezini, Frederico A.; Figueiredo, Ana Maria G.; Ticianelli, Regina B.

    2013-01-01

    In this work, a software to automate the post-counting tasks in comparative INAA has been developed that aims to become more flexible than the available options, integrating itself with some of the routines currently in use in the IPEN Activation Analysis Laboratory and allowing the user to choose between a fully-automatic analysis or an Excel-oriented one. The software makes use of the Genie 2000 data importing and analysis routines and stores each 'energy-counts-uncertainty' table as a separate ASCII file that can be used later on if required by the analyst. Moreover, it generates an Excel-compatible CSV (comma separated values) file with only the relevant results from the analyses for each sample or comparator, as well as the results of the concentration calculations and the results obtained with four different statistical tools (unweighted average, weighted average, normalized residuals and Rajeval technique), allowing the analyst to double-check the results. Finally, a 'summary' CSV file is also produced, with the final concentration results obtained for each element in each sample. (author)

  7. Software Performs Complex Design Analysis

    Science.gov (United States)

    2008-01-01

    Designers use computational fluid dynamics (CFD) to gain greater understanding of the fluid flow phenomena involved in components being designed. They also use finite element analysis (FEA) as a tool to help gain greater understanding of the structural response of components to loads, stresses and strains, and the prediction of failure modes. Automated CFD and FEA engineering design has centered on shape optimization, which has been hindered by two major problems: 1) inadequate shape parameterization algorithms, and 2) inadequate algorithms for CFD and FEA grid modification. Working with software engineers at Stennis Space Center, a NASA commercial partner, Optimal Solutions Software LLC, was able to utilize its revolutionary, one-of-a-kind arbitrary shape deformation (ASD) capability-a major advancement in solving these two aforementioned problems-to optimize the shapes of complex pipe components that transport highly sensitive fluids. The ASD technology solves the problem of inadequate shape parameterization algorithms by allowing the CFD designers to freely create their own shape parameters, therefore eliminating the restriction of only being able to use the computer-aided design (CAD) parameters. The problem of inadequate algorithms for CFD grid modification is solved by the fact that the new software performs a smooth volumetric deformation. This eliminates the extremely costly process of having to remesh the grid for every shape change desired. The program can perform a design change in a markedly reduced amount of time, a process that would traditionally involve the designer returning to the CAD model to reshape and then remesh the shapes, something that has been known to take hours, days-even weeks or months-depending upon the size of the model.

  8. Technical Performance Assessment: Mission Success in Software Acquisition Management

    Science.gov (United States)

    2010-04-27

    Examples Design constraints make software acquisition and development t l iti lex reme y cr ca Application domain – Operational Flight Program, Air...environment – used to produce the software Ri k t t bli h d d i t i d i k ts managemen – es a s e an ma n a ne r s managemen systems Milestone reviews...Directorate for Information Operations and Reports, 1215 Jefferson Davis Highway, Suite 1204, Arlington VA 22202-4302. Respondents should be aware that

  9. Experimental analysis of specification language diversity impact on NPP software diversity

    International Nuclear Information System (INIS)

    Yoo, Chang Sik

    1999-02-01

    In order to increase computer system reliability, software fault tolerance methods have been adopted to some safety critical systems including NPP. Prevention of software common mode failure is very crucial problem in software fault tolerance, but the effective method for this problem is not found yet. In our research, to find out an effective method for prevention of software common mode failure, the impact of specification language diversity on NPP software diversity was examined experimentally. Three specification languages were used to compose three requirements specifications, and programmers made twelve product codes from the specifications. From the product codes analysis, using fault diversity criteria, we concluded that diverse specification language method would enhance program diversity through diversification of requirements specification imperfections

  10. Synchronized analysis of testbeam data with the Judith software

    CERN Document Server

    McGoldrick, Garrin; Gorišek, Andrej

    2014-01-01

    The Judith software performs pixel detector analysis tasks utilizing two different data streams such as those produced by the reference and tested devices typically found in a testbeam. This software addresses and fixes problems arising from the desynchronization of the two simultaneously triggered data streams by detecting missed triggers in either of the streams. The software can perform all tasks required to generate particle tracks using multiple detector planes: it can align the planes, cluster hits and generate tracks from these clusters. This information can then be used to measure the properties of a particle detector with very fine spatial resolution. It was tested at DESY in the Kartel telescope, a silicon tracking detector, with ATLAS Diamond Beam Monitor modules as a device under test.

  11. Durable Suit Bladder with Improved Water Permeability for Pressure and Environment Suits

    Science.gov (United States)

    Bue, Grant C.; Kuznetz, Larry; Orndoff, Evelyne; Tang, Henry; Aitchison, Lindsay; Ross, Amy

    2009-01-01

    Water vapor permeability is shown to be useful in rejecting heat and managing moisture accumulation in launch-and-entry pressure suits. Currently this is accomplished through a porous Gortex layer in the Advanced Crew and Escape Suit (ACES) and in the baseline design of the Constellation Suit System Element (CSSE) Suit 1. Non-porous dense monolithic membranes (DMM) that are available offer potential improvements for water vapor permeability with reduced gas leak. Accordingly, three different pressure bladder materials were investigated for water vapor permeability and oxygen leak: ElasthaneTM 80A (thermoplastic polyether urethane) provided from stock polymer material and two custom thermoplastic polyether urethanes. Water vapor, carbon dioxide and oxygen permeability of the DMM's was measured in a 0.13 mm thick stand-alone layer, a 0.08 mm and 0.05 mm thick layer each bonded to two different nylon and polyester woven reinforcing materials. Additional water vapor permeability and mechanical compression measurements were made with the reinforced 0.05 mm thick layers, further bonded with a polyester wicking and overlaid with moistened polyester fleece thermal underwear .This simulated the pressure from a supine crew person. The 0.05 mm thick nylon reinforced sample with polyester wicking layer was further mechanically tested for wear and abrasion. Concepts for incorporating these materials in launch/entry and Extravehicular Activity pressure suits are presented.

  12. A Suite of Tools for Technology Assessment

    Science.gov (United States)

    2007-09-01

    Saden, Povinelli & Rosen, 1989). • This was a significant change in emphasis on the part of NASA, where technology had previously viewed as merely...Cost Analysis Symposium, April 13, 2005. A Suite of Tools for Technology Assessment 24 Bibliography - continued: • Sadin, Stanley T.; Povinelli

  13. An Agent Based Software Approach towards Building Complex Systems

    Directory of Open Access Journals (Sweden)

    Latika Kharb

    2015-08-01

    Full Text Available Agent-oriented techniques represent an exciting new means of analyzing, designing and building complex software systems. They have the potential to significantly improve current practice in software engineering and to extend the range of applications that can feasibly be tackled. Yet, to date, there have been few serious attempts to cast agent systems as a software engineering paradigm. This paper seeks to rectify this omission. Specifically, points to be argued include:firstly, the conceptual apparatus of agent-oriented systems is well-suited to building software solutions for complex systems and secondly, agent-oriented approaches represent a genuine advance over the current state of the art for engineering complex systems. Following on from this view, the major issues raised by adopting an agentoriented approach to software engineering are highlighted and discussed in this paper.

  14. Requirement analysis of the safety-critical software implementation for the nuclear power plant

    International Nuclear Information System (INIS)

    Chang, Hoon Seon; Jung, Jae Cheon; Kim, Jae Hack; Nam, Sang Ku; Kim, Hang Bae

    2005-01-01

    The safety critical software shall be implemented under the strict regulation and standards along with hardware qualification. In general, the safety critical software has been implemented using functional block language (FBL) and structured language like C in the real project. Software design shall comply with such characteristics as; modularity, simplicity, minimizing the use of sub-routine, and excluding the interrupt logic. To meet these prerequisites, we used the computer-aided software engineering (CASE) tool to substantiate the requirements traceability matrix that were manually developed using Word processors or Spreadsheets. And the coding standard and manual have been developed to confirm the quality of software development process, such as; readability, consistency, and maintainability in compliance with NUREG/CR-6463. System level preliminary hazard analysis (PHA) is performed by analyzing preliminary safety analysis report (PSAR) and FMEA document. The modularity concept is effectively implemented for the overall module configurations and functions using RTP software development tool. The response time imposed on the basis of the deterministic structure of the safety-critical software was measured

  15. TOUGH2 software qualification

    Energy Technology Data Exchange (ETDEWEB)

    Pruess, K.; Simmons, A.; Wu, Y.S.; Moridis, G.

    1996-02-01

    TOUGH2 is a numerical simulation code for multi-dimensional coupled fluid and heat flow of multiphase, multicomponent fluid mixtures in porous and fractured media. It belongs to the MULKOM ({open_quotes}MULti-KOMponent{close_quotes}) family of codes and is a more general version of the TOUGH simulator. The MULKOM family of codes was originally developed with a focus on geothermal reservoir simulation. They are suited to modeling systems which contain different fluid mixtures, with applications to flow problems arising in the context of high-level nuclear waste isolation, oil and gas recovery and storage, and groundwater resource protection. TOUGH2 is essentially a subset of MULKOM, consisting of a selection of the better tested and documented MULKOM program modules. The purpose of this package of reports is to provide all software baseline documents necessary for the software qualification of TOUGH2.

  16. TOUGH2 software qualification

    International Nuclear Information System (INIS)

    Pruess, K.; Simmons, A.; Wu, Y.S.; Moridis, G.

    1996-02-01

    TOUGH2 is a numerical simulation code for multi-dimensional coupled fluid and heat flow of multiphase, multicomponent fluid mixtures in porous and fractured media. It belongs to the MULKOM (open-quotes MULti-KOMponentclose quotes) family of codes and is a more general version of the TOUGH simulator. The MULKOM family of codes was originally developed with a focus on geothermal reservoir simulation. They are suited to modeling systems which contain different fluid mixtures, with applications to flow problems arising in the context of high-level nuclear waste isolation, oil and gas recovery and storage, and groundwater resource protection. TOUGH2 is essentially a subset of MULKOM, consisting of a selection of the better tested and documented MULKOM program modules. The purpose of this package of reports is to provide all software baseline documents necessary for the software qualification of TOUGH2

  17. STAR: Software Toolkit for Analysis Research

    International Nuclear Information System (INIS)

    Doak, J.E.; Prommel, J.M.; Whiteson, R.; Hoffbauer, B.L.; Thomas, T.R.; Helman, P.

    1993-01-01

    Analyzing vast quantities of data from diverse information sources is an increasingly important element for nonproliferation and arms control analysis. Much of the work in this area has used human analysts to assimilate, integrate, and interpret complex information gathered from various sources. With the advent of fast computers, we now have the capability to automate this process thereby shifting this burden away from humans. In addition, there now exist huge data storage capabilities which have made it possible to formulate large integrated databases comprising many thereabouts of information spanning a variety of subjects. We are currently designing a Software Toolkit for Analysis Research (STAR) to address these issues. The goal of STAR is to Produce a research tool that facilitates the development and interchange of algorithms for locating phenomena of interest to nonproliferation and arms control experts. One major component deals with the preparation of information. The ability to manage and effectively transform raw data into a meaningful form is a prerequisite for analysis by any methodology. The relevant information to be analyzed can be either unstructured text structured data, signals, or images. Text can be numerical and/or character, stored in raw data files, databases, streams of bytes, or compressed into bits in formats ranging from fixed, to character-delimited, to a count followed by content The data can be analyzed in real-time or batch mode. Once the data are preprocessed, different analysis techniques can be applied. Some are built using expert knowledge. Others are trained using data collected over a period of time. Currently, we are considering three classes of analyzers for use in our software toolkit: (1) traditional machine learning techniques, (2) the purely statistical system, and (3) expert systems

  18. Analysis of Variance: What Is Your Statistical Software Actually Doing?

    Science.gov (United States)

    Li, Jian; Lomax, Richard G.

    2011-01-01

    Users assume statistical software packages produce accurate results. In this article, the authors systematically examined Statistical Package for the Social Sciences (SPSS) and Statistical Analysis System (SAS) for 3 analysis of variance (ANOVA) designs, mixed-effects ANOVA, fixed-effects analysis of covariance (ANCOVA), and nested ANOVA. For each…

  19. Quality control in dual head γ-cameras: comparison between methods and software s used for image analysis

    International Nuclear Information System (INIS)

    Nayl E, A.; Fornasier, M. R.; De Denaro, M.; Sulieman, A.; Alkhorayef, M.; Bradley, D.

    2017-10-01

    Patient radiation dose and image quality are the main issues in nuclear medicine (Nm) procedures. Currently, many protocols are used for image acquisition and analysis of quality control (Qc) tests. National Electrical Manufacturers Association (Nema) methods and protocols are widely accepted method used for providing accurate description, measurement and reporting of γ-camera performance parameters. However, no standard software is available for image analysis. The aim os this study was to compare between the vendor Qc software analysis and three software from different developers downloaded free from internet; NMQC, Nm Tool kit and ImageJ-Nm Tool kit software. The three software are used for image analysis of some Qc tests for γ-cameras based on Nema protocols including non-uniformity evaluation. Ten non-uniformity Qc images were taken from dual head γ-camera (Siemens Symbia) installed in Trieste general hospital (Italy), and analyzed. Excel analysis was used as baseline calculation of the non-uniformity test according Nema procedures. The results of the non-uniformity analysis showed good agreement between the three independent software and excel calculation (the average differences were 0.3%, 2.9%, 1.3% and 1.6% for UFOV integral, UFOV differential, CFOV integral and CFOV differential respectively), while significant difference was detected on the analysis of the company Qc software with compare to the excel analysis (the average differences were 14.6%, 20.7%, 25.7% and 31.9% for UFOV integral, UFOV differential, CFOV integral and CFOV differential respectively). NMQC software was the best in comparison with the excel calculations. The variation in the results is due to different pixel sizes used for analysis in the three software and the γ-camera Qc software. Therefore, is important to perform the tests by the vendor Qc software as well as by independent analysis to understand the differences between the values. Moreover, the medical physicist should know

  20. Quality control in dual head γ-cameras: comparison between methods and software s used for image analysis

    Energy Technology Data Exchange (ETDEWEB)

    Nayl E, A. [Sudan Atomic Energy Commission, Radiation Safety Institute, Khartoum (Sudan); Fornasier, M. R.; De Denaro, M. [Azienda Sanitaria Universitaria Integrata di Trieste, Medical Physics Department, Via Giovanni Sai 7, 34128 Trieste (Italy); Sulieman, A. [Prince Sattam bin Abdulaziz University, College of Applied Medical Sciences, Radiology and Medical Imaging Department, P. O. Box 422, 11942 Al-Kharj (Saudi Arabia); Alkhorayef, M.; Bradley, D., E-mail: abdwsh10@hotmail.com [University of Surrey, Department of Physics, GU2-7XH Guildford, Surrey (United Kingdom)

    2017-10-15

    Patient radiation dose and image quality are the main issues in nuclear medicine (Nm) procedures. Currently, many protocols are used for image acquisition and analysis of quality control (Qc) tests. National Electrical Manufacturers Association (Nema) methods and protocols are widely accepted method used for providing accurate description, measurement and reporting of γ-camera performance parameters. However, no standard software is available for image analysis. The aim os this study was to compare between the vendor Qc software analysis and three software from different developers downloaded free from internet; NMQC, Nm Tool kit and ImageJ-Nm Tool kit software. The three software are used for image analysis of some Qc tests for γ-cameras based on Nema protocols including non-uniformity evaluation. Ten non-uniformity Qc images were taken from dual head γ-camera (Siemens Symbia) installed in Trieste general hospital (Italy), and analyzed. Excel analysis was used as baseline calculation of the non-uniformity test according Nema procedures. The results of the non-uniformity analysis showed good agreement between the three independent software and excel calculation (the average differences were 0.3%, 2.9%, 1.3% and 1.6% for UFOV integral, UFOV differential, CFOV integral and CFOV differential respectively), while significant difference was detected on the analysis of the company Qc software with compare to the excel analysis (the average differences were 14.6%, 20.7%, 25.7% and 31.9% for UFOV integral, UFOV differential, CFOV integral and CFOV differential respectively). NMQC software was the best in comparison with the excel calculations. The variation in the results is due to different pixel sizes used for analysis in the three software and the γ-camera Qc software. Therefore, is important to perform the tests by the vendor Qc software as well as by independent analysis to understand the differences between the values. Moreover, the medical physicist should know

  1. Methodology for object-oriented real-time systems analysis and design: Software engineering

    Science.gov (United States)

    Schoeffler, James D.

    1991-01-01

    Successful application of software engineering methodologies requires an integrated analysis and design life-cycle in which the various phases flow smoothly 'seamlessly' from analysis through design to implementation. Furthermore, different analysis methodologies often lead to different structuring of the system so that the transition from analysis to design may be awkward depending on the design methodology to be used. This is especially important when object-oriented programming is to be used for implementation when the original specification and perhaps high-level design is non-object oriented. Two approaches to real-time systems analysis which can lead to an object-oriented design are contrasted: (1) modeling the system using structured analysis with real-time extensions which emphasizes data and control flows followed by the abstraction of objects where the operations or methods of the objects correspond to processes in the data flow diagrams and then design in terms of these objects; and (2) modeling the system from the beginning as a set of naturally occurring concurrent entities (objects) each having its own time-behavior defined by a set of states and state-transition rules and seamlessly transforming the analysis models into high-level design models. A new concept of a 'real-time systems-analysis object' is introduced and becomes the basic building block of a series of seamlessly-connected models which progress from the object-oriented real-time systems analysis and design system analysis logical models through the physical architectural models and the high-level design stages. The methodology is appropriate to the overall specification including hardware and software modules. In software modules, the systems analysis objects are transformed into software objects.

  2. A Survey of Automatic Code Generating Software

    Science.gov (United States)

    1988-09-01

    ID that matches a name in the user’s database file and must then match the password specified for that user. A successful login will allow access to...Software Corp. P.O. Box 10089 Chicago, IL 60610 312-743-2755 PRO-2 Prodata, Inc. 4477 Emerald Suite C-100 Boise, ID 83706 208-342-6878 67 GTP Allen, Emerson

  3. A field study on root cause analysis of defects in space software

    International Nuclear Information System (INIS)

    Silva, Nuno; Cunha, João Carlos; Vieira, Marco

    2017-01-01

    Critical systems, such as space systems, are developed under strict requirements envisaging high integrity in accordance to specific standards. For such software systems, an independent assessment is put into effect (Independent Software Verification and Validation – ISVV) after the regular development lifecycle and V&V activities, aiming at finding residual faults and raising confidence in the software. However, it has been observed that there is still a significant number of defects remaining at this stage, questioning the effectiveness of the previous engineering processes. This paper presents a root cause analysis of 1070 defects found in four space software projects during ISVV, by applying an improved Orthogonal Defect Classification (ODC) taxonomy and examining the defect types, triggers and impacts, in order to identify why they reached such a later stage in the development. The paper also puts forward proposals for modifications to both the software development (to prevent defects) and the V&V activities (to better detect defects) and an assessment methodology for future works on root cause analysis. - Highlights: • Root cause analysis of space software defects by using an enhanced ODC taxonomy. • Prioritization of the root causes according to the more important defect impacts. • Identification of improvements to systems engineering and development processes. • Improvements to V&V activities as means to reduce the occurrence of defects. • Generic process to achieve the defects root causes and the corrections suggestions.

  4. Software selection based on analysis and forecasting methods, practised in 1C

    Science.gov (United States)

    Vazhdaev, A. N.; Chernysheva, T. Y.; Lisacheva, E. I.

    2015-09-01

    The research focuses on the problem of a “1C: Enterprise 8” platform inboard mechanisms for data analysis and forecasting. It is important to evaluate and select proper software to develop effective strategies for customer relationship management in terms of sales, as well as implementation and further maintenance of software. Research data allows creating new forecast models to schedule further software distribution.

  5. The vTAS suite: A simulator for classical and multiplexed three-axis neutron spectrometers

    International Nuclear Information System (INIS)

    Boehm, M.; Filhol, A.; Raoul, Y.; Kulda, J.; Schmidt, W.; Schmalzl, K.; Farhi, E.

    2013-01-01

    The vTAS suite provides graphical assistance to prepare and perform inelastic neutron scattering experiments on a TAS instrument, including latest multiplexed instrumental configurations, such as FlatCone, IMPS and UFO. The interactive display allows for flexible translation between instrument positions in real space and neutron scattering conditions represented in reciprocal space. It is a platform independent public domain software tool, available for download from the website of the Institut Laue Langevin (ILL).

  6. The Aviation Performance Measuring System (APMS): An Integrated Suite of Tools for Measuring Performance and Safety

    Science.gov (United States)

    Statler, Irving C.; Connor, Mary M. (Technical Monitor)

    1998-01-01

    This is a report of work in progress. In it, I summarize the status of the research and development of the Aviation Performance Measuring System (APMS) for managing, processing, and analyzing digital flight-recorded data, The objectives of the NASA-FAA APMS research project are to establish a sound scientific and technological basis for flight-data analysis, to define an open and flexible architecture for flight-data analysis systems, and to articulate guidelines for a standardized database structure on which to continue to build future flight-data-analysis extensions. APMS offers to the air transport community an open, voluntary standard for flight-data-analysis software; a standard that will help to ensure suitable functionality and data interchangeability among competing software programs. APMS will develop and document the methodologies, algorithms, and procedures for data management and analyses to enable users to easily interpret the implications regarding safety and efficiency of operations. APMS does not entail the implementation of a nationwide flight-data-collection system. It is intended to provide technical tools to ease the large-scale implementation of flight-data analyses at both the air-carrier and the national-airspace levels in support of their Flight Operations and Quality Assurance (FOQA) Programs and Advanced Qualifications Programs (AQP). APMS cannot meet its objectives unless it develops tools that go substantially beyond the capabilities of the current commercially available software and supporting analytic methods that are mainly designed to count special events. These existing capabilities, while of proven value, were created primarily with the needs-of aircrews in mind. APMS tools must serve the needs of the government and air carriers, as well as aircrews, to fully support the FOQA and AQP programs. They must be able to derive knowledge not only through the analysis of single flights (special-event detection), but also through

  7. Chaste: A test-driven approach to software development for biological modelling

    KAUST Repository

    Pitt-Francis, Joe; Pathmanathan, Pras; Bernabeu, Miguel O.; Bordas, Rafel; Cooper, Jonathan; Fletcher, Alexander G.; Mirams, Gary R.; Murray, Philip; Osborne, James M.; Walter, Alex; Chapman, S. Jon; Garny, Alan; van Leeuwen, Ingeborg M.M.; Maini, Philip K.; Rodrí guez, Blanca; Waters, Sarah L.; Whiteley, Jonathan P.; Byrne, Helen M.; Gavaghan, David J.

    2009-01-01

    Chaste ('Cancer, heart and soft-tissue environment') is a software library and a set of test suites for computational simulations in the domain of biology. Current functionality has arisen from modelling in the fields of cancer, cardiac physiology

  8. The Use of Flexible, Interactive, Situation-Focused Software for the E-Learning of Mathematics.

    Science.gov (United States)

    Farnsworth, Ralph Edward

    This paper discusses the classroom, home, and distance use of new, flexible, interactive, application-oriented software known as Active Learning Suite. The actual use of the software, not just a controlled experiment, is reported on. Designed for the e-learning of university mathematics, the program was developed by a joint U.S.-Russia team and…

  9. IMMAN: free software for information theory-based chemometric analysis.

    Science.gov (United States)

    Urias, Ricardo W Pino; Barigye, Stephen J; Marrero-Ponce, Yovani; García-Jacas, César R; Valdes-Martiní, José R; Perez-Gimenez, Facundo

    2015-05-01

    The features and theoretical background of a new and free computational program for chemometric analysis denominated IMMAN (acronym for Information theory-based CheMoMetrics ANalysis) are presented. This is multi-platform software developed in the Java programming language, designed with a remarkably user-friendly graphical interface for the computation of a collection of information-theoretic functions adapted for rank-based unsupervised and supervised feature selection tasks. A total of 20 feature selection parameters are presented, with the unsupervised and supervised frameworks represented by 10 approaches in each case. Several information-theoretic parameters traditionally used as molecular descriptors (MDs) are adapted for use as unsupervised rank-based feature selection methods. On the other hand, a generalization scheme for the previously defined differential Shannon's entropy is discussed, as well as the introduction of Jeffreys information measure for supervised feature selection. Moreover, well-known information-theoretic feature selection parameters, such as information gain, gain ratio, and symmetrical uncertainty are incorporated to the IMMAN software ( http://mobiosd-hub.com/imman-soft/ ), following an equal-interval discretization approach. IMMAN offers data pre-processing functionalities, such as missing values processing, dataset partitioning, and browsing. Moreover, single parameter or ensemble (multi-criteria) ranking options are provided. Consequently, this software is suitable for tasks like dimensionality reduction, feature ranking, as well as comparative diversity analysis of data matrices. Simple examples of applications performed with this program are presented. A comparative study between IMMAN and WEKA feature selection tools using the Arcene dataset was performed, demonstrating similar behavior. In addition, it is revealed that the use of IMMAN unsupervised feature selection methods improves the performance of both IMMAN and WEKA

  10. Learning from examples - Generation and evaluation of decision trees for software resource analysis

    Science.gov (United States)

    Selby, Richard W.; Porter, Adam A.

    1988-01-01

    A general solution method for the automatic generation of decision (or classification) trees is investigated. The approach is to provide insights through in-depth empirical characterization and evaluation of decision trees for software resource data analysis. The trees identify classes of objects (software modules) that had high development effort. Sixteen software systems ranging from 3,000 to 112,000 source lines were selected for analysis from a NASA production environment. The collection and analysis of 74 attributes (or metrics), for over 4,700 objects, captured information about the development effort, faults, changes, design style, and implementation style. A total of 9,600 decision trees were automatically generated and evaluated. The trees correctly identified 79.3 percent of the software modules that had high development effort or faults, and the trees generated from the best parameter combinations correctly identified 88.4 percent of the modules on the average.

  11. STEM - software test and evaluation methods: fault detection using static analysis techniques

    International Nuclear Information System (INIS)

    Bishop, P.G.; Esp, D.G.

    1988-08-01

    STEM is a software reliability project with the objective of evaluating a number of fault detection and fault estimation methods which can be applied to high integrity software. This Report gives some interim results of applying both manual and computer-based static analysis techniques, in particular SPADE, to an early CERL version of the PODS software containing known faults. The main results of this study are that: The scope for thorough verification is determined by the quality of the design documentation; documentation defects become especially apparent when verification is attempted. For well-defined software, the thoroughness of SPADE-assisted verification for detecting a large class of faults was successfully demonstrated. For imprecisely-defined software (not recommended for high-integrity systems) the use of tools such as SPADE is difficult and inappropriate. Analysis and verification tools are helpful, through their reliability and thoroughness. However, they are designed to assist, not replace, a human in validating software. Manual inspection can still reveal errors (such as errors in specification and errors of transcription of systems constants) which current tools cannot detect. There is a need for tools to automatically detect typographical errors in system constants, for example by reporting outliers to patterns. To obtain the maximum benefit from advanced tools, they should be applied during software development (when verification problems can be detected and corrected) rather than retrospectively. (author)

  12. Application of software quality assurance methods in validation and maintenance of reactor analysis computer codes

    International Nuclear Information System (INIS)

    Reznik, L.

    1994-01-01

    Various computer codes employed at Israel Electricity Company for preliminary reactor design analysis and fuel cycle scoping calculations have been often subject to program source modifications. Although most changes were due to computer or operating system compatibility problems, a number of significant modifications were due to model improvement and enhancements of algorithm efficiency and accuracy. With growing acceptance of software quality assurance requirements and methods, a program of implementing extensive testing of modified software has been adopted within the regular maintenance activities. In this work survey has been performed of various software quality assurance methods of software testing which belong mainly to the two major categories of implementation ('white box') and specification-based ('black box') testing. The results of this survey exhibits a clear preference of specification-based testing. In particular the equivalence class partitioning method and the boundary value method have been selected as especially suitable functional methods for testing reactor analysis codes.A separate study of software quality assurance methods and techniques has been performed in this work objective to establish appropriate pre-test software specification methods. Two methods of software analysis and specification have been selected as the most suitable for this purpose: The method of data flow diagrams has been shown to be particularly valuable for performing the functional/procedural software specification while the entities - relationship diagrams has been approved to be efficient for specifying software data/information domain. Feasibility of these two methods has been analyzed in particular for software uncertainty analysis and overall code accuracy estimation. (author). 14 refs

  13. Linear Programming Approaches for Power Savings in Software-defined Networks

    NARCIS (Netherlands)

    Moghaddam, F.A.; Grosso, P.

    2016-01-01

    Software-defined networks have been proposed as a viable solution to decrease the power consumption of the networking component in data center networks. Still the question remains on which scheduling algorithms are most suited to achieve this goal. We propose 4 different linear programming

  14. Analysis of signal acquisition in GPS receiver software

    Directory of Open Access Journals (Sweden)

    Vlada S. Sokolović

    2011-01-01

    Full Text Available This paper presents a critical analysis of the flow signal processing carried out in GPS receiver software, which served as a basis for a critical comparison of different signal processing architectures within the GPS receiver. It is possible to achieve Increased flexibility and reduction of GPS device commercial costs, including those of mobile devices, by using radio technology software (SDR, Software Defined Radio. The SDR application can be realized when certain hardware components in a GPS receiver are replaced. Signal processing in the SDR is implemented using a programmable DSP (Digital Signal Processing or FPGA (Field Programmable Gate Array circuit, which allows a simple change of digital signal processing algorithms and a simple change of the receiver parameters. The starting point of the research is the signal generated on the satellite the structure of which is shown in the paper. Based on the GPS signal structure, a receiver is realized with a task to extract an appropriate signal from the spectrum and detect it. Based on collected navigation data, the receiver calculates the position of the end user. The signal coming from the satellite may be at the carrier frequencies of L1 and L2. Since the SPS is used in the civil service, all the tests shown in the work were performed on the L1 signal. The signal coming to the receiver is generated in the spread spectrum technology and is situated below the level of noise. Such signals often interfere with signals from the environment which presents a difficulty for a receiver to perform proper detection and signal processing. Therefore, signal processing technology is continually being improved, aiming at more accurate and faster signal processing. All tests were carried out on a signal acquired from the satellite using the SE4110 input circuit used for filtering, amplification and signal selection. The samples of the received signal were forwarded to a computer for data post processing, i. e

  15. Software testability and its application to avionic software

    Science.gov (United States)

    Voas, Jeffrey M.; Miller, Keith W.; Payne, Jeffery E.

    1993-01-01

    Randomly generated black-box testing is an established yet controversial method of estimating software reliability. Unfortunately, as software applications have required higher reliabilities, practical difficulties with black-box testing have become increasingly problematic. These practical problems are particularly acute in life-critical avionics software, where requirements of 10 exp -7 failures per hour of system reliability can translate into a probability of failure (POF) of perhaps 10 exp -9 or less for each individual execution of the software. This paper describes the application of one type of testability analysis called 'sensitivity analysis' to B-737 avionics software; one application of sensitivity analysis is to quantify whether software testing is capable of detecting faults in a particular program and thus whether we can be confident that a tested program is not hiding faults. We so 80 by finding the testabilities of the individual statements of the program, and then use those statement testabilities to find the testabilities of the functions and modules. For the B-737 system we analyzed, we were able to isolate those functions that are more prone to hide errors during system/reliability testing.

  16. Development of Power Assisting Suit

    Science.gov (United States)

    Yamamoto, Keijiro; Ishii, Mineo; Hyodo, Kazuhito; Yoshimitsu, Toshihiro; Matsuo, Takashi

    In order to realize a wearable power assisting suit for assisting a nurse to carry a patient in her arms, the power supply and control systems of the suit have to be miniaturized, and it has to be wireless and pipeline-less. The new wearable suit consists of shoulders, arms, back, waist and legs units to be fitted on the nurse's body. The arms, waist and legs have new pneumatic rotary actuators driven directly by micro air pumps supplied by portable Ni-Cd batteries. The muscle forces are sensed by a new muscle hardness sensor utilizing a sensing tip mounted on a force sensing film device. An embedded microcomputer is used for the calculations of control signals. The new wearable suit was applied practically to a human body and a series of movement experiments that weights in the arms were held and taken up and down was performed. Each unit of the suit could transmit assisting torque directly to each joint verifying its practicability.

  17. Study of the suit inflation effect on crew safety during landing using a full-pressure IVA suit for new-generation reentry space vehicles

    Science.gov (United States)

    Wataru, Suzuki

    Recently, manned space capsules have been recognized as beneficial and reasonable human space vehicles again. The Dragon capsule already achieved several significant successes. The Orion capsule is going to be sent to a high-apogee orbit without crews for experimental purposes in September 2014. For such human-rated space capsules, the study of acceleration impacts against the human body during splashdown is essential to ensure the safety of crews. Moreover, it is also known that wearing a full pressure rescue suit significantly increases safety of a crew, compared to wearing a partial pressure suit. This is mainly because it enables the use of a personal life support system independently in addition to that which installed in the space vehicle. However, it is unclear how the inflation of the full pressure suit due to pressurization affects the crew safety during splashdown, especially in the case of the new generation manned space vehicles. Therefore, the purpose of this work is to investigate the effect of the suit inflation on crew safety against acceleration impact during splashdown. For this objective, the displacements of the safety harness in relation with the suit, a human surrogate, and the crew seats during pressurizing the suit in order to determine if the safety and survivability of a crew can be improved by wearing a full pressure suit. For these tests, the DL/H-1 full pressure IVA suit, developed by Pablo de Leon and Gary L. Harris, will be used. These tests use image analysis techniques to determine the displacements. It is expected, as a result of these tests, that wearing a full pressure suit will help to mitigate the impacts and will increase the safety and survivability of a crew during landing since it works as a buffer to mitigate impact forces during splashdown. This work also proposes a future plan for sled test experiments using a sled facility such as the one in use by the Civil Aerospace Medical Institute (CAMI) for experimental validation

  18. Modularity analysis of automotive control software

    NARCIS (Netherlands)

    Dajsuren, Y.; Brand, van den M.G.J.; Serebrenik, A.

    2013-01-01

    A design language and tool like MATLAB/Simulink is used for the graphical modelling and simulation of automotive control software. As the functionality based on electronics and software systems increases in motor vehicles, it is becoming increasingly important for system/software architects and

  19. Integrating open-source software applications to build molecular dynamics systems.

    Science.gov (United States)

    Allen, Bruce M; Predecki, Paul K; Kumosa, Maciej

    2014-04-05

    Three open-source applications, NanoEngineer-1, packmol, and mis2lmp are integrated using an open-source file format to quickly create molecular dynamics (MD) cells for simulation. The three software applications collectively make up the open-source software (OSS) suite known as MD Studio (MDS). The software is validated through software engineering practices and is verified through simulation of the diglycidyl ether of bisphenol-a and isophorone diamine (DGEBA/IPD) system. Multiple simulations are run using the MDS software to create MD cells, and the data generated are used to calculate density, bulk modulus, and glass transition temperature of the DGEBA/IPD system. Simulation results compare well with published experimental and numerical results. The MDS software prototype confirms that OSS applications can be analyzed against real-world research requirements and integrated to create a new capability. Copyright © 2014 Wiley Periodicals, Inc.

  20. Thermal-hydraulic software development for nuclear waste transportation cask design and analysis

    International Nuclear Information System (INIS)

    Brown, N.N.; Burns, S.P.; Gianoulakis, S.E.; Klein, D.E.

    1991-01-01

    This paper describes the development of a state-of-the-art thermal-hydraulic software package intended for spent fuel and high-level nuclear waste transportation cask design and analysis. The objectives of this software development effort are threefold: (1) to take advantage of advancements in computer hardware and software to provide a more efficient user interface, (2) to provide a tool for reducing inefficient conservatism in spent fuel and high-level waste shipping cask design by including convection as well as conduction and radiation heat transfer modeling capabilities, and (3) to provide a thermal-hydraulic analysis package which is developed under a rigorous quality assurance program established at Sandia National Laboratories. 20 refs., 5 figs., 2 tabs

  1. An architectural model for software reliability quantification: sources of data

    International Nuclear Information System (INIS)

    Smidts, C.; Sova, D.

    1999-01-01

    Software reliability assessment models in use today treat software as a monolithic block. An aversion towards 'atomic' models seems to exist. These models appear to add complexity to the modeling, to the data collection and seem intrinsically difficult to generalize. In 1997, we introduced an architecturally based software reliability model called FASRE. The model is based on an architecture derived from the requirements which captures both functional and nonfunctional requirements and on a generic classification of functions, attributes and failure modes. The model focuses on evaluation of failure mode probabilities and uses a Bayesian quantification framework. Failure mode probabilities of functions and attributes are propagated to the system level using fault trees. It can incorporate any type of prior information such as results of developers' testing, historical information on a specific functionality and its attributes, and, is ideally suited for reusable software. By building an architecture and deriving its potential failure modes, the model forces early appraisal and understanding of the weaknesses of the software, allows reliability analysis of the structure of the system, provides assessments at a functional level as well as at a systems' level. In order to quantify the probability of failure (or the probability of success) of a specific element of our architecture, data are needed. The term element of the architecture is used here in its broadest sense to mean a single failure mode or a higher level of abstraction such as a function. The paper surveys the potential sources of software reliability data available during software development. Next the mechanisms for incorporating these sources of relevant data to the FASRE model are identified

  2. Carotid artery stenosis: Performance of advanced vessel analysis software in evaluating CTA

    International Nuclear Information System (INIS)

    Tsiflikas, Ilias; Biermann, Christina; Thomas, Christoph; Ketelsen, Dominik; Claussen, Claus D.; Heuschmid, Martin

    2012-01-01

    Objectives: The aim of this study was to evaluate time efficiency and diagnostic reproducibility of an advanced vessel analysis software for diagnosis of carotid artery stenosis. Material and methods: 40 patients with suspected carotid artery stenosis received head and neck DE-CTA as part of their pre-interventional workup. Acquired data were evaluated by 2 independent radiologists. Stenosis grading was performed by MPR eyeballing with freely adjustable MPRs and with a preliminary prototype of the meanwhile available client-server and advanced visualization software syngo.via CT Vascular (Siemens Healthcare, Erlangen, Germany). Stenoses were graded according to the following 5 categories: I: 0%, II: 1–50%, III: 51–69%, IV: 70–99% and V: total occlusion. Furthermore, time to diagnosis for each carotid artery was recorded. Results: Both readers achieved very good specificity values and good respectively very good sensitivity values without significant differences between both reading methods. Furthermore, there was a very good correlation between both readers for both reading methods without significant differences (kappa value: standard image interpretation k = 0.809; advanced vessel analysis software k = 0.863). Using advanced vessel analysis software resulted in a significant time saving (p < 0.0001) for both readers. Time to diagnosis could be decreased by approximately 55%. Conclusions: Advanced vessel analysis application CT Vascular of the new imaging software syngo.via (Siemens Healthcare, Forchheim, Germany) provides a high rate of reproducibility in assessment of carotid artery stenosis. Furthermore a significant time saving in comparison to standard image interpretation is achievable

  3. Carotid artery stenosis: Performance of advanced vessel analysis software in evaluating CTA

    Energy Technology Data Exchange (ETDEWEB)

    Tsiflikas, Ilias, E-mail: ilias.tsiflikas@med.uni-tuebingen.de [University Hospital of Tuebingen, Diagnostic and Interventional Radiology, Hoppe-Seyler-Str. 3, 72076 Tuebingen (Germany); Biermann, Christina, E-mail: christina.biermann@siemens.com [University Hospital of Tuebingen, Diagnostic and Interventional Radiology, Hoppe-Seyler-Str. 3, 72076 Tuebingen (Germany); Siemens AG, Siemens Healthcare Consulting, Allee am Röthelheimpark 3A, 91052 Erlangen (Germany); Thomas, Christoph, E-mail: christoph.thomas@med.uni-tuebingen.de [University Hospital of Tuebingen, Diagnostic and Interventional Radiology, Hoppe-Seyler-Str. 3, 72076 Tuebingen (Germany); Ketelsen, Dominik, E-mail: dominik.ketelsen@med.uni-tuebingen.de [University Hospital of Tuebingen, Diagnostic and Interventional Radiology, Hoppe-Seyler-Str. 3, 72076 Tuebingen (Germany); Claussen, Claus D., E-mail: claus.claussen@med.uni-tuebingen.de [University Hospital of Tuebingen, Diagnostic and Interventional Radiology, Hoppe-Seyler-Str. 3, 72076 Tuebingen (Germany); Heuschmid, Martin, E-mail: martin.heuschmid@med.uni-tuebingen.de [University Hospital of Tuebingen, Diagnostic and Interventional Radiology, Hoppe-Seyler-Str. 3, 72076 Tuebingen (Germany)

    2012-09-15

    Objectives: The aim of this study was to evaluate time efficiency and diagnostic reproducibility of an advanced vessel analysis software for diagnosis of carotid artery stenosis. Material and methods: 40 patients with suspected carotid artery stenosis received head and neck DE-CTA as part of their pre-interventional workup. Acquired data were evaluated by 2 independent radiologists. Stenosis grading was performed by MPR eyeballing with freely adjustable MPRs and with a preliminary prototype of the meanwhile available client-server and advanced visualization software syngo.via CT Vascular (Siemens Healthcare, Erlangen, Germany). Stenoses were graded according to the following 5 categories: I: 0%, II: 1–50%, III: 51–69%, IV: 70–99% and V: total occlusion. Furthermore, time to diagnosis for each carotid artery was recorded. Results: Both readers achieved very good specificity values and good respectively very good sensitivity values without significant differences between both reading methods. Furthermore, there was a very good correlation between both readers for both reading methods without significant differences (kappa value: standard image interpretation k = 0.809; advanced vessel analysis software k = 0.863). Using advanced vessel analysis software resulted in a significant time saving (p < 0.0001) for both readers. Time to diagnosis could be decreased by approximately 55%. Conclusions: Advanced vessel analysis application CT Vascular of the new imaging software syngo.via (Siemens Healthcare, Forchheim, Germany) provides a high rate of reproducibility in assessment of carotid artery stenosis. Furthermore a significant time saving in comparison to standard image interpretation is achievable.

  4. CMS Software and Computing Ready for Run 2

    CERN Document Server

    Bloom, Kenneth

    2015-01-01

    In Run 1 of the Large Hadron Collider, software and computing was a strategic strength of the Compact Muon Solenoid experiment. The timely processing of data and simulation samples and the excellent performance of the reconstruction algorithms played an important role in the preparation of the full suite of searches used for the observation of the Higgs boson in 2012. In Run 2, the LHC will run at higher intensities and CMS will record data at a higher trigger rate. These new running conditions will provide new challenges for the software and computing systems. Over the two years of Long Shutdown 1, CMS has built upon the successes of Run 1 to improve the software and computing to meet these challenges. In this presentation we will describe the new features in software and computing that will once again put CMS in a position of physics leadership.

  5. SLIMarray: Lightweight software for microarray facility management

    Directory of Open Access Journals (Sweden)

    Marzolf Bruz

    2006-10-01

    Full Text Available Abstract Background Microarray core facilities are commonplace in biological research organizations, and need systems for accurately tracking various logistical aspects of their operation. Although these different needs could be handled separately, an integrated management system provides benefits in organization, automation and reduction in errors. Results We present SLIMarray (System for Lab Information Management of Microarrays, an open source, modular database web application capable of managing microarray inventories, sample processing and usage charges. The software allows modular configuration and is well suited for further development, providing users the flexibility to adapt it to their needs. SLIMarray Lite, a version of the software that is especially easy to install and run, is also available. Conclusion SLIMarray addresses the previously unmet need for free and open source software for managing the logistics of a microarray core facility.

  6. A Phenomenological Inquiry into the Perceptions of Software Professionals on the Asperger's Syndrome/High Functioning Autism Spectrum and the Success of Software Development Projects

    Science.gov (United States)

    Kendall, Leslie R.

    2013-01-01

    Individuals who have Asperger's Syndrome/High-Functioning Autism, as a group, are chronically underemployed and underutilized. Many in this group have abilities that are well suited for various roles within the practice of software development. Multiple studies have shown that certain organizational and management changes in the software…

  7. Web Viz 2.0: A versatile suite of tools for collaboration and visualization

    Science.gov (United States)

    Spencer, C.; Yuen, D. A.

    2012-12-01

    Most scientific applications on the web fail to realize the full collaborative potential of the internet by not utilizing web 2.0 technology. To relieve users from the struggle with software tools and allow them to focus on their research, new software developed for scientists and researchers must harness the full suite of web technology. For several years WebViz 1.0 enabled researchers with any web accessible device to interact with the peta-scale data generated by the Hierarchical Volume Renderer (HVR) system. We have developed a new iteration of WebViz that can be easily interfaced with many problem domains in addition to HVR by employing the best practices of software engineering and object-oriented programming. This is done by separating the core WebViz system from domain specific code at an interface, leveraging inheritance and polymorphism to allow newly developed modules access to the core services. We employed several design patterns (model-view-controller, singleton, observer, and application controller) to engineer this highly modular system implemented in Java.

  8. CMS Computing Software and Analysis Challenge 2006

    Energy Technology Data Exchange (ETDEWEB)

    De Filippis, N. [Dipartimento interateneo di Fisica M. Merlin and INFN Bari, Via Amendola 173, 70126 Bari (Italy)

    2007-10-15

    The CMS (Compact Muon Solenoid) collaboration is making a big effort to test the workflow and the dataflow associated with the data handling model. With this purpose the computing, software and analysis Challenge 2006, namely CSA06, started the 15th of September. It was a 50 million event exercise that included all the steps of the analysis chain, like the prompt reconstruction, the data streaming, calibration and alignment iterative executions, the data distribution to regional sites, up to the end-user analysis. Grid tools provided by the LCG project are also experimented to gain access to the data and the resources by providing a user friendly interface to the physicists submitting the production and the analysis jobs. An overview of the status and results of the CSA06 is presented in this work.

  9. CMS Computing Software and Analysis Challenge 2006

    International Nuclear Information System (INIS)

    De Filippis, N.

    2007-01-01

    The CMS (Compact Muon Solenoid) collaboration is making a big effort to test the workflow and the dataflow associated with the data handling model. With this purpose the computing, software and analysis Challenge 2006, namely CSA06, started the 15th of September. It was a 50 million event exercise that included all the steps of the analysis chain, like the prompt reconstruction, the data streaming, calibration and alignment iterative executions, the data distribution to regional sites, up to the end-user analysis. Grid tools provided by the LCG project are also experimented to gain access to the data and the resources by providing a user friendly interface to the physicists submitting the production and the analysis jobs. An overview of the status and results of the CSA06 is presented in this work

  10. Knickpoint finder: A software tool that improves neotectonic analysis

    Science.gov (United States)

    Queiroz, G. L.; Salamuni, E.; Nascimento, E. R.

    2015-03-01

    This work presents a new software tool for morphometric analysis of drainage networks based on the methods of Hack (1973) and Etchebehere et al. (2004). This tool is applicable to studies of morphotectonics and neotectonics. The software used a digital elevation model (DEM) to identify the relief breakpoints along drainage profiles (knickpoints). The program was coded in Python for use on the ArcGIS platform and is called Knickpoint Finder. A study area was selected to test and evaluate the software's ability to analyze and identify neotectonic morphostructures based on the morphology of the terrain. For an assessment of its validity, we chose an area of the James River basin, which covers most of the Piedmont area of Virginia (USA), which is an area of constant intraplate seismicity and non-orogenic active tectonics and exhibits a relatively homogeneous geodesic surface currently being altered by the seismogenic features of the region. After using the tool in the chosen area, we found that the knickpoint locations are associated with the geologic structures, epicenters of recent earthquakes, and drainages with rectilinear anomalies. The regional analysis demanded the use of a spatial representation of the data after processing using Knickpoint Finder. The results were satisfactory in terms of the correlation of dense areas of knickpoints with active lineaments and the rapidity of the identification of deformed areas. Therefore, this software tool may be considered useful in neotectonic analyses of large areas and may be applied to any area where there is DEM coverage.

  11. Software architecture analysis of usability

    NARCIS (Netherlands)

    Folmer, Eelke

    2005-01-01

    One of the qualities that has received increased attention in recent decades is usability. A software product with poor usability is likely to fail in a highly competitive market; therefore software developing organizations are paying more and more attention to ensuring the usability of their

  12. Using Statistical Analysis Software to Advance Nitro Plasticizer Wettability

    Energy Technology Data Exchange (ETDEWEB)

    Shear, Trevor Allan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-08-29

    Statistical analysis in science is an extremely powerful tool that is often underutilized. Additionally, it is frequently the case that data is misinterpreted or not used to its fullest extent. Utilizing the advanced software JMP®, many aspects of experimental design and data analysis can be evaluated and improved. This overview will detail the features of JMP® and how they were used to advance a project, resulting in time and cost savings, as well as the collection of scientifically sound data. The project analyzed in this report addresses the inability of a nitro plasticizer to coat a gold coated quartz crystal sensor used in a quartz crystal microbalance. Through the use of the JMP® software, the wettability of the nitro plasticizer was increased by over 200% using an atmospheric plasma pen, ensuring good sample preparation and reliable results.

  13. IUE Data Analysis Software for Personal Computers

    Science.gov (United States)

    Thompson, R.; Caplinger, J.; Taylor, L.; Lawton , P.

    1996-01-01

    This report summarizes the work performed for the program titled, "IUE Data Analysis Software for Personal Computers" awarded under Astrophysics Data Program NRA 92-OSSA-15. The work performed was completed over a 2-year period starting in April 1994. As a result of the project, 450 IDL routines and eight database tables are now available for distribution for Power Macintosh computers and Personal Computers running Windows 3.1.

  14. Ground System Architectures Workshop GMSEC SERVICES SUITE (GSS): an Agile Development Story

    Science.gov (United States)

    Ly, Vuong

    2017-01-01

    The GMSEC (Goddard Mission Services Evolution Center) Services Suite (GSS) is a collection of tools and software services along with a robust customizable web-based portal that enables the user to capture, monitor, report, and analyze system-wide GMSEC data. Given our plug-and-play architecture and the needs for rapid system development, we opted to follow the Scrum Agile Methodology for software development. Being one of the first few projects to implement the Agile methodology at NASA GSFC, in this presentation we will present our approaches, tools, successes, and challenges in implementing this methodology. The GMSEC architecture provides a scalable, extensible ground and flight system for existing and future missions. GMSEC comes with a robust Application Programming Interface (GMSEC API) and a core set of Java-based GMSEC components that facilitate the development of a GMSEC-based ground system. Over the past few years, we have seen an upbeat in the number of customers who are moving from a native desktop application environment to a web based environment particularly for data monitoring and analysis. We also see a need to provide separation of the business logic from the GUI display for our Java-based components and also to consolidate all the GUI displays into one interface. This combination of separation and consolidation brings immediate value to a GMSEC-based ground system through increased ease of data access via a uniform interface, built-in security measures, centralized configuration management, and ease of feature extensibility.

  15. A concept of software testing for SMART MMIS software

    International Nuclear Information System (INIS)

    Seo, Yong Seok; Seong, Seung Hwan; Park, Keun Ok; Hur, Sub; Kim, Dong Hoon

    2001-01-01

    In order to achieve high quality of SMART MMIS software, the well-constructed software testing concept shall be required. This paper established software testing concept which is to be applied to SMART MMIS software, in terms of software testing organization, documentation. procedure, and methods. The software testing methods are classified into source code static analysis and dynamic testing. The software dynamic testing methods are discussed with two aspects: white-box and black-box testing. As software testing concept introduced in this paper is applied to the SMART MMIS software. the high quality of the software will be produced. In the future, software failure data will be collected through the construction of SMART MMIS prototyping facility which the software testing concept of this paper is applied to

  16. Comparative Performance Analysis of Machine Learning Techniques for Software Bug Detection

    OpenAIRE

    Saiqa Aleem; Luiz Fernando Capretz; Faheem Ahmed

    2015-01-01

    Machine learning techniques can be used to analyse data from different perspectives and enable developers to retrieve useful information. Machine learning techniques are proven to be useful in terms of software bug prediction. In this paper, a comparative performance analysis of different machine learning techniques is explored f or software bug prediction on public available data sets. Results showed most of the mac ...

  17. Ultraviolet Testing of Space Suit Materials for Mars

    Science.gov (United States)

    Larson, Kristine; Fries, Marc

    2017-01-01

    Human missions to Mars may require radical changes in the approach to extra-vehicular (EVA) suit design. A major challenge is the balance of building a suit robust enough to complete multiple EVAs under intense ultraviolet (UV) light exposure without losing mechanical strength or compromising the suit's mobility. To study how the materials degrade on Mars in-situ, the Jet Propulsion Laboratory (JPL) invited the Advanced Space Suit team at NASA's Johnson Space Center (JSC) to place space suit materials on the Scanning Habitable Environments with Raman & Luminescence for Organics and Chemicals (SHERLOC) instrument's calibration target of the Mars 2020 rover. In order to select materials for the rover and understand the effects from Mars equivalent UV exposure, JSC conducted ground testing on both current and new space suit materials when exposed to 2500 hours of Mars mission equivalent UV. To complete this testing, JSC partnered with NASA's Marshall Space Flight Center to utilize their UV vacuum chambers. Materials tested were Orthofabric, polycarbonate, Teflon, Dacron, Vectran, spectra, bladder, nGimat coated Teflon, and nGimat coated Orthofabric. All samples were measured for mass, tensile strength, and chemical composition before and after radiation. Mass loss was insignificant (less than 0.5%) among the materials. Most materials loss tensile strength after radiation and became more brittle with a loss of elongation. Changes in chemical composition were seen in all radiated materials through Spectral Analysis. Results from this testing helped select the materials that will fly on the Mars 2020 rover. In addition, JSC can use this data to create a correlation to the chemical changes after radiation-which is what the rover will send back while on Mars-to the mechanical changes, such as tensile strength.

  18. Nuclear analysis software. Pt. 1: Spectrum transfer and reformatting (SPEDAC)

    International Nuclear Information System (INIS)

    1991-01-01

    GANAAS (Gamma, Activity, and Neutron Activation Analysis System) is one in the family of software packages developed under the auspices of the International Atomic Energy Agency. Primarily, the package was intended to support the IAEA Technical Assistance and Cooperation projects in developing countries. However, it is open domain software that can be copied and used by anybody, except for commercial purposes. All the nuclear analysis software provided by the IAEA has the same design philosophy and similar structure. The intention was to provide the user with maximum flexibility, at the same time with a simple and logical organization that requires minimum digging through the manuals. GANAAS is a modular system. It consists of several programmes that can be installed on the hard disk as the are needed. Obviously, some parts of they system are required in all cases. Those are installed at the beginning, without consulting the operator. GANAAS offers the opportunity to expand and improve the system. The gamma spectrum evaluation programmes using different fitting algorithms can be added to GANAAS, under the condition that the format of their input and output files corresponds to the rules of GANAAS. The same applies to the quantitative analysis parts of the programme

  19. Software system safety

    Science.gov (United States)

    Uber, James G.

    1988-01-01

    Software itself is not hazardous, but since software and hardware share common interfaces there is an opportunity for software to create hazards. Further, these software systems are complex, and proven methods for the design, analysis, and measurement of software safety are not yet available. Some past software failures, future NASA software trends, software engineering methods, and tools and techniques for various software safety analyses are reviewed. Recommendations to NASA are made based on this review.

  20. Software attribute visualization for high integrity software

    Energy Technology Data Exchange (ETDEWEB)

    Pollock, G.M.

    1998-03-01

    This report documents a prototype tool developed to investigate the use of visualization and virtual reality technologies for improving software surety confidence. The tool is utilized within the execution phase of the software life cycle. It provides a capability to monitor an executing program against prespecified requirements constraints provided in a program written in the requirements specification language SAGE. The resulting Software Attribute Visual Analysis Tool (SAVAnT) also provides a technique to assess the completeness of a software specification.

  1. The use of case tools in OPG safety analysis code qualification

    International Nuclear Information System (INIS)

    Pascoe, J.; Cheung, A.; Westbye, C.

    2001-01-01

    Ontario Power Generation (OPG) is currently qualifying its critical safety analysis software. The software quality assurance (SQA) framework is described. Given the legacy nature of much of the safety analysis software the reverse engineering methodology has been adopted. The safety analysis suite of codes was developed over a period of many years to differing standards of quality and had sparse or incomplete documentation. Key elements of the reverse engineering process require recovery of design information from existing coding. This recovery, if performed manually, could represent an enormous effort. Driven by a need to maximize productivity and enhance the repeatability and objectivity of software qualification activities the decision was made to acquire or develop and implement Computer Aided Software Engineering (CASE) tools. This paper presents relevant background information on CASE tools and discusses how the OPG SQA requirements were used to assess the suitability of available CASE tools. Key findings from the application of CASE tools to the qualification of the OPG safety analysis software are discussed. (author)

  2. Rapid Software Development for Experiment Control at OPAL

    International Nuclear Information System (INIS)

    Hathaway, P.V.; Lam, Tony; Franceschini, Ferdi; Hauser, Nick; Rayner, Hugh

    2005-01-01

    Full text: ANSTO is undertaking the parallel development of instrument control and graphical experiment interface software for seven neutron beam instruments at OPAL. Each instrument poses several challenges for a common system solution, including custom detector interfaces, a range of motion and beamline optics schema, and a spectrum of online data reduction requirements. To provide a superior system with the least development effort, the computing team have adopted proven, configurable, server-based control software (SICS)1., a highly Integrated Scientific Experimental Environment (GumTree)2. and industry-standard database management systems. The resulting graphical interfaces allow operation in a familiar experiment domain, with monitoring of data and parameters independent of control system specifics. GumTree presents the experimenter with a consistent interface for experiment management, instrument control and data reduction tasks. The facility instrument scientists can easily reconfigure instruments and add ancillaries. The user community can expect a reduced learning curve for performing each experiment. GumTree can be installed anywhere for pre-experiment familiarisation, postprocessing of acquired data sets, and integration with third party analysis tools. Instrument scientists are seeing faster software development iterations and have a solid basis to prepare for the next suite of instruments. 1. SICS from PSI (lns00.psi.ch). 2. GumTree (gumtree.sourceforge.net), new site: http://gumtree.sourceforge.net/wiki/index.php/Main_Page

  3. Key attributes of the SAPHIRE risk and reliability analysis software for risk-informed probabilistic applications

    International Nuclear Information System (INIS)

    Smith, Curtis; Knudsen, James; Kvarfordt, Kellie; Wood, Ted

    2008-01-01

    The Idaho National Laboratory is a primary developer of probabilistic risk and reliability analysis (PRRA) tools, dating back over 35 years. Evolving from mainframe-based software, the current state-of-the-practice has led to the creation of the SAPHIRE software. Currently, agencies such as the Nuclear Regulatory Commission, the National Aeronautics and Aerospace Agency, the Department of Energy, and the Department of Defense use version 7 of the SAPHIRE software for many of their risk-informed activities. In order to better understand and appreciate the power of software as part of risk-informed applications, we need to recall that our current analysis methods and solution methods have built upon pioneering work done 30-40 years ago. We contrast this work with the current capabilities in the SAPHIRE analysis package. As part of this discussion, we provide information for both the typical features and special analysis capabilities, which are available. We also present the application and results typically found with state-of-the-practice PRRA models. By providing both a high-level and detailed look at the SAPHIRE software, we give a snapshot in time for the current use of software tools in a risk-informed decision arena

  4. Software Schedules Missions, Aids Project Management

    Science.gov (United States)

    2008-01-01

    NASA missions require advanced planning, scheduling, and management, and the Space Agency has worked extensively to develop the programs and software suites necessary to facilitate these complex missions. These enormously intricate undertakings have hundreds of active components that need constant management and monitoring. It is no surprise, then, that the software developed for these tasks is often applicable in other high-stress, complex environments, like in government or industrial settings. NASA work over the past few years has resulted in a handful of new scheduling, knowledge-management, and research tools developed under contract with one of NASA s partners. These tools have the unique responsibility of supporting NASA missions, but they are also finding uses outside of the Space Program.

  5. Fabrication and performance analysis of a DEA cuff designed for dry-suit applications

    International Nuclear Information System (INIS)

    Ahmadi, S; Camacho Mattos, A; Barbazza, A; Soleimani, M; Boscariol, P; Menon, C

    2013-01-01

    A method for manufacturing a cylindrical dielectric elastomer actuator (DEA) is presented. The cylindrical DEA can be used in fabricating the cuff area of dry-suits where the garment is very tight and wearing the suit is difficult. When electrically actuated, the DEA expands radially and the suit can be worn more comfortably. In order to study the performance of the DEA, a customized testing setup was designed, and silicone-made cuff samples with different material stiffnesses were tested. Analytical and FEM modeling were considered to evaluate the experimental output. The results revealed that although the stiffness of the DEA material has a direct relationship with the radial constrictive pressure caused by mechanically stretching the DEA, it has a minor effect on the actuation pressure. It was also found that stacking multiple layers of the DEA to fabricate a laminated structure enabled the attainment of a desired variation of pressure required for the implementation of an electrically tunable cuff. (paper)

  6. Software for Probabilistic Risk Reduction

    Science.gov (United States)

    Hensley, Scott; Michel, Thierry; Madsen, Soren; Chapin, Elaine; Rodriguez, Ernesto

    2004-01-01

    A computer program implements a methodology, denoted probabilistic risk reduction, that is intended to aid in planning the development of complex software and/or hardware systems. This methodology integrates two complementary prior methodologies: (1) that of probabilistic risk assessment and (2) a risk-based planning methodology, implemented in a prior computer program known as Defect Detection and Prevention (DDP), in which multiple requirements and the beneficial effects of risk-mitigation actions are taken into account. The present methodology and the software are able to accommodate both process knowledge (notably of the efficacy of development practices) and product knowledge (notably of the logical structure of a system, the development of which one seeks to plan). Estimates of the costs and benefits of a planned development can be derived. Functional and non-functional aspects of software can be taken into account, and trades made among them. It becomes possible to optimize the planning process in the sense that it becomes possible to select the best suite of process steps and design choices to maximize the expectation of success while remaining within budget.

  7. Critical Factors Analysis for Offshore Software Development Success by Structural Equation Modeling

    Science.gov (United States)

    Wada, Yoshihisa; Tsuji, Hiroshi

    In order to analyze the success/failure factors in offshore software development service by the structural equation modeling, this paper proposes to follow two approaches together; domain knowledge based heuristic analysis and factor analysis based rational analysis. The former works for generating and verifying of hypothesis to find factors and causalities. The latter works for verifying factors introduced by theory to build the model without heuristics. Following the proposed combined approaches for the responses from skilled project managers of the questionnaire, this paper found that the vendor property has high causality for the success compared to software property and project property.

  8. Surveillance Analysis Computer System (SACS): Software requirements specification (SRS). Revision 2

    International Nuclear Information System (INIS)

    Glasscock, J.A.

    1995-01-01

    This document is the primary document establishing requirements for the Surveillance Analysis Computer System (SACS) database, an Impact Level 3Q system. SACS stores information on tank temperatures, surface levels, and interstitial liquid levels. This information is retrieved by the customer through a PC-based interface and is then available to a number of other software tools. The software requirements specification (SRS) describes the system requirements for the SACS Project, and follows the Standard Engineering Practices (WHC-CM-6-1), Software Practices (WHC-CM-3-10) and Quality Assurance (WHC-CM-4-2, QR 19.0) policies

  9. Development of Software for Automatic Analysis of Intervention in the Field of Homeopathy.

    Science.gov (United States)

    Jain, Rajesh Kumar; Goyal, Shagun; Bhat, Sushma N; Rao, Srinath; Sakthidharan, Vivek; Kumar, Prasanna; Sajan, Kannanaikal Rappayi; Jindal, Sameer Kumar; Jindal, Ghanshyam D

    2018-05-01

    To study the effect of homeopathic medicines (in higher potencies) in normal subjects, Peripheral Pulse Analyzer (PPA) has been used to record physiologic variability parameters before and after administration of the medicine/placebo in 210 normal subjects. Data have been acquired in seven rounds; placebo was administered in rounds 1 and 2 and medicine in potencies 6, 30, 200, 1 M, and 10 M was administered in rounds 3 to 7, respectively. Five different medicines in the said potencies were given to a group of around 40 subjects each. Although processing of data required human intervention, a software application has been developed to analyze the processed data and detect the response to eliminate the undue delay as well as human bias in subjective analysis. This utility named Automatic Analysis of Intervention in the Field of Homeopathy is run on the processed PPA data and the outcome has been compared with the manual analysis. The application software uses adaptive threshold based on statistics for detecting responses in contrast to fixed threshold used in manual analysis. The automatic analysis has detected 12.96% higher responses than subjective analysis. Higher response rates have been manually verified to be true positive. This indicates robustness of the application software. The automatic analysis software was run on another set of pulse harmonic parameters derived from the same data set to study cardiovascular susceptibility and 385 responses were detected in contrast to 272 of variability parameters. It was observed that 65% of the subjects, eliciting response, were common. This not only validates the software utility for giving consistent yield but also reveals the certainty of the response. This development may lead to electronic proving of homeopathic medicines (e-proving).

  10. Meta-Analyst: software for meta-analysis of binary, continuous and diagnostic data

    Directory of Open Access Journals (Sweden)

    Schmid Christopher H

    2009-12-01

    Full Text Available Abstract Background Meta-analysis is increasingly used as a key source of evidence synthesis to inform clinical practice. The theory and statistical foundations of meta-analysis continually evolve, providing solutions to many new and challenging problems. In practice, most meta-analyses are performed in general statistical packages or dedicated meta-analysis programs. Results Herein, we introduce Meta-Analyst, a novel, powerful, intuitive, and free meta-analysis program for the meta-analysis of a variety of problems. Meta-Analyst is implemented in C# atop of the Microsoft .NET framework, and features a graphical user interface. The software performs several meta-analysis and meta-regression models for binary and continuous outcomes, as well as analyses for diagnostic and prognostic test studies in the frequentist and Bayesian frameworks. Moreover, Meta-Analyst includes a flexible tool to edit and customize generated meta-analysis graphs (e.g., forest plots and provides output in many formats (images, Adobe PDF, Microsoft Word-ready RTF. The software architecture employed allows for rapid changes to be made to either the Graphical User Interface (GUI or to the analytic modules. We verified the numerical precision of Meta-Analyst by comparing its output with that from standard meta-analysis routines in Stata over a large database of 11,803 meta-analyses of binary outcome data, and 6,881 meta-analyses of continuous outcome data from the Cochrane Library of Systematic Reviews. Results from analyses of diagnostic and prognostic test studies have been verified in a limited number of meta-analyses versus MetaDisc and MetaTest. Bayesian statistical analyses use the OpenBUGS calculation engine (and are thus as accurate as the standalone OpenBUGS software. Conclusion We have developed and validated a new program for conducting meta-analyses that combines the advantages of existing software for this task.

  11. Planetary Suit Hip Bearing Model for Predicting Design vs. Performance

    Science.gov (United States)

    Cowley, Matthew S.; Margerum, Sarah; Harvil, Lauren; Rajulu, Sudhakar

    2011-01-01

    , the suited performance trends were comparable between the model and the suited subjects. With the three off-nominal bearing configurations compared to the nominal bearing configurations, human subjects showed decreases in hip flexion of 64%, 6%, and 13% and in hip abduction of 59%, 2%, and 20%. Likewise the solid model showed decreases in hip flexion of 58%, 1%, and 25% and in hip abduction of 56%, 0%, and 30%, under the same condition changes from the nominal configuration. Differences seen between the model predictions and the human subject performance data could be attributed to the model lacking dynamic elements and performing kinematic analysis only, the level of fit of the subjects with the suit, the levels of the subject s suit experience.

  12. Effectiveness of an Automatic Tracking Software in Underwater Motion Analysis

    Directory of Open Access Journals (Sweden)

    Fabrício A. Magalhaes

    2013-12-01

    Full Text Available Tracking of markers placed on anatomical landmarks is a common practice in sports science to perform the kinematic analysis that interests both athletes and coaches. Although different software programs have been developed to automatically track markers and/or features, none of them was specifically designed to analyze underwater motion. Hence, this study aimed to evaluate the effectiveness of a software developed for automatic tracking of underwater movements (DVP, based on the Kanade-Lucas-Tomasi feature tracker. Twenty-one video recordings of different aquatic exercises (n = 2940 markers’ positions were manually tracked to determine the markers’ center coordinates. Then, the videos were automatically tracked using DVP and a commercially available software (COM. Since tracking techniques may produce false targets, an operator was instructed to stop the automatic procedure and to correct the position of the cursor when the distance between the calculated marker’s coordinate and the reference one was higher than 4 pixels. The proportion of manual interventions required by the software was used as a measure of the degree of automation. Overall, manual interventions were 10.4% lower for DVP (7.4% than for COM (17.8%. Moreover, when examining the different exercise modes separately, the percentage of manual interventions was 5.6% to 29.3% lower for DVP than for COM. Similar results were observed when analyzing the type of marker rather than the type of exercise, with 9.9% less manual interventions for DVP than for COM. In conclusion, based on these results, the developed automatic tracking software presented can be used as a valid and useful tool for underwater motion analysis.

  13. BAM/DASS: Data Analysis Software for Sub-Microarcsecond Astrometry Device

    Science.gov (United States)

    Gardiol, D.; Bonino, D.; Lattanzi, M. G.; Riva, A.; Russo, F.

    2010-12-01

    The INAF - Osservatorio Astronomico di Torino is part of the Data Processing and Analysis Consortium (DPAC) for Gaia, a cornerstone mission of the European Space Agency. Gaia will perform global astrometry by means of two telescopes looking at the sky along two different lines of sight oriented at a fixed angle, also called basic angle. Knowledge of the basic angle fluctuations at the sub-microarcsecond level over periods of the order of the minute is crucial to reach the mission goals. A specific device, the Basic Angle Monitoring, will be dedicated to this purpose. We present here the software system we are developing to analyze the BAM data and recover the basic angle variations. This tool is integrated into the whole DPAC data analysis software.

  14. Analyzing the State of Static Analysis : A Large-Scale Evaluation in Open Source Software

    NARCIS (Netherlands)

    Beller, M.; Bholanath, R.; McIntosh, S.; Zaidman, A.E.

    2016-01-01

    The use of automatic static analysis has been a software engineering best practice for decades. However, we still do not know a lot about its use in real-world software projects: How prevalent is the use of Automated Static Analysis Tools (ASATs) such as FindBugs and JSHint? How do developers use

  15. A software architectural framework specification for neutron activation analysis

    International Nuclear Information System (INIS)

    Preston, J.A.; Grant, C.N.

    2013-01-01

    Neutron Activation Analysis (NAA) is a sensitive multi-element nuclear analytical technique that has been routinely applied by research reactor (RR) facilities to environmental, nutritional, health related, geological and geochemical studies. As RR facilities face calls to increase their research output and impact, with existing or reducing budgets, automation of NAA offers a possible solution. However, automation has many challenges, not the least of which is a lack of system architecture standards to establish acceptable mechanisms for the various hardware/software and software/software interactions among data acquisition systems, specialised hardware such as sample changers, sample loaders, and data processing modules. This lack of standardization often results in automation hardware and software being incompatible with existing system components, in a facility looking to automate its NAA operations. This limits the availability of automation to a few RR facilities with adequate budgets or in-house engineering resources. What is needed is a modern open system architecture for NAA, that provides the required set of functionalities. This paper describes such an 'architectural framework' (OpenNAA), and portions of a reference implementation. As an example of the benefits, calculations indicate that applying this architecture to the compilation and QA steps associated with the analysis of 35 elements in 140 samples, with 14 SRM's, can reduce the time required by over 80 %. The adoption of open standards in the nuclear industry has been very successful over the years in promoting interchangeability and maximising the lifetime and output of nuclear measurement systems. OpenNAA will provide similar benefits within the NAA application space, safeguarding user investments in their current system, while providing a solid path for development into the future. (author)

  16. Software applications for flux balance analysis.

    Science.gov (United States)

    Lakshmanan, Meiyappan; Koh, Geoffrey; Chung, Bevan K S; Lee, Dong-Yup

    2014-01-01

    Flux balance analysis (FBA) is a widely used computational method for characterizing and engineering intrinsic cellular metabolism. The increasing number of its successful applications and growing popularity are possibly attributable to the availability of specific software tools for FBA. Each tool has its unique features and limitations with respect to operational environment, user-interface and supported analysis algorithms. Presented herein is an in-depth evaluation of currently available FBA applications, focusing mainly on usability, functionality, graphical representation and inter-operability. Overall, most of the applications are able to perform basic features of model creation and FBA simulation. COBRA toolbox, OptFlux and FASIMU are versatile to support advanced in silico algorithms to identify environmental and genetic targets for strain design. SurreyFBA, WEbcoli, Acorn, FAME, GEMSiRV and MetaFluxNet are the distinct tools which provide the user friendly interfaces in model handling. In terms of software architecture, FBA-SimVis and OptFlux have the flexible environments as they enable the plug-in/add-on feature to aid prospective functional extensions. Notably, an increasing trend towards the implementation of more tailored e-services such as central model repository and assistance to collaborative efforts was observed among the web-based applications with the help of advanced web-technologies. Furthermore, most recent applications such as the Model SEED, FAME, MetaFlux and MicrobesFlux have even included several routines to facilitate the reconstruction of genome-scale metabolic models. Finally, a brief discussion on the future directions of FBA applications was made for the benefit of potential tool developers.

  17. Adobe Creative Suite 4 Bible

    CERN Document Server

    Padova, Ted

    2009-01-01

    As one of the few books to cover integration and workflow issues between Photoshop, Illustrator, InDesign, GoLive, Acrobat, and Version Cue, this comprehensive reference is the one book that Creative Suite users need; Two well-known and respected authors cover topics such as developing consistent color-managed workflows, moving files among the Creative Suite applications, preparing files for print or the Web, repurposing documents, and using the Creative Suite with Microsoft Office documents; More than 1,200 pages are packed with valuable advice and techniques for tackling common everyday issu

  18. UPVapor: Cofrentes nuclear power plant production results analysis software

    International Nuclear Information System (INIS)

    Curiel, M.; Palomo, M. J.; Baraza, A.; Vaquer, J.

    2010-10-01

    UPVapor software version 02 has been developed for the Cofrentes nuclear power plant Data Analysis Department (Spain). It is an analysis graphical environment in which users have available all the plant variables registered in the process computer system (SIEC). In order to perform this, UPVapor software has many advanced graphic tools for work simplicity, as well as a friendly environment easy to use and with many configuration possibilities. Plant variables are classified in the same way that they are in SIEC computer and these values are taken from it through the network of Iberdrola. UPVapor can generate two different types of graphics: evolution graphs and X Y graphs. The first ones analyse the evolution up to twenty plant variables in a user's defined time period and according to historic plant files. Many tools are available: cursors, graphic configuration, mobile means, non valid data visualization ... Moreover, a particular analysis configuration can be saved, as a pre selection, giving the possibility of charging pre selection directly and developing quick monitoring of a group of preselected plant variables. In X Y graphs, it is possible to analyse a variable value against another variable in a defined time. As an option, users can filter previous data depending on a variable certain range, with the possibility of programming up to five filters. As well as the other graph, X Y graph has many configurations, saving and printing options. With UPVapor software, data analysts can save a valuable time during daily work and, as it is of easy utilization, it permits to other users to perform their own analysis without ask the analysts to develop. Besides, it can be used from any work centre with access to network framework. (Author)

  19. UPVapor: Cofrentes nuclear power plant production results analysis software

    Energy Technology Data Exchange (ETDEWEB)

    Curiel, M. [Logistica y Acondicionamientos Industriales SAU, Sorolla Center, local 10, Av. de las Cortes Valencianas No. 58, 46015 Valencia (Spain); Palomo, M. J. [ISIRYM, Universidad Politecnica de Valencia, Camino de Vera s/n, Valencia (Spain); Baraza, A. [Iberdrola Generacion S. A., Central Nuclear Cofrentes, Carretera Almansa Requena s/n, 04662 Cofrentes, Valencia (Spain); Vaquer, J., E-mail: m.curiel@lainsa.co [TITANIA Servicios Tecnologicos SL, Sorolla Center, local 10, Av. de las Cortes Valencianas No. 58, 46015 Valencia (Spain)

    2010-10-15

    UPVapor software version 02 has been developed for the Cofrentes nuclear power plant Data Analysis Department (Spain). It is an analysis graphical environment in which users have available all the plant variables registered in the process computer system (SIEC). In order to perform this, UPVapor software has many advanced graphic tools for work simplicity, as well as a friendly environment easy to use and with many configuration possibilities. Plant variables are classified in the same way that they are in SIEC computer and these values are taken from it through the network of Iberdrola. UPVapor can generate two different types of graphics: evolution graphs and X Y graphs. The first ones analyse the evolution up to twenty plant variables in a user's defined time period and according to historic plant files. Many tools are available: cursors, graphic configuration, mobile means, non valid data visualization ... Moreover, a particular analysis configuration can be saved, as a pre selection, giving the possibility of charging pre selection directly and developing quick monitoring of a group of preselected plant variables. In X Y graphs, it is possible to analyse a variable value against another variable in a defined time. As an option, users can filter previous data depending on a variable certain range, with the possibility of programming up to five filters. As well as the other graph, X Y graph has many configurations, saving and printing options. With UPVapor software, data analysts can save a valuable time during daily work and, as it is of easy utilization, it permits to other users to perform their own analysis without ask the analysts to develop. Besides, it can be used from any work centre with access to network framework. (Author)

  20. A controlled experiment on the impact of software structure on maintainability

    Science.gov (United States)

    Rombach, Dieter H.

    1987-01-01

    The impact of software structure on maintainability aspects including comprehensibility, locality, modifiability, and reusability in a distributed system environment is studied in a controlled maintenance experiment involving six medium-size distributed software systems implemented in LADY (language for distributed systems) and six in an extended version of sequential PASCAL. For all maintenance aspects except reusability, the results were quantitatively given in terms of complexity metrics which could be automated. The results showed LADY to be better suited to the development of maintainable software than the extension of sequential PASCAL. The strong typing combined with high parametrization of units is suggested to improve the reusability of units in LADY.

  1. Command and Data Handling Flight Software test framework: A Radiation Belt Storm Probes practice

    Science.gov (United States)

    Hill, T. A.; Reid, W. M.; Wortman, K. A.

    During the Radiation Belt Storm Probes (RBSP) mission, a test framework was developed by the Embedded Applications Group in the Space Department at the Johns Hopkins Applied Physics Laboratory (APL). The test framework is implemented for verification of the Command and Data Handling (C& DH) Flight Software. The RBSP C& DH Flight Software consists of applications developed for use with Goddard Space Flight Center's core Flight Executive (cFE) architecture. The test framework's initial concept originated with tests developed for verification of the Autonomy rules that execute with the Autonomy Engine application of the RBSP C& DH Flight Software. The test framework was adopted and expanded for system and requirements verification of the RBSP C& DH Flight Software. During the evolution of the RBSP C& DH Flight Software test framework design, a set of script conventions and a script library were developed. The script conventions and library eased integration of system and requirements verification tests into a comprehensive automated test suite. The comprehensive test suite is currently being used to verify releases of the RBSP C& DH Flight Software. In addition to providing the details and benefits of the test framework, the discussion will include several lessons learned throughout the verification process of RBSP C& DH Flight Software. Our next mission, Solar Probe Plus (SPP), will use the cFE architecture for the C& DH Flight Software. SPP also plans to use the same ground system as RBSP. Many of the RBSP C& DH Flight Software applications are reusable on the SPP mission, therefore there is potential for test design and test framework reuse for system and requirements verification.

  2. OVERVIEW OF THE SAPHIRE PROBABILISTIC RISK ANALYSIS SOFTWARE

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Curtis L.; Wood, Ted; Knudsen, James; Ma, Zhegang

    2016-10-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) is a software application developed for performing a complete probabilistic risk assessment (PRA) using a personal computer (PC) running the Microsoft Windows operating system. SAPHIRE Version 8 is funded by the U.S. Nuclear Regulatory Commission (NRC) and developed by the Idaho National Laboratory (INL). INL's primary role in this project is that of software developer and tester. However, INL also plays an important role in technology transfer by interfacing and supporting SAPHIRE users, who constitute a wide range of PRA practitioners from the NRC, national laboratories, the private sector, and foreign countries. In this paper, we provide an overview of the current technical capabilities found in SAPHIRE Version 8, including the user interface and enhanced solving algorithms.

  3. Software for analysis of waveforms acquired by digital Doppler broadening spectrometer

    International Nuclear Information System (INIS)

    Vlcek, M; Čížek, J; Procházka, I

    2013-01-01

    High-resolution digital spectrometer for coincidence measurement of Doppler broadening of positron annihilation radiation was recently developed and tested. In this spectrometer pulses from high purity Ge (HPGe) detectors are sampled in the real time by fast digitizers and subsequently analyzed off-line by software. We present description of the software routines used for pulse shape analysis in two spectrometer configurations: (i) semi-digital setup in which detector pulses shaped in spectroscopic amplifiers (SA's) are digitized; (ii) pure digital setup in which pulses from detector pre-amplifiers are digitized directly. Software developed in this work will be freely available in the form of source code and pre-compiled binaries.

  4. ImatraNMR: Novel software for batch integration and analysis of quantitative NMR spectra

    Science.gov (United States)

    Mäkelä, A. V.; Heikkilä, O.; Kilpeläinen, I.; Heikkinen, S.

    2011-08-01

    Quantitative NMR spectroscopy is a useful and important tool for analysis of various mixtures. Recently, in addition of traditional quantitative 1D 1H and 13C NMR methods, a variety of pulse sequences aimed for quantitative or semiquantitative analysis have been developed. To obtain actual usable results from quantitative spectra, they must be processed and analyzed with suitable software. Currently, there are many processing packages available from spectrometer manufacturers and third party developers, and most of them are capable of analyzing and integration of quantitative spectra. However, they are mainly aimed for processing single or few spectra, and are slow and difficult to use when large numbers of spectra and signals are being analyzed, even when using pre-saved integration areas or custom scripting features. In this article, we present a novel software, ImatraNMR, designed for batch analysis of quantitative spectra. In addition to capability of analyzing large number of spectra, it provides results in text and CSV formats, allowing further data-analysis using spreadsheet programs or general analysis programs, such as Matlab. The software is written with Java, and thus it should run in any platform capable of providing Java Runtime Environment version 1.6 or newer, however, currently it has only been tested with Windows and Linux (Ubuntu 10.04). The software is free for non-commercial use, and is provided with source code upon request.

  5. Models for composing software : an analysis of software composition and objects

    NARCIS (Netherlands)

    Bergmans, Lodewijk

    1999-01-01

    In this report, we investigate component-based software construction with a focus on composition. In particular we try to analyze the requirements and issues for components and software composition. As a means to understand this research area, we introduce a canonical model for representing

  6. Improved airline-type supplied-air plastic suit

    International Nuclear Information System (INIS)

    Jolley, L. Jr.; Zippler, D.B.; Cofer, C.H.; Harper, J.A.

    1978-06-01

    Two piece supplied-air plastic suits are used extensively at the Savannah River Plant for personnel protection against inhalation of airborne plutonium and tritium. Worker comfort and noise level problems gave impetus to development of an improved suit and aid distribution system. The resulting plastic suit and development work are discussed. The plastic suit unit cost is less than $20, the hearing zone noise level is less than 75 dBA, protection factors exceed 10,000, and user comfort is approved. This suit is expected to meet performance requirements for unrestricted use

  7. Development of tools for safety analysis of control software in advanced reactors

    International Nuclear Information System (INIS)

    Guarro, S.; Yau, M.; Motamed, M.

    1996-04-01

    Software based control systems have gained a pervasive presence in a wide variety of applications, including nuclear power plant control and protection systems which are within the oversight and licensing responsibility of the US Nuclear Regulatory Commission. While the cost effectiveness and flexibility of software based plant process control is widely recognized, it is very difficult to achieve and prove high levels of demonstrated dependability and safety assurance for the functions performed by process control software, due to the very flexibility and potential complexity of the software itself. The development of tools to model, analyze and test software design and implementations in the context of the system that the software is designed to control can greatly assist the task of providing higher levels of assurance than those obtainable by software testing alone. This report presents and discusses the development of the Dynamic Flowgraph Methodology (DFM) and its application in the dependability and assurance analysis of software-based control systems. The features of the methodology and full-scale examples of application to both generic process and nuclear power plant control systems are presented and discussed in detail. The features of a workstation software tool developed to assist users in the application of DFM are also described

  8. Development of tools for safety analysis of control software in advanced reactors

    Energy Technology Data Exchange (ETDEWEB)

    Guarro, S.; Yau, M.; Motamed, M. [Advanced Systems Concepts Associates, El Segundo, CA (United States)

    1996-04-01

    Software based control systems have gained a pervasive presence in a wide variety of applications, including nuclear power plant control and protection systems which are within the oversight and licensing responsibility of the US Nuclear Regulatory Commission. While the cost effectiveness and flexibility of software based plant process control is widely recognized, it is very difficult to achieve and prove high levels of demonstrated dependability and safety assurance for the functions performed by process control software, due to the very flexibility and potential complexity of the software itself. The development of tools to model, analyze and test software design and implementations in the context of the system that the software is designed to control can greatly assist the task of providing higher levels of assurance than those obtainable by software testing alone. This report presents and discusses the development of the Dynamic Flowgraph Methodology (DFM) and its application in the dependability and assurance analysis of software-based control systems. The features of the methodology and full-scale examples of application to both generic process and nuclear power plant control systems are presented and discussed in detail. The features of a workstation software tool developed to assist users in the application of DFM are also described.

  9. Fostering successful scientific software communities

    Science.gov (United States)

    Bangerth, W.; Heister, T.; Hwang, L.; Kellogg, L. H.

    2016-12-01

    Developing sustainable open source software packages for the sciences appears at first to be primarily a technical challenge: How can one create stable and robust algorithms, appropriate software designs, sufficient documentation, quality assurance strategies such as continuous integration and test suites, or backward compatibility approaches that yield high-quality software usable not only by the authors, but also the broader community of scientists? However, our experience from almost two decades of leading the development of the deal.II software library (http://www.dealii.org, a widely-used finite element package) and the ASPECT code (http://aspect.dealii.org, used to simulate convection in the Earth's mantle) has taught us that technical aspects are not the most difficult ones in scientific open source software. Rather, it is the social challenge of building and maintaining a community of users and developers interested in answering questions on user forums, contributing code, and jointly finding solutions to common technical and non-technical challenges. These problems are posed in an environment where project leaders typically have no resources to reward the majority of contributors, where very few people are specifically paid for the work they do on the project, and with frequent turnover of contributors as project members rotate into and out of jobs. In particular, much software work is done by graduate students who may become fluent enough in a software only a year or two before they leave academia. We will discuss strategies we have found do and do not work in maintaining and growing communities around the scientific software projects we lead. Specifically, we will discuss the management style necessary to keep contributors engaged, ways to give credit where credit is due, and structuring documentation to decrease reliance on forums and thereby allow user communities to grow without straining those who answer questions.

  10. WinDAM C earthen embankment internal erosion analysis software

    Science.gov (United States)

    Two primary causes of dam failure are overtopping and internal erosion. For the purpose of evaluating dam safety for existing earthen embankment dams and proposed earthen embankment dams, Windows Dam Analysis Modules C (WinDAM C) software will simulate either internal erosion or erosion resulting f...

  11. Software analysis by simulation for nuclear plant availability and safety goals

    International Nuclear Information System (INIS)

    Lapassat, A.M.; Segalard, J.; Salichon, M.; Le Meur, M.; Boulc'h, J.

    1988-01-01

    The microprocessors utilisation for monitoring protection and safety of nuclear reactor has become reality in the eighties. The authorities responsible for reactor safety systems have considered the necessity of the correct functioning of reactor control systems. The problems take off, when analysis of software, has led us in a first time to develop a completely software tool of verification and validation of programs and specifications. The CEA (French Atomic Energie Commission) responsible of reliable distributed techniques of nuclear plant discusses in this paper the software test and simulation tools used to analyse real-time software. The tool O.S.T. make part of a big program of help for the conception and the evaluation for the systems' fault tolerance which the European ESPRIT SMART no. 1609 (System Measurement and Architecture Technique) will be the kernel [fr

  12. Software Safety Analysis of Digital Protection System Requirements Using a Qualitative Formal Method

    International Nuclear Information System (INIS)

    Lee, Jang-Soo; Kwon, Kee-Choon; Cha, Sung-Deok

    2004-01-01

    The safety analysis of requirements is a key problem area in the development of software for the digital protection systems of a nuclear power plant. When specifying requirements for software of the digital protection systems and conducting safety analysis, engineers find that requirements are often known only in qualitative terms and that existing fault-tree analysis techniques provide little guidance on formulating and evaluating potential failure modes. A framework for the requirements engineering process is proposed that consists of a qualitative method for requirements specification, called the qualitative formal method (QFM), and a safety analysis method for the requirements based on causality information, called the causal requirements safety analysis (CRSA). CRSA is a technique that qualitatively evaluates causal relationships between software faults and physical hazards. This technique, extending the qualitative formal method process and utilizing information captured in the state trajectory, provides specific guidelines on how to identify failure modes and the relationship among them. The QFM and CRSA processes are described using shutdown system 2 of the Wolsong nuclear power plants as the digital protection system example

  13. Z-1 Prototype Space Suit Testing Summary

    Science.gov (United States)

    Ross, Amy

    2013-01-01

    The Advanced Space Suit team of the NASA-Johnson Space Center performed a series of test with the Z-1 prototype space suit in 2012. This paper discusses, at a summary level, the tests performed and results from those tests. The purpose of the tests were two-fold: 1) characterize the suit performance so that the data could be used in the downselection of components for the Z-2 Space Suit and 2) develop interfaces with the suitport and exploration vehicles through pressurized suit evaluations. Tests performed included isolated and functional range of motion data capture, Z-1 waist and hip testing, joint torque testing, CO2 washout testing, fit checks and subject familiarizations, an exploration vehicle aft deck and suitport controls interface evaluation, delta pressure suitport tests including pressurized suit don and doff, and gross mobility and suitport ingress and egress demonstrations in reduced gravity. Lessons learned specific to the Z-1 prototype and to suit testing techniques will be presented.

  14. Hybrid Enhanced Epidermal SpaceSuit Design Approaches

    Science.gov (United States)

    Jessup, Joseph M.

    A Space suit that does not rely on gas pressurization is a multi-faceted problem that requires major stability controls to be incorporated during design and construction. The concept of Hybrid Epidermal Enhancement space suit integrates evolved human anthropomorphic and physiological adaptations into its functionality, using commercially available bio-medical technologies to address shortcomings of conventional gas pressure suits, and the impracticalities of MCP suits. The prototype HEE Space Suit explored integumentary homeostasis, thermal control and mobility using advanced bio-medical materials technology and construction concepts. The goal was a space suit that functions as an enhanced, multi-functional bio-mimic of the human epidermal layer that works in attunement with the wearer rather than as a separate system. In addressing human physiological requirements for design and construction of the HEE suit, testing regimes were devised and integrated into the prototype which was then subject to a series of detailed tests using both anatomical reproduction methods and human subject.

  15. Software and package applicating for network meta-analysis: A usage-based comparative study.

    Science.gov (United States)

    Xu, Chang; Niu, Yuming; Wu, Junyi; Gu, Huiyun; Zhang, Chao

    2017-12-21

    To compare and analyze the characteristics and functions of software applications for network meta-analysis (NMA). PubMed, EMbase, The Cochrane Library, the official websites of Bayesian inference Using Gibbs Sampling (BUGS), Stata and R, and Google were searched to collect the software and packages for performing NMA; software and packages published up to March 2016 were included. After collecting the software, packages, and their user guides, we used the software and packages to calculate a typical example. All characteristics, functions, and computed results were compared and analyzed. Ten types of software were included, including programming and non-programming software. They were developed mainly based on Bayesian or frequentist theory. Most types of software have the characteristics of easy operation, easy mastery, exact calculation, or excellent graphing. However, there was no single software that performed accurate calculations with superior graphing; this could only be achieved through the combination of two or more types of software. This study suggests that the user should choose the appropriate software according to personal programming basis, operational habits, and financial ability. Then, the choice of the combination of BUGS and R (or Stata) software to perform the NMA is considered. © 2017 Chinese Cochrane Center, West China Hospital of Sichuan University and John Wiley & Sons Australia, Ltd.

  16. Development of data acquisition and analysis software for multichannel detectors

    International Nuclear Information System (INIS)

    Chung, Y.

    1988-06-01

    This report describes the development of data acquisition and analysis software for Apple Macintosh computers, capable of controlling two multichannel detectors. With the help of outstanding graphics capabilities, easy-to-use user interface, and several other built-in convenience features, this application has enhanced the productivity and the efficiency of data analysis. 2 refs., 6 figs

  17. An object-oriented software interface to VME

    International Nuclear Information System (INIS)

    Thomas, Timothy L; Gottlieb, Eric; Gold, Michael

    1996-01-01

    In the next millennium, data acquisition tasks for high energy physics will increasingly rely on distributed processing and the VME bus. To provide transparent, general-purpose access to VME hardware modules through a VME-embedded processor, we have created a simple, portable, easily configured object-oriented interface to the VME bus. This software is particularly well-suited for hardware development, providing rapid engineering level access to the VME interface of prototype modules. (author)

  18. OASYS (OrAnge SYnchrotron Suite): an open-source graphical environment for x-ray virtual experiments

    Science.gov (United States)

    Rebuffi, Luca; Sanchez del Rio, Manuel

    2017-08-01

    The evolution of the hardware platforms, the modernization of the software tools, the access to the codes of a large number of young people and the popularization of the open source software for scientific applications drove us to design OASYS (ORange SYnchrotron Suite), a completely new graphical environment for modelling X-ray experiments. The implemented software architecture allows to obtain not only an intuitive and very-easy-to-use graphical interface, but also provides high flexibility and rapidity for interactive simulations, making configuration changes to quickly compare multiple beamline configurations. Its purpose is to integrate in a synergetic way the most powerful calculation engines available. OASYS integrates different simulation strategies via the implementation of adequate simulation tools for X-ray Optics (e.g. ray tracing and wave optics packages). It provides a language to make them to communicate by sending and receiving encapsulated data. Python has been chosen as main programming language, because of its universality and popularity in scientific computing. The software Orange, developed at the University of Ljubljana (SLO), is the high level workflow engine that provides the interaction with the user and communication mechanisms.

  19. Small-signal modelling and analysis of switching converters using MATLAB

    NARCIS (Netherlands)

    Duarte, J.L.

    1998-01-01

    A general procedure for the description of power electronic circuit dynamics is proposed, with the intention of control system design and discrete-time system simulation. The approach is especially suited to be used along with computeraided analysis and synthesis software packages such as MATLAB.

  20. Development of the Free-space Optical Communications Analysis Software (FOCAS)

    Science.gov (United States)

    Jeganathan, M.; Mecherle, G.; Lesh, J.

    1998-01-01

    The Free-space Optical Communications Analysis Software (FOCAS) was developed at the Jet Propulsion Laboratory (JPL) to provide mission planners, systems engineers and communications engineers with an easy to use tool to analyze optical communications link.

  1. Cost and Morbidity Analysis of Chest Port Insertion: Interventional Radiology Suite Versus Operating Room.

    Science.gov (United States)

    LaRoy, Jennifer R; White, Sarah B; Jayakrishnan, Thejus; Dybul, Stephanie; Ungerer, Dirk; Turaga, Kiran; Patel, Parag J

    2015-06-01

    To compare complications and cost, from a hospital perspective, of chest port insertions performed in an interventional radiology (IR) suite versus in surgery in an operating room (OR). This study was approved by an institutional review board and is HIPAA compliant. Medical records were retrospectively searched on consecutive chest port placement procedures, in the IR suite and the OR, between October 22, 2010 and February 26, 2013, to determine patients' demographic information and chest port-related complications and/or infections. A total of 478 charts were reviewed (age range: 21-85 years; 309 women, 169 men). Univariate and bivariate analyses were performed to identify risk factors associated with an increased complication rate. Cost data on 149 consecutive Medicare outpatients (100 treated in the IR suite; 49 treated in the OR) who had isolated chest port insertions between March 2012 and February 2013 were obtained for both the operative services and pharmacy. Nonparametric tests for heterogeneity were performed using the Kruskal-Wallis method. Early complications occurred in 9.2% (22 of 239) of the IR patients versus 13.4% (32 of 239) of the OR patients. Of the 478 implanted chest ports, 9 placed in IR and 18 placed in surgery required early removal. Infections from the ports placed in IR versus the OR were 0.25 versus 0.18 infections per 1000 catheters, respectively. Overall mean costs for chest port insertion were significantly higher in the OR, for both room and pharmacy costs (P chest ports in an OR setting was almost twice that of placement in the IR suite. Hospital costs to place a chest port were significantly lower in the IR suite than in the OR, whereas radiology and surgery patients did not show a significantly different rate of complications and/or infections. Copyright © 2015 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  2. Applied software risk management a guide for software project managers

    CERN Document Server

    Pandian, C Ravindranath

    2006-01-01

    Few software projects are completed on time, on budget, and to their original specifications. Focusing on what practitioners need to know about risk in the pursuit of delivering software projects, Applied Software Risk Management: A Guide for Software Project Managers covers key components of the risk management process and the software development process, as well as best practices for software risk identification, risk planning, and risk analysis. Written in a clear and concise manner, this resource presents concepts and practical insight into managing risk. It first covers risk-driven project management, risk management processes, risk attributes, risk identification, and risk analysis. The book continues by examining responses to risk, the tracking and modeling of risks, intelligence gathering, and integrated risk management. It concludes with details on drafting and implementing procedures. A diary of a risk manager provides insight in implementing risk management processes.Bringing together concepts ...

  3. Meta-DiSc: a software for meta-analysis of test accuracy data.

    Science.gov (United States)

    Zamora, Javier; Abraira, Victor; Muriel, Alfonso; Khan, Khalid; Coomarasamy, Arri

    2006-07-12

    Systematic reviews and meta-analyses of test accuracy studies are increasingly being recognised as central in guiding clinical practice. However, there is currently no dedicated and comprehensive software for meta-analysis of diagnostic data. In this article, we present Meta-DiSc, a Windows-based, user-friendly, freely available (for academic use) software that we have developed, piloted, and validated to perform diagnostic meta-analysis. Meta-DiSc a) allows exploration of heterogeneity, with a variety of statistics including chi-square, I-squared and Spearman correlation tests, b) implements meta-regression techniques to explore the relationships between study characteristics and accuracy estimates, c) performs statistical pooling of sensitivities, specificities, likelihood ratios and diagnostic odds ratios using fixed and random effects models, both overall and in subgroups and d) produces high quality figures, including forest plots and summary receiver operating characteristic curves that can be exported for use in manuscripts for publication. All computational algorithms have been validated through comparison with different statistical tools and published meta-analyses. Meta-DiSc has a Graphical User Interface with roll-down menus, dialog boxes, and online help facilities. Meta-DiSc is a comprehensive and dedicated test accuracy meta-analysis software. It has already been used and cited in several meta-analyses published in high-ranking journals. The software is publicly available at http://www.hrc.es/investigacion/metadisc_en.htm.

  4. Software package for the design and analysis of DNA origami structures

    DEFF Research Database (Denmark)

    Andersen, Ebbe Sloth; Nielsen, Morten Muhlig; Dong, Mingdong

    was observed on the mica surface with a fraction of the dolphin nanostructures showing extensive tail flexibility of approximately 90 degrees. The Java editor and tools are free software distributed under the GNU license. The open architecture of the editor makes it easy for the scientific community......A software package was developed for the semi-automated design of DNA origamis and further data analysis of Atomic Force Microscopy (AFM) images. As an example, we design the shape of a bottlenose dolphin and analyze it by means of high resolution AFM imaging. A high yield of DNA dolphins...... to contribute new tools and functionalities. Documentation, tutorials and software will be made available online....

  5. The software safety analysis based on SFTA for reactor power regulating system in nuclear power plant

    International Nuclear Information System (INIS)

    Liu Zhaohui; Yang Xiaohua; Liao Longtao; Wu Zhiqiang

    2015-01-01

    The digitalized Instrumentation and Control (I and C) system of Nuclear power plants can provide many advantages. However, digital control systems induce new failure modes that differ from those of analog control systems. While the cost effectiveness and flexibility of software is widely recognized, it is very difficult to achieve and prove high levels of dependability and safety assurance for the functions performed by process control software, due to the very flexibility and potential complexity of the software itself. Software safety analysis (SSA) was one way to improve the software safety by identify the system hazards caused by software failure. This paper describes the application of a software fault tree analysis (SFTA) at the software design phase. At first, we evaluate all the software modules of the reactor power regulating system in nuclear power plant and identify various hazards. The SFTA was applied to some critical modules selected from the previous step. At last, we get some new hazards that had not been identified in the prior processes of the document evaluation which were helpful for our design. (author)

  6. The use of CDS/ISIS software in Africa | Abboy | Innovation

    African Journals Online (AJOL)

    CDS/ISIS is free-of-charge software that has been developed by UNESCO and is distributed through a network of country distributors and more recently through the World Wide Web. CDS/ISIS is ideally suited for the manipulation of bibliographic data. Literature sources reveal that the use of CDS/ISIS is widespread in Africa ...

  7. Standard gamma-ray spectra for the comparison of spectral analysis software

    International Nuclear Information System (INIS)

    Woods, S.; Hemingway, J.; Bowles, N.

    1997-01-01

    Three sets of standard γ-ray spectra have been produced for use in assessing the performance of spectral analysis software. The origin of and rationale behind the spectra are described. Nine representative analysis systems have been tested both in terms of component performance and in terms of overall performance and the problems encountered in the analysis are discussed. (author)

  8. Standard gamma-ray spectra for the comparison of spectral analysis software

    Energy Technology Data Exchange (ETDEWEB)

    Woods, S.; Hemingway, J.; Bowles, N. [and others

    1997-08-01

    Three sets of standard {gamma}-ray spectra have been produced for use in assessing the performance of spectral analysis software. The origin of and rationale behind the spectra are described. Nine representative analysis systems have been tested both in terms of component performance and in terms of overall performance and the problems encountered in the analysis are discussed. (author)

  9. Using Business Analysis Software in a Business Intelligence Course

    Science.gov (United States)

    Elizondo, Juan; Parzinger, Monica J.; Welch, Orion J.

    2011-01-01

    This paper presents an example of a project used in an undergraduate business intelligence class which integrates concepts from statistics, marketing, and information systems disciplines. SAS Enterprise Miner software is used as the foundation for predictive analysis and data mining. The course culminates with a competition and the project is used…

  10. Analysis and design of software ecosystem architectures – Towards the 4S telemedicine ecosystem

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak; Hansen, Klaus Marius; Kyng, Morten

    2014-01-01

    performed a descriptive, revelatory case study of the Danish telemedicine ecosystem and for ii), we experimentally designed, implemented, and evaluated the architecture of 4S. Results We contribute in three areas. First, we define the software ecosystem architecture concept that captures organization......, and application stove-pipes that inhibit the adoption of telemedical solutions. To which extent can a software ecosystem approach to telemedicine alleviate this? Objective In this article, we define the concept of software ecosystem architecture as the structure(s) of a software ecosystem comprising elements...... experience in creating and evolving the 4S telemedicine ecosystem. Conclusion The concept of software ecosystem architecture can be used analytically and constructively in respectively the analysis and design of software ecosystems....

  11. A coverage and slicing dependencies analysis for seeking software security defects.

    Science.gov (United States)

    He, Hui; Zhang, Dongyan; Liu, Min; Zhang, Weizhe; Gao, Dongmin

    2014-01-01

    Software security defects have a serious impact on the software quality and reliability. It is a major hidden danger for the operation of a system that a software system has some security flaws. When the scale of the software increases, its vulnerability has becoming much more difficult to find out. Once these vulnerabilities are exploited, it may lead to great loss. In this situation, the concept of Software Assurance is carried out by some experts. And the automated fault localization technique is a part of the research of Software Assurance. Currently, automated fault localization method includes coverage based fault localization (CBFL) and program slicing. Both of the methods have their own location advantages and defects. In this paper, we have put forward a new method, named Reverse Data Dependence Analysis Model, which integrates the two methods by analyzing the program structure. On this basis, we finally proposed a new automated fault localization method. This method not only is automation lossless but also changes the basic location unit into single sentence, which makes the location effect more accurate. Through several experiments, we proved that our method is more effective. Furthermore, we analyzed the effectiveness among these existing methods and different faults.

  12. ATLAS software stack on ARM64

    CERN Document Server

    Smith, Joshua Wyatt; The ATLAS collaboration

    2016-01-01

    The ATLAS experiment explores new hardware and software platforms that, in the future, may be more suited to its data intensive workloads. One such alternative hardware platform is the ARM architecture, which is designed to be extremely power efficient and is found in most smartphones and tablets. CERN openlab recently installed a small cluster of ARM 64-bit evaluation prototype servers. Each server is based on a single-socket ARM 64-bit system on a chip, with 32 Cortex-A57 cores. In total, each server has 128 GB RAM connected with four fast memory channels. This paper reports on the port of the ATLAS software stack onto these new prototype ARM64 servers. This included building the "external" packages that the ATLAS software relies on. Patches were needed to introduce this new architecture into the build as well as patches that correct for platform specific code that caused failures on non-x86 architectures. These patches were applied such that porting to further platforms will need no or only very little adj...

  13. Free/open source software: a study of some applications for scientific data analysis of nuclear experiments

    Energy Technology Data Exchange (ETDEWEB)

    Menezes, Mario Olimpio de [Instituto de Pesquisas Energeticas e Nucleares (IPEN), Sao Paulo, SP (Brazil)]. E-mail: mario@ipen.br; mo.menezes@gmail.com

    2005-07-01

    Free/Open Source Software (FOSS) has been used in science long before the formal social movement known as 'Free Software/Open Source Software' came in to existence. After the Personal Computer (PC) boom in the 80s, commercial closed source software became widely available to scientists for data analysis in this platform. In this paper, we study some high quality FOSS, available also for free, that can be used for complex data analysis tasks. We show the results and data analysis process, aiming to expose the high quality and highly productive ways of both results and processes, while highlighting the different approach used in some of the FOSS. We show that scientists have today in FOSS a viable, high quality alternative to commercial closed source software which, besides being ready to use, also offer the possibility of great customization or extension to fit very particular needs of many fields of scientific data analysis. Among the FOSS, we study in this paper GNU Octave and SCILAB - free alternatives to MATLAB; Gnuplot - free alternative to ORIGIN-like software. We also show that scientists have invaluable resources in modern FOSS programming languages such as Python, and Perl, that can be used both to do data analysis and manipulation, allowing very complex tasks to be done automatically after some few lines of easy programming. (author)

  14. Free/open source software: a study of some applications for scientific data analysis of nuclear experiments

    International Nuclear Information System (INIS)

    Menezes, Mario Olimpio de

    2005-01-01

    Free/Open Source Software (FOSS) has been used in science long before the formal social movement known as 'Free Software/Open Source Software' came in to existence. After the Personal Computer (PC) boom in the 80s, commercial closed source software became widely available to scientists for data analysis in this platform. In this paper, we study some high quality FOSS, available also for free, that can be used for complex data analysis tasks. We show the results and data analysis process, aiming to expose the high quality and highly productive ways of both results and processes, while highlighting the different approach used in some of the FOSS. We show that scientists have today in FOSS a viable, high quality alternative to commercial closed source software which, besides being ready to use, also offer the possibility of great customization or extension to fit very particular needs of many fields of scientific data analysis. Among the FOSS, we study in this paper GNU Octave and SCILAB - free alternatives to MATLAB; Gnuplot - free alternative to ORIGIN-like software. We also show that scientists have invaluable resources in modern FOSS programming languages such as Python, and Perl, that can be used both to do data analysis and manipulation, allowing very complex tasks to be done automatically after some few lines of easy programming. (author)

  15. The impact of new accelerator control software on LEP performance

    International Nuclear Information System (INIS)

    Bailey, R.; Belk, A.; Collier, P.; Lamont, M.; Rigk, G. de; Tarrant, M.

    1993-01-01

    After the first year of running LEP, it became apparent that a new generation of application software would be required for efficient long term exploitation of the accelerator. In response to this need, a suite of accelerator control software has been developed, which is new both in style and functionality. During 1992 this software has been extensively used for driving LEP in many different operational modes, which include several different optics, polarisation runs at different energies and 8 bunch operation with Pretzels. The software has performed well and has undoubtedly enhanced the efficiency of accelerator operations. In particular the turnaround time has been significantly reduced, giving an increase of around 20% in the integrated luminosity for the year. Furthermore the software has made the accelerator accessible to less experienced operators. After outlining the development strategy, the overall functionality and performance of the software is discussed, with particular emphasis on improvements in operating efficiency. Some evaluation of the performance and reliability of ORACLE as an on-line database is also given

  16. Static analysis of software the abstract interpretation

    CERN Document Server

    Boulanger, Jean-Louis

    2013-01-01

    The existing literature currently available to students and researchers is very general, covering only the formal techniques of static analysis. This book presents real examples of the formal techniques called ""abstract interpretation"" currently being used in various industrial fields: railway, aeronautics, space, automotive, etc. The purpose of this book is to present students and researchers, in a single book, with the wealth of experience of people who are intrinsically involved in the realization and evaluation of software-based safety critical systems. As the authors are people curr

  17. HITCal: a software tool for analysis of video head impulse test responses.

    Science.gov (United States)

    Rey-Martinez, Jorge; Batuecas-Caletrio, Angel; Matiño, Eusebi; Perez Fernandez, Nicolás

    2015-09-01

    The developed software (HITCal) may be a useful tool in the analysis and measurement of the saccadic video head impulse test (vHIT) responses and with the experience obtained during its use the authors suggest that HITCal is an excellent method for enhanced exploration of vHIT outputs. To develop a (software) method to analyze and explore the vHIT responses, mainly saccades. HITCal was written using a computational development program; the function to access a vHIT file was programmed; extended head impulse exploration and measurement tools were created and an automated saccade analysis was developed using an experimental algorithm. For pre-release HITCal laboratory tests, a database of head impulse tests (HITs) was created with the data collected retrospectively in three reference centers. This HITs database was evaluated by humans and was also computed with HITCal. The authors have successfully built HITCal and it has been released as open source software; the developed software was fully operative and all the proposed characteristics were incorporated in the released version. The automated saccades algorithm implemented in HITCal has good concordance with the assessment by human observers (Cohen's kappa coefficient = 0.7).

  18. The BRITNeY Suite Animation Tool

    DEFF Research Database (Denmark)

    Westergaard, Michael; Lassen, Kristian Bisgaard

    2006-01-01

    This paper describes the BRITNeY suite, a tool which enables users to create visualizations of formal models. BRITNeY suite is integrated with CPN Tools, and we give an example of how to extend a simple stop-and-wait protocol with a visualization in the form of message sequence charts. We also sh...... examples of animations created during industrial projects to give an impression of what is possible with the BRITNeY suite....

  19. Formal methods in software development: A road less travelled

    Directory of Open Access Journals (Sweden)

    John A van der Poll

    2010-08-01

    Full Text Available An integration of traditional verification techniques and formal specifications in software engineering is presented. Advocates of such techniques claim that mathematical formalisms allow them to produce quality, verifiably correct, or at least highly dependable software and that the testing and maintenance phases are shortened. Critics on the other hand maintain that software formalisms are hard to master, tedious to use and not well suited for the fast turnaround times demanded by industry. In this paper some popular formalisms and the advantages of using these during the early phases of the software development life cycle are presented. Employing the Floyd-Hoare verification principles during the formal specification phase facilitates reasoning about the properties of a specification. Some observations that may help to alleviate the formal-methods controversy are established and a number of formal methods successes is presented. Possible conditions for an increased acceptance of formalisms in oftware development are discussed.

  20. An analysis of unit tests of a flight software product line

    NARCIS (Netherlands)

    Ganesan, D.; Lindvall, M.; McComas, D.; Bartholomew, M.; Slegel, S.; Medina, B.; Krikhaar, R.; Verhoef, C.; Dharmalingam, G.; Montgomery, L.P.

    2013-01-01

    This paper presents an analysis of the unit testing approach developed and used by the Core Flight Software System (CFS) product line team at the NASA Goddard Space Flight Center (GSFC). The goal of the analysis is to understand, review, and recommend strategies for improving the CFS' existing unit

  1. Variety in Performance: A comparative Analysis of Recorded Performances of Bach's Sixth Suite for Solo Cello from 1961 to 1998

    Directory of Open Access Journals (Sweden)

    Alistair Sung

    2011-08-01

    Full Text Available There exists a commonly held belief amongst musicologists that there has been a ―general globalisation of styles‖ within recorded performances of the latter half of the twentieth century. For many, this is evidenced by an increased interaction between mainstream (MS and historically informed performance (HIP practices as well as a general decrease in the diversity of performances. Through a comparative aural and software-assisted analysis and measurement of performance features such as tempo, rhythmic flexibility, vibrato, portamento, bowing, and articulation in key interpretations of Bach‘s Suite No. 6 for Solo Cello recorded in between 1961 and 1998, this study investigates the relationship between MS and HIP performances in the latter half of the 20th century. By limiting analyses to those performers who have made two recordings during the designated period (Tortelier, Starker, Bylsma, Ma, Wispelwey, it has been possible to identify overarching trends and individual differences. These have been examined with reference to their broader social and cultural context in order to test commonly held musicological assumptions about the modernist and postmodernist foundations of performances from this era. The results indicate that by the 1990s increased interaction between MS and HIP practices has resulted in a wider variety of differing performances and that this can be seen to be symptomatic of the postmodern condition that has become prevalent since the last decade of the 20th century.

  2. Enhancements to the opera-3d suite

    International Nuclear Information System (INIS)

    Riley, C.P.

    1997-01-01

    The OPERA-3D suite of programs has been enhanced to include 2 additional 3 dimensional finite element based solvers, with complimentary features in the pre- and postprocessing. SOPRANO computes electromagnetic fields at high frequency including displacement current effects. It has 2 modules emdash a deterministic solution at a user defined frequency and an eigenvalue solution for modal analysis. It is suitable for designing microwave structures and cavities found in particle accelerators. SCALA computes electrostatic fields in the presence of space charge from charged particle beams. The user may define the emission characteristics of electrodes or plasma surfaces and compute the resultant space charge limited beams, including the presence of magnetic fields. Typical applications in particle accelerators are electron guns and ion sources. Other enhancements to the suite include additional capabilities in TOSCA and ELEKTRA, the static and dynamic solvers. copyright 1997 American Institute of Physics

  3. Microbial contamination determination of Cream suit,Traditional Ice Cream and Olovia in Yasuj City

    Directory of Open Access Journals (Sweden)

    SS Khoramrooz

    2015-09-01

    Full Text Available Background & aim: Prevalence of diseases caused by consumption of contaminated food has always been a problem all over the world, and every year spent on improving the disease is costly.Cream suit, Ice cream & olowye for ingredient substance and manufacture & preservation conditional have very high possibility for contamination.The aim of this study is Microbial contamination determination of Cream suit, Traditional Ice Cream and Olovia in Yasuj City Methods: This study is randomized cross sectional study was performed on 64 samples.The samples were taken from the ice cream and confectionery shops in Yasuj city and keep on cold box then the samples were transported in sterile conditions, to the department of medical microbiology laboratory in medical university of yasuj and  microbial contamination rate evaluated by national standard method. Collected data analysed with SPSS software for data description,from central dispersion and table frequency and draw chart.  RESULTS: The survey results showed that 40% o traditional ice cream,cream suit were infected by Staph aurous, Escherichia coli and salmonella respectly (6.7,87 and 0,(50,30 and 0.(0,0 and0 present, and no seen any bacteria on olowye. Conclusion: Due to our research contamination rate traditional ice cream,cream suit and olowye were by Staph aurous, Escherichia coli and salmonella were very high . therefore using different ways to control bacterial growth especaly E.coli the mostly transmited by fecal oral including the use of healthly and safe raw material for promoting health awareness of people involved in the food preparation and production is essential.

  4. Model-Based GN and C Simulation and Flight Software Development for Orion Missions beyond LEO

    Science.gov (United States)

    Odegard, Ryan; Milenkovic, Zoran; Henry, Joel; Buttacoli, Michael

    2014-01-01

    For Orion missions beyond low Earth orbit (LEO), the Guidance, Navigation, and Control (GN&C) system is being developed using a model-based approach for simulation and flight software. Lessons learned from the development of GN&C algorithms and flight software for the Orion Exploration Flight Test One (EFT-1) vehicle have been applied to the development of further capabilities for Orion GN&C beyond EFT-1. Continuing the use of a Model-Based Development (MBD) approach with the Matlab®/Simulink® tool suite, the process for GN&C development and analysis has been largely improved. Furthermore, a model-based simulation environment in Simulink, rather than an external C-based simulation, greatly eases the process for development of flight algorithms. The benefits seen by employing lessons learned from EFT-1 are described, as well as the approach for implementing additional MBD techniques. Also detailed are the key enablers for improvements to the MBD process, including enhanced configuration management techniques for model-based software systems, automated code and artifact generation, and automated testing and integration.

  5. Dependability Assessment by Static Analysis of Software Important to Nuclear Power Plant Safety

    Energy Technology Data Exchange (ETDEWEB)

    Ourghanlian, Alain [EDF Lab, Chatou (France)

    2014-08-15

    We describe a practical experimentation of safety assessment of safety-critical software used in Nuclear Power Plants. To enhance the credibility of safety assessments and to optimize safety justification costs, Electricite de France (EDF) investigates the use of methods and tools for source code semantic analysis, to obtain indisputable evidence and help assessors focus on the most critical issues. EDF has been using the PolySpace tool for more than 10 years. Today, new industrial tools, based on the same formal approach, Abstract Interpretation, are available. Practical experimentation with these new tools shows that the precision obtained on one of our shutdown systems software is very significantly improved. In a first part, we present the analysis principles of the tools used in our experimentation. In a second part, we present the main characteristics of protection-system software, and why these characteristics are well adapted for the new analysis tools. In the last part, we present an overview of the results and the limitation of the tools.

  6. Software Analysis of New Space Gravity Data for Geophysics and Climate Research

    Science.gov (United States)

    Deese, Rupert; Ivins, Erik R.; Fielding, Eric J.

    2012-01-01

    Both the Gravity Recovery and Climate Experiment (GRACE) and Gravity field and steady-state Ocean Circulation Explorer (GOCE) satellites are returning rich data for the study of the solid earth, the oceans, and the climate. Current software analysis tools do not provide researchers with the ease and flexibility required to make full use of this data. We evaluate the capabilities and shortcomings of existing software tools including Mathematica, the GOCE User Toolbox, the ICGEM's (International Center for Global Earth Models) web server, and Tesseroids. Using existing tools as necessary, we design and implement software with the capability to produce gridded data and publication quality renderings from raw gravity data. The straight forward software interface marks an improvement over previously existing tools and makes new space gravity data more useful to researchers. Using the software we calculate Bouguer anomalies of the gravity tensor's vertical component in the Gulf of Mexico, Antarctica, and the 2010 Maule earthquake region. These maps identify promising areas of future research.

  7. Analysis and recommendations for a reliable programming of software based safety systems

    International Nuclear Information System (INIS)

    Nunez McLeod, J.; Nunez McLeod, J.E.; Rivera, S.S.

    1997-01-01

    The present paper summarizes the results of several studies performed for the development of high software on i486 microprocessors, towards its utilization for control and safety systems for nuclear power plants. The work is based on software programmed in C language. Several recommendations oriented to high reliability software are analyzed, relating the requirements on high level language to its influence on assembler level. Several metrics are implemented, that allow for the quantification of the results achieved. New metrics were developed and other were adapted, in order to obtain more efficient indexes for the software description. Such metrics are helpful to visualize the adaptation of the software under development to the quality rules under use. A specific program developed to assist the reliability analyst on this quantification is also present in the paper. It performs the analysis of an executable program written in C language, disassembling it and evaluating its inter al structures. (author)

  8. New analysis software for Viking Lander meteorological data

    Directory of Open Access Journals (Sweden)

    O. Kemppinen

    2013-02-01

    Full Text Available We have developed a set of tools that enable us to process Viking Lander meteorological data beyond what has been previously publicly available. Besides providing data for new periods of time, the existing data periods have been augmented by enhancing the data resolution significantly. This was accomplished by first transferring the original Prime computer version of the data analysis software to a standard Linux platform, and then by modifying the software to be able to process the data despite irregularities in the original raw data and reverse engineering various parameter files. In addition to this, the processing pipeline has been streamlined, making processing the data faster and easier. As a case example of new data, freshly processed Viking Lander 1 and 2 temperature records are described and briefly analyzed in ways that have not been previously possible due to the lack of data.

  9. UTOOLS: microcomputer software for spatial analysis and landscape visualization.

    Science.gov (United States)

    Alan A. Ager; Robert J. McGaughey

    1997-01-01

    UTOOLS is a collection of programs designed to integrate various spatial data in a way that allows versatile spatial analysis and visualization. The programs were designed for watershed-scale assessments in which a wide array of resource data must be integrated, analyzed, and interpreted. UTOOLS software combines raster, attribute, and vector data into "spatial...

  10. Deep in Data. Empirical Data Based Software Accuracy Testing Using the Building America Field Data Repository

    Energy Technology Data Exchange (ETDEWEB)

    Neymark, J. [J.Neymark and Associates, Golden, CO (United States); Roberts, D. [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2013-06-01

    This paper describes progress toward developing a usable, standardized, empirical data-based software accuracy test suite using home energy consumption and building description data. Empirical data collected from around the United States have been translated into a uniform Home Performance Extensible Markup Language format that may enable software developers to create translators to their input schemes for efficient access to the data. This could allow for modeling many homes expediently, and thus implementing software accuracy test cases by applying the translated data.

  11. Hardware and software constructs for a vibration analysis network

    International Nuclear Information System (INIS)

    Cook, S.A.; Crowe, R.D.; Toffer, H.

    1985-01-01

    Vibration level monitoring and analysis has been initiated at N Reactor, the dual purpose reactor operated at Hanford, Washington by UNC Nuclear Industries (UNC) for the Department of Energy (DOE). The machinery to be monitored was located in several buildings scattered over the plant site, necessitating an approach using satellite stations to collect, monitor and temporarily store data. The satellite stations are, in turn, linked to a centralized processing computer for further analysis. The advantages of a networked data analysis system are discussed in this paper along with the hardware and software required to implement such a system

  12. Decision making model design for antivirus software selection using Factor Analysis and Analytical Hierarchy Process

    Directory of Open Access Journals (Sweden)

    Nurhayati Ai

    2018-01-01

    Full Text Available Virus spread increase significantly through the internet in 2017. One of the protection method is using antivirus software. The wide variety of antivirus software in the market tends to creating confusion among consumer. Selecting the right antivirus according to their needs has become difficult. This is the reason we conduct our research. We formulate a decision making model for antivirus software consumer. The model is constructed by using factor analysis and AHP method. First we spread questionnaires to consumer, then from those questionnaires we identified 16 variables that needs to be considered on selecting antivirus software. This 16 variables then divided into 5 factors by using factor analysis method in SPSS software. These five factors are security, performance, internal, time and capacity. To rank those factors we spread questionnaires to 6 IT expert then the data is analyzed using AHP method. The result is that performance factors gained the highest rank from all of the other factors. Thus, consumer can select antivirus software by judging the variables in the performance factors. Those variables are software loading speed, user friendly, no excessive memory use, thorough scanning, and scanning virus fast and accurately.

  13. TLM-Tracker: software for cell segmentation, tracking and lineage analysis in time-lapse microscopy movies.

    Science.gov (United States)

    Klein, Johannes; Leupold, Stefan; Biegler, Ilona; Biedendieck, Rebekka; Münch, Richard; Jahn, Dieter

    2012-09-01

    Time-lapse imaging in combination with fluorescence microscopy techniques enable the investigation of gene regulatory circuits and uncovered phenomena like culture heterogeneity. In this context, computational image processing for the analysis of single cell behaviour plays an increasing role in systems biology and mathematical modelling approaches. Consequently, we developed a software package with graphical user interface for the analysis of single bacterial cell behaviour. A new software called TLM-Tracker allows for the flexible and user-friendly interpretation for the segmentation, tracking and lineage analysis of microbial cells in time-lapse movies. The software package, including manual, tutorial video and examples, is available as Matlab code or executable binaries at http://www.tlmtracker.tu-bs.de.

  14. Enabling interoperability in Geoscience with GI-suite

    Science.gov (United States)

    Boldrini, Enrico; Papeschi, Fabrizio; Santoro, Mattia; Nativi, Stefano

    2015-04-01

    GI-suite is a brokering framework targeting interoperability of heterogeneous systems in the Geoscience domain. The framework is composed by different brokers each one focusing on a specific functionality: discovery, access and semantics (i.e. GI-cat, GI-axe, GI-sem). The brokering takes place between a set of heterogeneous publishing services and a set of heterogeneous consumer applications: the brokering target is represented by resources (e.g. coverages, features, or metadata information) required to seamlessly flow from the providers to the consumers. Different international and community standards are now supported by GI-suite, making possible the successful deployment of GI-suite in many international projects and initiatives (such as GEOSS, NSF BCube and several EU funded projects). As for the publisher side more than 40 standards and implementations are supported (e.g. Dublin Core, OAI-PMH, OGC W*S, Geonetwork, THREDDS Data Server, Hyrax Server, etc.). The support for each individual standard is provided by means of specific GI-suite components, called accessors. As for the consumer applications side more than 15 standards and implementations are supported (e.g. ESRI ArcGIS, Openlayers, OGC W*S, OAI-PMH clients, etc.). The support for each individual standard is provided by means of specific profiler components. The GI-suite can be used in different scenarios by different actors: - A data provider having a pre-existent data repository can deploy and configure GI-suite to broker it and making thus available its data resources through different protocols to many different users (e.g. for data discovery and/or data access) - A data consumer can use GI-suite to discover and/or access resources from a variety of publishing services that are already publishing data according to well-known standards. - A community can deploy and configure GI-suite to build a community (or project-specific) broker: GI-suite can broker a set of community related repositories and

  15. Scrum2Kanban: Integrating Kanban and Scrum in a University Software Engineering Capstone Course

    OpenAIRE

    Matthies, Christoph

    2018-01-01

    Using university capstone courses to teach agile software development methodologies has become commonplace, as agile methods have gained support in professional software development. This usually means students are introduced to and work with the currently most popular agile methodology: Scrum. However, as the agile methods employed in the industry change and are adapted to different contexts, university courses must follow suit. A prime example of this is the Kanban method, which has recentl...

  16. The SAVI Vulnerability Analysis Software Package

    International Nuclear Information System (INIS)

    Mc Aniff, R.J.; Paulus, W.K.; Key, B.; Simpkins, B.

    1987-01-01

    SAVI (Systematic Analysis of Vulnerability to Intrusion) is a new PC-based software package for modeling Physical Protection Systems (PPS). SAVI utilizes a path analysis approach based on the Adversary Sequence Diagram (ASD) methodology. A highly interactive interface allows the user to accurately model complex facilities, maintain a library of these models on disk, and calculate the most vulnerable paths through any facility. Recommendations are provided to help the user choose facility upgrades which should reduce identified path vulnerabilities. Pop-up windows throughout SAVI are used for the input and display of information. A menu at the top of the screen presents all options to the user. These options are further explained on a message line directly below the menu. A diagram on the screen graphically represents the current protection system model. All input is checked for errors, and data are presented in a logical and clear manner. Print utilities provide the user with hard copies of all information and calculated results

  17. Software phantom with realistic speckle modeling for validation of image analysis methods in echocardiography

    Science.gov (United States)

    Law, Yuen C.; Tenbrinck, Daniel; Jiang, Xiaoyi; Kuhlen, Torsten

    2014-03-01

    Computer-assisted processing and interpretation of medical ultrasound images is one of the most challenging tasks within image analysis. Physical phenomena in ultrasonographic images, e.g., the characteristic speckle noise and shadowing effects, make the majority of standard methods from image analysis non optimal. Furthermore, validation of adapted computer vision methods proves to be difficult due to missing ground truth information. There is no widely accepted software phantom in the community and existing software phantoms are not exible enough to support the use of specific speckle models for different tissue types, e.g., muscle and fat tissue. In this work we propose an anatomical software phantom with a realistic speckle pattern simulation to _ll this gap and provide a exible tool for validation purposes in medical ultrasound image analysis. We discuss the generation of speckle patterns and perform statistical analysis of the simulated textures to obtain quantitative measures of the realism and accuracy regarding the resulting textures.

  18. What was uniform about the fin-de-siècle sailor suit?

    Science.gov (United States)

    Rose, Clare

    2011-01-01

    The sailor suits widely worn by children in late-nineteenth-century Britain have been interpreted at the time, and since, as expressions of an Imperial ethos. Yet, a closer examination of the ways that these garments were produced by mass manufacturers, mediated by advertisers and fashion advisors and consumed by families makes us question this characterization. Manufacturers interpreted sailor suits not as unchanging uniforms but as fashion items responding to seasonal changes. Consumers used them to assert social identities and social distinctions, selecting from the multiple variants available. Cultural commentators described sailor suits as emulating Royal practice—but also as ‘common’ and to be avoided. A close analysis of large samples of images and texts from the period 1870–1900 reveals how these different meanings overlapped, making the fin-de-siècle sailor suit a garment that undermines many of our assumptions.

  19. Application of software technology to automatic test data analysis

    Science.gov (United States)

    Stagner, J. R.

    1991-01-01

    The verification process for a major software subsystem was partially automated as part of a feasibility demonstration. The methods employed are generally useful and applicable to other types of subsystems. The effort resulted in substantial savings in test engineer analysis time and offers a method for inclusion of automatic verification as a part of regression testing.

  20. ANALYSIS OF CONTEMPORARY SOFTWARE BEING USED FOR FORWARDING SERVICES

    Directory of Open Access Journals (Sweden)

    Naumov, V.

    2013-01-01

    Full Text Available The role of information technologies in the forwarding services has been specified. The typical structure of the logistic sites providing the search of requests of freight owners and carriers has been described. The analysis of the software for transportation companies was conducted. The perspective directions of improvement of forwarding services process have been revealed.

  1. PyElph - a software tool for gel images analysis and phylogenetics

    Directory of Open Access Journals (Sweden)

    Pavel Ana Brânduşa

    2012-01-01

    Full Text Available Abstract Background This paper presents PyElph, a software tool which automatically extracts data from gel images, computes the molecular weights of the analyzed molecules or fragments, compares DNA patterns which result from experiments with molecular genetic markers and, also, generates phylogenetic trees computed by five clustering methods, using the information extracted from the analyzed gel image. The software can be successfully used for population genetics, phylogenetics, taxonomic studies and other applications which require gel image analysis. Researchers and students working in molecular biology and genetics would benefit greatly from the proposed software because it is free, open source, easy to use, has a friendly Graphical User Interface and does not depend on specific image acquisition devices like other commercial programs with similar functionalities do. Results PyElph software tool is entirely implemented in Python which is a very popular programming language among the bioinformatics community. It provides a very friendly Graphical User Interface which was designed in six steps that gradually lead to the results. The user is guided through the following steps: image loading and preparation, lane detection, band detection, molecular weights computation based on a molecular weight marker, band matching and finally, the computation and visualization of phylogenetic trees. A strong point of the software is the visualization component for the processed data. The Graphical User Interface provides operations for image manipulation and highlights lanes, bands and band matching in the analyzed gel image. All the data and images generated in each step can be saved. The software has been tested on several DNA patterns obtained from experiments with different genetic markers. Examples of genetic markers which can be analyzed using PyElph are RFLP (Restriction Fragment Length Polymorphism, AFLP (Amplified Fragment Length Polymorphism, RAPD

  2. Software factory techniques applied to process control at CERN

    CERN Document Server

    Dutour, Mathias D

    2008-01-01

    The CERN Large Hadron Collider (LHC) requires constant monitoring and control of quantities of parameters to guarantee operational conditions. For this purpose, a methodology called UNICOS (UNIfied Industrial COntrols Systems) has been implemented to standardize the design of process control applications. To further accelerate the development of these applications, we migrated our existing UNICOS tooling suite toward a software factory in charge of assembling project, domain and technical information seamlessly into deployable PLC (Programmable logic Controller) - SCADA (Supervisory Control And Data Acquisition) systems. This software factory delivers consistently high quality by reducing human error and repetitive tasks, and adapts to user specifications in a cost-efficient way. Hence, this production tool is designed to encapsulate and hide the PLC and SCADA target platforms, enabling the experts to focus on the business model rather than specific syntaxes and grammars. Based on industry standard software, ...

  3. Tools-4-Metatool (T4M): online suite of web-tools to process stoichiometric network analysis data from Metatool.

    Science.gov (United States)

    Xavier, Daniela; Vázquez, Sara; Higuera, Clara; Morán, Federico; Montero, Francisco

    2011-08-01

    Tools-4-Metatool (T4M) is a suite of web-tools, implemented in PERL, which analyses, parses, and manipulates files related to Metatool. Its main goal is to assist the work with Metatool. T4M has two major sets of tools: Analysis and Compare. Analysis visualizes the results of Metatool (convex basis, elementary flux modes, and enzyme subsets) and facilitates the study of metabolic networks. It is composed of five tools: MDigraph, MetaMatrix, CBGraph, EMGraph, and SortEM. Compare was developed to compare different Metatool results from different networks. This set consists of: Compara and ComparaSub which compare network subsets providing outputs in different formats and ComparaEM that seeks for identical elementary modes in two metabolic networks. The suite T4M also includes one script that generates Metatool input: CBasis2Metatool, based on a Metatool output file that is filtered by a list of convex basis' metabolites. Finally, the utility CheckMIn checks the consistency of the Metatool input file. T4M is available at http://solea.quim.ucm.es/t4m. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  4. Utilizing a Photo-Analysis Software for Content Identifying Method (CIM

    Directory of Open Access Journals (Sweden)

    Nejad Nasim Sahraei

    2015-01-01

    Full Text Available Content Identifying Methodology or (CIM was developed to measure public preferences in order to reveal the common characteristics of landscapes and aspects of underlying perceptions including the individual's reactions to content and spatial configuration, therefore, it can assist with the identification of factors that influenced preference. Regarding the analysis of landscape photographs through CIM, there are several studies utilizing image analysis software, such as Adobe Photoshop, in order to identify the physical contents in the scenes. This study attempts to evaluate public’s ‘preferences for aesthetic qualities of pedestrian bridges in urban areas through a photo-questionnaire survey, in which respondents evaluated images of pedestrian bridges in urban areas. Two groups of images were evaluated as the most and least preferred scenes that concern the highest and lowest mean scores respectively. These two groups were analyzed by CIM and also evaluated based on the respondent’s description of each group to reveal the pattern of preferences and the factors that may affect them. Digimizer Software was employed to triangulate the two approaches and to determine the role of these factors on people’s preferences. This study attempts to introduce the useful software for image analysis which can measure the physical contents and also their spatial organization in the scenes. According to the findings, it is revealed that Digimizer could be a useful tool in CIM approaches through preference studies that utilizes photographs in place of the actual landscape in order to determine the most important factors in public preferences for pedestrian bridges in urban areas.

  5. Software Engineering Improvement Activities/Plan

    Science.gov (United States)

    2003-01-01

    bd Systems personnel accomplished the technical responsibilities for this reporting period, as planned. A close working relationship was maintained with personnel of the MSFC Avionics Department Software Group (ED14). Work accomplishments included development, evaluation, and enhancement of a software cost model, performing literature search and evaluation of software tools available for code analysis and requirements analysis, and participating in other relevant software engineering activities. Monthly reports were submitted. This support was provided to the Flight Software Group/ED 1 4 in accomplishing the software engineering improvement engineering activities of the Marshall Space Flight Center (MSFC) Software Engineering Improvement Plan.

  6. Software for a measuring facility for activation analysis

    International Nuclear Information System (INIS)

    De Keyser, A.; De Roost, E.

    1985-01-01

    A software package has been developed for an APPLE P.C. The programs are intended to control an automated measuring station for photon activation analysis at GELINA, the linear accelerator of C.B.N.M. at Geel (Belgium). They allow to set-up a measuring scheme, to execute it under computer control, to accumulate and store 2 K-spectra using a built-in ADC and to output the results as listings, plots or evaluated reports

  7. HDX Workbench: Software for the Analysis of H/D Exchange MS Data

    Science.gov (United States)

    Pascal, Bruce D.; Willis, Scooter; Lauer, Janelle L.; Landgraf, Rachelle R.; West, Graham M.; Marciano, David; Novick, Scott; Goswami, Devrishi; Chalmers, Michael J.; Griffin, Patrick R.

    2012-09-01

    Hydrogen/deuterium exchange mass spectrometry (HDX-MS) is an established method for the interrogation of protein conformation and dynamics. While the data analysis challenge of HDX-MS has been addressed by a number of software packages, new computational tools are needed to keep pace with the improved methods and throughput of this technique. To address these needs, we report an integrated desktop program titled HDX Workbench, which facilitates automation, management, visualization, and statistical cross-comparison of large HDX data sets. Using the software, validated data analysis can be achieved at the rate of generation. The application is available at the project home page http://hdx.florida.scripps.edu.

  8. A Technique of Software Safety Analysis in the Design Phase for PLC Based Safety-Critical Systems

    International Nuclear Information System (INIS)

    Koo, Seo-Ryong; Kim, Chang-Hwoi

    2017-01-01

    The purpose of safety analysis, which is a method of identifying portions of a system that have the potential for unacceptable hazards, is firstly to encourage design changes that will reduce or eliminate hazards and, secondly, to conduct special analyses and tests that can provide increased confidence in especially vulnerable portions of the system. For the design and implementation phase of the PLC based systems, we proposed a technique for software design specification and analysis, and this technique enables us to generate software design specifications (SDSs) in nuclear fields. For the safety analysis in the design phase, we used architecture design blocks of NuFDS to represent the architecture of the software. On the basis of the architecture design specification, we can directly generate the fault tree and then use the fault tree for qualitative analysis. Therefore, we proposed a technique of fault tree synthesis, along with a universal fault tree template for the architecture modules of nuclear software. Through our proposed fault tree synthesis in this work, users can use the architecture specification of the NuFDS approach to intuitively compose fault trees that help analyze the safety design features of software.

  9. Development of a gamma ray spectrometry software for neutron activation analysis using the open source concept

    International Nuclear Information System (INIS)

    Lucia, Silvio Rogerio de; Maihara, Vera Akiko; Menezes, Mario O. de

    2009-01-01

    In this work, a new software - SAANI (Instrumental Neutron Activation Analysis Software) was developed and used for gamma ray spectra analysis in the Neutron Activation Laboratory (LAN) of the Nuclear and Energetic Research Institute (IPEN-CNEN/SP). The software was developed to completely replace the old one - VISPECT. Besides the visual improvement in the user interface, the new software will allow the standardization of several procedures which are done nowadays in several different ways by each researcher, avoiding intermediate steps in the calculations. By using a modern programming language - Python, together with the graphical library Qt (by Trolltech), both multi-platform, the new software is able to run in Windows, Linux and other platforms. In addition to this, the new software has being designed to be extensible through plug-ins. In order to achieve the proposed initial scope, that is, completely replace the old software, SAANI has undergone several and different kinds of tests, using spectra from certified reference materials, standards and common spectra already analyzed by other software or that were used in international inter-comparisons. The results obtained by SAANI in all tests were considered very good. Some small discrepancies were found and after careful search and analysis, their source was identified as being an accuracy bug in the old software. Usability and robustness tests were conducted by installing SAANI in several laboratory computers and following them during daily utilization. The results of these tests also indicated that SAANI was ready to be used by all researchers in the LAN-IPEN. (author)

  10. Application of econometric and ecology analysis methods in physics software

    Science.gov (United States)

    Han, Min Cheol; Hoff, Gabriela; Kim, Chan Hyeong; Kim, Sung Hun; Grazia Pia, Maria; Ronchieri, Elisabetta; Saracco, Paolo

    2017-10-01

    Some data analysis methods typically used in econometric studies and in ecology have been evaluated and applied in physics software environments. They concern the evolution of observables through objective identification of change points and trends, and measurements of inequality, diversity and evenness across a data set. Within each analysis area, various statistical tests and measures have been examined. This conference paper summarizes a brief overview of some of these methods.

  11. An instructional guide for leaf color analysis using digital imaging software

    Science.gov (United States)

    Paula F. Murakami; Michelle R. Turner; Abby K. van den Berg; Paul G. Schaberg

    2005-01-01

    Digital color analysis has become an increasingly popular and cost-effective method utilized by resource managers and scientists for evaluating foliar nutrition and health in response to environmental stresses. We developed and tested a new method of digital image analysis that uses Scion Image or NIH image public domain software to quantify leaf color. This...

  12. FieldTrip: Open source software for advanced analysis of MEG, EEG, and invasive electrophysiological data.

    Science.gov (United States)

    Oostenveld, Robert; Fries, Pascal; Maris, Eric; Schoffelen, Jan-Mathijs

    2011-01-01

    This paper describes FieldTrip, an open source software package that we developed for the analysis of MEG, EEG, and other electrophysiological data. The software is implemented as a MATLAB toolbox and includes a complete set of consistent and user-friendly high-level functions that allow experimental neuroscientists to analyze experimental data. It includes algorithms for simple and advanced analysis, such as time-frequency analysis using multitapers, source reconstruction using dipoles, distributed sources and beamformers, connectivity analysis, and nonparametric statistical permutation tests at the channel and source level. The implementation as toolbox allows the user to perform elaborate and structured analyses of large data sets using the MATLAB command line and batch scripting. Furthermore, users and developers can easily extend the functionality and implement new algorithms. The modular design facilitates the reuse in other software packages.

  13. ANALYSIS OF CELLULAR REACTION TO IFN-γ STIMULATION BY A SOFTWARE PACKAGE GeneExpressionAnalyser

    Directory of Open Access Journals (Sweden)

    A. V. Saetchnikov

    2014-01-01

    Full Text Available The software package GeneExpressionAnalyser for analysis of the DNA microarray experi-mental data has been developed. The algorithms of data analysis, differentially expressed genes and biological functions of the cell are described. The efficiency of the developed package is tested on the published experimental data devoted to the time-course research of the changes in the human cell un-der the influence of IFN-γ on melanoma. The developed software has a number of advantages over the existing software: it is free, has a simple and intuitive graphical interface, allows to analyze different types of DNA microarrays, contains a set of methods for complete data analysis and performs effec-tive gene annotation for a selected list of genes.

  14. ASDA - Advanced Suit Design Analyzer computer program

    Science.gov (United States)

    Bue, Grant C.; Conger, Bruce C.; Iovine, John V.; Chang, Chi-Min

    1992-01-01

    An ASDA model developed to evaluate the heat and mass transfer characteristics of advanced pressurized suit design concepts for low pressure or vacuum planetary applications is presented. The model is based on a generalized 3-layer suit that uses the Systems Integrated Numerical Differencing Analyzer '85 in conjunction with a 41-node FORTRAN routine. The latter simulates the transient heat transfer and respiratory processes of a human body in a suited environment. The user options for the suit encompass a liquid cooled garment, a removable jacket, a CO2/H2O permeable layer, and a phase change layer.

  15. ACEMAN (II): a PDP-11 software package for acoustic emission analysis

    International Nuclear Information System (INIS)

    Tobias, A.

    1976-01-01

    A powerful, but easy-to-use, software package (ACEMAN) for acoustic emission analysis has been developed at Berkeley Nuclear Laboratories. The system is based on a PDP-11 minicomputer with 24 K of memory, an RK05 DISK Drive and a Tektronix 4010 Graphics terminal. The operation of the system is described in detail in terms of the functions performed in response to the various command mnemonics. The ACEMAN software package offers many useful facilities not found on other acoustic emission monitoring systems. Its main features, many of which are unique, are summarised. The ACEMAN system automatically handles arrays of up to 12 sensors in real-time operation during which data are acquired, analysed, stored on the computer disk for future analysis and displayed on the terminal if required. (author)

  16. Transient thermal-hydraulic characteristics analysis software for PWR nuclear power systems

    International Nuclear Information System (INIS)

    Wu Yingwei; Zhuang Chengjun; Su Guanghui; Qiu Suizheng

    2010-01-01

    A point reactor neutron kinetics model, a two-phase drift-flow U-tube steam generator model, an advanced non-equilibrium three regions pressurizer model, and a passive emergency core decay heat-removed system model are adopted in the paper to develop the computerized analysis code for PWR transient thermal-hydraulic characteristics, by Compaq Visual Fortran 6.0 language. Visual input, real-time processing and dynamic visualization output are achieved by Microsoft Visual Studio. NET language. The reliability verification of the soft has been conducted by RELAP 5, and the verification results show that the software is with high calculation precision, high calculation speed, modern interface, luxuriant functions and strong operability. The software was applied to calculate the transient accident conditions for QSNP, and the analysis results are significant to the practical engineering applications. (authors)

  17. A new modification of summary-based analysis method for large software system testing

    Directory of Open Access Journals (Sweden)

    A. V. Sidorin

    2015-01-01

    Full Text Available The automated testing tools becoming a frequent practice require thorough computer-aided testing of large software systems, including system inter-component interfaces. To achieve a good coverage, one should overcome scalability problems of different methods of analysis. These problems arise from impossibility to analyze all the execution paths. The objective of this research is to build a method for inter-procedural analysis, which efficiency enables us to analyse large software systems (such as Android OS codebase as a whole for a reasonable time (no more than 4 hours. This article reviews existing methods of software analysis to detect their potential defects. It focuses on the symbolic execution method since it is widely used both in static analysis of source code and in hybrid analysis of object files and intermediate representation (concolic testing. The method of symbolic execution involves separation of a set of input data values into equivalence classes while choosing an execution path. The paper also considers advantages of this method and its shortcomings. One of the main scalability problems is related to inter-procedural analysis. Analysis time grows rapidly if an inlining method is used for inter-procedural analysis. So this work proposes a summary-based analysis method to solve scalability problems. Clang Static Analyzer, an open source static analyzer (a part of the LLVM project, has been chosen as a target system. It allows us to compare performance of inlining and summary-based inter-procedural analysis. A mathematical model for preliminary estimations is described in order to identify possible factors of performance improvement.

  18. Software libero per lo sviluppo in Africa orientale

    Directory of Open Access Journals (Sweden)

    Andrea Mandrici

    2010-03-01

    Full Text Available Free Software for development in East Africa Since 1999 Istituto Oikos (Italian NGO in partnership with Oikos East Africa (Tanzanian NGO are carrying out integrated development projects in collaboration with local Government Authorities at Regional, District, and Village level. With the aim to ensure sustainability to the interventions addressed to strengthen the governance capacities and the use of IT technologies in the rural areas, the projects have promoted the use of Free Software tools such as Ubuntu and Debian Linux as OS; Qgis and GRASS as G.I.S.; SQLite/Spatialite and PostgreSQL/PostGIS as Spatial-enabled DBRMS, OpenOffice as productivity suite.

  19. Software libero per lo sviluppo in Africa orientale

    Directory of Open Access Journals (Sweden)

    Andrea Mandrici

    2010-03-01

    Full Text Available Free Software for development in East Africa  Since 1999 Istituto Oikos (Italian NGO in partnership with Oikos East Africa (Tanzanian NGO are carrying out integrated development projects in collaboration with local Government Authorities at Regional, District, and Village level. With the aim to ensure sustainability to the interventions addressed to strengthen the governance capacities and the use of IT technologies in the rural areas, the projects have promoted the use of Free Software tools such as Ubuntu and Debian Linux as OS; Qgis and GRASS as G.I.S.; SQLite/Spatialite and PostgreSQL/PostGIS as Spatial-enabled DBRMS, OpenOffice as productivity suite.

  20. Policy Process Editor for P3BM Software

    Science.gov (United States)

    James, Mark; Chang, Hsin-Ping; Chow, Edward T.; Crichton, Gerald A.

    2010-01-01

    A computer program enables generation, in the form of graphical representations of process flows with embedded natural-language policy statements, input to a suite of policy-, process-, and performance-based management (P3BM) software. This program (1) serves as an interface between users and the Hunter software, which translates the input into machine-readable form; and (2) enables users to initialize and monitor the policy-implementation process. This program provides an intuitive graphical interface for incorporating natural-language policy statements into business-process flow diagrams. Thus, the program enables users who dictate policies to intuitively embed their intended process flows as they state the policies, reducing the likelihood of errors and reducing the time between declaration and execution of policy.

  1. Spectrum analysis on quality requirements consideration in software design documents.

    Science.gov (United States)

    Kaiya, Haruhiko; Umemura, Masahiro; Ogata, Shinpei; Kaijiri, Kenji

    2013-12-01

    Software quality requirements defined in the requirements analysis stage should be implemented in the final products, such as source codes and system deployment. To guarantee this meta-requirement, quality requirements should be considered in the intermediate stages, such as the design stage or the architectural definition stage. We propose a novel method for checking whether quality requirements are considered in the design stage. In this method, a technique called "spectrum analysis for quality requirements" is applied not only to requirements specifications but also to design documents. The technique enables us to derive the spectrum of a document, and quality requirements considerations in the document are numerically represented in the spectrum. We can thus objectively identify whether the considerations of quality requirements in a requirements document are adapted to its design document. To validate the method, we applied it to commercial software systems with the help of a supporting tool, and we confirmed that the method worked well.

  2. JColorGrid: software for the visualization of biological measurements.

    Science.gov (United States)

    Joachimiak, Marcin P; Weisman, Jennifer L; May, Barnaby Ch

    2006-04-27

    Two-dimensional data colourings are an effective medium by which to represent three-dimensional data in two dimensions. Such "color-grid" representations have found increasing use in the biological sciences (e.g. microarray 'heat maps' and bioactivity data) as they are particularly suited to complex data sets and offer an alternative to the graphical representations included in traditional statistical software packages. The effectiveness of color-grids lies in their graphical design, which introduces a standard for customizable data representation. Currently, software applications capable of generating limited color-grid representations can be found only in advanced statistical packages or custom programs (e.g. micro-array analysis tools), often associated with steep learning curves and requiring expert knowledge. Here we describe JColorGrid, a Java library and platform independent application that renders color-grid graphics from data. The software can be used as a Java library, as a command-line application, and as a color-grid parameter interface and graphical viewer application. Data, titles, and data labels are input as tab-delimited text files or Microsoft Excel spreadsheets and the color-grid settings are specified through the graphical interface or a text configuration file. JColorGrid allows both user graphical data exploration as well as a means of automatically rendering color-grids from data as part of research pipelines. The program has been tested on Windows, Mac, and Linux operating systems, and the binary executables and source files are available for download at http://jcolorgrid.ucsf.edu.

  3. CAX a software for automated spectrum analysis

    International Nuclear Information System (INIS)

    Zahn, Guilherme S.; Genezini, Frederico A.

    2017-01-01

    In this work, the scripting capabilities of Genie-2000 were used to develop a software that automatically analyses all spectrum files in either Ortec's CHN or Canberra's MCA or CNF formats in a folder, generating two output files: a print-ready text le (.DAT) and a Comma-Separated Values (.CSV) le which can be easily imported in any major spreadsheet software. This software, named CAX ('Convert and Analyse for eXcel'), uses Genie-2000's functions to import spectrum files into Genie's native CNF format and analyze the converted spectra. The software can also, if requested, import energy and FWHM calibrations from a stored calibrated spectrum. The print-ready output le (.DAT) is generated by Genie-2000 using a customized script, and the CSV le is generated by a custom-built DAT2CSV software which generates a CSV le that complies to the Brazilian standards, with commas as a decimal indicator and semicolons as eld separators. This software is already used in the daily routines in IPEN's Neutron Activation Laboratory, greatly reducing the time required for sample analyses, as well as reducing the possibility of transcription errors. (author)

  4. Model Based Analysis and Test Generation for Flight Software

    Science.gov (United States)

    Pasareanu, Corina S.; Schumann, Johann M.; Mehlitz, Peter C.; Lowry, Mike R.; Karsai, Gabor; Nine, Harmon; Neema, Sandeep

    2009-01-01

    We describe a framework for model-based analysis and test case generation in the context of a heterogeneous model-based development paradigm that uses and combines Math- Works and UML 2.0 models and the associated code generation tools. This paradigm poses novel challenges to analysis and test case generation that, to the best of our knowledge, have not been addressed before. The framework is based on a common intermediate representation for different modeling formalisms and leverages and extends model checking and symbolic execution tools for model analysis and test case generation, respectively. We discuss the application of our framework to software models for a NASA flight mission.

  5. Comparison of gamma knife validation film's analysis results of different film dose analysis software

    International Nuclear Information System (INIS)

    Cheng Xiaojun; Zhang Conghua; Liu Han; Dai Fuyou; Hu Chuanpeng; Liu Cheng; Yao Zhongfu

    2011-01-01

    Objective: To compare the analytical result of different kinds of film dose analysis software for the same gamma knife, analyze the reasons of difference caused, and explore the measurements and means for quality control and quality assurance during testing gamma knife and analyzing its result. Methods: To test the Moon Deity gamma knife with Kodak EDR2 film and γ-Star gamma knife with GAFCHROMIC® EBT film, respectively. All the validation films are scanned to proper imagine format for dose analysis software by EPSON PERFECTION V750 PRO scanner. Then imagines of Moon Deity gamma knife are analyzed with Robot Knife Adjuvant 1.09 and Fas-09 1.0, and imagines of γ-Star gamma knife with Fas-09 and MATLAB 7.0. Results: There is no significant difference in the maximum deviation of radiation field size (Full Width at Half Maximum, FWHM) and its nominal value between Robot Knife Adjuvant and Fas-09 for Moon Deity gamma knife (t=-2.133, P>0.05). The analysis on the radiation field's penumbra region width of collimators which have different sizes indicated that the differences are significant (t=-8.154, P<0.05). There is no significant difference in the maximum deviation of FWHM and its nominal value between Fas-09 and MATLAB for γ-Star gamma knife (t=-1.384, P>0.05). However, following national standards,analysis of φ4 mm width of collimators can obtain different results according to the two kinds software, and the result of Fas-09 is not qualified while MATLAB is qualified. The analysis on the radiation field's penumbra region width of collimators which have different sizes indicates that the differences are significant (t=3.074, P<0.05). The imagines are processed with Fas-09. The analysis of imagine in the pre-and the post-processing indicates that there is no significant difference in the maximum deviation of FWHM and its nominal value (t=0.647, P>0.05), and the analytical result of the radiation field's penumbra region width indicates that there is

  6. Optimizing the physical ergonomics indices for the use of partial pressure suits.

    Science.gov (United States)

    Ding, Li; Li, Xianxue; Hedge, Alan; Hu, Huimin; Feathers, David; Qin, Zhifeng; Xiao, Huajun; Xue, Lihao; Zhou, Qianxiang

    2015-03-01

    This study developed an ergonomic evaluation system for the design of high-altitude partial pressure suits (PPSs). A total of twenty-one Chinese males participated in the experiment which tested three types of ergonomics indices (manipulative mission, operational reach and operational strength) were studied using a three-dimensional video-based motion capture system, a target-pointing board, a hand dynamometer, and a step-tread apparatus. In total, 36 ergonomics indices were evaluated and optimized using regression and fitting analysis. Some indices that were found to be linearly related and redundant were removed from the study. An optimal ergonomics index system was established that can be used to conveniently and quickly evaluate the performance of different pressurized/non-pressurized suit designs. The resulting ergonomics index system will provide a theoretical basis and practical guidance for mission planners, suit designers and engineers to design equipment for human use, and to aid in assessing partial pressure suits. Copyright © 2014 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  7. Software engineering processes principles and applications

    CERN Document Server

    Wang, Yingxu

    2000-01-01

    Fundamentals of the Software Engineering ProcessIntroductionA Unified Framework of the Software Engineering ProcessProcess AlgebraProcess-Based Software EngineeringSoftware Engineering Process System ModelingThe CMM ModelThe ISO 9001 ModelThe BOOTSTRAP ModelThe ISO/IEC 15504 (SPICE) ModelThe Software Engineering Process Reference Model: SEPRMSoftware Engineering Process System AnalysisBenchmarking the SEPRM ProcessesComparative Analysis of Current Process ModelsTransformation of Capability Levels Between Current Process ModelsSoftware Engineering Process EstablishmentSoftware Process Establish

  8. Lightweight and Continuous Architectural Software Quality Assurance using the aSQA Technique

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak; Hansen, Klaus Marius; Lindstrøm, Bo

    2010-01-01

    In this paper, we present a novel technique for assessing and prioritizing architectural quality in large-scale software development projects. The technique can be applied with relatively little effort by software architects and thus suited for agile development in which quality attributes can...... be assessed and prioritized, e.g., within each development sprint. We outline the processes and metrics embodied in the technique, and report initial experiences on the benefits and liabilities. In conclusion, the technique is considered valuable and a viable tool, and has benefits in an architectural...

  9. Validation of a Video Analysis Software Package for Quantifying Movement Velocity in Resistance Exercises.

    Science.gov (United States)

    Sañudo, Borja; Rueda, David; Pozo-Cruz, Borja Del; de Hoyo, Moisés; Carrasco, Luis

    2016-10-01

    Sañudo, B, Rueda, D, del Pozo-Cruz, B, de Hoyo, M, and Carrasco, L. Validation of a video analysis software package for quantifying movement velocity in resistance exercises. J Strength Cond Res 30(10): 2934-2941, 2016-The aim of this study was to establish the validity of a video analysis software package in measuring mean propulsive velocity (MPV) and the maximal velocity during bench press. Twenty-one healthy males (21 ± 1 year) with weight training experience were recruited, and the MPV and the maximal velocity of the concentric phase (Vmax) were compared with a linear position transducer system during a standard bench press exercise. Participants performed a 1 repetition maximum test using the supine bench press exercise. The testing procedures involved the simultaneous assessment of bench press propulsive velocity using 2 kinematic (linear position transducer and semi-automated tracking software) systems. High Pearson's correlation coefficients for MPV and Vmax between both devices (r = 0.473 to 0.993) were observed. The intraclass correlation coefficients for barbell velocity data and the kinematic data obtained from video analysis were high (>0.79). In addition, the low coefficients of variation indicate that measurements had low variability. Finally, Bland-Altman plots with the limits of agreement of the MPV and Vmax with different loads showed a negative trend, which indicated that the video analysis had higher values than the linear transducer. In conclusion, this study has demonstrated that the software used for the video analysis was an easy to use and cost-effective tool with a very high degree of concurrent validity. This software can be used to evaluate changes in velocity of training load in resistance training, which may be important for the prescription and monitoring of training programmes.

  10. Use MACES IVA Suit for EVA Mobility Evaluations

    Science.gov (United States)

    Watson, Richard D.

    2014-01-01

    The use of an Intra-Vehicular Activity (IVA) suit for a spacewalk or Extra-Vehicular Activity (EVA) was evaluated for mobility and usability in the Neutral Buoyancy Lab (NBL) environment. The Space Shuttle Advanced Crew Escape Suit (ACES) has been modified (MACES) to integrate with the Orion spacecraft. The first several missions of the Orion MPCV spacecraft will not have mass available to carry an EVA specific suit so any EVA required will have to be performed by the MACES. Since the MACES was not designed with EVA in mind, it was unknown what mobility the suit would be able to provide for an EVA or if a person could perform useful tasks for an extended time inside the pressurized suit. The suit was evaluated in multiple NBL runs by a variety of subjects including crewmembers with significant EVA experience. Various functional mobility tasks performed included: translation, body positioning, carrying tools, body stabilization, equipment handling, and use of tools. Hardware configurations included with and without TMG, suit with IVA gloves and suit with EVA gloves. Most tasks were completed on ISS mockups with existing EVA tools. Some limited tasks were completed with prototype tools on a simulated rocky surface. Major findings include: demonstration of the ability to weigh-out the suit, understanding the need to have subjects perform multiple runs prior to getting feedback, determination of critical sizing factors, and need for adjustment of suit work envelop. The early testing has demonstrated the feasibility of EVA's limited duration and limited scope. Further testing is required with more flight like tasking and constraints to validate these early results. If the suit is used for EVA, it will require mission specific modifications for umbilical management or PLSS integration, safety tether attachment, and tool interfaces. These evaluations are continuing through calendar year 2014.

  11. Morphing: A Novel Approach to Astronaut Suit Sizing

    Science.gov (United States)

    Margerum, Sarah; Clowers, Kurt; Rajulu, Sudhakar

    2006-01-01

    The fitting of a spacesuit to an astronaut is an iterative process consisting of two parts. The first uses anthropometric data to provide an approximation of the suit components that will fit the astronaut. The second part is the subjective fitting, where small adjustments are made based on the astronaut s preference. By providing a better approximation of the correct suit components, the entire fit process time can be reduced significantly. The goals of this project are twofold: (1) To evaluate the effectiveness of the existing sizing algorithm for the Mark III Hybrid suit and (2) to determine what additional components are needed in order to provide adequate sizing for the existing astronaut population. A single subject was scanned using a 3D whole-body scanner (VITUS 3D) in the Mark III suit in eight different poses and four subjects in minimal clothing were also scanned in similar poses. The 3D external body scans of the suit and the subject are overlaid and visually aligned in a customized MATLAB program. The suit components were contracted or expanded linearly along the subjects limbs to match the subjects segmental lengths. Two independent measures were obtained from the morphing program on four subjects and compared with the existing sizing information. Two of the four subjects were in correspondence with the sizing algorithm and morphing results. The morphing outcome for a third subject, incompatible with the suit, suggested that an additional arm element at least 6 inches smaller than the existing smallest suit component would need to be acquired. The morphing result of the fourth subject, deemed incompatible with the suit using the sizing algorithm, indicated a different suit configuration which would be compatible. This configuration matched with the existing suit fit check data.

  12. Overview of ACTYS project on development of indigenous state-of-the-art code suites for nuclear activation analysis

    International Nuclear Information System (INIS)

    Subhash, P.V.; Tadepalli, Sai Chaitanya; Deshpande, Shishir P.; Kanth, Priti; Srinivasan, R.

    2017-01-01

    Rigorous activation calculations are warranted for safer and efficient design of future fusion machines. Suitable activation codes, which yield accurate results with faster performance yet include all fusion relevant reactions are a prerequisite. To meet these, an indigenous project called ACTYS-Project is initiated and as a result, four state-of-art codes are developed so far. The goal of this project is to develop indigenous state-of-the-art code suites for nuclear activation analysis

  13. Software factory techniques applied to process control at CERN

    OpenAIRE

    Dutour, Mathias D

    2007-01-01

    The CERN Large Hadron Collider (LHC) requires constant monitoring and control of quantities of parameters to guarantee operational conditions. For this purpose, a methodology called UNICOS (UNIfied Industrial COntrols Systems) has been implemented to standardize the design of process control applications. To further accelerate the development of these applications, we migrated our existing UNICOS tooling suite toward a software factory in charge of assembling project, domain and technical inf...

  14. StOCNET : Software for the statistical analysis of social networks

    NARCIS (Netherlands)

    Huisman, M.; van Duijn, M.A.J.

    2003-01-01

    StOCNET3 is an open software system in a Windows environment for the advanced statistical analysis of social networks. It provides a platform to make a number of recently developed and therefore not (yet) standard statistical methods available to a wider audience. A flexible user interface utilizing

  15. FMT (Flight Software Memory Tracker) For Cassini Spacecraft-Software Engineering Using JAVA

    Science.gov (United States)

    Kan, Edwin P.; Uffelman, Hal; Wax, Allan H.

    1997-01-01

    The software engineering design of the Flight Software Memory Tracker (FMT) Tool is discussed in this paper. FMT is a ground analysis software set, consisting of utilities and procedures, designed to track the flight software, i.e., images of memory load and updatable parameters of the computers on-board Cassini spacecraft. FMT is implemented in Java.

  16. Software Piracy in Research: A Moral Analysis.

    Science.gov (United States)

    Santillanes, Gary; Felder, Ryan Marshall

    2015-08-01

    Researchers in virtually every discipline rely on sophisticated proprietary software for their work. However, some researchers are unable to afford the licenses and instead procure the software illegally. We discuss the prohibition of software piracy by intellectual property laws, and argue that the moral basis for the copyright law offers the possibility of cases where software piracy may be morally justified. The ethics codes that scientific institutions abide by are informed by a rule-consequentialist logic: by preserving personal rights to authored works, people able to do so will be incentivized to create. By showing that the law has this rule-consequentialist grounding, we suggest that scientists who blindly adopt their institutional ethics codes will commit themselves to accepting that software piracy could be morally justified, in some cases. We hope that this conclusion will spark debate over important tensions between ethics codes, copyright law, and the underlying moral basis for these regulations. We conclude by offering practical solutions (other than piracy) for researchers.

  17. [Development and practice evaluation of blood acid-base imbalance analysis software].

    Science.gov (United States)

    Chen, Bo; Huang, Haiying; Zhou, Qiang; Peng, Shan; Jia, Hongyu; Ji, Tianxing

    2014-11-01

    To develop a blood gas, acid-base imbalance analysis computer software to diagnose systematically, rapidly, accurately and automatically determine acid-base imbalance type, and evaluate the clinical application. Using VBA programming language, a computer aided diagnostic software for the judgment of acid-base balance was developed. The clinical data of 220 patients admitted to the Second Affiliated Hospital of Guangzhou Medical University were retrospectively analyzed. The arterial blood gas [pH value, HCO(3)(-), arterial partial pressure of carbon dioxide (PaCO₂)] and electrolytes included data (Na⁺ and Cl⁻) were collected. Data were entered into the software for acid-base imbalances judgment. At the same time the data generation was calculated manually by H-H compensation formula for determining the type of acid-base imbalance. The consistency of judgment results from software and manual calculation was evaluated, and the judgment time of two methods was compared. The clinical diagnosis of the types of acid-base imbalance for the 220 patients: 65 cases were normal, 90 cases with simple type, mixed type in 41 cases, and triplex type in 24 cases. The accuracy of the judgment results of the normal and triplex types from computer software compared with which were calculated manually was 100%, the accuracy of the simple type judgment was 98.9% and 78.0% for the mixed type, and the total accuracy was 95.5%. The Kappa value of judgment result from software and manual judgment was 0.935, P=0.000. It was demonstrated that the consistency was very good. The time for software to determine acid-base imbalances was significantly shorter than the manual judgment (seconds:18.14 ± 3.80 vs. 43.79 ± 23.86, t=7.466, P=0.000), so the method of software was much faster than the manual method. Software judgment can replace manual judgment with the characteristics of rapid, accurate and convenient, can improve work efficiency and quality of clinical doctors and has great

  18. Image processing and analysis software development

    International Nuclear Information System (INIS)

    Shahnaz, R.

    1999-01-01

    The work presented in this project is aimed at developing a software 'IMAGE GALLERY' to investigate various image processing and analysis techniques. The work was divided into two parts namely the image processing techniques and pattern recognition, which further comprised of character and face recognition. Various image enhancement techniques including negative imaging, contrast stretching, compression of dynamic, neon, diffuse, emboss etc. have been studied. Segmentation techniques including point detection, line detection, edge detection have been studied. Also some of the smoothing and sharpening filters have been investigated. All these imaging techniques have been implemented in a window based computer program written in Visual Basic Neural network techniques based on Perception model have been applied for face and character recognition. (author)

  19. LHCb - Automated Testing Infrastructure for the Software Framework Gaudi

    CERN Multimedia

    Clemencic, M

    2009-01-01

    An extensive test suite is the first step towards the delivery of robust software, but it is not always easy to implement it, especially in projects with many developers. An easy to use and flexible infrastructure to use to write and execute the tests reduces the work each developer has to do to instrument his packages with tests. At the same time, the infrastructure gives the same look and feel to the tests and allows automated execution of the test suite. For Gaudi, we decided to develop the testing infrastructure on top of the free tool QMTest, used already in LCG Application Area for the routine tests run in the nightly build system. The high flexibility of QMTest allowed us to integrate it in the Gaudi package structure. A specialized test class and some utility functions have been developed to simplify the definition of a test for a Gaudi-based application. Thanks to the testing infrastructure described here, we managed to quickly extend the standard Gaudi test suite and add tests to the main LHCb appli...

  20. Software engineering the current practice

    CERN Document Server

    Rajlich, Vaclav

    2011-01-01

    INTRODUCTION History of Software EngineeringSoftware PropertiesOrigins of SoftwareBirth of Software EngineeringThird Paradigm: Iterative ApproachSoftware Life Span ModelsStaged ModelVariants of Staged ModelSoftware Technologies Programming Languages and CompilersObject-Oriented TechnologyVersion Control SystemSoftware ModelsClass DiagramsUML Activity DiagramsClass Dependency Graphs and ContractsSOFTWARE CHANGEIntroduction to Software ChangeCharacteristics of Software ChangePhases of Software ChangeRequirements and Their ElicitationRequirements Analysis and Change InitiationConcepts and Concept