WorldWideScience

Sample records for analysis software suite

  1. The ESA's Space Trajectory Analysis software suite

    Science.gov (United States)

    Ortega, Guillermo

    The European Space Agency (ESA) initiated in 2005 an internal activity to develop an open source software suite involving university science departments and research institutions all over the world. This project is called the "Space Trajectory Analysis" or STA. This article describes the birth of STA and its present configuration. One of the STA aims is to promote the exchange of technical ideas, and raise knowledge and competence in the areas of applied mathematics, space engineering, and informatics at University level. Conceived as a research and education tool to support the analysis phase of a space mission, STA is able to visualize a wide range of space trajectories. These include among others ascent, re-entry, descent and landing trajectories, orbits around planets and moons, interplanetary trajectories, rendezvous trajectories, etc. The article explains that STA project is an original idea of the Technical Directorate of ESA. It was born in August 2005 to provide a framework in astrodynamics research at University level. As research and education software applicable to Academia, a number of Universities support this development by joining ESA in leading the development. ESA and Universities partnership are expressed in the STA Steering Board. Together with ESA, each University has a chair in the board whose tasks are develop, control, promote, maintain, and expand the software suite. The article describes that STA provides calculations in the fields of spacecraft tracking, attitude analysis, coverage and visibility analysis, orbit determination, position and velocity of solar system bodies, etc. STA implements the concept of "space scenario" composed of Solar system bodies, spacecraft, ground stations, pads, etc. It is able to propagate the orbit of a spacecraft where orbital propagators are included. STA is able to compute communication links between objects of a scenario (coverage, line of sight), and to represent the trajectory computations and

  2. Orbit Software Suite

    Science.gov (United States)

    Osgood, Cathy; Williams, Kevin; Gentry, Philip; Brownfield, Dana; Hallstrom, John; Stuit, Tim

    2012-01-01

    Orbit Software Suite is used to support a variety of NASA/DM (Dependable Multiprocessor) mission planning and analysis activities on the IPS (Intrusion Prevention System) platform. The suite of Orbit software tools (Orbit Design and Orbit Dynamics) resides on IPS/Linux workstations, and is used to perform mission design and analysis tasks corresponding to trajectory/ launch window, rendezvous, and proximity operations flight segments. A list of tools in Orbit Software Suite represents tool versions established during/after the Equipment Rehost-3 Project.

  3. eXtended CASA Line Analysis Software Suite (XCLASS)

    CERN Document Server

    Möller, T; Schilke, P

    2015-01-01

    The eXtended CASA Line Analysis Software Suite (XCLASS) is a toolbox for the Common Astronomy Software Applications package (CASA) containing new functions for modeling interferometric and single dish data. Among the tools is the myXCLASS program which calculates synthetic spectra by solving the radiative transfer equation for an isothermal object in one dimension, whereas the finite source size and dust attenuation are considered as well. Molecular data required by the myXCLASS program are taken from an embedded SQLite3 database containing entries from the Cologne Database for Molecular Spectroscopy CDMS) and JPL using the Virtual Atomic and Molecular Data Center (VAMDC) portal. Additionally, the toolbox provides an interface for the model optimizer package Modeling and Analysis Generic Interface for eXternal numerical codes (MAGIX), which helps to find the best description of observational data using myXCLASS (or another external model program), i.e., finding the parameter set that most closely reproduces t...

  4. Design and Implementation of Convex Analysis of Mixtures Software Suite

    OpenAIRE

    Meng, Fan

    2012-01-01

    Various convex analysis of mixtures (CAM) based algorithms have been developed to address real world blind source separation (BSS) problems and proven to have good performances in previous papers. This thesis reported the implementation of a comprehensive software CAM-Java, which contains three different CAM based algorithms, CAM compartment modeling (CAM-CM), CAM non-negative independent component analysis (CAM-nICA), and CAM non-negative well-grounded component analysis (CAM-nWCA). The imp...

  5. Navigation/Prop Software Suite

    Science.gov (United States)

    Bruchmiller, Tomas; Tran, Sanh; Lee, Mathew; Bucker, Scott; Bupane, Catherine; Bennett, Charles; Cantu, Sergio; Kwong, Ping; Propst, Carolyn

    2012-01-01

    Navigation (Nav)/Prop software is used to support shuttle mission analysis, production, and some operations tasks. The Nav/Prop suite containing configuration items (CIs) resides on IPS/Linux workstations. It features lifecycle documents, and data files used for shuttle navigation and propellant analysis for all flight segments. This suite also includes trajectory server, archive server, and RAT software residing on MCC/Linux workstations. Navigation/Prop represents tool versions established during or after IPS Equipment Rehost-3 or after the MCC Rehost.

  6. A Comprehensive Software Suite for the Analysis of cDNAs

    Institute of Scientific and Technical Information of China (English)

    Kazuharu Arakawa; Haruo Suzuki; Kosuke Fujishima; Kenji Fujimoto; Sho Ueda; Motomu Matsui; Masaru Tomita

    2005-01-01

    We have developed a comprehensive software suite for bioinformatics research of cDNAs; it is aimed at rapid characterization of the features of genes and the proteins they code. Methods implemented include the detection of translation initiation and termination signals, statistical analysis of codon usage, comparative study of amino acid composition, comparative modeling of the structures of product proteins, prediction of alternative splice forms, and metabolic pathway reconstruction.The software package is freely available under the GNU General Public License at http://www.g-language.org/data/cdna/.

  7. IMAGE Software Suite

    Science.gov (United States)

    Gallagher, Dennis L.; Rose, M. Franklin (Technical Monitor)

    2000-01-01

    The IMAGE Mission is generating a truely unique set of magnetospheric measurement through a first-of-its-kind complement of remote, global observations. These data are being distributed in the Universal Data Format (UDF), which consists of data, calibration, and documentation. This is an open dataset, available to all by request to the National Space Science Data Center (NSSDC) at NASA Goddard Space Flight Center. Browse data, which consists of summary observations, is also available through the NSSDC in the Common Data Format (CDF) and graphic representations of the browse data. Access to the browse data can be achieved through the NSSDC CDAWeb services or by use of NSSDC provided software tools. This presentation documents the software tools, being provided by the IMAGE team, for use in viewing and analyzing the UDF telemetry data. Like the IMAGE data, these tools are openly available. What these tools can do, how they can be obtained, and how they are expected to evolve will be discussed.

  8. Developing a Comprehensive Software Suite for Advanced Reactor Performance and Safety Analysis

    International Nuclear Information System (INIS)

    This paper provides an introduction to the reactor analysis capabilities of the nuclear power reactor simulation tools that are being developed as part of the US Department of Energy’s Nuclear Energy Advanced Modeling and Simulation (NEAMS) Toolkit. The NEAMS Toolkit is an integrated suite of multiphysics simulation tools that leverage high performance computing to reduce uncertainty in the prediction of the performance and safety of advanced reactor and fuel designs. The toolkit effort is composed of two major components, the fuels product line, which provides tools for fuel performance analysis, and the reactor product line, which provides tools for reactor performance and safety analysis. This paper presents an overview of the NEAMS reactor product line development effort. (author)

  9. ORBS, ORCS, OACS, a Software Suite for Data Reduction and Analysis of the Hyperspectral Imagers SITELLE and SpIOMM

    Science.gov (United States)

    Martin, T.; Drissen, L.; Joncas, G.

    2015-09-01

    SITELLE (installed in 2015 at the Canada-France-Hawaii Telescope) and SpIOMM (a prototype attached to the Observatoire du Mont-Mégantic) are the first Imaging Fourier Transform Spectrometers (IFTS) capable of obtaining a hyperspectral data cube which samples a 12 arc minutes field of view into four millions of visible spectra. The result of each observation is made up of two interferometric data cubes which need to be merged, corrected, transformed and calibrated in order to get a spectral cube of the observed region ready to be analysed. ORBS is a fully automatic data reduction software that has been entirely designed for this purpose. The data size (up to 68 Gb for larger science cases) and the computational needs have been challenging and the highly parallelized object-oriented architecture of ORBS reflects the solutions adopted which made possible to process 68 Gb of raw data in less than 11 hours using 8 cores and 22.6 Gb of RAM. It is based on a core framework (ORB) that has been designed to support the whole software suite for data analysis (ORCS and OACS), data simulation (ORUS) and data acquisition (IRIS). They all aim to provide a strong basis for the creation and development of specialized analysis modules that could benefit the scientific community working with SITELLE and SpIOMM.

  10. The Software Architecture of the Upgraded ESA DRAMA Software Suite

    Science.gov (United States)

    Kebschull, Christopher; Flegel, Sven; Gelhaus, Johannes; Mockel, Marek; Braun, Vitali; Radtke, Jonas; Wiedemann, Carsten; Vorsmann, Peter; Sanchez-Ortiz, Noelia; Krag, Holger

    2013-08-01

    In the beginnings of man's space flight activities there was the belief that space is so big that everybody could use it without any repercussions. However during the last six decades the increasing use of Earth's orbits has lead to a rapid growth in the space debris environment, which has a big influence on current and future space missions. For this reason ESA issued the "Requirements on Space Debris Mitigation for ESA Projects" [1] in 2008, which apply to all ESA missions henceforth. The DRAMA (Debris Risk Assessment and Mitigation Analysis) software suite had been developed to support the planning of space missions to comply with these requirements. During the last year the DRAMA software suite has been upgraded under ESA contract by TUBS and DEIMOS to include additional tools and increase the performance of existing ones. This paper describes the overall software architecture of the ESA DRAMA software suite. Specifically the new graphical user interface, which manages the five main tools ARES (Assessment of Risk Event Statistics), MIDAS (MASTER-based Impact Flux and Damage Assessment Software), OSCAR (Orbital Spacecraft Active Removal), CROC (Cross Section of Complex Bodies) and SARA (Re-entry Survival and Risk Analysis) is being discussed. The advancements are highlighted as well as the challenges that arise from the integration of the five tool interfaces. A framework had been developed at the ILR and was used for MASTER-2009 and PROOF-2009. The Java based GUI framework, enables the cross-platform deployment, and its underlying model-view-presenter (MVP) software pattern, meet strict design requirements necessary to ensure a robust and reliable method of operation in an environment where the GUI is separated from the processing back-end. While the GUI framework evolved with each project, allowing an increasing degree of integration of services like validators for input fields, it has also increased in complexity. The paper will conclude with an outlook on

  11. Splicing Express: a software suite for alternative splicing analysis using next-generation sequencing data

    OpenAIRE

    Kroll, Jose E.; Kim, JiHoon; Ohno-Machado, Lucila; de Souza, Sandro J.

    2015-01-01

    Motivation. Alternative splicing events (ASEs) are prevalent in the transcriptome of eukaryotic species and are known to influence many biological phenomena. The identification and quantification of these events are crucial for a better understanding of biological processes. Next-generation DNA sequencing technologies have allowed deep characterization of transcriptomes and made it possible to address these issues. ASEs analysis, however, represents a challenging task especially when many dif...

  12. Spinal Test Suites for Software Product Lines

    OpenAIRE

    Beohar, Harsh; Mousavi, MR Mohammad Reza

    2014-01-01

    A major challenge in testing software product lines is efficiency. In particular, testing a product line should take less effort than testing each and every product individually. We address this issue in the context of input-output conformance testing, which is a formal theory of model-based testing. We extend the notion of conformance testing on input-output featured transition systems with the novel concept of spinal test suites. We show how this concept dispenses with retesting the common ...

  13. A metrics suite for coupling measurement of software architecture

    Institute of Scientific and Technical Information of China (English)

    KONG Qing-yan; LUN Li-jun; ZHAO Jia-hua; WANG Yi-he

    2009-01-01

    To better evaluate the quality of software architecture,a metrics suite is proposed to measure the coupling of software architecture models,in which CBC is used to measure the coupling between components,CBCC is used to measure the coupling of transferring message between components,CBCCT is used to measure the coupling of software architecture, WCBCC is used to measure the coupling of transferring message with weight between components,and WCBCCT is used to measure the coupling of message transmission with weight in the whole software architecture.The proposed algorithm for the coupling metrics is applied to the design of serve software architecture.Analysis of an example validates the feasibility of this metrics suite.

  14. Engineering Software Suite Validates System Design

    Science.gov (United States)

    2007-01-01

    EDAptive Computing Inc.'s (ECI) EDAstar engineering software tool suite, created to capture and validate system design requirements, was significantly funded by NASA's Ames Research Center through five Small Business Innovation Research (SBIR) contracts. These programs specifically developed Syscape, used to capture executable specifications of multi-disciplinary systems, and VectorGen, used to automatically generate tests to ensure system implementations meet specifications. According to the company, the VectorGen tests considerably reduce the time and effort required to validate implementation of components, thereby ensuring their safe and reliable operation. EDASHIELD, an additional product offering from ECI, can be used to diagnose, predict, and correct errors after a system has been deployed using EDASTAR -created models. Initial commercialization for EDASTAR included application by a large prime contractor in a military setting, and customers include various branches within the U.S. Department of Defense, industry giants like the Lockheed Martin Corporation, Science Applications International Corporation, and Ball Aerospace and Technologies Corporation, as well as NASA's Langley and Glenn Research Centers

  15. Strengthening Software Authentication with the ROSE Software Suite

    Energy Technology Data Exchange (ETDEWEB)

    White, G

    2006-06-15

    Many recent nonproliferation and arms control software projects include a software authentication regime. These include U.S. Government-sponsored projects both in the United States and in the Russian Federation (RF). This trend toward requiring software authentication is only accelerating. Demonstrating assurance that software performs as expected without hidden ''backdoors'' is crucial to a project's success. In this context, ''authentication'' is defined as determining that a software package performs only its intended purpose and performs said purpose correctly and reliably over the planned duration of an agreement. In addition to visual inspections by knowledgeable computer scientists, automated tools are needed to highlight suspicious code constructs, both to aid visual inspection and to guide program development. While many commercial tools are available for portions of the authentication task, they are proprietary and not extensible. An open-source, extensible tool can be customized to the unique needs of each project (projects can have both common and custom rules to detect flaws and security holes). Any such extensible tool has to be based on a complete language compiler. ROSE is precisely such a compiler infrastructure developed within the Department of Energy (DOE) and targeted at the optimization of scientific applications and user-defined libraries within large-scale applications (typically applications of a million lines of code). ROSE is a robust, source-to-source analysis and optimization infrastructure currently addressing large, million-line DOE applications in C and C++ (handling the full C, C99, C++ languages and with current collaborations to support Fortran90). We propose to extend ROSE to address a number of security-specific requirements, and apply it to software authentication for nonproliferation and arms control projects.

  16. Strengthening Software Authentication with the ROSE Software Suite

    International Nuclear Information System (INIS)

    Many recent nonproliferation and arms control software projects include a software authentication regime. These include U.S. Government-sponsored projects both in the United States and in the Russian Federation (RF). This trend toward requiring software authentication is only accelerating. Demonstrating assurance that software performs as expected without hidden ''backdoors'' is crucial to a project's success. In this context, ''authentication'' is defined as determining that a software package performs only its intended purpose and performs said purpose correctly and reliably over the planned duration of an agreement. In addition to visual inspections by knowledgeable computer scientists, automated tools are needed to highlight suspicious code constructs, both to aid visual inspection and to guide program development. While many commercial tools are available for portions of the authentication task, they are proprietary and not extensible. An open-source, extensible tool can be customized to the unique needs of each project (projects can have both common and custom rules to detect flaws and security holes). Any such extensible tool has to be based on a complete language compiler. ROSE is precisely such a compiler infrastructure developed within the Department of Energy (DOE) and targeted at the optimization of scientific applications and user-defined libraries within large-scale applications (typically applications of a million lines of code). ROSE is a robust, source-to-source analysis and optimization infrastructure currently addressing large, million-line DOE applications in C and C++ (handling the full C, C99, C++ languages and with current collaborations to support Fortran90). We propose to extend ROSE to address a number of security-specific requirements, and apply it to software authentication for nonproliferation and arms control projects

  17. BioImage Suite: An integrated medical image analysis suite: An update

    OpenAIRE

    Papademetris, Xenophon; Jackowski, Marcel P; Rajeevan, Nallakkandi; DiStasio, Marcello; Okuda, Hirohito; Constable, R. Todd; Staib, Lawrence H.

    2006-01-01

    BioImage Suite is an NIH-supported medical image analysis software suite developed at Yale. It leverages both the Visualization Toolkit (VTK) and the Insight Toolkit (ITK) and it includes many additional algorithms for image analysis especially in the areas of segmentation, registration, diffusion weighted image processing and fMRI analysis. BioImage Suite has a user-friendly user interface developed in the Tcl scripting language. A final beta version is freely available for download 1

  18. CAMEO (Computer-Aided Management of Emergency Operations) Software Suite

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — CAMEO is the umbrella name for a system of software applications used widely to plan for and respond to chemical emergencies. All of the programs in the suite work...

  19. Extending and Enhancing SAS (Static Analysis Suite)

    CERN Document Server

    Ho, David

    2016-01-01

    The Static Analysis Suite (SAS) is an open-source software package used to perform static analysis on C and C++ code, helping to ensure safety, readability and maintainability. In this Summer Student project, SAS was enhanced to improve ease of use and user customisation. A straightforward method of integrating static analysis into a project at compilation time was provided using the automated build tool CMake. The process of adding checkers to the suite was streamlined and simplied by developing an automatic code generator. To make SAS more suitable for continuous integration, a reporting mechanism summarising results was added. This suitability has been demonstrated by inclusion of SAS in the Future Circular Collider Software nightly build system. Scalability of the improved package was demonstrated by using the tool to analyse the ROOT code base.

  20. Approaches in highly parameterized inversion—PEST++ Version 3, a Parameter ESTimation and uncertainty analysis software suite optimized for large environmental models

    Science.gov (United States)

    Welter, David E.; White, Jeremy T.; Hunt, Randall J.; Doherty, John E.

    2015-01-01

    The PEST++ Version 1 object-oriented parameter estimation code is here extended to Version 3 to incorporate additional algorithms and tools to further improve support for large and complex environmental modeling problems. PEST++ Version 3 includes the Gauss-Marquardt-Levenberg (GML) algorithm for nonlinear parameter estimation, Tikhonov regularization, integrated linear-based uncertainty quantification, options of integrated TCP/IP based parallel run management or external independent run management by use of a Version 2 update of the GENIE Version 1 software code, and utilities for global sensitivity analyses. The Version 3 code design is consistent with PEST++ Version 1 and continues to be designed to lower the barriers of entry for users as well as developers while providing efficient and optimized algorithms capable of accommodating large, highly parameterized inverse problems. As such, this effort continues the original focus of (1) implementing the most popular and powerful features of the PEST software suite in a fashion that is easy for novice or experienced modelers to use and (2) developing a software framework that is easy to extend.

  1. Recent advances in the CRANK software suite for experimental phasing

    International Nuclear Information System (INIS)

    Recent developments in the CRANK software suite for experimental phasing have led to many more structures being built automatically. For its first release in 2004, CRANK was shown to effectively detect and phase anomalous scatterers from single-wavelength anomalous diffraction data. Since then, CRANK has been significantly improved and many more structures can be built automatically with single- or multiple-wavelength anomalous diffraction or single isomorphous replacement with anomalous scattering data. Here, the new algorithms that have been developed that have led to these substantial improvements are discussed and CRANK’s performance on over 100 real data sets is shown. The latest version of CRANK is freely available for download at http://www.bfsc.leidenuniv.nl/software/crank/ and from CCP4 (http://www.ccp4.ac.uk/)

  2. User Guide for the STAYSL PNNL Suite of Software Tools

    Energy Technology Data Exchange (ETDEWEB)

    Greenwood, Lawrence R.; Johnson, Christian D.

    2013-02-27

    The STAYSL PNNL software suite provides a set of tools for working with neutron activation rates measured in a nuclear fission reactor, an accelerator-based neutron source, or any neutron field to determine the neutron flux spectrum through a generalized least-squares approach. This process is referred to as neutron spectral adjustment since the preferred approach is to use measured data to adjust neutron spectra provided by neutron physics calculations. The input data consist of the reaction rates based on measured activities, an initial estimate of the neutron flux spectrum, neutron activation cross sections and their associated uncertainties (covariances), and relevant correction factors. The output consists of the adjusted neutron flux spectrum and associated covariance matrix, which is useful for neutron dosimetry and radiation damage calculations.

  3. SUIT

    DEFF Research Database (Denmark)

    Algreen-Ussing, Gregers; Wedebrunn, Ola

    2003-01-01

    Leaflet om project SUIT udgivet af European Commission. Tryksagen forklarer i korte ord resultatet af projektet SUIT. Kulturværdier i Miljøspørgsmål. Vurdering af projekter og indvirkning på miljø.......Leaflet om project SUIT udgivet af European Commission. Tryksagen forklarer i korte ord resultatet af projektet SUIT. Kulturværdier i Miljøspørgsmål. Vurdering af projekter og indvirkning på miljø....

  4. Software FMEA analysis for safety-related application software

    International Nuclear Information System (INIS)

    Highlights: • We develop a modified FMEA analysis suited for applying to software architecture. • A template for failure modes on a specific software language is established. • A detailed-level software FMEA analysis on nuclear safety software is presented. - Abstract: A method of a software safety analysis is described in this paper for safety-related application software. The target software system is a software code installed at an Automatic Test and Interface Processor (ATIP) in a digital reactor protection system (DRPS). For the ATIP software safety analysis, at first, an overall safety or hazard analysis is performed over the software architecture and modules, and then a detailed safety analysis based on the software FMEA (Failure Modes and Effect Analysis) method is applied to the ATIP program. For an efficient analysis, the software FMEA analysis is carried out based on the so-called failure-mode template extracted from the function blocks used in the function block diagram (FBD) for the ATIP software. The software safety analysis by the software FMEA analysis, being applied to the ATIP software code, which has been integrated and passed through a very rigorous system test procedure, is proven to be able to provide very valuable results (i.e., software defects) that could not be identified during various system tests

  5. Assessment and Comparison of Fuzzy Based Test Suite Prioritization Method for GUI Based Software

    OpenAIRE

    Neha Chaudhary; O.P. Sangwan

    2016-01-01

    The testing of event driven software has significant role to improve overall quality of software. Due to event driven nature of GUI based software many test cases are generated and it is difficult to identify test cases whose fault revealing capability is high. To identify those test cases test suite prioritization is done. Various test suite prioritization methods exists for GUI based software in literature. Prioritization methods improve the rate of fault detection. In our previous work we ...

  6. Test Suite Reduction for Regression Testing of Simple Interactions between Two Software Modules

    OpenAIRE

    Dmitry, Kichigin

    2007-01-01

    This paper presents a new test suite reduction technique for regression testing of simple interactions between two software modules. The idea of the technique consists in building models of interactions between two modules and using those models for the test suite reduction. Interaction models are built using sequences of interface functions, invoked during software execution.

  7. SOFAS: Software Analysis Services

    OpenAIRE

    Ghezzi, G

    2010-01-01

    We propose a distributed and collaborative software analysis platform to enable seamless interoperability of software analysis tools across platform, geographical and organizational boundaries. In particular, we devise software analysis tools as services that can be accessed and composed over the Internet. These distributed services shall be widely accessible through a software analysis broker where organizations and research groups can register and share their tools. To enable (semi)-automat...

  8. Design and Development of Ontology Suite for Software Risk Planning, Software Risk Tracking and Software Risk Control

    Directory of Open Access Journals (Sweden)

    C. R.R. Robin

    2011-01-01

    Full Text Available Problem statement: Ontology as a conceptual courseware structure may work as a mind tool for effective teaching and as a visual navigation interface to the learning objects. Knowledge visualization is defined as the use of visual representations to transfer knowledge between at least two persons. This study presents the design, development and visualization of ontologies for Software Risk Planning, Software Risk Tracking and Software Risk Controlling. Approach: The ontologies are developed using protégé tool, an effective ontology editor and it is represented by the formal knowledge representational language OWL. In order to increase the richness of the knowledge available in the ontologies, its semantic representation is presented using ontology document generator. Finally the ontologies are effectively visualised using OntoViz. Results: The ontologies represent the domain knowledge Software Risk Planning, Software Risk Tracking and Software Risk Controlling respectively and is developed with the indention to use it as a knowledge base for effective knowledge representation, Knowledge Management and E-Learning applications. The constructed ontologies are evaluated using quantitative analysis and qualitative analysis. Conclusion: Since the average reuse ratio is 0.95, the developed ontologies are highly cohesive. Comparison of concepts and properties used in the ontologies proved that the developed ontologies are concept oriented ontology. The both quantitative and qualitative analysis says, the developed ontologies are ready to use for applications such as E-Learning, Knowledge Management.

  9. HAZARD ANALYSIS SOFTWARE

    International Nuclear Information System (INIS)

    Washington Safety Management Solutions, LLC developed web-based software to improve the efficiency and consistency of hazard identification and analysis, control selection and classification, and to standardize analysis reporting at Savannah River Site. In the new nuclear age, information technology provides methods to improve the efficiency of the documented safety analysis development process which includes hazard analysis activities. This software provides a web interface that interacts with a relational database to support analysis, record data, and to ensure reporting consistency. A team of subject matter experts participated in a series of meetings to review the associated processes and procedures for requirements and standard practices. Through these meetings, a set of software requirements were developed and compiled into a requirements traceability matrix from which software could be developed. The software was tested to ensure compliance with the requirements. Training was provided to the hazard analysis leads. Hazard analysis teams using the software have verified its operability. The software has been classified as NQA-1, Level D, as it supports the analysis team but does not perform the analysis. The software can be transported to other sites with alternate risk schemes. The software is being used to support the development of 14 hazard analyses. User responses have been positive with a number of suggestions for improvement which are being incorporated as time permits. The software has enforced a uniform implementation of the site procedures. The software has significantly improved the efficiency and standardization of the hazard analysis process

  10. Multi Objective Test Suite Reduction for GUI Based Software Using NSGA-II

    Directory of Open Access Journals (Sweden)

    Neha Chaudhary

    2016-08-01

    Full Text Available Regression Testing is a performed to ensure modified code does not have any unintended side effect on the software. If regression testing is performed with retest-all method it will be very time consuming as testing activity. Therefore test suite reduction methods are used to reduce the size of original test suite. Objective of test suite reduction is to reduce those test cases which are redundant or less important in their fault revealing capability. Test suite reduction can only be used when time is critical to run all test cases and selective testing can only be done. Various methods exist in the literature related to test suite reduction of traditional software. Most of the methods are based of single objective optimization. In case of multi objective optimization of test suite, usually researchers assign different weight values to different objectives and combine them as single objective. However in test suite reduction multiple Pareto-optimal solutions are present, it is difficult to select one test case over other. Since GUI based software is our concern there exist very few reduction techniques and none of them consider multiple objective based reduction. In this work we propose a new test suite reduction technique based on two objectives, event weight and number of faults identified by test case. We evaluated our results for 2 different applications and we achieved 20% reduction in test suite size for both applications. In Terp Paint 3.0 application compromise 15.6% fault revealing capability and for Notepad 11.1% fault revealing capability is reduced.

  11. Rietveld analysis software for J-PARC

    International Nuclear Information System (INIS)

    A new analysis software suite, Z-Code, is under development for powder diffraction data analyses in the Materials and Life Science Facility (MLF) of the Japan Proton Accelerator Research Complex (J-PARC). This software suite comprises data processing, data analyses, graphical user interface and visualization software. As a part of Z-Code, a Rietveld analysis program for neutron (TOF and angle dispersive) and X-ray data, Z-Rietveld, has been developed. Here we report the basic traits and some significant features of Z-Rietveld.

  12. Software Suite to Support In-Flight Characterization of Remote Sensing Systems

    Science.gov (United States)

    Stanley, Thomas; Holekamp, Kara; Gasser, Gerald; Tabor, Wes; Vaughan, Ronald; Ryan, Robert; Pagnutti, Mary; Blonski, Slawomir; Kenton, Ross

    2014-01-01

    A characterization software suite was developed to facilitate NASA's in-flight characterization of commercial remote sensing systems. Characterization of aerial and satellite systems requires knowledge of ground characteristics, or ground truth. This information is typically obtained with instruments taking measurements prior to or during a remote sensing system overpass. Acquired ground-truth data, which can consist of hundreds of measurements with different data formats, must be processed before it can be used in the characterization. Accurate in-flight characterization of remote sensing systems relies on multiple field data acquisitions that are efficiently processed, with minimal error. To address the need for timely, reproducible ground-truth data, a characterization software suite was developed to automate the data processing methods. The characterization software suite is engineering code, requiring some prior knowledge and expertise to run. The suite consists of component scripts for each of the three main in-flight characterization types: radiometric, geometric, and spatial. The component scripts for the radiometric characterization operate primarily by reading the raw data acquired by the field instruments, combining it with other applicable information, and then reducing it to a format that is appropriate for input into MODTRAN (MODerate resolution atmospheric TRANsmission), an Air Force Research Laboratory-developed radiative transport code used to predict at-sensor measurements. The geometric scripts operate by comparing identified target locations from the remote sensing image to known target locations, producing circular error statistics defined by the Federal Geographic Data Committee Standards. The spatial scripts analyze a target edge within the image, and produce estimates of Relative Edge Response and the value of the Modulation Transfer Function at the Nyquist frequency. The software suite enables rapid, efficient, automated processing of

  13. Data analysis and graphing in an introductory physics laboratory: spreadsheet versus statistics suite

    International Nuclear Information System (INIS)

    Two methods of data analysis are compared: spreadsheet software and a statistics software suite. Their use is compared analysing data collected in three selected experiments taken from an introductory physics laboratory, which include a linear dependence, a nonlinear dependence and a histogram. The merits of each method are compared.

  14. Data Analysis and Graphing in an Introductory Physics Laboratory: Spreadsheet versus Statistics Suite

    Science.gov (United States)

    Peterlin, Primoz

    2010-01-01

    Two methods of data analysis are compared: spreadsheet software and a statistics software suite. Their use is compared analysing data collected in three selected experiments taken from an introductory physics laboratory, which include a linear dependence, a nonlinear dependence and a histogram. The merits of each method are compared. (Contains 7…

  15. Technical Note: DIRART- A software suite for deformable image registration and adaptive radiotherapy research

    International Nuclear Information System (INIS)

    Purpose: Recent years have witnessed tremendous progress in image guide radiotherapy technology and a growing interest in the possibilities for adapting treatment planning and delivery over the course of treatment. One obstacle faced by the research community has been the lack of a comprehensive open-source software toolkit dedicated for adaptive radiotherapy (ART). To address this need, the authors have developed a software suite called the Deformable Image Registration and Adaptive Radiotherapy Toolkit (DIRART). Methods: DIRART is an open-source toolkit developed in MATLAB. It is designed in an object-oriented style with focus on user-friendliness, features, and flexibility. It contains four classes of DIR algorithms, including the newer inverse consistency algorithms to provide consistent displacement vector field in both directions. It also contains common ART functions, an integrated graphical user interface, a variety of visualization and image-processing features, dose metric analysis functions, and interface routines. These interface routines make DIRART a powerful complement to the Computational Environment for Radiotherapy Research (CERR) and popular image-processing toolkits such as ITK. Results: DIRART provides a set of image processing/registration algorithms and postprocessing functions to facilitate the development and testing of DIR algorithms. It also offers a good amount of options for DIR results visualization, evaluation, and validation. Conclusions: By exchanging data with treatment planning systems via DICOM-RT files and CERR, and by bringing image registration algorithms closer to radiotherapy applications, DIRART is potentially a convenient and flexible platform that may facilitate ART and DIR research.

  16. Software safety hazard analysis

    International Nuclear Information System (INIS)

    Techniques for analyzing the safety and reliability of analog-based electronic protection systems that serve to mitigate hazards in process control systems have been developed over many years, and are reasonably well understood. An example is the protection system in a nuclear power plant. The extension of these techniques to systems which include digital computers is not well developed, and there is little consensus among software engineering experts and safety experts on how to analyze such systems. One possible technique is to extend hazard analysis to include digital computer-based systems. Software is frequently overlooked during system hazard analyses, but this is unacceptable when the software is in control of a potentially hazardous operation. In such cases, hazard analysis should be extended to fully cover the software. A method for performing software hazard analysis is proposed in this paper

  17. Software for Schenkerian Analysis

    OpenAIRE

    Marsden, Alan

    2011-01-01

    Software developed to automate the process of Schen-kerian analysis is described. The current state of the art is that moderately good analyses of small extracts can be generated, but more information is required about the criteria by which analysts make decisions among alternative interpretations in the course of analysis. The software described here allows the procedure of reduction to be examined while in process, allowing decision points, and potentially criteria, to become clear.

  18. ANALYSIS OF DESIGN ELEMENTS IN SKI SUITS

    Directory of Open Access Journals (Sweden)

    Birsen Çileroğlu

    2014-06-01

    Full Text Available Popularity of Ski Sport in 19th century necessitated a new perspective on protective skiing clothing ag ainst the mountain climates and excessive cold. Winter clothing were the basis of ski attire during this period. By the beginning of 20th century lining cloth were used to minimize the wind effect. The difference between the men and women’s ski attire of the time consisted of a knee - length skirts worn over the golf trousers. Subsequent to the First World War, skiing suit models were influenced by the period uniforms and the producers reflected the fashion trends to the ski clothing. In conformance with th e prevailing trends, ski trousers were designed and produced for the women thus leading to reduction in gender differences. Increases in the ski tourism and holding of the first winter olympics in 1924 resulted in variations in ski attires, development of design characteristics, growth in user numbers, and enlargement of production capacities. Designers emphasized in their collections combined presence of elegance and practicality in the skiing attire. In 1930s, the ski suits influenced by pilots’ uniforms included characteristics permitting freedom of motion, and the design elements exhibited changes in terms of style, material and aerodynamics. In time, the ski attires showed varying design features distinguishing professionals from the amateurs. While protective functionality was primary consideration for the amateurs, for professionals the aerodynamic design was also a leading factor. Eventually, the increased differences in design characteristics were exhibited in ski suit collections, World reknown brands were formed, production and sales volumes showed significant rise. During 20th century the ski suits influenced by fashion trends to acquire unique styles reached a position of dominance to impact current fashion trends, and apart from sports attir es they became a style determinant in the clothing of cold climates. Ski suits

  19. Xmipp 3.0: an improved software suite for image processing in electron microscopy.

    Science.gov (United States)

    de la Rosa-Trevín, J M; Otón, J; Marabini, R; Zaldívar, A; Vargas, J; Carazo, J M; Sorzano, C O S

    2013-11-01

    Xmipp is a specialized software package for image processing in electron microscopy, and that is mainly focused on 3D reconstruction of macromolecules through single-particles analysis. In this article we present Xmipp 3.0, a major release which introduces several improvements and new developments over the previous version. A central improvement is the concept of a project that stores the entire processing workflow from data import to final results. It is now possible to monitor, reproduce and restart all computing tasks as well as graphically explore the complete set of interrelated tasks associated to a given project. Other graphical tools have also been improved such as data visualization, particle picking and parameter "wizards" that allow the visual selection of some key parameters. Many standard image formats are transparently supported for input/output from all programs. Additionally, results have been standardized, facilitating the interoperation between different Xmipp programs. Finally, as a result of a large code refactoring, the underlying C++ libraries are better suited for future developments and all code has been optimized. Xmipp is an open-source package that is freely available for download from: http://xmipp.cnb.csic.es. PMID:24075951

  20. Design and Development of Ontology Suite for Software Risk Planning, Software Risk Tracking and Software Risk Control

    OpenAIRE

    C. R.R. Robin; G.V. Uma

    2011-01-01

    Problem statement: Ontology as a conceptual courseware structure may work as a mind tool for effective teaching and as a visual navigation interface to the learning objects. Knowledge visualization is defined as the use of visual representations to transfer knowledge between at least two persons. This study presents the design, development and visualization of ontologies for Software Risk Planning, Software Risk Tracking and Software Risk Controlling. Approach: The ontolog...

  1. STING Millennium Suite: integrated software for extensive analyses of 3d structures of proteins and their complexes

    Directory of Open Access Journals (Sweden)

    Yamagishi Michel EB

    2004-08-01

    Full Text Available Abstract Background The integration of many aspects of protein/DNA structure analysis is an important requirement for software products in general area of structural bioinformatics. In fact, there are too few software packages on the internet which can be described as successful in this respect. We might say that what is still missing is publicly available, web based software for interactive analysis of the sequence/structure/function of proteins and their complexes with DNA and ligands. Some of existing software packages do have certain level of integration and do offer analysis of several structure related parameters, however not to the extent generally demanded by a user. Results We are reporting here about new Sting Millennium Suite (SMS version which is fully accessible (including for local files at client end, web based software for molecular structure and sequence/structure/function analysis. The new SMS client version is now operational also on Linux boxes and it works with non-public pdb formatted files (structures not deposited at the RCSB/PDB, eliminating earlier requirement for the registration if SMS components were to be used with user's local files. At the same time the new SMS offers some important additions and improvements such as link to ProTherm as well as significant re-engineering of SMS component ConSSeq. Also, we have added 3 new SMS mirror sites to existing network of global SMS servers: Argentina, Japan and Spain. Conclusion SMS is already established software package and many key data base and software servers worldwide, do offer either a link to, or host the SMS. SMS (Sting Millennium Suite is web-based publicly available software developed to aid researches in their quest for translating information about the structures of macromolecules into knowledge. SMS allows to a user to interactively analyze molecular structures, cross-referencing visualized information with a correlated one, available across the internet. SMS

  2. SPACE: a suite of tools for protein structure prediction and analysis based on complementarity and environment.

    Science.gov (United States)

    Sobolev, Vladimir; Eyal, Eran; Gerzon, Sergey; Potapov, Vladimir; Babor, Mariana; Prilusky, Jaime; Edelman, Marvin

    2005-07-01

    We describe a suite of SPACE tools for analysis and prediction of structures of biomolecules and their complexes. LPC/CSU software provides a common definition of inter-atomic contacts and complementarity of contacting surfaces to analyze protein structure and complexes. In the current version of LPC/CSU, analyses of water molecules and nucleic acids have been added, together with improved and expanded visualization options using Chime or Java based Jmol. The SPACE suite includes servers and programs for: structural analysis of point mutations (MutaProt); side chain modeling based on surface complementarity (SCCOMP); building a crystal environment and analysis of crystal contacts (CryCo); construction and analysis of protein contact maps (CMA) and molecular docking software (LIGIN). The SPACE suite is accessed at http://ligin.weizmann.ac.il/space. PMID:15980496

  3. SAGES: A Suite of Freely-Available Software Tools for Electronic Disease Surveillance in Resource-Limited Settings

    Science.gov (United States)

    Lewis, Sheri L.; Feighner, Brian H.; Loschen, Wayne A.; Wojcik, Richard A.; Skora, Joseph F.; Coberly, Jacqueline S.; Blazes, David L.

    2011-01-01

    Public health surveillance is undergoing a revolution driven by advances in the field of information technology. Many countries have experienced vast improvements in the collection, ingestion, analysis, visualization, and dissemination of public health data. Resource-limited countries have lagged behind due to challenges in information technology infrastructure, public health resources, and the costs of proprietary software. The Suite for Automated Global Electronic bioSurveillance (SAGES) is a collection of modular, flexible, freely-available software tools for electronic disease surveillance in resource-limited settings. One or more SAGES tools may be used in concert with existing surveillance applications or the SAGES tools may be used en masse for an end-to-end biosurveillance capability. This flexibility allows for the development of an inexpensive, customized, and sustainable disease surveillance system. The ability to rapidly assess anomalous disease activity may lead to more efficient use of limited resources and better compliance with World Health Organization International Health Regulations. PMID:21572957

  4. Design and functionalities of the MADOR® software suite for dose-reduction management after DTPA therapy.

    Science.gov (United States)

    Leprince, B; Fritsch, P; Bérard, P; Roméo, P-H

    2016-03-01

    A software suite on biokinetics of radionuclides and internal dosimetry intended for the occupational health practitioners of nuclear industry and for expert opinions has been developed under Borland C++ Builder™. These computing tools allow physicians to improve the dosimetric follow-up of workers in agreement with the French regulations and to manage new internal contaminations by radionuclides such as Pu and/or Am after diethylene triamine penta-acetic acid treatments. In this paper, the concept and functionalities of the first two computing tools of this MADOR(®) suite are described. The release 0.0 is the forensic application, which allows calculating the derived recording levels for intake by inhalation or ingestion of the main radioisotopes encountered in occupational environment. Indeed, these reference values of activity are convenient to interpret rapidly the bioassay measurements and make decisions as part of medical monitoring. The release 1.0 addresses the effect of DTPA treatments on Pu/Am biokinetics and the dose benefit. The forensic results of the MADOR(®) suite were validated by comparison with reference data. PMID:25999333

  5. Design and functionalities of the MADORR software suite for dose-reduction management after DTPA therapy

    International Nuclear Information System (INIS)

    A software suite on biokinetics of radionuclides and internal dosimetry intended for the occupational health practitioners of nuclear industry and for expert opinions has been developed under Borland C++ BuilderTM. These computing tools allow physicians to improve the dosimetric follow-up of workers in agreement with the French regulations and to manage new internal contaminations by radionuclides such as Pu and/or Am after diethylene triamine penta-acetic acid treatments. In this paper, the concept and functionalities of the first two computing tools of this MADORR suite are described. The release 0.0 is the forensic application, which allows calculating the derived recording levels for intake by inhalation or ingestion of the main radioisotopes encountered in occupational environment. Indeed, these reference values of activity are convenient to interpret rapidly the bioassay measurements and make decisions as part of medical monitoring. The release 1.0 addresses the effect of DTPA treatments on Pu/Am biokinetics and the dose benefit. The forensic results of the MADORR suite were validated by comparison with reference data. (authors)

  6. Automatic Feature Interaction Analysis in PacoSuite

    Directory of Open Access Journals (Sweden)

    Wim Vanderperren

    2004-10-01

    Full Text Available In this paper, we build upon previous work that aims at recuperating aspect oriented ideas into component based software development. In that research, a composition adapter was proposed in order to capture crosscutting concerns in the PacoSuite component based methodology. A composition adapter is visually applied onto a given component composition and the changes it describes are automatically applied. Stacking multiple composition adapters onto the same component composition can however lead to unpredictable and undesired side-effects. In this paper, we propose a solution for this issue, widely known as the feature interaction problem. We present a classification of different interaction levels among composition adapters and the algorithms required to verify them. The proposed algorithms are however of exponential nature and depend on both the composition adapters and the component composition as a whole. In order to enhance the performance of our feature interaction analysis, we present a set of theorems that define the interaction levels solely in terms of the properties of the composition adapters themselves.

  7. Kinematic Analysis of Exoskeleton Suit for Human Arm

    Directory of Open Access Journals (Sweden)

    Surachai Panich

    2010-01-01

    Full Text Available Problem statement: There are many robotic arms developed for providing care to physically disabled people. It is difficult to find robot designs in literature that articulate such a procedure. Therefore, it is our hope that the design work shown in this study may serve as a good example of a systematic method for rehabilitation robot design. Approach: The arm exoskeleton suit was developed to increase human's strength, endurance, or speed enabling them to perform tasks that they previously could not perform. It should not impede the user's natural motion and should be comfortable and safe to wear and easy to use. Although movement is difficult for them, they usually want to go somewhere by themselves. Results: The kinematic exoskeleton suit for human arms is simulated by MATLAB software. The exoskeleton suit of human arm consists of one link length, three link twists, two link offsets and three joint angles. Conclusion: This study introduced the kinematic of exoskeleton suit for human arm. The exoskeleton suit can be used to be instrument for anyone who needs to improve human's performance. It will increase the strength of human that can lift heavy load or help handicapped patients, who cannot use their arm.

  8. The Toast++ software suite for forward and inverse modeling in optical tomography.

    Science.gov (United States)

    Schweiger, Martin; Arridge, Simon

    2014-04-01

    We present the Toast++ open-source software environment for solving the forward and inverse problems in diffuse optical tomography (DOT). The software suite consists of a set of libraries to simulate near-infrared light propagation in highly scattering media with complex boundaries and heterogeneous internal parameter distribution, based on a finite-element solver. Steady-state, time- and frequency-domain data acquisition systems can be modeled. The forward solver is implemented in C++ and supports performance acceleration with parallelization for shared and distributed memory architectures, as well as graphics processing computation. Building on the numerical forward solver, Toast++ contains model-based iterative inverse solvers for reconstructing the volume distribution of absorption and scattering parameters from boundary measurements of light transmission. A range of regularization methods are provided, including the possibility of incorporating prior knowledge of internal structure. The user can link to the Toast++ libraries either directly to compile application programs for DOT, or make use of the included MATLAB and PYTHON bindings to generate script-based solutions. This approach allows rapid prototyping and provides a rich toolset in both environments for debugging, testing, and visualization. PMID:24781586

  9. ABC Tester - Artificial Bee Colony Based Software Test Suite Optimization Approach

    Directory of Open Access Journals (Sweden)

    D. Jeya Mala

    2009-07-01

    Full Text Available In this paper we present a new, non-pheromone-based test suite optimization approach inspired by the behavior of biological bees. Our proposed approach is based on ABC (Artificial Bee Colony Optimization which is motivated by the intelligent behavior of honey bees. In our proposed system, the sites are the nodes in the Software under Test (SUT, the artificial bees modify the test cases with time and the bee?s aim is to discover the places of nodes with higher coverage and finally the one with the highest usage by the given test case. Since ABC system combines local search methods carried out by employed bees with global search methods managed by onlookers and scouts, we attain near global optima. We investigate whether this new approach outperforms existing test optimization approach based on Genetic Algorithms (GA in the task of software test optimization. Taking into account the results of our experiments, we conclude that (i the proposed approach uses fewer iterations to complete the task; (ii is more scalable, i.e., it requires less computation time to complete the task, and finally (iii our approach is best in achieving near global optimal solution.

  10. Niche idea : Pandell's Nexus suite of back-office software developed with juniors in mind

    Energy Technology Data Exchange (ETDEWEB)

    Wells, P.

    2007-07-15

    Pandell Technology Corporation has developed a complete suite of back-office software products for the junior oil and gas sector. The Nexus product line that was developed for this niche market includes JVNexus for joint venture financial accounting, AFENexus for expenditure tracking, GeoNexus for land management, and EANexus for economic analysis. Each application can help junior to midsize oil and gas companies capitalize on their internal resources, providing them an affordable way to acquire the services they need to support their business. Clients can acquire the software through a software-as-a-service (SaaS) business model. Microsoft has supported Pandell's efforts to offer clients better products and services through SaaS. The four systems cost between $450 and $750 each with no additional upfront capital expenditures. This article also listed companies that have adopted Nexus products, including Annex Petroleum Inc., Delphi Energy Corporation, and Innova Exploration Limited. Pandell is currently working on a new and improved version of GeoNexus, which will be fully Web-enabled. 1 fig.

  11. The MineTool Software Suite: A Novel Data Mining Palette of Tools for Automated Modeling of Space Physics Data

    Science.gov (United States)

    Sipes, T.; Karimabadi, H.; Roberts, A.

    2009-12-01

    We present a new data mining software tool called MineTool for analysis and modeling of space physics data. MineTool is a graphical user interface implementation that merges two data mining algorithms into an easy-to-use software tool: an algorithm for analysis and modeling of static data [Karimabadi et al, 2007] and MineTool-TS, an algorithm for data mining of time series data [Karimabadi et al, 2009]. By virtue of automating the modeling process and model evaluations, MineTool makes data mining and predictive modeling more accessible to non-experts. The software is entirely in Java and freeware. By ranking all inputs as predictors of the outcome before constructing a model, MineTool enables inclusion of only relevant variables as well. The technique aggregates the various stages of model building into a four-step process consisting of (i) data segmentation and sampling, (ii) variable pre-selection and transform generation, (iii) predictive model estimation and validation, and (iv) final model selection. Optimal strategies are chosen for each modeling step. A notable feature of the technique is that the final model is always in closed analytical form rather than “black box” form characteristic of some other techniques. Having the analytical model enables deciphering the importance of various variables to affecting the outcome. MineTool suite also provides capabilities for data preparation for data mining as well as visualization of the datasets. MineTool has successfully been used to develop models for automated detection of flux transfer events (FTEs) at Earth’s magnetopause in the Cluster spacecraft time series data and 3D magnetopause modeling. In this presentation, we demonstrate the ease of use of the software through examples including how it was used in the FTE problem.

  12. Comparative Analysis of MOGA, NSGA-II and MOPSO for Regression Test Suite Optimization

    Directory of Open Access Journals (Sweden)

    Zeeshan Anwar

    2014-01-01

    Full Text Available In Software Engineering Regression Testing is a mandatory activity. Whenever, a change in existing system occurs and new version appears, the unchanged portions need to be regression tested for any resulting undesirable effects. During process of Regression Testing, same test cases are executed repeatedly for un-modified portion of software. This activity is an overhead and consumes huge resources and budget. To save time and resources, researches have proposed various techniques for Regression Test Suite Optimization. In this research regression test suites are minimized using three Computational Intelligence multi-objective techniques for black box testing methods. These include; 1- Multi-Objective Genetic Algorithms (MOGA, 2- Non-Dominated Sorting Genetic Algorithm (NSGA-II and 3- Multi-Objective Particle Swarm Optimization (MOPSO. Said techniques are applied on two published case studies and through experimentation, the quality of these techniques is analyzed. Four quality metrics are defined to perform this analysis. The results of research show that MOGA is better for reducing the size and thus execution time of the regression test suites as compared to MOPSO and NSGA-II. It was also found that use of MOGA, NSGA-II and MOPSO are not safe for regression test suite optimization. This is because fault detection rate and requirement coverage is reduced after optimization of Regression Test Suites.

  13. The application of formal software engineering methods to the unattended and remote monitoring software suite at Los Alamos National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Determan, John Clifford [Los Alamos National Laboratory; Longo, Joseph F [Los Alamos National Laboratory; Michel, Kelly D [Los Alamos National Laboratory

    2009-01-01

    The Unattended and Remote Monitoring (UNARM) system is a collection of specialized hardware and software used by the International Atomic Energy Agency (IAEA) to institute nuclear safeguards at many nuclear facilities around the world. The hardware consists of detectors, instruments, and networked computers for acquiring various forms of data, including but not limited to radiation data, global position coordinates, camera images, isotopic data, and operator declarations. The software provides two primary functions: the secure and reliable collection of this data from the instruments and the ability to perform an integrated review and analysis of the disparate data sources. Several years ago the team responsible for maintaining the software portion of the UNARM system began the process of formalizing its operations. These formal operations include a configuration management system, a change control board, an issue tracking system, and extensive formal testing, for both functionality and reliability. Functionality is tested with formal test cases chosen to fully represent the data types and methods of analysis that will be commonly encountered. Reliability is tested with iterative, concurrent testing where up to five analyses are executed simultaneously for thousands of cycles. Iterative concurrent testing helps ensure that there are no resource conflicts or leaks when multiple system components are in use simultaneously. The goal of this work is to provide a high quality, reliable product, commensurate with the criticality of the application. Testing results will be presented that demonstrate that this goal has been achieved and the impact of the introduction of a formal software engineering framework to the UNARM product will be presented.

  14. Space suit bioenergetics: framework and analysis of unsuited and suited activity.

    Science.gov (United States)

    Carr, Christopher E; Newman, Dava J

    2007-11-01

    Metabolic costs limit the duration and intensity of extravehicular activity (EVA), an essential component of future human missions to the Moon and Mars. Energetics Framework: We present a framework for comparison of energetics data across and between studies. This framework, applied to locomotion, differentiates between muscle efficiency and energy recovery, two concepts often confused in the literature. The human run-walk transition in Earth gravity occurs at the point for which energy recovery is approximately the same for walking and running, suggesting a possible role for recovery in gait transitions. Muscular Energetics: Muscle physiology limits the overall efficiency by which chemical energy is converted through metabolism to useful work. Unsuited Locomotion: Walking and running use different methods of energy storage and release. These differences contribute to the relative changes in the metabolic cost of walking and running as gravity is varied, with the metabolic cost of locomoting at a given velocity changing in proportion to gravity for running and less than in proportion for walking. Space Suits: Major factors affecting the energetic cost of suited movement include suit pressurization, gravity, velocity, surface slope, and space suit configuration. Apollo lunar surface EVA traverse metabolic rates, while unexpectedly low, were higher than other activity categories. The Lunar Roving Vehicle facilitated even lower metabolic rates, thus longer duration EVAs. Muscles and tendons act like springs during running; similarly, longitudinal pressure forces in gas pressure space suits allow spring-like storage and release of energy when suits are self-supporting. PMID:18018432

  15. CCP4 Software Suite: history, evolution, content, challenges and future developments

    Directory of Open Access Journals (Sweden)

    Krissinel, Eugene

    2015-04-01

    Full Text Available Collaborative Computational Project Number 4 (CCP4 in Protein Crystallography is a public resource for producing and supporting a world-leading, integrated Suite of programs that allows researchers to determine macromolecular structures by X-ray crystallography, and other biophysical techniques. CCP4 supports the widest possible researcher community, embracing academic, not for profit, and for profit research. The primary aims of CCP4 include development and support of the development of cutting edge approaches to experimental determination and analysis of protein structure, with integration of them into the suite for worldwide dissemination. In addition, CCP4 plays an important role in the education and training of scientists in experimental structural biology. In this paper, we overview CCP4’s 35-year long history and (technical milestones of its evolution. We will also consider how a particular structure of CCP4 Suite and Collaboration has emerged, its main functionality, current state and plans for future.“Collaborative Computational Project Number 4 (CCP4” en Cristalografía de Proteínas es un recurso público líder mundial, encaminado a producir y mantener un conjunto integrado de programas que permite a los investigadores determinar estructuras macromoleculares mediante cristalografía de rayos-X, así como por otras técnicas biofísicas. CCP4 va dirigido a la más amplia comunidad científica posible, abarcando investigaciones en el ámbito académico, tanto sin ánimo de lucro como con él. Sus objetivos principales incluyen el desarrollo y soporte de metodologías punteras para la determinación y análisis de estructuras de proteínas, integradas en un conjunto bien definido para facilitar su fácil difusión mundial. Además, CCP4 juega un papel importante en la formación y entrenamiento de científicos en biología estructural experimental. En este artículo, ofreceré una visión de conjunto de la larga historia e hitos t

  16. Distributed and collaborative software analysis

    OpenAIRE

    Ghezzi, G; H.C. Gall

    2010-01-01

    Throughout the years software engineers have come up with a myriad of specialized tools and techniques that focus on a certain type of software analysis such as source code analysis, duplication analysis, co-change analysis, bug prediction, or detection of bug fixing patterns. However, easy and straight forward synergies between these analyses and tools rarely exist because of their stand-alone nature, their platform dependence, their different input and output formats and the variety of data...

  17. The Sample Analysis at Mars Investigation and Instrument Suite

    Science.gov (United States)

    Mahaffy, Paul; Webster, Chris R.; Cabane, M.; Conrad, Pamela G.; Coll, Patrice; Atreya, Sushil K.; Arvey, Robert; Barciniak, Michael; Benna, Mehdi; Bleacher, L.; Brinckerhoff, William B.; Eigenbrode, Jennifer L.; Carignan, Daniel; Cascia, Mark; Chalmers, Robert A.; Dworkin, Jason P.; Errigo, Therese; Everson, Paula; Franz, Heather; Farley, Rodger; Feng, Steven; Frazier, Gregory; Freissinet, Caroline; Glavin, Daniel P.; Harpold, Daniel N.

    2012-01-01

    The Sample Analysis at Mars (SAM) investigation of the Mars Science Laboratory(MSL) addresses the chemical and isotopic composition of the atmosphere and volatilesextracted from solid samples. The SAM investigation is designed to contribute substantiallyto the mission goal of quantitatively assessing the habitability of Mars as an essentialstep in the search for past or present life on Mars. SAM is a 40 kg instrument suite locatedin the interior of MSLs Curiosity rover. The SAM instruments are a quadrupole massspectrometer, a tunable laser spectrometer, and a 6-column gas chromatograph all coupledthrough solid and gas processing systems to provide complementary information on thesame samples. The SAM suite is able to measure a suite of light isotopes and to analyzevolatiles directly from the atmosphere or thermally released from solid samples. In additionto measurements of simple inorganic compounds and noble gases SAM will conducta sensitive search for organic compounds with either thermal or chemical extraction fromsieved samples delivered by the sample processing system on the Curiosity rovers roboticarm.

  18. Data processing software suite SITENNO for coherent X-ray diffraction imaging using the X-ray free-electron laser SACLA

    International Nuclear Information System (INIS)

    The software suite SITENNO is developed for processing diffraction data collected in coherent X-ray diffraction imaging experiments of non-crystalline particles using an X-ray free-electron laser. Coherent X-ray diffraction imaging is a promising technique for visualizing the structures of non-crystalline particles with dimensions of micrometers to sub-micrometers. Recently, X-ray free-electron laser sources have enabled efficient experiments in the ‘diffraction before destruction’ scheme. Diffraction experiments have been conducted at SPring-8 Angstrom Compact free-electron LAser (SACLA) using the custom-made diffraction apparatus KOTOBUKI-1 and two multiport CCD detectors. In the experiments, ten thousands of single-shot diffraction patterns can be collected within several hours. Then, diffraction patterns with significant levels of intensity suitable for structural analysis must be found, direct-beam positions in diffraction patterns determined, diffraction patterns from the two CCD detectors merged, and phase-retrieval calculations for structural analyses performed. A software suite named SITENNO has been developed to semi-automatically apply the four-step processing to a huge number of diffraction data. Here, details of the algorithm used in the suite are described and the performance for approximately 9000 diffraction patterns collected from cuboid-shaped copper oxide particles reported. Using the SITENNO suite, it is possible to conduct experiments with data processing immediately after the data collection, and to characterize the size distribution and internal structures of the non-crystalline particles

  19. Software analysis handbook: Software complexity analysis and software reliability estimation and prediction

    Science.gov (United States)

    Lee, Alice T.; Gunn, Todd; Pham, Tuan; Ricaldi, Ron

    1994-01-01

    This handbook documents the three software analysis processes the Space Station Software Analysis team uses to assess space station software, including their backgrounds, theories, tools, and analysis procedures. Potential applications of these analysis results are also presented. The first section describes how software complexity analysis provides quantitative information on code, such as code structure and risk areas, throughout the software life cycle. Software complexity analysis allows an analyst to understand the software structure, identify critical software components, assess risk areas within a software system, identify testing deficiencies, and recommend program improvements. Performing this type of analysis during the early design phases of software development can positively affect the process, and may prevent later, much larger, difficulties. The second section describes how software reliability estimation and prediction analysis, or software reliability, provides a quantitative means to measure the probability of failure-free operation of a computer program, and describes the two tools used by JSC to determine failure rates and design tradeoffs between reliability, costs, performance, and schedule.

  20. An R package suite for microarray meta-analysis in quality control, differentially expressed gene analysis and pathway enrichment detection

    OpenAIRE

    Wang, Xingbin; Kang, Dongwan D.; Shen, Kui; Song, Chi; Lu, Shuya; Chang, Lun-Ching; Liao, Serena G.; Huo, Zhiguang; Tang, Shaowu; Ding, Ying; Kaminski, Naftali; Sibille, Etienne; Lin, Yan; Li, Jia; Tseng, George C.

    2012-01-01

    Summary: With the rapid advances and prevalence of high-throughput genomic technologies, integrating information of multiple relevant genomic studies has brought new challenges. Microarray meta-analysis has become a frequently used tool in biomedical research. Little effort, however, has been made to develop a systematic pipeline and user-friendly software. In this article, we present MetaOmics, a suite of three R packages MetaQC, MetaDE and MetaPath, for quality control, differentially expre...

  1. Software engineering with analysis patterns

    OpenAIRE

    Geyer-Schulz, Andreas; Hahsler, Michael

    2001-01-01

    The purpose of this article is twofold, first to promote the use of patterns in the analysis phase of the software life-cycle by proposing an outline template for analysis patterns that strongly supports the whole analysis process from the requirements analysis to the analysis model and further on to its transformation into a flexible design. Second we present, as an example, a family of analysis patterns that deal with a series of pressing problems in cooperative work, collaborative informat...

  2. Genex: Data Analysis Software

    Czech Academy of Sciences Publication Activity Database

    Kubista, Mikael; Rusňáková, Vendula; Švec, David; Sjögreen, B.; Tichopád, Aleš

    Essex: Caister Academic Press, 2012 - (Filion, M.), s. 63-84 ISBN 978-1-908230-01-0 Institutional research plan: CEZ:AV0Z50520701 Keywords : qPCR data analysis * real-time PCR * GenEx Subject RIV: EB - Genetics ; Molecular Biology www.caister.com

  3. DelPhi: a comprehensive suite for DelPhi software and associated resources

    Directory of Open Access Journals (Sweden)

    Li Lin

    2012-05-01

    Full Text Available Abstract Background Accurate modeling of electrostatic potential and corresponding energies becomes increasingly important for understanding properties of biological macromolecules and their complexes. However, this is not an easy task due to the irregular shape of biological entities and the presence of water and mobile ions. Results Here we report a comprehensive suite for the well-known Poisson-Boltzmann solver, DelPhi, enriched with additional features to facilitate DelPhi usage. The suite allows for easy download of both DelPhi executable files and source code along with a makefile for local installations. The users can obtain the DelPhi manual and parameter files required for the corresponding investigation. Non-experienced researchers can download examples containing all necessary data to carry out DelPhi runs on a set of selected examples illustrating various DelPhi features and demonstrating DelPhi’s accuracy against analytical solutions. Conclusions DelPhi suite offers not only the DelPhi executable and sources files, examples and parameter files, but also provides links to third party developed resources either utilizing DelPhi or providing plugins for DelPhi. In addition, the users and developers are offered a forum to share ideas, resolve issues, report bugs and seek help with respect to the DelPhi package. The resource is available free of charge for academic users from URL: http://compbio.clemson.edu/DelPhi.php.

  4. Software Design for Smile Analysis

    Science.gov (United States)

    Sodagar, A.; Rafatjoo, R.; Gholami Borujeni, D.; Noroozi, H.; Sarkhosh, A.

    2010-01-01

    Introduction: Esthetics and attractiveness of the smile is one of the major demands in contemporary orthodontic treatment. In order to improve a smile design, it is necessary to record “posed smile” as an intentional, non-pressure, static, natural and reproducible smile. The record then should be analyzed to determine its characteristics. In this study, we intended to design and introduce a software to analyze the smile rapidly and precisely in order to produce an attractive smile for the patients. Materials and Methods: For this purpose, a practical study was performed to design multimedia software “Smile Analysis” which can receive patients’ photographs and videographs. After giving records to the software, the operator should mark the points and lines which are displayed on the system’s guide and also define the correct scale for each image. Thirty-three variables are measured by the software and displayed on the report page. Reliability of measurements in both image and video was significantly high (α=0.7–1). Results: In order to evaluate intra- operator and inter-operator reliability, five cases were selected randomly. Statistical analysis showed that calculations performed in smile analysis software were both valid and highly reliable (for both video and photo). Conclusion: The results obtained from smile analysis could be used in diagnosis, treatment planning and evaluation of the treatment progress. PMID:21998792

  5. Software Design for Smile Analysis

    Directory of Open Access Journals (Sweden)

    A. Sarkhosh

    2010-12-01

    Full Text Available Introduction: Esthetics and attractiveness of the smile is one of the major demands in contemporary orthodontic treatment. In order to improve a smile design, it is necessary to record “posed smile” as an intentional, non-pressure, static, natural and reproduciblesmile. The record then should be analyzed to determine its characteristics. In this study,we intended to design and introduce a software to analyze the smile rapidly and precisely in order to produce an attractive smile for the patients.Materials and Methods: For this purpose, a practical study was performed to design multimedia software “Smile Analysis” which can receive patients’ photographs and videographs. After giving records to the software, the operator should mark the points and lines which are displayed on the system’s guide and also define the correct scale for each image. Thirty-three variables are measured by the software and displayed on the report page. Reliability of measurements in both image and video was significantly high(=0.7-1.Results: In order to evaluate intra- operator and inter-operator reliability, five cases were selected randomly. Statistical analysis showed that calculations performed in smile analysis software were both valid and highly reliable (for both video and photo.Conclusion: The results obtained from smile analysis could be used in diagnosis,treatment planning and evaluation of the treatment progress.

  6. Inertial motion capture system for biomechanical analysis in pressure suits

    Science.gov (United States)

    Di Capua, Massimiliano

    A non-invasive system has been developed at the University of Maryland Space System Laboratory with the goal of providing a new capability for quantifying the motion of the human inside a space suit. Based on an array of six microprocessors and eighteen microelectromechanical (MEMS) inertial measurement units (IMUs), the Body Pose Measurement System (BPMS) allows the monitoring of the kinematics of the suit occupant in an unobtrusive, self-contained, lightweight and compact fashion, without requiring any external equipment such as those necessary with modern optical motion capture systems. BPMS measures and stores the accelerations, angular rates and magnetic fields acting upon each IMU, which are mounted on the head, torso, and each segment of each limb. In order to convert the raw data into a more useful form, such as a set of body segment angles quantifying pose and motion, a series of geometrical models and a non-linear complimentary filter were implemented. The first portion of this works focuses on assessing system performance, which was measured by comparing the BPMS filtered data against rigid body angles measured through an external VICON optical motion capture system. This type of system is the industry standard, and is used here for independent measurement of body pose angles. By comparing the two sets of data, performance metrics such as BPMS system operational conditions, accuracy, and drift were evaluated and correlated against VICON data. After the system and models were verified and their capabilities and limitations assessed, a series of pressure suit evaluations were conducted. Three different pressure suits were used to identify the relationship between usable range of motion and internal suit pressure. In addition to addressing range of motion, a series of exploration tasks were also performed, recorded, and analysed in order to identify different motion patterns and trajectories as suit pressure is increased and overall suit mobility is reduced

  7. Team Software Development for Aerothermodynamic and Aerodynamic Analysis and Design

    Science.gov (United States)

    Alexandrov, N.; Atkins, H. L.; Bibb, K. L.; Biedron, R. T.; Carpenter, M. H.; Gnoffo, P. A.; Hammond, D. P.; Jones, W. T.; Kleb, W. L.; Lee-Rausch, E. M.

    2003-01-01

    A collaborative approach to software development is described. The approach employs the agile development techniques: project retrospectives, Scrum status meetings, and elements of Extreme Programming to efficiently develop a cohesive and extensible software suite. The software product under development is a fluid dynamics simulator for performing aerodynamic and aerothermodynamic analysis and design. The functionality of the software product is achieved both through the merging, with substantial rewrite, of separate legacy codes and the authorship of new routines. Examples of rapid implementation of new functionality demonstrate the benefits obtained with this agile software development process. The appendix contains a discussion of coding issues encountered while porting legacy Fortran 77 code to Fortran 95, software design principles, and a Fortran 95 coding standard.

  8. Development of an e-VLBI Data Transport Software Suite with VDIF

    Science.gov (United States)

    Sekido, Mamoru; Takefuji, Kazuhiro; Kimura, Moritaka; Hobiger, Thomas; Kokado, Kensuke; Nozawa, Kentarou; Kurihara, Shinobu; Shinno, Takuya; Takahashi, Fujinobu

    2010-01-01

    We have developed a software library (KVTP-lib) for VLBI data transmission over the network with the VDIF (VLBI Data Interchange Format), which is the newly proposed standard VLBI data format designed for electronic data transfer over the network. The software package keeps the application layer (VDIF frame) and the transmission layer separate, so that each layer can be developed efficiently. The real-time VLBI data transmission tool sudp-send is an application tool based on the KVTP-lib library. sudp-send captures the VLBI data stream from the VSI-H interface with the K5/VSI PC-board and writes the data to file in standard Linux file format or transmits it to the network using the simple- UDP (SUDP) protocol. Another tool, sudp-recv , receives the data stream from the network and writes the data to file in a specific VLBI format (K5/VSSP, VDIF, or Mark 5B). This software system has been implemented on the Wettzell Tsukuba baseline; evaluation before operational employment is under way.

  9. EXPANDER – an integrative program suite for microarray data analysis

    Directory of Open Access Journals (Sweden)

    Shiloh Yosef

    2005-09-01

    Full Text Available Abstract Background Gene expression microarrays are a prominent experimental tool in functional genomics which has opened the opportunity for gaining global, systems-level understanding of transcriptional networks. Experiments that apply this technology typically generate overwhelming volumes of data, unprecedented in biological research. Therefore the task of mining meaningful biological knowledge out of the raw data is a major challenge in bioinformatics. Of special need are integrative packages that provide biologist users with advanced but yet easy to use, set of algorithms, together covering the whole range of steps in microarray data analysis. Results Here we present the EXPANDER 2.0 (EXPression ANalyzer and DisplayER software package. EXPANDER 2.0 is an integrative package for the analysis of gene expression data, designed as a 'one-stop shop' tool that implements various data analysis algorithms ranging from the initial steps of normalization and filtering, through clustering and biclustering, to high-level functional enrichment analysis that points to biological processes that are active in the examined conditions, and to promoter cis-regulatory elements analysis that elucidates transcription factors that control the observed transcriptional response. EXPANDER is available with pre-compiled functional Gene Ontology (GO and promoter sequence-derived data files for yeast, worm, fly, rat, mouse and human, supporting high-level analysis applied to data obtained from these six organisms. Conclusion EXPANDER integrated capabilities and its built-in support of multiple organisms make it a very powerful tool for analysis of microarray data. The package is freely available for academic users at http://www.cs.tau.ac.il/~rshamir/expander

  10. ROSETTA3: an object-oriented software suite for the simulation and design of macromolecules.

    Science.gov (United States)

    Leaver-Fay, Andrew; Tyka, Michael; Lewis, Steven M; Lange, Oliver F; Thompson, James; Jacak, Ron; Kaufman, Kristian; Renfrew, P Douglas; Smith, Colin A; Sheffler, Will; Davis, Ian W; Cooper, Seth; Treuille, Adrien; Mandell, Daniel J; Richter, Florian; Ban, Yih-En Andrew; Fleishman, Sarel J; Corn, Jacob E; Kim, David E; Lyskov, Sergey; Berrondo, Monica; Mentzer, Stuart; Popović, Zoran; Havranek, James J; Karanicolas, John; Das, Rhiju; Meiler, Jens; Kortemme, Tanja; Gray, Jeffrey J; Kuhlman, Brian; Baker, David; Bradley, Philip

    2011-01-01

    We have recently completed a full re-architecturing of the ROSETTA molecular modeling program, generalizing and expanding its existing functionality. The new architecture enables the rapid prototyping of novel protocols by providing easy-to-use interfaces to powerful tools for molecular modeling. The source code of this rearchitecturing has been released as ROSETTA3 and is freely available for academic use. At the time of its release, it contained 470,000 lines of code. Counting currently unpublished protocols at the time of this writing, the source includes 1,285,000 lines. Its rapid growth is a testament to its ease of use. This chapter describes the requirements for our new architecture, justifies the design decisions, sketches out central classes, and highlights a few of the common tasks that the new software can perform. PMID:21187238

  11. Onco-Regulon: an integrated database and software suite for site specific targeting of transcription factors of cancer genes.

    Science.gov (United States)

    Tomar, Navneet; Mishra, Akhilesh; Mrinal, Nirotpal; Jayaram, B

    2016-01-01

    Transcription factors (TFs) bind at multiple sites in the genome and regulate expression of many genes. Regulating TF binding in a gene specific manner remains a formidable challenge in drug discovery because the same binding motif may be present at multiple locations in the genome. Here, we present Onco-Regulon (http://www.scfbio-iitd.res.in/software/onco/NavSite/index.htm), an integrated database of regulatory motifs of cancer genes clubbed with Unique Sequence-Predictor (USP) a software suite that identifies unique sequences for each of these regulatory DNA motifs at the specified position in the genome. USP works by extending a given DNA motif, in 5'→3', 3' →5' or both directions by adding one nucleotide at each step, and calculates the frequency of each extended motif in the genome by Frequency Counter programme. This step is iterated till the frequency of the extended motif becomes unity in the genome. Thus, for each given motif, we get three possible unique sequences. Closest Sequence Finder program predicts off-target drug binding in the genome. Inclusion of DNA-Protein structural information further makes Onco-Regulon a highly informative repository for gene specific drug development. We believe that Onco-Regulon will help researchers to design drugs which will bind to an exclusive site in the genome with no off-target effects, theoretically.Database URL: http://www.scfbio-iitd.res.in/software/onco/NavSite/index.htm. PMID:27515825

  12. Acquiring data in real time in Italy from the Antarctic Seismographic Argentinean Italian Network (ASAIN): testing the global capabilities of the EarthWorm and Antelope software suites.

    Science.gov (United States)

    Percy Plasencia Linares, Milton; Russi, Marino; Pesaresi, Damiano; Cravos, Claudio

    2010-05-01

    The Italian National Institute for Oceanography and Experimental Geophysics (Istituto Nazionale di Oceanografia e di Geofisica Sperimentale, OGS) is running the Antarctic Seismographic Argentinean Italian Network (ASAIN), made of 7 seismic stations located in the Scotia Sea region in Antarctica and in Tierra del Fuego - Argentina: data from these stations are transferred in real time to the OGS headquarters in Trieste (Italy) via satellite links provided by the Instituto Antártico Argentino (IAA). Data is collected and archived primarily in Güralp Compress Format (GCF) through the Scream! software at OGS and IAA, and transmitted also in real time to the Observatories and Research Facilities for European Seismology (ORFEUS). The main real time seismic data acquisition and processing system of the ASAIN network is based on the EarthWorm 7.3 (Open Source) software suite installed on a Linux server at the OGS headquarters in Trieste. It runs several software modules for data collection, data archiving, data publication on dedicated web servers: wave_serverV, Winston Wave Server, and data analysis and realtime monitoring through Swarm program. OGS is also running, in close cooperation with the Friuli-Venezia Giulia Civil Defense, the North East (NI) Italy seismic network, making use of the Antelope commercial software suite from BRTT as the main acquisition system. As a test to check the global capabilities of the Antelope software suite, we also set up an instance of Antelope acquiring data in real time from both the regional ASAIN seismic network in Antarctica and a subset of the Global Seismic Network (GSN) funded by the Incorporated Research Institution for Seismology (IRIS). The facilities of the IRIS Data Management System, and specifically the IRIS Data Management Center, were used for real time access to waveform required in this study. The first tests indicated that more than 80% of the earthquakes with magnitude M>5.0 listed in the Preliminary Determination

  13. Spherical Coordinate Systems for Streamlining Suited Mobility Analysis

    Science.gov (United States)

    Benson, Elizabeth; Cowley, Matthew S.; Harvill. Lauren; Rajulu, Sudhakar

    2014-01-01

    When describing human motion, biomechanists generally report joint angles in terms of Euler angle rotation sequences. However, there are known limitations in using this method to describe complex motions such as the shoulder joint during a baseball pitch. Euler angle notation uses a series of three rotations about an axis where each rotation is dependent upon the preceding rotation. As such, the Euler angles need to be regarded as a set to get accurate angle information. Unfortunately, it is often difficult to visualize and understand these complex motion representations. One of our key functions is to help design engineers understand how a human will perform with new designs and all too often traditional use of Euler rotations becomes as much of a hindrance as a help. It is believed that using a spherical coordinate system will allow ABF personnel to more quickly and easily transmit important mobility data to engineers, in a format that is readily understandable and directly translatable to their design efforts. Objectives: The goal of this project is to establish new analysis and visualization techniques to aid in the examination and comprehension of complex motions. Methods: This project consisted of a series of small sub-projects, meant to validate and verify the method before it was implemented in the ABF's data analysis practices. The first stage was a proof of concept, where a mechanical test rig was built and instrumented with an inclinometer, so that its angle from horizontal was known. The test rig was tracked in 3D using an optical motion capture system, and its position and orientation were reported in both Euler and spherical reference systems. The rig was meant to simulate flexion/extension, transverse rotation and abduction/adduction of the human shoulder, but without the variability inherent in human motion. In the second phase of the project, the ABF estimated the error inherent in a spherical coordinate system, and evaluated how this error would

  14. MAUS: MICE Analysis User Software

    CERN Document Server

    CERN. Geneva

    2012-01-01

    The Muon Ionization Cooling Experiment (MICE) has developed the MICE Analysis User Software (MAUS) to simulate and analyse experimental data. It serves as the primary codebase for the experiment, providing for online data quality checks and offline batch simulation and reconstruction. The code is structured in a Map-Reduce framework to allow parallelization whether on a personal machine or in the control room. Various software engineering practices from industry are also used to ensure correct and maintainable physics code, which include unit, functional and integration tests, continuous integration and load testing, code reviews, and distributed version control systems. Lastly, there are various small design decisions like using JSON as the data structure, using SWIG to allow developers to write components in either Python or C++, or using the SCons python-based build system that may be of interest to other experiments.

  15. PARENT: A Parallel Software Suite for the Calculation of Configurational Entropy in Biomolecular Systems.

    Science.gov (United States)

    Fleck, Markus; Polyansky, Anton A; Zagrovic, Bojan

    2016-04-12

    Accurate estimation of configurational entropy from the in silico-generated biomolecular ensembles, e.g., from molecular dynamics (MD) trajectories, is dependent strongly on exhaustive sampling for physical reasons. This, however, creates a major computational problem for the subsequent estimation of configurational entropy using the Maximum Information Spanning Tree (MIST) or Mutual Information Expansion (MIE) approaches for internal molecular coordinates. In particular, the available software for such estimation exhibits serious limitations when it comes to molecules with hundreds or thousands of atoms, because of its reliance on a serial program architecture. To overcome this problem, we have developed a parallel, hybrid MPI/openMP C++ implementation of MIST and MIE, called PARENT, which is particularly optimized for high-performance computing and provides efficient estimation of configurational entropy in different biological processes (e.g., protein-protein interactions). In addition, PARENT also allows for a detailed mapping of intramolecular allosteric networks. Here, we benchmark the program on a set of 1-μs-long MD trajectories of 10 different protein complexes and their components, demonstrating robustness and good scalability. A direct comparison between MIST and MIE on the same dataset demonstrates a superior convergence behavior for the former approach, when it comes to total simulation length and configurational-space binning. PMID:26989950

  16. QSD Performa performance monitoring and analysis software

    International Nuclear Information System (INIS)

    QSD Performa is a software system that assists plant engineers in monitoring system performance. It collects and analyzes data that is useful in improving the operations and maintenance of power generation facilities. This software system makes important plant performance data available to PC user's connected to a Local Area Network. Its use of graphical displays, and statistical analyses makes it easier for plant engineers to interpret data, detect adverse trends, and make improvements in plant operations. QSD Performa is designed to make use of information stored in existing databases. It makes use of Open Database technology. System engineers can produce composite graphs of plant performance parameters. Analog parameters like temperature, pressure, and digital data such as starts, stops, and strokes, can be overlaid. Important operating and maintenance events, such as preventative and corrective maintenance actions, can be added to the timeline displays. A suite of statistical tests is included to help uncover adverse trends. Exposure models can be constructed and failure analysis performed on any designated collection of components. The look and feel of the failure data is similar to the INPO Nuclear Plant Reliability System. QSD Performa can operate interactively or automatically. The Scheduler can be run at night to automatically collect the most recent data and notify engineers of values that exceed limits

  17. Web Server Suite for Complex Mixture Analysis by Covariance NMR

    OpenAIRE

    Zhang, Fengli; Robinette, Steve; Bruschweiler-Li, Lei; Brüschweiler, Rafael

    2009-01-01

    Elucidation of the chemical composition of biological samples is a main focus of systems biology and metabolomics. Their comprehensive study requires reliable, efficient, and automatable methods to identify and quantify the underlying metabolites. Because nuclear magnetic resonance (NMR) spectroscopy is a rich source of molecular information, it has a unique potential for this task. Here we present a suite of public web servers (http://spinportal.magnet.fsu.edu), termed COLMAR, that facilitat...

  18. ATIRS package: A program suite for the rovibrational analysis of infrared spectra of asymmetric top molecules

    Science.gov (United States)

    Tasinato, N.; Pietropolli Charmet, A.; Stoppa, P.

    2007-06-01

    Nowadays high-resolution infrared spectra can be recorded quite easily and therefore it has become important to assist the rovibrational analysis, especially the assignment step, that is still fraught with many problems in the presence of perturbation effects. In this article we provide a description of ATIRS, a complete software suite developed for assisting in the rotational investigation of vibrational bands of asymmetric top molecules. This package uses the Pickett's CALPGM suite for fitting transitions and predicting line positions and is composed by three stand-alone applications: (1) Visual Loomis-Wood for the assignment of spectral lines based on Loomis-Wood type diagrams; (2) Visual CALPGM, a new graphical interface to Pickett's programs SPFIT and SPCAT; (3) Visual Spectra Simulator for the simulation of spectra. The graphical interface to the CALPGM suite is developed for asymmetric rotors. The main feature of this application is to avoid the use of the parameter codes that are here replaced employing the well known parameter names or symbols. Highlighting the regular transition sequences, Visual Loomis-Wood assists in the assignment of the spectral lines. It visualizes the description of a transition and the assignment can be simply done by mouse-clicking on the diagram; moreover its display mode feature lets to check the experimental spectrum in which all the assigned lines together with their description are reported. Visual Spectra Simulator provides a simple and functionally application that, using the calculated frequencies and intensities given by SPCAT, simulates the high-resolution infrared spectrum and compare it to the experimental one. ATIRS, freely available to the spectroscopic community, is designed to be easy to use and presents a standard graphical interface; being based on the CALPGM package it can handle forbidden transitions and perturbations among many states.

  19. Intercomparison of gamma ray analysis software packages

    International Nuclear Information System (INIS)

    The IAEA undertook an intercomparison exercise to review available software for gamma ray spectra analysis. This document describes the methods used in the intercomparison exercise, characterizes the software packages reviewed and presents the results obtained. Only direct results are given without any recommendation for a particular software or method for gamma ray spectra analysis

  20. Software reliability analysis in probabilistic risk analysis

    International Nuclear Information System (INIS)

    Probabilistic Risk Analysis (PRA) is a tool which can reveal shortcomings of the NPP design in general. PRA analysts have not had sufficient guiding principles in modelling particular digital components malfunctions. Digital I and C systems are mostly analysed simply and the software reliability estimates are engineering judgments often lacking a proper justification. The OECD/NEA Working Group RISK's task DIGREL develops a taxonomy of failure modes of digital I and C systems. The EU FP7 project HARMONICS develops software reliability estimation method based on an analytic approach and Bayesian belief network. (author)

  1. Software Performs Complex Design Analysis

    Science.gov (United States)

    2008-01-01

    Designers use computational fluid dynamics (CFD) to gain greater understanding of the fluid flow phenomena involved in components being designed. They also use finite element analysis (FEA) as a tool to help gain greater understanding of the structural response of components to loads, stresses and strains, and the prediction of failure modes. Automated CFD and FEA engineering design has centered on shape optimization, which has been hindered by two major problems: 1) inadequate shape parameterization algorithms, and 2) inadequate algorithms for CFD and FEA grid modification. Working with software engineers at Stennis Space Center, a NASA commercial partner, Optimal Solutions Software LLC, was able to utilize its revolutionary, one-of-a-kind arbitrary shape deformation (ASD) capability-a major advancement in solving these two aforementioned problems-to optimize the shapes of complex pipe components that transport highly sensitive fluids. The ASD technology solves the problem of inadequate shape parameterization algorithms by allowing the CFD designers to freely create their own shape parameters, therefore eliminating the restriction of only being able to use the computer-aided design (CAD) parameters. The problem of inadequate algorithms for CFD grid modification is solved by the fact that the new software performs a smooth volumetric deformation. This eliminates the extremely costly process of having to remesh the grid for every shape change desired. The program can perform a design change in a markedly reduced amount of time, a process that would traditionally involve the designer returning to the CAD model to reshape and then remesh the shapes, something that has been known to take hours, days-even weeks or months-depending upon the size of the model.

  2. Software Security Analysis : Managing source code audit

    OpenAIRE

    Persson, Daniel; Baca, Dejan

    2004-01-01

    Software users have become more conscious of security. More people have access to Internet and huge databases of security exploits. To make secure products, software developers must acknowledge this threat and take action. A first step is to perform a software security analysis. The software security analysis was performed using automatic auditing tools. An experimental environment was constructed to check if the findings were exploitable or not. Open source projects were used as reference to...

  3. Human Factors Analysis in Software Engineering

    Institute of Scientific and Technical Information of China (English)

    Xu Ren-zuo; Ma Ruo-feng; Liu Li-na; Xiong Zhong-wei

    2004-01-01

    The general human factors analysis analyzes human functions, effects and influence in a system. But in a narrow sense, it analyzes human influence upon the reliability of a system, it includes traditional human reliability analysis, human error analysis, man-machine interface analysis, human character analysis, and others. A software development project in software engineering is successful or not to be completely determined by human factors. In this paper, we discuss the human factors intensions, declare the importance of human factors analysis for software engineering by listed some instances. At last, we probe preliminarily into the mentality that a practitioner in software engineering should possess.

  4. Reliability analysis of software based safety functions

    International Nuclear Information System (INIS)

    The methods applicable in the reliability analysis of software based safety functions are described in the report. Although the safety functions also include other components, the main emphasis in the report is on the reliability analysis of software. The check list type qualitative reliability analysis methods, such as failure mode and effects analysis (FMEA), are described, as well as the software fault tree analysis. The safety analysis based on the Petri nets is discussed. The most essential concepts and models of quantitative software reliability analysis are described. The most common software metrics and their combined use with software reliability models are discussed. The application of software reliability models in PSA is evaluated; it is observed that the recent software reliability models do not produce the estimates needed in PSA directly. As a result from the study some recommendations and conclusions are drawn. The need of formal methods in the analysis and development of software based systems, the applicability of qualitative reliability engineering methods in connection to PSA and the need to make more precise the requirements for software based systems and their analyses in the regulatory guides should be mentioned. (orig.). (46 refs., 13 figs., 1 tab.)

  5. A Coupled Calculation Suite for Atucha II Operational Transients Analysis

    Directory of Open Access Journals (Sweden)

    Oscar Mazzantini

    2011-01-01

    Full Text Available While more than a decade ago reactor and thermal hydraulic calculations were tedious and often needed a lot of approximations and simplifications that forced the designers to take a very conservative approach, computational resources available nowadays allow engineers to cope with increasingly complex problems in a reasonable time. The use of best-estimate calculations provides tools to justify convenient engineering margins, reduces costs, and maximises economic benefits. In this direction, a suite of coupled best-estimate specific calculation codes was developed to analyse the behaviour of the Atucha II nuclear power plant in Argentina. The developed tool includes three-dimensional spatial neutron kinetics, a channel-level model of the core thermal hydraulics with subcooled boiling correlations, a one-dimensional model of the primary and secondary circuits including pumps, steam generators, heat exchangers, and the turbine with all their associated control loops, and a complete simulation of the reactor control, limitation, and protection system working in closed-loop conditions as a faithful representation of the real power plant. In the present paper, a description of the coupling scheme between the codes involved is given, and some examples of their application to Atucha II are shown.

  6. Software Piracy: An Empirical Analysis

    OpenAIRE

    Gomes, Nicolas

    2014-01-01

    Chapter 2 summary As the devices that used software became more available to the masses the problem of software piracy increases. Recent theoretical works have attempted to model the phenomenon of software piracy; others tried to describe empirically the determinants that may explain this phenomenon. The empirical literature in the latter case is still in its infancy. This chapter reviews the theoretical literature focusing on three major models: those dealing with diffusion models, with n...

  7. Software architecture analysis of usability

    OpenAIRE

    Folmer, Eelke

    2005-01-01

    One of the qualities that has received increased attention in recent decades is usability. A software product with poor usability is likely to fail in a highly competitive market; therefore software developing organizations are paying more and more attention to ensuring the usability of their software. However, practice shows that product quality (which includes usability among others) is not that high as it could be. Organizations spend a relative large amount of money and effort on fixing u...

  8. Analysis of Software-Engineering-Processes

    OpenAIRE

    Teichmann, Clemens; Schreiber, Andreas

    2013-01-01

    The German Aerospace Center (DLR) is one of the biggest software development facilities in Germany. Its employees create complex software using various development processes. To assure high software quality, innovative software engineering methods and tools need to be incorporated. A current problem in the field of computer science is to identify the effectiveness of those methods and tools to ensure quality. An analysis of the incorporated processes is needed to determine which parts sup...

  9. Automating Risk Analysis of Software Design Models

    OpenAIRE

    Maxime Frydman; Guifré Ruiz; Elisa Heymann; Eduardo César; Barton P. Miller

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security e...

  10. Analysis strategies and software for geodetic VLBI

    OpenAIRE

    Haas, R.

    2004-01-01

    This article describes currently used analysis strategy and data analysis software for geodetic VLBI.Today's geodetic observing strategies are shortly presented, and the geodetic VLBI observables and data modeling are briefly discussed. A short overview is given on existing geodetic VLBI software packages and the statistical approaches that are applied. Necessary improvements of today's analysis software are described. Some of the future expectations and goals of geodetic VLBI are presented a...

  11. Integrated Methodology for Software Reliability Analysis

    Directory of Open Access Journals (Sweden)

    Marian Pompiliu CRISTESCU

    2012-01-01

    Full Text Available The most used techniques to ensure safety and reliability of the systems are applied together as a whole, and in most cases, the software components are usually overlooked or to little analyzed. The present paper describes the applicability of fault trees analysis software system, analysis defined as Software Fault Tree Analysis (SFTA, fault trees are evaluated using binary decision diagrams, all of these being integrated and used with help from Java library reliability.

  12. Software architecture analysis of usability

    NARCIS (Netherlands)

    Folmer, Eelke

    2005-01-01

    One of the qualities that has received increased attention in recent decades is usability. A software product with poor usability is likely to fail in a highly competitive market; therefore software developing organizations are paying more and more attention to ensuring the usability of their softwa

  13. Improving Software Quality through Program Analysis

    International Nuclear Information System (INIS)

    In this paper, we present the Program Analysis Framework (PAF) to analyze the software architecture and software modularity of large software packages using techniques in Aspect Mining. The basic idea about PAF is to record the call relationships information among the important elements firstly and then use the different analysis algorithms to find the crosscutting concerns which could destroy the modularity of the software from this recording information. We evaluate our framework through analyzing DATE, the ALICE Data-Acquisition (DAQ) software which handles the data flow from the detector electronics to the permanent storage archiving. The analysis results prove the effectiveness and efficiency of our framework. PAF has pinpointed a number of possible optimizations which could be applied and help maximizing the software quality. PAF could also be used for the analysis of other projects written in C language.

  14. Safety Analysis of an Evolving Software Architecture

    OpenAIRE

    de Lemos, Rogério

    2000-01-01

    The safety analysis of an evolving software system has to consider the impact that changes might have on the software components, and to provide confidence that the risk is acceptable. If the impact of a change is not thoroughly analysed, accidents can occur as a result of faulty interactions between components, for example. However, the process of safety analysis can be enhanced if appropriate abstractions are provided for modelling and analysing software components and their interactions. I...

  15. Analysis of open source GIS software

    OpenAIRE

    Božnis, Andrius

    2006-01-01

    GIS is one of the most perspective information technology sciences sphere. GIS conjuncts the digital image analysis and data base systems. This makes GIS wide applicable and very high skills demanding system. There is a lot of commercial GIS software which is well advertised and which functionality is pretty well known, while open source software is forgotten. In this diploma work is made analysis of available open source GIS software on the Internet, in the scope of different projects interr...

  16. Hazard Analysis for the Mark III Space Suit Assembly (SSA) Used in One-g Operations

    Science.gov (United States)

    Mitchell, Kate; Ross, Amy; Blanco, Raul; Wood, Art

    2012-01-01

    This Hazard Analysis document encompasses the Mark III Space Suit Assembly (SSA) and associated ancillary equipment. It has been prepared using JSC17773, "Preparing Hazard Analyses for JSC Ground Operation", as a guide. The purpose of this document is to present the potential hazards involved in ground (23 % maximum O2, One-g) operations of the Mark III and associated ancillary support equipment system. The hazards listed in this document are specific to suit operations only; each supporting facility (Bldg. 9, etc.) is responsible for test specific Hazard Analyses. A "hazard" is defined as any condition that has the potential for harming personnel or equipment. This analysis was performed to document the safety aspects associated with manned use of the Mark III for pressurized and unpressurized ambient, ground-based, One-g human testing. The hazards identified herein represent generic hazards inherent to all standard JSC test venues for nominal ground test configurations. Non-standard test venues or test specific configurations may warrant consideration of additional hazards analysis prior to test. The cognizant suit engineer is responsible for the safety of the astronaut/test subject, space suit, and suit support personnel. The test requester, for the test supported by the suit test engineer and suited subject, is responsible for overall safety and any necessary Test Readiness Reviews (TRR).

  17. DSN Data Visualization Suite

    Science.gov (United States)

    Bui, Bach X.; Malhotra, Mark R.; Kim, Richard M.

    2009-01-01

    The DSN Data Visualization Suite is a set of computer programs and reusable Application Programming Interfaces (APIs) that assist in the visualization and analysis of Deep Space Network (DSN) spacecraft-tracking data, which can include predicted and actual values of downlink frequencies, uplink frequencies, and antenna-pointing angles in various formats that can include tables of values and polynomial coefficients. The data can also include lists of antenna-pointing events, lists of antenna- limit events, and schedules of tracking activities. To date, analysis and correlation of these intricately related data before and after tracking have been difficult and time-consuming. The DSN Data Visualization Suite enables operators to quickly diagnose tracking-data problems before, during, and after tracking. The Suite provides interpolation on demand and plotting of DSN tracking data, correlation of all data on a given temporal point, and display of data with color coding configurable by users. The suite thereby enables rapid analysis of the data prior to transmission of the data to DSN control centers. At the control centers, the same suite enables operators to validate the data before committing the data to DSN subsystems. This software is also Web-enabled to afford its capabilities to international space agencies.

  18. Joint optimization of algorithmic suites for EEG analysis.

    Science.gov (United States)

    Santana, Eder; Brockmeier, Austin J; Principe, Jose C

    2014-01-01

    Electroencephalogram (EEG) data analysis algorithms consist of multiple processing steps each with a number of free parameters. A joint optimization methodology can be used as a wrapper to fine-tune these parameters for the patient or application. This approach is inspired by deep learning neural network models, but differs because the processing layers for EEG are heterogeneous with different approaches used for processing space and time. Nonetheless, we treat the processing stages as a neural network and apply backpropagation to jointly optimize the parameters. This approach outperforms previous results on the BCI Competition II - dataset IV; additionally, it outperforms the common spatial patterns (CSP) algorithm on the BCI Competition III dataset IV. In addition, the optimized parameters in the architecture are still interpretable. PMID:25570621

  19. Development of integrated transport analysis suite for LHD plasmas towards transport model validation and increased predictability

    International Nuclear Information System (INIS)

    In this study, the integrated transport analysis suite, TASK3D-a, was developed to enhance the physics understanding and accurate discussion of the Large Helical Device (LHD) experiment toward facilitating transport model validation. Steady-state and dynamic (transient) transport analyses of NBI (neutral-beam-injection)-heated LHD plasmas have been greatly facilitated by this suite. This will increase the predictability of the transport properties of LHD plasmas toward reactor-relevant regimes and reactor-scale plasmas. (author)

  20. WRF model performance analysis for a suite of simulation design

    Science.gov (United States)

    Mohan, Manju; Sati, Ankur Prabhat

    2016-03-01

    At present scientists are successfully using Numerical Weather Prediction (NWP) models to achieve a reliable forecast. Nested domains are preferred by the modelling community with varying grid ratios having wider applications. The impact of the nesting grid ratio (NGR) on the model performance needs systematic analysis and explored in the present study. The usage of WRF is mostly as a mesoscale model in simulating either extreme events or events of smaller duration shown with statistical model evaluation for the correspondingly similar and short period of time. Thus, influence of the simulation period on model performance has been examined for key meteorological parameters. Several works done earlier on episodes involve model implementation for longer duration and for that single simulation is performed often for a continuous stretch. This study scrutinizes the influence on model performance due to one single simulation versus several smaller simulations for the same duration; essentially splitting the run-time. In the present study, the surface wind (i.e., winds at 10 meters), temperature and Relative humidity at 2 meters as obtained from model simulations are compared with the Observations. The sensitivity study of nesting grid ratio, continuous versus smaller split simulations and realistic simulation period is done in the present study. It is found that there is no statistically significant difference in the simulated results on changing the nesting grid ratio while the smaller time split schemes (2 days and 4 days schemes on comparison with 8 days and 16 days continuous run) improve the results significantly. The impact of increasing number of observations from different sites on model performance is also scrutinised. Furthermore, conceptual framework is provided for Optimum time period for simulations to have confidence in statistical model evaluation.

  1. Plutonium and uranium isotopic analysis: recent developments of the MGA++ code suite

    Energy Technology Data Exchange (ETDEWEB)

    Buckley, W; Clark, D; Parker, W E; Romine, W; Ruhter, W; Wang, T F

    1999-09-17

    The Lawrence Livermore National Laboratory develops sophisticated gamma-ray analysis codes for isotopic determinations of nuclear materials based on the principles of the MultiGroup Analysis (MGA). MGA methodology has been upgraded and expanded and is now comprised of a suite of codes known as MGA++. A graphical user interface has also been developed for viewing the data and the fitting procedure. The code suite provides plutonium and uranium isotopic analysis for data collected with high-purity germanium planar and/or coaxial detector systems. The most recent addition to the MGA++ code suite, MGAHI, analyzes Pu data using higher-energy gamma rays (200 keV and higher) and is particularly useful for Pu samples that are enclosed in thick-walled containers. Additionally, the code suite can perform isotopic analysis of uranium spectra collected with cadmium-zinc-telluride (CZT) detectors. We are currently developing new codes with will integrate into the MGA++ suite. These will include Pu isotopic analysis capabilities for data collected with CZT detectors, and U isotopic analysis with high-purity germanium detectors, which utilizes only higher energy gamma rays. Future development of MGA++ will include a capability for isotopic analyses on mixtures of Pu and U.

  2. Numerical methods in software and analysis

    CERN Document Server

    Rice, John R

    1992-01-01

    Numerical Methods, Software, and Analysis, Second Edition introduces science and engineering students to the methods, tools, and ideas of numerical computation. Introductory courses in numerical methods face a fundamental problem-there is too little time to learn too much. This text solves that problem by using high-quality mathematical software. In fact, the objective of the text is to present scientific problem solving using standard mathematical software. This book discusses numerous programs and software packages focusing on the IMSL library (including the PROTRAN system) and ACM Algorithm

  3. Software Design for Smile Analysis

    OpenAIRE

    Sarkhosh, A.; Noroozi, H.; D. Gholami Borujeni; Rafatjoo, R.; Sodagar, A.

    2010-01-01

    Introduction: Esthetics and attractiveness of the smile is one of the major demands in contemporary orthodontic treatment. In order to improve a smile design, it is necessary to record “posed smile” as an intentional, non-pressure, static, natural and reproducible smile. The record then should be analyzed to determine its characteristics. In this study, we intended to design and introduce a software to analyze the smile rapidly and precisely in order to produce an attractive smile for the pat...

  4. GRACAT, Software for grounding and collision analysis

    DEFF Research Database (Denmark)

    Friis-Hansen, Peter; Simonsen, Bo Cerup

    2002-01-01

    route where the result is the probability density functions for the cost of oil outflow in a given area per year for the two vessels. In this paper we describe the basic modelling principles and the capabilities of the software package. The software package can be downloaded for research purposes from......From 1998 to 2001 an integrated software package for grounding and collision analysis was developed at the Technical University of Denmark within the ISESO project at the cost of six man years (0.75M US$). The software provides a toolbox for a multitude of analyses related to collision...... and grounding accidents. The software consists of three basic analysis modules and one risk mitigation module: 1) frequency, 2) damage, and 3) consequence. These modules can be used individually or in series and the analyses can be performed in deterministic or probabilistic mode. Finally, in the mitigation...

  5. Software safety analysis practice in installation phase

    International Nuclear Information System (INIS)

    This work performed a software safety analysis in the installation phase of the Lung men nuclear power plant in Taiwan, under the cooperation of Institute of Nuclear Energy Research and Tpc. The US Nuclear Regulatory Commission requests licensee to perform software safety analysis and software verification and validation in each phase of software development life cycle with Branch Technical Position 7-14. In this work, 37 safety grade digital instrumentation and control systems were analyzed by failure mode and effects analysis, which is suggested by IEEE standard 7-4.3.2-2003. During the installation phase, skew tests for safety grade network and point to point tests were performed. The failure mode and effects analysis showed all the single failure modes can be resolved by the redundant means. Most of the common mode failures can be resolved by operator manual actions. (Author)

  6. ERP Software Evaluation and Comparative Analysis

    OpenAIRE

    Kalpic, Damir; Fertalj, Kresimir

    2004-01-01

    This paper presents the results of an investigation performed in 2001 under the title Comparative Analysis of Information Systems Software in Croatia. The focus was set on the comparative analysis of domestic and foreign Enterprise Resource Planning (ERP) software, which is present in Croatia. The investigation was performed from the standpoint of ERP applicability, regardless of the development methods and information technology. In other words, the evaluation was performed primarily from th...

  7. MathWeb: a concurrent image analysis tool suite for multispectral data fusion

    Science.gov (United States)

    Achalakul, Tiranee; Haaland, Peter D.; Taylor, Stephen

    1999-03-01

    This paper describes a preliminary approach to the fusion of multi-spectral image data for the analysis of cervical cancer. The long-term goal of this research is to define spectral signatures and automatically detect cancer cell structures. The approach combines a multi-spectral microscope with an image analysis tool suite, MathWeb. The tool suite incorporates a concurrent Principal Component Transform (PCT) that is used to fuse the multi-spectral data. This paper describes the general approach and the concurrent PCT algorithm. The algorithm is evaluated from both the perspective of image quality and performance scalability.

  8. Software acquisition: a business strategy analysis

    OpenAIRE

    Farbey, B.; Finkelstein, A.

    2001-01-01

    The paper argues that there are new insights to be gained from a strategic analysis of requirements engineering. The paper is motivated by a simple question: what does it take to be a world class software acquirer? The question has relevance for requirements engineers because for many organisations market pressures mean that software is commonly acquired rather than developed from scratch. The paper builds on the work of C. H. Fine (1998) who suggests that product, process and supply chain sh...

  9. A 'Toolbox' Equivalent Process for Safety Analysis Software

    International Nuclear Information System (INIS)

    Defense Nuclear Facilities Safety Board (DNFSB) Recommendation 2002-1 (Quality Assurance for Safety-Related Software) identified a number of quality assurance issues on the use of software in Department of Energy (DOE) facilities for analyzing hazards, and designing and operating controls that prevent or mitigate potential accidents. The development and maintenance of a collection, or 'toolbox', of multiple-site use, standard solution, Software Quality Assurance (SQA)-compliant safety software is one of the major improvements identified in the associated DOE Implementation Plan (IP). The DOE safety analysis toolbox will contain a set of appropriately quality-assured, configuration-controlled, safety analysis codes, recognized for DOE-broad, safety basis applications. Currently, six widely applied safety analysis computer codes have been designated for toolbox consideration. While the toolbox concept considerably reduces SQA burdens among DOE users of these codes, many users of unique, single-purpose, or single-site software may still have sufficient technical justification to continue use of their computer code of choice, but are thwarted by the multiple-site condition on toolbox candidate software. The process discussed here provides a roadmap for an equivalency argument, i.e., establishing satisfactory SQA credentials for single-site software that can be deemed ''toolbox-equivalent''. The process is based on the model established to meet IP Commitment 4.2.1.2: Establish SQA criteria for the safety analysis ''toolbox'' codes. Implementing criteria that establish the set of prescriptive SQA requirements are based on implementation plan/procedures from the Savannah River Site, also incorporating aspects of those from the Waste Isolation Pilot Plant (SNL component) and the Yucca Mountain Project. The major requirements are met with evidence of a software quality assurance plan, software requirements and design documentation, user's instructions, test report, a

  10. Acoustic Emission Analysis Applet (AEAA) Software

    Science.gov (United States)

    Nichols, Charles T.; Roth, Don J.

    2013-01-01

    NASA Glenn Research and NASA White Sands Test Facility have developed software supporting an automated pressure vessel structural health monitoring (SHM) system based on acoustic emissions (AE). The software, referred to as the Acoustic Emission Analysis Applet (AEAA), provides analysts with a tool that can interrogate data collected on Digital Wave Corp. and Physical Acoustics Corp. software using a wide spectrum of powerful filters and charts. This software can be made to work with any data once the data format is known. The applet will compute basic AE statistics, and statistics as a function of time and pressure (see figure). AEAA provides value added beyond the analysis provided by the respective vendors' analysis software. The software can handle data sets of unlimited size. A wide variety of government and commercial applications could benefit from this technology, notably requalification and usage tests for compressed gas and hydrogen-fueled vehicles. Future enhancements will add features similar to a "check engine" light on a vehicle. Once installed, the system will ultimately be used to alert International Space Station crewmembers to critical structural instabilities, but will have little impact to missions otherwise. Diagnostic information could then be transmitted to experienced technicians on the ground in a timely manner to determine whether pressure vessels have been impacted, are structurally unsound, or can be safely used to complete the mission.

  11. A methodology, based on a language's properties, for the selection and validation of a suite of software metrics.

    OpenAIRE

    Bodnar, Roger P. Jr.

    1997-01-01

    Software Engineering has attempted to improve the software development process for over two decades. A primary attempt at this process lies in the arena of measurement. "You can't control what you can't measure" [DEMT82]. This thesis attempts to measure the development of multimedia products. Multimedia languages seem to be the trend of future languages. Problem areas such as Education, Instruction, Training, and Information Systems require that various media allow the achievement of suc...

  12. What characteristics are suited to help choosing traditional or agile project management methods for software development projects?

    OpenAIRE

    Paykina, Ekaterina; Zhou, Li

    2012-01-01

    Nowadays, the nature of the projects has changed to be unique, uncertain, ambiguous,complex and innovative. It becomes hard to plan in advance the project progress, asdeviations from plans and unpredictable changes occur more frequently. This can bespecifically observed in the software development industry which needs to constantlymeet customers’ rapidly changed requirements. Traditionally, software projects aredeveloped through a plan-driven approach which emphasizes an overall project plan ...

  13. The SPOCA-suite: a software for extraction and tracking of Active Regions and Coronal Holes on EUV images

    CERN Document Server

    Delouille, Véronique; Verbeeck, Cis; de Visscher, Ruben

    2012-01-01

    Precise localisation and characterization of active regions and coronal holes as observed by EUV imagers are crucial for a wide range of solar and helio-physics studies. We describe a segmentation procedure, the SPOCA-suite, that produces catalogs of Active Regions (AR) and Coronal Holes (CH) on SDO-AIA images. The method builds upon our previous work on 'Spatial Possibilistic Clustering Algorithm' (SPOCA) and substantially improve it in several ways. The SPOCA-suite is applied in near real time on AIA archive and produces entries into the AR and CH catalogs of the Heliophysics Event Knowledgebase (HEK) every four hours. We give an illustration of the use of SPOCA for determination of the CH filling factors. This reports is intended as a reference guide for the users of SPoCA output.

  14. The SPoCA-suite: Software for extraction, characterization, and tracking of active regions and coronal holes on EUV images

    Science.gov (United States)

    Verbeeck, C.; Delouille, V.; Mampaey, B.; De Visscher, R.

    2014-01-01

    Context. Precise localization and characterization of active regions (AR) and coronal holes (CH) as observed by extreme ultra violet (EUV) imagers are crucial for a wide range of solar and helio-physics studies. Aims: We introduce a set of segmentation procedures (known as the SPoCA-suite) that allows one to retrieve AR and CH properties on EUV images taken from SOHO-EIT, STEREO-EUVI, PROBA2-SWAP, and SDO-AIA. Methods: We build upon our previous work on the Spatial Possibilistic Clustering Algorithm (SPoCA), that we have improved substantially in several ways. Results: We apply our algorithm on the synoptic EIT archive from 1997 to 2011 and decompose this dataset into regions that can clearly be identified as AR, quiet Sun, and CH. An antiphase between AR and CH filling factor is observed, as expected. The SPoCA-suite is next applied to datasets from EUVI, SWAP, and AIA. The time series pertaining to ARs or CHs are presented. Conclusions: The SPoCA-suite enables the extraction of several long time series of AR and CH properties from the data files of EUV imagers and also allows tracking individual ARs or CHs over time. For AIA images, AR and CH catalogs are available in near-real time from the Heliophysics Events Knowledgebase. The full code, which allows processing any EUV images, is available upon request to the authors.

  15. Software abstractions logic, language, and analysis

    CERN Document Server

    Jackson, Daniel

    2011-01-01

    In Software Abstractions Daniel Jackson introduces an approach to software design that draws on traditional formal methods but exploits automated tools to find flaws as early as possible. This approach--which Jackson calls "lightweight formal methods" or "agile modeling"--takes from formal specification the idea of a precise and expressive notation based on a tiny core of simple and robust concepts but replaces conventional analysis based on theorem proving with a fully automated analysis that gives designers immediate feedback. Jackson has developed Alloy, a language that captures the essence of software abstractions simply and succinctly, using a minimal toolkit of mathematical notions. This revised edition updates the text, examples, and appendixes to be fully compatible with the latest version of Alloy (Alloy 4). The designer can use automated analysis not only to correct errors but also to make models that are more precise and elegant. This approach, Jackson says, can rescue designers from "the tarpit of...

  16. Power and performance software analysis and optimization

    CERN Document Server

    Kukunas, Jim

    2015-01-01

    Power and Performance: Software Analysis and Optimization is a guide to solving performance problems in modern Linux systems. Power-efficient chips are no help if the software those chips run on is inefficient. Starting with the necessary architectural background as a foundation, the book demonstrates the proper usage of performance analysis tools in order to pinpoint the cause of performance problems, and includes best practices for handling common performance issues those tools identify. Provides expert perspective from a key member of Intel's optimization team on how processors and memory

  17. DDS-Suite - A Dynamic Data Acquisition, Processing, and Analysis System for Wind Tunnel Testing

    Science.gov (United States)

    Burnside, Jathan J.

    2012-01-01

    Wind Tunnels have optimized their steady-state data systems for acquisition and analysis and even implemented large dynamic-data acquisition systems, however development of near real-time processing and analysis tools for dynamic-data have lagged. DDS-Suite is a set of tools used to acquire, process, and analyze large amounts of dynamic data. Each phase of the testing process: acquisition, processing, and analysis are handled by separate components so that bottlenecks in one phase of the process do not affect the other, leading to a robust system. DDS-Suite is capable of acquiring 672 channels of dynamic data at rate of 275 MB / s. More than 300 channels of the system use 24-bit analog-to-digital cards and are capable of producing data with less than 0.01 of phase difference at 1 kHz. System architecture, design philosophy, and examples of use during NASA Constellation and Fundamental Aerodynamic tests are discussed.

  18. Modelling and Evaluating Software Project Risks with Quantitative Analysis Techniques in Planning Software Development

    OpenAIRE

    Elzamly, Abdelrafe; Hussin, Burairah

    2015-01-01

    Risk is not always avoidable, but it is controllable. The aim of this paper is to present new techniques which use the stepwise regression analysis tomodel and evaluate the risks in planning software development and reducing risk with software process improvement. Top ten software risk factors in planning software development phase and thirty control factors were presented to respondents. This study incorporates risk management approach and planning software development to mitigate software p...

  19. Objective facial photograph analysis using imaging software.

    Science.gov (United States)

    Pham, Annette M; Tollefson, Travis T

    2010-05-01

    Facial analysis is an integral part of the surgical planning process. Clinical photography has long been an invaluable tool in the surgeon's practice not only for accurate facial analysis but also for enhancing communication between the patient and surgeon, for evaluating postoperative results, for medicolegal documentation, and for educational and teaching opportunities. From 35-mm slide film to the digital technology of today, clinical photography has benefited greatly from technological advances. With the development of computer imaging software, objective facial analysis becomes easier to perform and less time consuming. Thus, while the original purpose of facial analysis remains the same, the process becomes much more efficient and allows for some objectivity. Although clinical judgment and artistry of technique is never compromised, the ability to perform objective facial photograph analysis using imaging software may become the standard in facial plastic surgery practices in the future. PMID:20511080

  20. Software for computerised analysis of cardiotocographic traces.

    Science.gov (United States)

    Romano, M; Bifulco, P; Ruffo, M; Improta, G; Clemente, F; Cesarelli, M

    2016-02-01

    Despite the widespread use of cardiotocography in foetal monitoring, the evaluation of foetal status suffers from a considerable inter and intra-observer variability. In order to overcome the main limitations of visual cardiotocographic assessment, computerised methods to analyse cardiotocographic recordings have been recently developed. In this study, a new software for automated analysis of foetal heart rate is presented. It allows an automatic procedure for measuring the most relevant parameters derivable from cardiotocographic traces. Simulated and real cardiotocographic traces were analysed to test software reliability. In artificial traces, we simulated a set number of events (accelerations, decelerations and contractions) to be recognised. In the case of real signals, instead, results of the computerised analysis were compared with the visual assessment performed by 18 expert clinicians and three performance indexes were computed to gain information about performances of the proposed software. The software showed preliminary performance we judged satisfactory in that the results matched completely the requirements, as proved by tests on artificial signals in which all simulated events were detected from the software. Performance indexes computed in comparison with obstetricians' evaluations are, on the contrary, not so satisfactory; in fact they led to obtain the following values of the statistical parameters: sensitivity equal to 93%, positive predictive value equal to 82% and accuracy equal to 77%. Very probably this arises from the high variability of trace annotation carried out by clinicians. PMID:26638805

  1. Development of Advanced Suite of Deterministic Codes for VHTR Physics Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Kang Seog; Cho, J. Y.; Lee, K. H. (and others)

    2007-07-15

    Advanced Suites of deterministic codes for VHTR physics analysis has been developed for detailed analysis of current and advanced reactor designs as part of a US-ROK collaborative I-NERI project. These code suites include the conventional 2-step procedure in which a few group constants are generated by a transport lattice calculation, and the reactor physics analysis is performed by a 3-dimensional diffusion calculation, and a whole core transport code that can model local heterogeneities directly at the core level. Particular modeling issues in physics analysis of the gas-cooled VHTRs were resolved, which include a double heterogeneity of the coated fuel particles, a neutron streaming in the coolant channels, a strong core-reflector interaction, and large spectrum shifts due to changes of the surrounding environment, temperature and burnup. And the geometry handling capability of the DeCART code were extended to deal with the hexagonal fuel elements of the VHTR core. The developed code suites were validated and verified by comparing the computational results with those of the Monte Carlo calculations for the benchmark problems.

  2. Advanced Software Methods for Physics Analysis

    International Nuclear Information System (INIS)

    Unprecedented data analysis complexity is experienced in modern High Energy Physics experiments. The complexity arises from the growing size of recorded data samples, the large number of data analyses performed by different users in each single experiment, and the level of complexity of each single analysis. For this reason, the requirements on software for data analysis impose a very high level of reliability. We present two concrete examples: the former from BaBar experience with the migration to a new Analysis Model with the definition of a new model for the Event Data Store, the latter about a toolkit for multivariate statistical and parametric Monte Carlo analysis developed using generic programming

  3. Software for analysis of visual meteor data

    Science.gov (United States)

    Veljković, Kristina; Ivanović, Ilija

    2014-02-01

    In this paper, we will present new software for analysis of IMO data collected from visual observations. The software consists of a package of functions written in the statistical programming language R, as well as a Java application which uses these functions in a user friendly environment. R code contains various filters for selection of data, methods for calculation of Zenithal Hourly Rate (ZHR), solar longitude, population index and graphical representation of ZHR and distribution of observed magnitudes. The Java application allows everyone to use these functions without any knowledge of R. Both R code and the Java application are open source and free with user manuals and examples provided.

  4. Platform Independent Dynamic Java Virtual Machine Analysis: the Java Grande Forum Benchmark Suite

    OpenAIRE

    Daly, Charles; Horgan, Jane; Power, James; Waldron, John

    2001-01-01

    In this paper we present a platform independent analysis of the dynamic profiles of Java programs when executing on the Java Virtual Machine. The Java programs selected are taken from the Java Grande Forum benchmark suite, and five different Java-to-bytecode compilers are analysed. The results presented describe the dynamic instruction usage frequencies, as well as the sizes of the local variable, parameter and operand stacks during execution on the JVM. These results,...

  5. Towards software analysis as a service

    OpenAIRE

    Ghezzi, G; H.C. Gall

    2008-01-01

    Throughout the years software engineers have come up with a myriad of specialized tools and techniques that focus on a certain type of analysis, such as metrics extraction, evolution tracking, co-change detection, bug prediction, all the way up to social network analysis of team dynamics. However, easy and straight forward synergies between these analyses/tools rarely exist because of their stand-alone nature, their platform dependence, their different input and output formats and the vari...

  6. Improving Software Systems By Flow Control Analysis

    Directory of Open Access Journals (Sweden)

    Piotr Poznanski

    2012-01-01

    Full Text Available Using agile methods during the implementation of the system that meets mission critical requirements can be a real challenge. The change in the system built of dozens or even hundreds of specialized devices with embedded software requires the cooperation of a large group of engineers. This article presents a solution that supports parallel work of groups of system analysts and software developers. Deployment of formal rules to the requirements written in natural language enables using formal analysis of artifacts being a bridge between software and system requirements. Formalism and textual form of requirements allowed the automatic generation of message flow graph for the (sub system, called the “big-picture-model”. Flow diagram analysis helped to avoid a large number of defects whose repair cost in extreme cases could undermine the legitimacy of agile methods in projects of this scale. Retrospectively, a reduction of technical debt was observed. Continuous analysis of the “big picture model” improves the control of the quality parameters of the software architecture. The article also tries to explain why the commercial platform based on UML modeling language may not be sufficient in projects of this complexity.

  7. Residence time distribution software analysis. User's manual

    International Nuclear Information System (INIS)

    Radiotracer applications cover a wide range of industrial activities in chemical and metallurgical processes, water treatment, mineral processing, environmental protection and civil engineering. Experiment design, data acquisition, treatment and interpretation are the basic elements of tracer methodology. The application of radiotracers to determine impulse response as RTD as well as the technical conditions for conducting experiments in industry and in the environment create a need for data processing using special software. Important progress has been made during recent years in the preparation of software programs for data treatment and interpretation. The software package developed for industrial process analysis and diagnosis by the stimulus-response methods contains all the methods for data processing for radiotracer experiments

  8. Automating Risk Analysis of Software Design Models

    Directory of Open Access Journals (Sweden)

    Maxime Frydman

    2014-01-01

    Full Text Available The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance.

  9. Automating risk analysis of software design models.

    Science.gov (United States)

    Frydman, Maxime; Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance. PMID:25136688

  10. Calibration of the Quadrupole Mass Spectrometer of the Sample Analysis at Mars Instrument Suite

    Science.gov (United States)

    Mahaffy, P. R.; Trainer, M. G.; Eigenbrode, J. L.; Franz, H. B.; Stern, J. C.; Harpold, D.; Conrad, P. G.; Raaen, E.; Lyness, E.

    2011-01-01

    The SAM suite of instruments on the "Curiosity" Rover of the Mars Science Laboratory (MSL) is designed to provide chemical and isotopic analysis of organic and inorganic volatiles for both atmospheric and solid samples. The mission of the MSL investigations is to advance beyond the successful search for aqueous transformation in surface environments at Mars toward a quantitative assessment of habitability and preservation through a series of chemical and geological measurements. The SAM suite was delivered in December 2010 (Figure 1) to the Jet Propulsion Laboratory for integration into the Curiosity Rover. We previously outlined the range of SAM solid and gas calibrations implemented or planned and here we discuss a specific set of calibration experiments to establish the response of the SAM Quadrupole Mass Spectrometer (QMS) to the four most abundant gases in the Martian atmosphere CO2, N2, Ar, and O2, A full SAM instrument description and calibration report is presently in preparation.

  11. Complexity of software trustworthiness and its dynamical statistical analysis methods

    Institute of Scientific and Technical Information of China (English)

    ZHENG ZhiMing; MA ShiLong; LI Wei; JIANG Xin; WEI Wei; MA LiLi; TANG ShaoTing

    2009-01-01

    Developing trusted softwares has become an important trend and a natural choice in the development of software technology and applications.At present,the method of measurement and assessment of software trustworthiness cannot guarantee safe and reliable operations of software systems completely and effectively.Based on the dynamical system study,this paper interprets the characteristics of behaviors of software systems and the basic scientific problems of software trustworthiness complexity,analyzes the characteristics of complexity of software trustworthiness,and proposes to study the software trustworthiness measurement in terms of the complexity of software trustworthiness.Using the dynamical statistical analysis methods,the paper advances an invariant-measure based assessment method of software trustworthiness by statistical indices,and hereby provides a dynamical criterion for the untrustworthiness of software systems.By an example,the feasibility of the proposed dynamical statistical analysis method in software trustworthiness measurement is demonstrated using numerical simulations and theoretical analysis.

  12. A COMPREHENSIVE REVIEW AND ANALYSIS ON OBJECT-ORIENTED SOFTWARE METRICS IN SOFTWARE MEASUREMENT

    OpenAIRE

    K.P. Srinivasan; Dr. T. Devi

    2014-01-01

    The software development is dynamic and is always undergoing major changes. Today a huge number of tools and methodologies are available for software development and software development refers to all activities that go into producing information system solution. System development activities consist of system analysis, modeling, design, implementation, testing and maintenance and further the state of software metrics in software development during the last decade is encouraging and many r...

  13. Peranso - Light Curve and Period Analysis Software

    CERN Document Server

    Paunzen, E

    2016-01-01

    A time series is a sample of observations of well-defined data points obtained through repeated measurements over a certain time range. The analysis of such data samples has become increasingly important not only in natural science but also in many other fields of research. Peranso offers a complete set of powerful light curve and period analysis functions to work with large, astronomical data sets. Substantial attention has been given to ease-of-use and data accuracy, making it one of the most productive time series analysis software available. In this paper, we give an introduction to Peranso and its functionality.

  14. Peranso - Light curve and period analysis software

    Science.gov (United States)

    Paunzen, E.; Vanmunster, T.

    2016-03-01

    A time series is a sample of observations of well-defined data points obtained through repeated measurements over a certain time range. The analysis of such data samples has become increasingly important not only in natural science but also in many other fields of research. Peranso offers a complete set of powerful light curve and period analysis functions to work with large astronomical data sets. Substantial attention has been given to ease-of-use and data accuracy, making it one of the most productive time series analysis software available. In this paper, we give an introduction to Peranso and its functionality.

  15. Analysis and design for architecture-based software

    Institute of Scientific and Technical Information of China (English)

    Jia Xiaolin; He Jian; Qin Zheng; Wang Xianghua

    2005-01-01

    The technologies of software architecture are introduced, and the software analysis-and-design process is divided into requirement analysis, software architecture design and system design. Using these technologies, a model of architecture-centric software analysis and design process(ACSADP) is proposed. Meanwhile, with regard to the completeness, consistency and correctness between the software requirements and design results, the theories of function and process control are applied to ACSADP. Finally, a model of integrated development environment (IDE) for ACSADP is proposed. It can be demonstrated by the practice that the model of ACSADP can aid developer to manage software process effectively and improve the quality of software analysis and design.

  16. Software analysis in the semantic web

    Science.gov (United States)

    Taylor, Joshua; Hall, Robert T.

    2013-05-01

    Many approaches in software analysis, particularly dynamic malware analyis, benefit greatly from the use of linked data and other Semantic Web technology. In this paper, we describe AIS, Inc.'s Semantic Extractor (SemEx) component from the Malware Analysis and Attribution through Genetic Information (MAAGI) effort, funded under DARPA's Cyber Genome program. The SemEx generates OWL-based semantic models of high and low level behaviors in malware samples from system call traces generated by AIS's introspective hypervisor, IntroVirtTM. Within MAAGI, these semantic models were used by modules that cluster malware samples by functionality, and construct "genealogical" malware lineages. Herein, we describe the design, implementation, and use of the SemEx, as well as the C2DB, an OWL ontology used for representing software behavior and cyber-environments.

  17. A Lexical Analysis of Social Software Literature

    OpenAIRE

    Loay ALTAMIMI

    2013-01-01

    Social software are today more prevalent in organizational context, providing new ways for work and giving web users new opportunities for interaction and collaboration. This review aims to gain insight into the extent of available scholarly and professional literature on these new tools and into interests in this field. The analysis of the 5356 collected articles includes type of publication, year of publication, source, keywords in articles' titles and abstracts. The study here adopted a sy...

  18. Digital PIV (DPIV) Software Analysis System

    Science.gov (United States)

    Blackshire, James L.

    1997-01-01

    A software package was developed to provide a Digital PIV (DPIV) capability for NASA LaRC. The system provides an automated image capture, test correlation, and autocorrelation analysis capability for the Kodak Megaplus 1.4 digital camera system for PIV measurements. The package includes three separate programs that, when used together with the PIV data validation algorithm, constitutes a complete DPIV analysis capability. The programs are run on an IBM PC/AT host computer running either Microsoft Windows 3.1 or Windows 95 using a 'quickwin' format that allows simple user interface and output capabilities to the windows environment.

  19. A Lexical Analysis of Social Software Literature

    Directory of Open Access Journals (Sweden)

    Loay ALTAMIMI

    2013-01-01

    Full Text Available Social software are today more prevalent in organizational context, providing new ways for work and giving web users new opportunities for interaction and collaboration. This review aims to gain insight into the extent of available scholarly and professional literature on these new tools and into interests in this field. The analysis of the 5356 collected articles includes type of publication, year of publication, source, keywords in articles' titles and abstracts. The study here adopted a systematic approach for the literature review, that is, the principle of Lexical Analysis.

  20. EDA: EXAFS data analysis software package

    Science.gov (United States)

    Kuzmin, A.

    1995-02-01

    The present paper describes the EXAFS data analysis software package, called EDA, originally developed by the author for IBM PC compatible computers. It consists of a set of interactive programs which allow to carry out all steps of the EXAFS data analysis procedure. There are two main differences from known packages. First, a significantly improved algorithm is used for atomic-like background removal in the EXAFS extraction procedure. Second, a model independent derivation of the radial distribution function from EXAFS, based on a maximum-entropy-like algorithm, is available.

  1. R suite for the Reduction and Analysis of UFO Orbit Data

    Science.gov (United States)

    Campbell-Burns, P.; Kacerek, R.

    2016-02-01

    This paper presents work undertaken by UKMON to compile a suite of simple R scripts for the reduction and analysis of meteor data. The application of R in this context is by no means an original idea and there is no doubt that it has been used already in many reports to the IMO. However, we are unaware of any common libraries or shared resources available to the meteor community. By sharing our work we hope to stimulate interest and discussion. Graphs shown in this paper are illustrative and are based on current data from both EDMOND and UKMON.

  2. Analysis of software for modeling atmospheric dispersion

    International Nuclear Information System (INIS)

    During last few years, a number software packages for microcomputes have appeared with the aim to simulate diffusion of atmospheric pollutants. These codes, simplifying the models used for safety analyses of industrial plants are becoming more useful, and are even used for post-accidental conditions. The report presents for the first time in a critical manner, principal models available up to this date. The problem arises in adapting the models to the demanded post-accidental interventions. In parallel to this action an analysis of performance was performed. It means, identifying the need of forecasting the most appropriate actions to be performed having in mind short available time and lack of information. Because of these difficulties, it is possible to simplify the software, which will not include all the options but could deal with a specific situation. This would enable minimisation of data to be collected on the site

  3. ISON Data Acquisition and Analysis Software

    Science.gov (United States)

    Kouprianov, Vladimir

    2013-08-01

    Since the first days of the ISON project, its success was strongly based on using advanced data analysis techniques and their implementation in software. Space debris studies and space surveillance in optical are very unique from the point of view of observation techniques and thus infer extremely specific requirements on sensor design and control and on initial data analysis, dictated mostly by fast apparent motion of space objects being studied. From the point of view of data acquisition and analysis software, this implies support for sophisticated scheduling, complex tracking, accurate timing, large fields of view, and undersampled CCD images with trailed sources. Here we present the historical outline, major goals and design concepts of the standard ISON data acquisition and analysis packages, and how they meet these requirements. Among these packages, the most important are: CHAOS telescope control system (TCS), its recent successor FORTE, and Apex II ‒ a platform for astronomical image analysis with focus on high-precision astrometry and photometry of fast-moving objects and transient phenomena. Development of these packages is supported by ISON, and they are now responsible for most of the raw data produced by the network. They are installed on nearly all sensors and are available to all participants of the ISON collaboration.

  4. Software development processes and analysis software: a mismatch and a novel framework

    International Nuclear Information System (INIS)

    This paper discusses the salient characteristics of analysis software and the impact of those characteristics on its development. From this discussion, it can be seen that mainstream software development processes, usually characterized as Plan Driven or Agile, are built upon assumptions that are mismatched to the development and maintenance of analysis software. We propose a novel software development framework that would match the process normally observed in the development of analysis software. In the discussion of this framework, we suggest areas of research and directions for future work. (author)

  5. Software Metrics: Some degree of software measurement and analysis

    OpenAIRE

    Rakesh. L; Manoranjan Kumar Singh; Gunaseelan Devaraj

    2010-01-01

    Measurement lies at the heart of many systems that govern our lives. Measurement is essential to our daily life and measuring has become a common place and well accepted. Engineering discipline use methods that are based on models and theories. Methodological improvements alone do not make an engineering discipline. Measurement encourages us to improve our processes and products. This paper examines the realm of software engineering to see why measurement is needed and also set the scene for ...

  6. Evaluating Data Analysis Software: The Case of TinkerPlots

    Science.gov (United States)

    Fitzallen, Noleine

    2007-01-01

    The ever increasing availability of mathematics education software and internet-based multimedia learning activities presents teachers with the difficult task of deciding which programs are best suited for their students' learning needs. The challenge is for teachers to select pedagogical products that not only promote significant mathematical…

  7. Software Reliability Growth Model with Logistic-Exponential Test-Effort Function and Analysis of Software Release Policy

    OpenAIRE

    Shaik. Mohammad Rafi; Dr.K.Nageswara Rao; Shaheda Akthar

    2010-01-01

    software reliability is one of the important factors of software quality. Before software delivered in to market it is thoroughly checked and errors are removed. Every software industry wants to develop software that should be error free. Software reliabilitygrowth models are helping the software industries to develop software which is error free and reliable. In this paper an analysis is done based on incorporating the logistic-exponential testing-effort in to NHPP Software reliability growt...

  8. Static analysis of software the abstract interpretation

    CERN Document Server

    Boulanger, Jean-Louis

    2013-01-01

    The existing literature currently available to students and researchers is very general, covering only the formal techniques of static analysis. This book presents real examples of the formal techniques called ""abstract interpretation"" currently being used in various industrial fields: railway, aeronautics, space, automotive, etc. The purpose of this book is to present students and researchers, in a single book, with the wealth of experience of people who are intrinsically involved in the realization and evaluation of software-based safety critical systems. As the authors are people curr

  9. Development of software for airborne photos analysis

    Science.gov (United States)

    Rudowicz-Nawrocka, J.; Tomczak, R. J.; Nowakowski, K.; Mueller, W.; Kujawa, S.

    2014-04-01

    Systems type UAV / UAS enable acquisition of huge amounts of data, such as images. For their storage and analysis IT systems are necessary. Existing systems do not always allow you to perform such operations as researchers wish to [1]. The purpose of the research is to automate the process of recognizing objects and phenomena occurring on grasslands. The basis for action are numerous collections of images taken from the oktokopter [2]. For the purpose of the collection, management and analysis of image data and character acquired in the course of research, in accordance with the principles of software engineering several computer programs has been produced. The resulting software is different functionality and type. Applications were made using a number of popular technologies. The choice of so many technology was primarily dictated by the possibilities of their use for specific tasks and availability on different platforms and the ability to distribute open source. Applications presented by the authors, designed to assess the status of grassland based on aerial photography, show the complexity of the issues but at the same time tend to further research.

  10. ANALYSIS OF SOFTWARE COST ESTIMATION MODELS

    OpenAIRE

    Tahir Abdullah; Rabia Saleem; Shahbaz Nazeer; Muhammad Usman

    2012-01-01

    Software Cost estimation is a process of forecasting the Cost of project in terms of budget, time, and other resources needed to complete a software system and it is a core issue in the software project management to estimate the cost of a project before initiating the Software Project. Different models have been developed to estimate the cost of software projects for the last several years. Most of these models rely on the Analysts’ experience, size of the software project and some other sof...

  11. Visual querying and analysis of large software repositories

    NARCIS (Netherlands)

    Voinea, Lucian; Telea, Alexandru

    2009-01-01

    We present a software framework for mining software repositories. Our extensible framework enables the integration of data extraction from repositories with data analysis and interactive visualization. We demonstrate the applicability of the framework by presenting several case studies performed on

  12. Towards a document structure editor for software requirements analysis

    Science.gov (United States)

    Kowalski, Vincent J.; Lekkos, Anthony A.

    1986-01-01

    Of the six or seven phases of the software engineering life cycle, requirements analysis tends to be the least understood and the least formalized. Correspondingly, a scarcity of useful software tools exist which aid in the development of user and system requirements. It is proposed that requirements analysis should culminate in a set of documents similar to those that usually accompany a delivered Software product. The design of a software tool, the Document Structure Editor, which facilitates the development of such documentation.

  13. Method for detecting software anomalies based on recurrence plot analysis

    Directory of Open Access Journals (Sweden)

    Michał Mosdorf

    2012-03-01

    Full Text Available Presented paper evaluates method for detecting software anomalies based on recurrence plot analysis of trace log generated by software execution. Described method for detecting software anomalies is based on windowed recurrence quantification analysis for selected measures (e.g. Recurrence rate - RR or Determinism - DET. Initial results show that proposed method is useful in detecting silent software anomalies that do not result in typical crashes (e.g. exceptions.

  14. Standardizing Activation Analysis: New Software for Photon Activation Analysis

    International Nuclear Information System (INIS)

    Photon Activation Analysis (PAA) of environmental, archaeological and industrial samples requires extensive data analysis that is susceptible to error. For the purpose of saving time, manpower and minimizing error, a computer program was designed, built and implemented using SQL, Access 2007 and asp.net technology to automate this process. Based on the peak information of the spectrum and assisted by its PAA library, the program automatically identifies elements in the samples and calculates their concentrations and respective uncertainties. The software also could be operated in browser/server mode, which gives the possibility to use it anywhere the internet is accessible. By switching the nuclide library and the related formula behind, the new software can be easily expanded to neutron activation analysis (NAA), charged particle activation analysis (CPAA) or proton-induced X-ray emission (PIXE). Implementation of this would standardize the analysis of nuclear activation data. Results from this software were compared to standard PAA analysis with excellent agreement. With minimum input from the user, the software has proven to be fast, user-friendly and reliable.

  15. Analysis of Software Product Strategy at TPS

    OpenAIRE

    Oystryk, Gareth

    2010-01-01

    The purpose of this report is to help TPS make strategic decisions about the future of its human services software products. TPS is a privately held company that entered the software publishing industry in 2006 with the intent of selling products and services to the human services software market. However, TPS’ portfolio of products experienced uneven financial performance over the last four years, resulting in the need to reconsider its software product strategy. This report presents finding...

  16. A suite of R packages for web-enabled modeling and analysis of surface waters

    Science.gov (United States)

    Read, J. S.; Winslow, L. A.; Nüst, D.; De Cicco, L.; Walker, J. I.

    2014-12-01

    Researchers often create redundant methods for downloading, manipulating, and analyzing data from online resources. Moreover, the reproducibility of science can be hampered by complicated and voluminous data, lack of time for documentation and long-term maintenance of software, and fear of exposing programming skills. The combination of these factors can encourage unshared one-off programmatic solutions instead of openly provided reusable methods. Federal and academic researchers in the water resources and informatics domains have collaborated to address these issues. The result of this collaboration is a suite of modular R packages that can be used independently or as elements in reproducible analytical workflows. These documented and freely available R packages were designed to fill basic needs for the effective use of water data: the retrieval of time-series and spatial data from web resources (dataRetrieval, geoknife), performing quality assurance and quality control checks of these data with robust statistical methods (sensorQC), the creation of useful data derivatives (including physically- and biologically-relevant indices; GDopp, LakeMetabolizer), and the execution and evaluation of models (glmtools, rLakeAnalyzer). Here, we share details and recommendations for the collaborative coding process, and highlight the benefits of an open-source tool development pattern with a popular programming language in the water resources discipline (such as R). We provide examples of reproducible science driven by large volumes of web-available data using these tools, explore benefits of accessing packages as standardized web processing services (WPS) and present a working platform that allows domain experts to publish scientific algorithms in a service-oriented architecture (WPS4R). We assert that in the era of open data, tools that leverage these data should also be freely shared, transparent, and developed in an open innovation environment.

  17. Image processing and analysis software development

    International Nuclear Information System (INIS)

    The work presented in this project is aimed at developing a software 'IMAGE GALLERY' to investigate various image processing and analysis techniques. The work was divided into two parts namely the image processing techniques and pattern recognition, which further comprised of character and face recognition. Various image enhancement techniques including negative imaging, contrast stretching, compression of dynamic, neon, diffuse, emboss etc. have been studied. Segmentation techniques including point detection, line detection, edge detection have been studied. Also some of the smoothing and sharpening filters have been investigated. All these imaging techniques have been implemented in a window based computer program written in Visual Basic Neural network techniques based on Perception model have been applied for face and character recognition. (author)

  18. Mutation Analysis Approach to Develop Reliable Object-Oriented Software

    Directory of Open Access Journals (Sweden)

    Monalisa Sarma

    2014-01-01

    Full Text Available In general, modern programs are large and complex and it is essential that they should be highly reliable in applications. In order to develop highly reliable software, Java programming language developer provides a rich set of exceptions and exception handling mechanisms. Exception handling mechanisms are intended to help developers build robust programs. Given a program with exception handling constructs, for an effective testing, we are to detect whether all possible exceptions are raised and caught or not. However, complex exception handling constructs make it tedious to trace which exceptions are handled and where and which exceptions are passed on. In this paper, we address this problem and propose a mutation analysis approach to develop reliable object-oriented programs. We have applied a number of mutation operators to create a large set of mutant programs with different type of faults. We then generate test cases and test data to uncover exception related faults. The test suite so obtained is applied to the mutant programs measuring the mutation score and hence verifying whether mutant programs are effective or not. We have tested our approach with a number of case studies to substantiate the efficacy of the proposed mutation analysis technique.

  19. Software applications for flux balance analysis.

    Science.gov (United States)

    Lakshmanan, Meiyappan; Koh, Geoffrey; Chung, Bevan K S; Lee, Dong-Yup

    2014-01-01

    Flux balance analysis (FBA) is a widely used computational method for characterizing and engineering intrinsic cellular metabolism. The increasing number of its successful applications and growing popularity are possibly attributable to the availability of specific software tools for FBA. Each tool has its unique features and limitations with respect to operational environment, user-interface and supported analysis algorithms. Presented herein is an in-depth evaluation of currently available FBA applications, focusing mainly on usability, functionality, graphical representation and inter-operability. Overall, most of the applications are able to perform basic features of model creation and FBA simulation. COBRA toolbox, OptFlux and FASIMU are versatile to support advanced in silico algorithms to identify environmental and genetic targets for strain design. SurreyFBA, WEbcoli, Acorn, FAME, GEMSiRV and MetaFluxNet are the distinct tools which provide the user friendly interfaces in model handling. In terms of software architecture, FBA-SimVis and OptFlux have the flexible environments as they enable the plug-in/add-on feature to aid prospective functional extensions. Notably, an increasing trend towards the implementation of more tailored e-services such as central model repository and assistance to collaborative efforts was observed among the web-based applications with the help of advanced web-technologies. Furthermore, most recent applications such as the Model SEED, FAME, MetaFlux and MicrobesFlux have even included several routines to facilitate the reconstruction of genome-scale metabolic models. Finally, a brief discussion on the future directions of FBA applications was made for the benefit of potential tool developers. PMID:23131418

  20. Software patterns, knowledge maps, and domain analysis

    CERN Document Server

    Fayad, Mohamed E; Hegde, Srikanth GK; Basia, Anshu; Vakil, Ashka

    2014-01-01

    Preface AcknowledgmentsAuthors INTRODUCTIONAn Overview of Knowledge MapsIntroduction: Key Concepts-Software Stable Models, Knowledge Maps, Pattern Language, Goals, Capabilities (Enduring Business Themes + Business Objects) The Motivation The Problem The Objectives Overview of Software Stability Concepts Overview of Knowledge Maps Pattern Languages versus Knowledge Maps: A Brief ComparisonThe Solution Knowledge Maps Methodology or Concurrent Software Development ModelWhy Knowledge Maps? Research Methodology Undertaken Research Verification and Validation The Stratification of This Book Summary

  1. Analysis on Some of Software Reliability Models

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Software reliability & maintainability evaluation tool (SRMET 3.0) is introducted in detail in this paper,which was developed by Software Evaluation and Test Center of China Aerospace Mechanical Corporation. SRMET 3.0is supported by seven soft ware reliability models and four software maintainability models. Numerical characteristicsfor all those models are deeply studied in this paper, and corresponding numerical algorithms for each model are alsogiven in the paper.

  2. Visual querying and analysis of large software repositories

    OpenAIRE

    Voinea, Lucian; Telea, Alexandru

    2009-01-01

    We present a software framework for mining software repositories. Our extensible framework enables the integration of data extraction from repositories with data analysis and interactive visualization. We demonstrate the applicability of the framework by presenting several case studies performed on industry-size software repositories. In each study we use the framework to give answers to one or several software engineering questions addressing a specific project. Next, we validate the answers...

  3. A strategic analysis of a software company in transition

    OpenAIRE

    Larson, Marnie

    2005-01-01

    This project provides an in depth analysis of a small software company attempting to transition business models in an evolving software market. The market for HRIPayroll software solutions is consolidating quickly and StarGarden Software must decide where its place is in the market and whether it makes sense for the company to continue to go it alone. Options include: utilizing resellers, downsizing, and becoming an acquisition target. StarGarden also has an exciting new product line in devel...

  4. Regression Testing Cost Reduction Suite

    OpenAIRE

    Mohamed Alaa El-Din; Ismail Abd El-Hamid Taha; Hesham El-Deeb

    2014-01-01

    The estimated cost of software maintenance exceeds 70 percent of total software costs [1], and large portion of this maintenance expenses is devoted to regression testing. Regression testing is an expensive and frequently executed maintenance activity used to revalidate the modified software. Any reduction in the cost of regression testing would help to reduce the software maintenance cost. Test suites once developed are reused and updated frequently as the software evolves. As a result, some...

  5. Change Impact Analysis of Crosscutting in Software Architectural Design

    OpenAIRE

    Berg, van den, W.

    2006-01-01

    Software architectures should be amenable to changes in user requirements and implementation technology. The analysis of the impact of these changes can be based on traceability of architectural design elements. Design elements have dependencies with other software artifacts but also evolve in time. Crosscutting dependencies may have a strong influence on modifiability of software architectures. We present an impact analysis of crosscutting dependencies in architectural design. The analysis i...

  6. MIBPB: a software package for electrostatic analysis.

    Science.gov (United States)

    Chen, Duan; Chen, Zhan; Chen, Changjun; Geng, Weihua; Wei, Guo-Wei

    2011-03-01

    The Poisson-Boltzmann equation (PBE) is an established model for the electrostatic analysis of biomolecules. The development of advanced computational techniques for the solution of the PBE has been an important topic in the past two decades. This article presents a matched interface and boundary (MIB)-based PBE software package, the MIBPB solver, for electrostatic analysis. The MIBPB has a unique feature that it is the first interface technique-based PBE solver that rigorously enforces the solution and flux continuity conditions at the dielectric interface between the biomolecule and the solvent. For protein molecular surfaces, which may possess troublesome geometrical singularities, the MIB scheme makes the MIBPB by far the only existing PBE solver that is able to deliver the second-order convergence, that is, the accuracy increases four times when the mesh size is halved. The MIBPB method is also equipped with a Dirichlet-to-Neumann mapping technique that builds a Green's function approach to analytically resolve the singular charge distribution in biomolecules in order to obtain reliable solutions at meshes as coarse as 1 Å--whereas it usually takes other traditional PB solvers 0.25 Å to reach similar level of reliability. This work further accelerates the rate of convergence of linear equation systems resulting from the MIBPB by using the Krylov subspace (KS) techniques. Condition numbers of the MIBPB matrices are significantly reduced by using appropriate KS solver and preconditioner combinations. Both linear and nonlinear PBE solvers in the MIBPB package are tested by protein-solvent solvation energy calculations and analysis of salt effects on protein-protein binding energies, respectively. PMID:20845420

  7. Testing the global capabilities of the Antelope software suite: fast location and Mb determination of teleseismic events using the ASAIN and GSN seismic networks

    Science.gov (United States)

    Pesaresi, D.; Russi, M.; Plasencia, M.; Cravos, C.

    2009-04-01

    The Italian National Institute for Oceanography and Experimental Geophysics (Istituto Nazionale di Oceanografia e di Geofisica Sperimentale, OGS) is running the Antarctic Seismographic Argentinean Italian Network (ASAIN), made of 5 seismic stations located in the Scotia Sea region in Antarctica and in Argentina: data from these stations are transferred in real time to the OGS headquarters in Trieste (Italy) via satellite links. OGS is also running, in close cooperation with the Friuli-Venezia Giulia Civil Defense, the North East (NI) Italy seismic network, making use of the Antelope commercial software suite from BRTT as the main acquisition system. As a test to check the global capabilities of Antelope, we set up an instance of Antelope acquiring data in real time from both the regional ASAIN seismic network in Antarctica and a subset of the Global Seismic Network (GSN) funded by the Incorporated Research Institution for Seismology (IRIS). The facilities of the IRIS Data Management System, and specifically the IRIS Data Management Center, were used for real time access to waveform required in this study. Preliminary results over 1 month period indicated that about 82% of the earthquakes with magnitude M>5.0 listed in the PDE catalogue of the National Earthquake Information Center (NEIC) of the United States Geological Survey (USGS) were also correctly detected by Antelope, with an average location error of 0.05 degrees and average body wave magnitude Mb estimation error below 0.1. The average time difference between event origin time and the actual time of event determination by Antelope was of about 45': the comparison with 20', the IASPEI91 P-wave travel time for 180 degrees distance, and 25', the estimate of our test system data latency, indicate that Antelope is a serious candidate for regional and global early warning systems. Updated figures calculated over a longer period of time will be presented and discussed.

  8. Analysis of Test Efficiency during Software Development Process

    CERN Document Server

    Nair, T R Gopalakrishnan; Tiwari, Pranesh Kumar

    2012-01-01

    One of the prerequisites of any organization is an unvarying sustainability in the dynamic and competitive industrial environment. Development of high quality software is therefore an inevitable constraint of any software industry. Defect management being one of the highly influencing factors for the production of high quality software, it is obligatory for the software organizations to orient them towards effective defect management. Since, the time of software evolution, testing is deemed a promising technique of defect management in all IT industries. This paper provides an empirical investigation of several projects through a case study comprising of four software companies having various production capabilities. The aim of this investigation is to analyze the efficiency of test team during software development process. The study indicates very low-test efficiency at requirements analysis phase and even lesser test efficiency at design phase of software development. Subsequently, the study calls for a str...

  9. Rapid Optical Characterization Suite for in situ Target Analysis of Rock Surfaces Project

    Data.gov (United States)

    National Aeronautics and Space Administration — ROCSTAR is an in situ instrument suite that can accomplish rapid mineral and molecular identification without sample preparation for in situ planetary exploration;...

  10. Managing Software Project Risks (Analysis Phase) with Proposed Fuzzy Regression Analysis Modelling Techniques with Fuzzy Concepts

    OpenAIRE

    Elzamly, Abdelrafe; Hussin, Burairah

    2014-01-01

    The aim of this paper is to propose new mining techniques by which we can study the impact of different risk management techniques and different software risk factors on software analysis development projects. The new mining technique uses the fuzzy multiple regression analysis techniques with fuzzy concepts to manage the software risks in a software project and mitigating risk with software process improvement. Top ten software risk factors in analysis phase and thirty risk management techni...

  11. Time Management in the Operating Room: An Analysis of the Dedicated Minimally Invasive Surgery Suite

    OpenAIRE

    Hsiao, Kenneth C.; Machaidze, Zurab; Pattaras, John G.

    2004-01-01

    Background: Dedicated minimally invasive surgery suites are available that contain specialized equipment to facilitate endoscopic surgery. Laparoscopy performed in a general operating room is hampered by the multitude of additional equipment that must be transported into the room. The objective of this study was to compare the preparation times between procedures performed in traditional operating rooms versus dedicated minimally invasive surgery suites to see whether operating room efficienc...

  12. Development of a New VLBI Data Analysis Software

    Science.gov (United States)

    Bolotin, Sergei; Gipson, John M.; MacMillan, Daniel S.

    2010-01-01

    We present an overview of a new VLBI analysis software under development at NASA GSFC. The new software will replace CALC/SOLVE and many related utility programs. It will have the capabilities of the current system as well as incorporate new models and data analysis techniques. In this paper we give a conceptual overview of the new software. We formulate the main goals of the software. The software should be flexible and modular to implement models and estimation techniques that currently exist or will appear in future. On the other hand it should be reliable and possess production quality for processing standard VLBI sessions. Also, it needs to be capable of processing observations from a fully deployed network of VLBI2010 stations in a reasonable time. We describe the software development process and outline the software architecture.

  13. The Combustion Experiment on the Sample Analysis at Mars (SAM) Instrument Suite on the Curiosity Rover

    Science.gov (United States)

    Stern, J. C.; Malespin, C. A.; Eigenbrode, J. L.; Graham, H. V.; Archer, P. D., Jr.; Brunner, A. E.; Freissinet, C.; Franz, H. B.; Fuentes, J.; Glavin, D. P.; Leshin, L. A.; Mahaffy, P. R.; McAdam, A. C.; Ming, D. W.; Navvaro-Gonzales, R.; Niles, P. B.; Steele, A.

    2014-01-01

    The combustion experiment on the Sample Analysis at Mars (SAM) suite on Curiosity will heat a sample of Mars regolith in the presence of oxygen and measure composition of the evolved gases using quadrupole mass spectrometry (QMS) and tunable laser spectrometry (TLS). QMS will enable detection of combustion products such as CO, CO2, NO, and other oxidized species, while TLS will enable precise measurements of the abundance and carbon isotopic composition (delta(sup 13)C) of the evolved CO2 and hydrogen isotopic composition (deltaD) of H2O. SAM will perform a two-step combustion to isolate combustible materials below approx.550 C and above approx.550 C. The combustion experiment on SAM, if properly designed and executed, has the potential to answer multiple questions regarding the origins of volatiles seen thus far in SAM evolved gas analysis (EGA) on Mars. Constraints imposed by SAM and MSL time and power resources, as well as SAM consumables (oxygen gas), will limit the number of SAM combustion experiments, so it is imperative to design an experiment targeting the most pressing science questions. Low temperature combustion experiments will primarily target the quantification of carbon (and nitrogen) contributed by SAM wet chemistry reagants MTBSTFA (N-Methyl-N-tert-butyldimethylsilyltrifluoroacetamide) and DMF (Dimethylformamide), which have been identified in the background of blank and sample runs and may adsorb to the sample while the cup is in the Sample Manipulation System (SMS). In addition, differences between the sample and "blank" may yield information regarding abundance and delta(sup 13)C of bulk (both organic and inorganic) martian carbon. High temperature combustion experiments primarily aim to detect refractory organic matter, if present in Cumberland fines, as well as address the question of quantification and deltaD value of water evolution associated with hydroxyl hydrogen in clay minerals.

  14. Using the Beopt Automated Residential Simulation Test Suite to Enable Comparative Analysis Between Energy Simulation Engines: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Tabares-Velasco, Paulo Cesar [National Renewable Energy Lab. (NREL), Golden, CO (United States); Maguire, Jeff [National Renewable Energy Lab. (NREL), Golden, CO (United States); Horowitz, Scott [National Renewable Energy Lab. (NREL), Golden, CO (United States); Christensen, Craig [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2014-09-01

    Verification and validation are crucial software quality control procedures to follow when developing and implementing models. This is particularly important because a variety of stakeholders rely on accurate predictions from building simulation programs. This study uses the BEopt Automated Residential Simulation Test Suite (BARTS) to facilitate comparison of two energy simulation engines across various building components and includes building models that isolate the impacts of specific components on annual energy consumption. As a case study, BARTS has been used to identify important discrepancies between the engines for several components of the building models. These discrepancies are caused by differences in the algorithms used by the engines or coding errors.

  15. Using the BEopt Automated Residential Simulation Test Suite to Enable Comparative Analysis Between Energy Simulation Engines: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Tabares-Velasco, P. C.; Maguire, J.; Horowitz, S.; Christensen, C.

    2014-09-01

    Verification and validation are crucial software quality control procedures when developing and implementing models. This is particularly important as a variety of stakeholders rely on accurate predictions from building simulation programs. This study uses the BEopt Automated Residential Simulation Test Suite (BARTS) to facilitate comparison of two energy simulation engines across various building components and includes models that isolate the impacts of specific building components on annual energy consumption. As a case study, BARTS has been used to identify important discrepancies between the engines for several components of the building models; these discrepancies are caused by differences in the models used by the engines or coding errors.

  16. A proposal for performing software safety hazard analysis

    International Nuclear Information System (INIS)

    Techniques for analyzing the safety and reliability of analog-based electronic protection systems that serve to mitigate hazards in process control systems have been developed over many years, and are reasonably understood. An example is the protection system in a nuclear power plant. The extension of these techniques to systems which include digital computers is not well developed, and there is little consensus among software engineering experts and safety experts on how to analyze such systems. One possible technique is to extend hazard analysis to include digital computer-based systems. Software is frequently overlooked during system hazard analyses, but this is unacceptable when the software is in control of a potentially hazardous operation. In such cases, hazard analysis should be extended to fully cover the software. A method for performing software hazard analysis is proposed in this paper. The method concentrates on finding hazards during the early stages of the software life cycle, using an extension of HAZOP. (author)

  17. A proposal for performing software safety hazard analysis

    International Nuclear Information System (INIS)

    Techniques for analyzing the safety and reliability of analog-based electronic protection systems that serve to mitigate hazards in process control systems have been developed over many years, and are reasonably understood. An example is the protection system in a nuclear power plant. The extension of these techniques to systems which include digital computers is not well developed, and there is little consensus among software engineering experts and safety experts on how to analyze such systems. One possible technique is to extend hazard analysis to include digital computer-based systems. Software is frequently overlooked during system hazard analyses, but this is unacceptable when the software is in control of a potentially hazardous operation. In such cases, hazard analysis should be extended to fully cover the software. A method for performing software hazard analysis is proposed in this paper. The method concentrates on finding hazards during the early stages of the software life cycle, using an extension of HAZOP

  18. Design and Development of a Miniaturized Double Latching Solenoid Valve for the Sample Analysis at Mars Instrument Suite

    Science.gov (United States)

    Smith, James T.

    2008-01-01

    The development of the in-house Miniaturized Double Latching Solenoid Valve, or Microvalve, for the Gas Processing System (GPS) of the Sample Analysis at Mars (SAM) instrument suite is described. The Microvalve is a double latching solenoid valve that actuates a pintle shaft axially to hermetically seal an orifice. The key requirements and the design innovations implemented to meet them are described.

  19. Runtime analysis of search heuristics on software engineering problems

    Institute of Scientific and Technical Information of China (English)

    Per Kristian LEHRE; Xin YAO

    2009-01-01

    Many software engineering tasks can potentially be automated using search heuristics. However, much work is needed in designing and evaluating search heuristics before this approach can be routinely applied to a software engineering problem. Experimental methodology should be complemented with theoretical analysis to achieve this goal.Recently, there have been significant theoretical advances in the runtime analysis of evolutionary algorithms (EAs) and other search heuristics in other problem domains. We suggest that these methods could be transferred and adapted to gain insight into the behaviour of search heuristics on software engineering problems while automating software engineering.

  20. New software for XRF quantitative analysis

    International Nuclear Information System (INIS)

    It is well known that in XRF quantitative analysis empirical calibrations, even in the most simple case of binary mixtures, a relatively large number of standards are required. In case of samples containing more than 3 elements, the number of standards needed for calibration becomes suddenly prohibitive and the calibration curve has to be obtained by complicated multidimensional fits. In order to overcome this difficulty, a new XRF analysis software has been developed, based exclusively on theoretical treatment of photon interactions in sample. Starting from theoretical formulas of Shiraiwa and Fujino for primary and secondary fluorescence, modified to take into account the finite sample thickness, the total yield for a characteristic line Xj in a sample can be calculated as function of its composition w vector = (w1,...,wn) were {wj} are the concentrations of all n constituent elements. A non-linear system can be written for a given sample with unknown composition. Choosing a number of equations equal to the number of identified elements, we obtain a non-linear system that can be solved numerically by Newton's algorithm. When a light element is known to be present in sample and its lines cannot be seen in spectrum (i.e. Al, C, etc) the completeness equation, Σwi = 1 must be added in the system to take into account the true composition. Based on the algorithm sketched before, a set of computer codes has been written each one being specific to one of the three types of excitation sources usually used in XRF: collimated beam from a X-ray tube, collimated isotopic source and ring-like isotopic source. The ring-source version is completed by a Monte Carlo code for incidence vs. detection angle weight matrix calculation. Also, a version taking into account chemical content for the existing compounds in sample has been written for each type of excitation source. The programs were tested on many samples with known composition and the results were always below 10

  1. Fabrication and performance analysis of a DEA cuff designed for dry-suit applications

    Science.gov (United States)

    Ahmadi, S.; Camacho Mattos, A.; Barbazza, A.; Soleimani, M.; Boscariol, P.; Menon, C.

    2013-03-01

    A method for manufacturing a cylindrical dielectric elastomer actuator (DEA) is presented. The cylindrical DEA can be used in fabricating the cuff area of dry-suits where the garment is very tight and wearing the suit is difficult. When electrically actuated, the DEA expands radially and the suit can be worn more comfortably. In order to study the performance of the DEA, a customized testing setup was designed, and silicone-made cuff samples with different material stiffnesses were tested. Analytical and FEM modeling were considered to evaluate the experimental output. The results revealed that although the stiffness of the DEA material has a direct relationship with the radial constrictive pressure caused by mechanically stretching the DEA, it has a minor effect on the actuation pressure. It was also found that stacking multiple layers of the DEA to fabricate a laminated structure enabled the attainment of a desired variation of pressure required for the implementation of an electrically tunable cuff.

  2. Software Piracy in Research: A Moral Analysis.

    Science.gov (United States)

    Santillanes, Gary; Felder, Ryan Marshall

    2015-08-01

    Researchers in virtually every discipline rely on sophisticated proprietary software for their work. However, some researchers are unable to afford the licenses and instead procure the software illegally. We discuss the prohibition of software piracy by intellectual property laws, and argue that the moral basis for the copyright law offers the possibility of cases where software piracy may be morally justified. The ethics codes that scientific institutions abide by are informed by a rule-consequentialist logic: by preserving personal rights to authored works, people able to do so will be incentivized to create. By showing that the law has this rule-consequentialist grounding, we suggest that scientists who blindly adopt their institutional ethics codes will commit themselves to accepting that software piracy could be morally justified, in some cases. We hope that this conclusion will spark debate over important tensions between ethics codes, copyright law, and the underlying moral basis for these regulations. We conclude by offering practical solutions (other than piracy) for researchers. PMID:25005342

  3. ORION Environmental Control and Life Support Systems Suit Loop and Pressure Control Analysis

    Science.gov (United States)

    Eckhardt, Brad; Conger, Bruce; Stambaugh, Imelda C.

    2015-01-01

    Under NASA's ORION Multi-Purpose Crew Vehicle (MPCV) Environmental Control and Life Support System (ECLSS) Project at Johnson Space Center's (JSC), the Crew and Thermal Systems Division has developed performance models of the air system using Thermal Desktop/FloCAD. The Thermal Desktop model includes an Air Revitalization System (ARS Loop), a Suit Loop, a Cabin Loop, and Pressure Control System (PCS) for supplying make-up gas (N2 and O2) to the Cabin and Suit Loop. The ARS and PCS are designed to maintain air quality at acceptable O2, CO2 and humidity levels as well as internal pressures in the vehicle Cabin and during suited operations. This effort required development of a suite of Thermal Desktop Orion ECLSS models to address the need for various simulation capabilities regarding ECLSS performance. An initial highly detailed model of the ARS Loop was developed in order to simulate rapid pressure transients (water hammer effects) within the ARS Loop caused by events such as cycling of the Pressurized Swing Adsorption (PSA) Beds and required high temporal resolution (small time steps) in the model during simulation. A second ECLSS model was developed to simulate events which occur over longer periods of time (over 30 minutes) where O2, CO2 and humidity levels, as well as internal pressures needed to be monitored in the cabin and for suited operations. Stand-alone models of the PCS and the Negative Pressure relief Valve (NPRV) were developed to study thermal effects within the PCS during emergency scenarios (Cabin Leak) and cabin pressurization during vehicle re-entry into Earth's atmosphere. Results from the Orion ECLSS models were used during Orion Delta-PDR (July, 2014) to address Key Design Requirements (KDR's) for Suit Loop operations for multiple mission scenarios.

  4. The decommissioning and demolition of four suites of high active chemical analysis cells at DNPDE

    International Nuclear Information System (INIS)

    The decommissioning and demolition of four laboratory suites of high active cells at DNPDE is described. All four suites had suffered drain leaks of high active liquor into underfloor ducts; the options available at the time and current policy for dealing with the resultant activity deposits are given. The decommissioning procedures are detailed to provide information for future similar exercises. Features to ease demolition of such facilities and to eliminate the possibility of long term activity deposition from drain leaks are highlighted for incorporation in future designs. The waste arisings and radiation doses received during the work are tabulated. (author)

  5. Interoperability between analysis and detailing software for reinforced concrete

    International Nuclear Information System (INIS)

    The paper demonstrates the concept of interoperability between analysis and detailing software by flow charting the appropriate flow of common data. An application of the proposed data flow is provided to validate the concept. Data is shown to pass in both directions between software applications, i e not only from analysis to detailing but also from detailing back to analysis. The full scale application of the proposed interchange of data is discussed. Conclusions related to potential challenges and rewards associated with developing fully functioning interoperability between analysis and detailing software for reinforced concrete are provided

  6. TEST SUITE GENERATION PROCESS FOR AGENT TESTING

    Directory of Open Access Journals (Sweden)

    HOUHAMDI ZINA

    2011-04-01

    Full Text Available Software agents are a promising technology for today's complex, distributed systems. Methodologies and techniques that address testing and reliability of multi agent systems are increasingly demanded, in particular to support automated test case generation and execution. In this paper, we introduce a novel approach for goal-oriented software agent testing. It specifies a testing process that complements the goal oriented methodology Tropos and reinforces the mutual relationship between goal analysis and testing. Furthermore, it defines a structured and comprehensive agent test suite generation process by providing a systematic way of deriving test cases from goal analysis.

  7. The Einstein Suite: A Web-Based Tool for Rapid and Collaborative Engineering Design and Analysis

    Science.gov (United States)

    Palmer, Richard S.

    1997-01-01

    Taken together the components of the Einstein Suite provide two revolutionary capabilities - they have the potential to change the way engineering and financial engineering are performed by: (1) providing currently unavailable functionality, and (2) providing a 10-100 times improvement over currently available but impractical or costly functionality.

  8. Theoretical and software considerations for nonlinear dynamic analysis

    Science.gov (United States)

    Schmidt, R. J.; Dodds, R. H., Jr.

    1983-01-01

    In the finite element method for structural analysis, it is generally necessary to discretize the structural model into a very large number of elements to accurately evaluate displacements, strains, and stresses. As the complexity of the model increases, the number of degrees of freedom can easily exceed the capacity of present-day software system. Improvements of structural analysis software including more efficient use of existing hardware and improved structural modeling techniques are discussed. One modeling technique that is used successfully in static linear and nonlinear analysis is multilevel substructuring. This research extends the use of multilevel substructure modeling to include dynamic analysis and defines the requirements for a general purpose software system capable of efficient nonlinear dynamic analysis. The multilevel substructuring technique is presented, the analytical formulations and computational procedures for dynamic analysis and nonlinear mechanics are reviewed, and an approach to the design and implementation of a general purpose structural software system is presented.

  9. Software metrics a guide to planning, analysis, and application

    CERN Document Server

    Pandian, C Ravindranath

    2003-01-01

    Software Metrics: A Guide to Planning, Analysis, and Application simplifies software measurement and explains its value as a pragmatic tool for management. Ideas and techniques presented in this book are derived from best practices. The ideas are field-proven, down to earth, and straightforward, making this volume an invaluable resource for those striving for process improvement.

  10. Integrated analysis software for bulk power system stability

    Energy Technology Data Exchange (ETDEWEB)

    Tanaka, T.; Nagao, T.; Takahashi, K. [Central Research Inst. of Electric Power Industry, Tokyo (Japan)

    1994-12-31

    This paper presents Central Research Inst.of Electric Power Industry - CRIEPI`s - own developed three softwares for bulk power network analysis and the user support system which arranges tremendous data necessary for these softwares with easy and high reliability. (author) 3 refs., 7 figs., 2 tabs.

  11. Workforce Planning of Navigation Software Project Based on Competence Analysis

    Directory of Open Access Journals (Sweden)

    Shangfei Xie

    2011-03-01

    Full Text Available This paper introduces the quantitative research method for the personnel configuration of the software project by studying the effects of the overall competence of the developers in the vehicle navigation software project on the factors like project quality. The study shows the overall competence of the developers is related to the after-submission defect density, productivity and the average delay of software Version 0.99. Further, a quantitative formula of competence and competence is drawn on the basis of statistics; meanwhile, according to the research result an integer programming configuration method for the navigation software project personnel based on competence analysis is concluded.

  12. Free software for performing physical analysis of systems for digital radiography and mammography

    Energy Technology Data Exchange (ETDEWEB)

    Donini, Bruno; Lanconelli, Nico, E-mail: nico.lanconelli@unibo.it [Alma Mater Studiorum, Department of Physics and Astronomy, University of Bologna, Bologna 40127 (Italy); Rivetti, Stefano [Fisica Medica, Ospedale di Sassuolo S.p.A., Sassuolo 41049 (Italy); Bertolini, Marco [Medical Physics Unit, Azienda Ospedaliera ASMN, Istituto di Ricovero e Cura a Carattere Scientifico, Reggio Emilia 42123 (Italy)

    2014-05-15

    Purpose: In this paper, the authors present a free software for assisting users in achieving the physical characterization of x-ray digital systems and image quality checks. Methods: The program was developed as a plugin of a well-known public-domain suite ImageJ. The software can assist users in calculating various physical parameters such as the response curve (also termed signal transfer property), modulation transfer function (MTF), noise power spectra (NPS), and detective quantum efficiency (DQE). It also includes the computation of some image quality checks: defective pixel analysis, uniformity, dark analysis, and lag. Results: The software was made available in 2009 and has been used during the last couple of years by many users who gave us valuable feedback for improving its usability. It was tested for achieving the physical characterization of several clinical systems for digital radiography and mammography. Various published papers made use of the outcomes of the plugin. Conclusions: This software is potentially beneficial to a variety of users: physicists working in hospitals, staff working in radiological departments, such as medical physicists, physicians, engineers. The plugin, together with a brief user manual, are freely available and can be found online ( http://www.medphys.it/downloads.htm ). With our plugin users can estimate all three most important parameters used for physical characterization (MTF, NPS, and also DQE). The plugin can run on any operating system equipped with ImageJ suite. The authors validated the software by comparing MTF and NPS curves on a common set of images with those obtained with other dedicated programs, achieving a very good agreement.

  13. Free software for performing physical analysis of systems for digital radiography and mammography

    International Nuclear Information System (INIS)

    Purpose: In this paper, the authors present a free software for assisting users in achieving the physical characterization of x-ray digital systems and image quality checks. Methods: The program was developed as a plugin of a well-known public-domain suite ImageJ. The software can assist users in calculating various physical parameters such as the response curve (also termed signal transfer property), modulation transfer function (MTF), noise power spectra (NPS), and detective quantum efficiency (DQE). It also includes the computation of some image quality checks: defective pixel analysis, uniformity, dark analysis, and lag. Results: The software was made available in 2009 and has been used during the last couple of years by many users who gave us valuable feedback for improving its usability. It was tested for achieving the physical characterization of several clinical systems for digital radiography and mammography. Various published papers made use of the outcomes of the plugin. Conclusions: This software is potentially beneficial to a variety of users: physicists working in hospitals, staff working in radiological departments, such as medical physicists, physicians, engineers. The plugin, together with a brief user manual, are freely available and can be found online ( http://www.medphys.it/downloads.htm ). With our plugin users can estimate all three most important parameters used for physical characterization (MTF, NPS, and also DQE). The plugin can run on any operating system equipped with ImageJ suite. The authors validated the software by comparing MTF and NPS curves on a common set of images with those obtained with other dedicated programs, achieving a very good agreement

  14. Software Requirements Analysis as Fault Predictor

    Science.gov (United States)

    Wallace, Dolores

    2003-01-01

    Waiting until the integration and system test phase to discover errors leads to more costly rework than resolving those same errors earlier in the lifecycle. Costs increase even more significantly once a software system has become operational. WE can assess the quality of system requirements, but do little to correlate this information either to system assurance activities or long-term reliability projections - both of which remain unclear and anecdotal. Extending earlier work on requirements accomplished by the ARM tool, measuring requirements quality information against code complexity and test data for the same system may be used to predict specific software modules containing high impact or deeply embedded faults now escaping in operational systems. Such knowledge would lead to more effective and efficient test programs. It may enable insight into whether a program should be maintained or started over.

  15. State/event fault trees-A safety analysis model for software-controlled systems

    International Nuclear Information System (INIS)

    Safety models for software-controlled systems should be intuitive, compositional and have the expressive power to model both software and hardware behaviour. Moreover, they should provide quantitative results for failure or hazard probabilities. Fault trees are an accepted and intuitive model for safety analysis, but they are incapable of expressing state dependencies or temporal order of events. We propose to combine fault trees with an explicit State/Event semantics, using a graphical notation that is similar to Statecharts. Our new model, named State/Event Fault Trees (SEFTs), subsumes both deterministic state machines suited to describe software behaviour, and Markov chains that model probabilistic failures, while keeping the visualisation of causal chains known from fault trees. We allow exponentially distributed probabilistic events, deterministic delays, and triggered events. The model provides a component concept, where components are connected by typed ports. Quantitative evaluation is achieved by translating the component models to Deterministic and Stochastic Petri Nets (DSPNs) and using an existing tool for analysis or simulation. This paper, which is an extended version of [Kaiser B, Gramlich C. State-Event-Fault-Trees-a safety analysis model for software controlled systems. Computer safety, reliability, and security. Proceedings of the 23rd international conference, SAFECOMP 2004, Potsdam, Germany, September 21st-24th. Lecture Notes in Computer Science, vol. 3219, 2004.p. 195-209], revisits the model elements and the analysis procedure and provides a small case study of a fire alarm system, completed by an outlook on our tool project ESSaRel

  16. Analysis of Hollinshed watershed using GIS software

    OpenAIRE

    Hipp, Michael.

    1999-01-01

    CIVINS The objective of this study is to apply GIS and storm water modeling software to develop an accurate hydrologic model of the Hollinshed watershed. Use of GIS will allow the user to quickly change the land use of specific areas within in the watershed to determine the hydrologic effects throughout the watershed using the storm water model. Specific objectives were to: (1) develop a GIS database for the Hollinshed watershed; (2) Develop an appropriate link/ node diagram and correspond...

  17. GWAMA: software for genome-wide association meta-analysis

    Directory of Open Access Journals (Sweden)

    Mägi Reedik

    2010-05-01

    Full Text Available Abstract Background Despite the recent success of genome-wide association studies in identifying novel loci contributing effects to complex human traits, such as type 2 diabetes and obesity, much of the genetic component of variation in these phenotypes remains unexplained. One way to improving power to detect further novel loci is through meta-analysis of studies from the same population, increasing the sample size over any individual study. Although statistical software analysis packages incorporate routines for meta-analysis, they are ill equipped to meet the challenges of the scale and complexity of data generated in genome-wide association studies. Results We have developed flexible, open-source software for the meta-analysis of genome-wide association studies. The software incorporates a variety of error trapping facilities, and provides a range of meta-analysis summary statistics. The software is distributed with scripts that allow simple formatting of files containing the results of each association study and generate graphical summaries of genome-wide meta-analysis results. Conclusions The GWAMA (Genome-Wide Association Meta-Analysis software has been developed to perform meta-analysis of summary statistics generated from genome-wide association studies of dichotomous phenotypes or quantitative traits. Software with source files, documentation and example data files are freely available online at http://www.well.ox.ac.uk/GWAMA.

  18. Analysis and Optimization of a Thyristor Structure Using Backside Schottky Contacts Suited for the High Temperature

    OpenAIRE

    Toulon, Gaëtan; Bourennane , Abdelhakim; Isoird, Karine

    2013-01-01

    International audience In high current, high voltage, high temperature (T > 125 °C) power applications, commercially available conventional silicon thyristors are not suited because they present high leakage current. In this context, this paper presents a high-symmetrical (voltage) thyristor structure that presents a lower leakage current and higher breakover voltage as compared with the conventional thyristor at T > 125 °C. It is shown through 2-D physical simulations that the replacement...

  19. Modular Open-Source Software for Item Factor Analysis

    Science.gov (United States)

    Pritikin, Joshua N.; Hunter, Micheal D.; Boker, Steven M.

    2015-01-01

    This article introduces an item factor analysis (IFA) module for "OpenMx," a free, open-source, and modular statistical modeling package that runs within the R programming environment on GNU/Linux, Mac OS X, and Microsoft Windows. The IFA module offers a novel model specification language that is well suited to programmatic generation…

  20. Long-term preservation of analysis software environment

    International Nuclear Information System (INIS)

    Long-term preservation of scientific data represents a challenge to experiments, especially regarding the analysis software. Preserving data is not enough; the full software and hardware environment is needed. Virtual machines (VMs) make it possible to preserve hardware “in software”. A complete infrastructure package has been developed for easy deployment and management of VMs, based on CERN virtual machine (CernVM). Further, a HTTP-based file system, CernVM file system (CVMFS), is used for the distribution of the software. It is possible to process data with any given software version, and a matching, regenerated VM version. A point-and-click web user interface is being developed for setting up the complete processing chain, including VM and software versions, number and type of processing nodes, and the particular type of analysis and data. This paradigm also allows for distributed cloud-computing on private and public clouds, for both legacy and contemporary experiments.

  1. Synchronized analysis of testbeam data with the Judith software

    International Nuclear Information System (INIS)

    The Judith software performs pixel detector analysis tasks utilizing two different data streams such as those produced by the reference and tested devices typically found in a testbeam. This software addresses and fixes problems arising from the desynchronization of the two simultaneously triggered data streams by detecting missed triggers in either of the streams. The software can perform all tasks required to generate particle tracks using multiple detector planes: it can align the planes, cluster hits and generate tracks from these clusters. This information can then be used to measure the properties of a particle detector with very fine spatial resolution. It was tested at DESY in the Kartel telescope, a silicon tracking detector, with ATLAS Diamond Beam Monitor modules as a device under test. - Highlights: • The Judith software performs analysis of testbeam data. • The software can synchronize individual data streams. • The efficiency of a prototype diamond pixel detector was measured

  2. GammaLib and ctools. A software framework for the analysis of astronomical gamma-ray data

    Science.gov (United States)

    Knödlseder, J.; Mayer, M.; Deil, C.; Cayrou, J.-B.; Owen, E.; Kelley-Hoskins, N.; Lu, C.-C.; Buehler, R.; Forest, F.; Louge, T.; Siejkowski, H.; Kosack, K.; Gerard, L.; Schulz, A.; Martin, P.; Sanchez, D.; Ohm, S.; Hassan, T.; Brau-Nogué, S.

    2016-08-01

    The field of gamma-ray astronomy has seen important progress during the last decade, yet to date no common software framework has been developed for the scientific analysis of gamma-ray telescope data. We propose to fill this gap by means of the GammaLib software, a generic library that we have developed to support the analysis of gamma-ray event data. GammaLib was written in C++ and all functionality is available in Python through an extension module. Based on this framework we have developed the ctools software package, a suite of software tools that enables flexible workflows to be built for the analysis of Imaging Air Cherenkov Telescope event data. The ctools are inspired by science analysis software available for existing high-energy astronomy instruments, and they follow the modular ftools model developed by the High Energy Astrophysics Science Archive Research Center. The ctools were written in Python and C++, and can be either used from the command line via shell scripts or directly from Python. In this paper we present the GammaLib and ctools software versions 1.0 that were released at the end of 2015. GammaLib and ctools are ready for the science analysis of Imaging Air Cherenkov Telescope event data, and also support the analysis of Fermi-LAT data and the exploitation of the COMPTEL legacy data archive. We propose using ctools as the science tools software for the Cherenkov Telescope Array Observatory.

  3. Software quality studies using analytical metric analysis

    OpenAIRE

    Rodríguez Martínez, Cecilia

    2013-01-01

    Actualmente las empresas de ingeniería derivan una gran cantidad de recursos a la detección y corrección de errores en sus códigos software. Estos errores se deben generalmente a los errores cometidos por los desarrolladores cuando escriben el código o sus especificaciones.  No hay ninguna herramienta capaz de detectar todos estos errores y algunos de ellos pasan desapercibidos tras el proceso de pruebas. Por esta razón, numerosas investigaciones han intentado encontrar indicadores en los cód...

  4. New software for statistical analysis of Cambridge Structural Database data

    OpenAIRE

    Sykes, Richard A.; McCabe, Patrick; Allen, Frank H; Battle, Gary M.; Bruno, Ian J.; Wood, Peter A.

    2011-01-01

    A collection of new software tools is presented for the analysis of geometrical, chemical and crystallographic data from the Cambridge Structural Database (CSD). This software supersedes the program Vista. The new functionality is integrated into the program Mercury in order to provide statistical, charting and plotting options alongside three-dimensional structural visualization and analysis. The integration also permits immediate access to other information about specific CSD entries throug...

  5. Project Report: Automatic Sequence Processor Software Analysis

    Science.gov (United States)

    Benjamin, Brandon

    2011-01-01

    The Mission Planning and Sequencing (MPS) element of Multi-Mission Ground System and Services (MGSS) provides space missions with multi-purpose software to plan spacecraft activities, sequence spacecraft commands, and then integrate these products and execute them on spacecraft. Jet Propulsion Laboratory (JPL) is currently is flying many missions. The processes for building, integrating, and testing the multi-mission uplink software need to be improved to meet the needs of the missions and the operations teams that command the spacecraft. The Multi-Mission Sequencing Team is responsible for collecting and processing the observations, experiments and engineering activities that are to be performed on a selected spacecraft. The collection of these activities is called a sequence and ultimately a sequence becomes a sequence of spacecraft commands. The operations teams check the sequence to make sure that no constraints are violated. The workflow process involves sending a program start command, which activates the Automatic Sequence Processor (ASP). The ASP is currently a file-based system that is comprised of scripts written in perl, c-shell and awk. Once this start process is complete, the system checks for errors and aborts if there are any; otherwise the system converts the commands to binary, and then sends the resultant information to be radiated to the spacecraft.

  6. New Graphical User Interface for EXAFS analysis with the GNXAS suite of programs

    Science.gov (United States)

    Hatada, Keisuke; Iesari, Fabio; Properzi, Leonardo; Minicucci, M.; di Cicco, Andrea

    2016-05-01

    GNXAS is a suite of programs based on multiple scattering calculations which performs a structural refinement of EXAFS spectra. It can be used for any system although it has been mainly developed to determine the local structure of disordered substances. We developed a user-friendly graphical user interface (GUI) to facilitate use of the codes by using wxPython. The developed GUI and the codes are multiplatform running on Windows, Macintosh and Linux systems, and are free shareware (http://gnxas.unicam.it). In this work we illustrate features and potentials of this newly developed version of GNXAS (w-GNXAS).

  7. Development of software tools for 4-D visualization and quantitative analysis of PHITS simulation results

    International Nuclear Information System (INIS)

    A suite of software tools has been developed to facilitate the development of apparatus using a radiation transport simulation code PHITS by enabling 4D visualization (3D space and time) and quantitative analysis of so-called dieaway plots. To deliver useable tools as soon as possible, the existing software was utilized as much as possible; ParaView will be used for the 4D visualization of the results, whereas the analyses of dieaway plots will be done with ROOT toolkit with a tool named “diana”. To enable 4D visualization using ParaView, a group of tools (angel2vtk, DispDCAS1, CamPos) has been developed for the conversion of the data format to the one which can be read from ParaView and to ease the visualization. (author)

  8. Analytical Tools for Space Suit Design

    Science.gov (United States)

    Aitchison, Lindsay

    2011-01-01

    As indicated by the implementation of multiple small project teams within the agency, NASA is adopting a lean approach to hardware development that emphasizes quick product realization and rapid response to shifting program and agency goals. Over the past two decades, space suit design has been evolutionary in approach with emphasis on building prototypes then testing with the largest practical range of subjects possible. The results of these efforts show continuous improvement but make scaled design and performance predictions almost impossible with limited budgets and little time. Thus, in an effort to start changing the way NASA approaches space suit design and analysis, the Advanced Space Suit group has initiated the development of an integrated design and analysis tool. It is a multi-year-if not decadal-development effort that, when fully implemented, is envisioned to generate analysis of any given space suit architecture or, conversely, predictions of ideal space suit architectures given specific mission parameters. The master tool will exchange information to and from a set of five sub-tool groups in order to generate the desired output. The basic functions of each sub-tool group, the initial relationships between the sub-tools, and a comparison to state of the art software and tools are discussed.

  9. Applications of the BEam Cross section Analysis Software (BECAS)

    DEFF Research Database (Denmark)

    Blasques, José Pedro Albergaria Amaral; Bitsche, Robert; Fedorov, Vladimir;

    2013-01-01

    A newly developed framework is presented for structural design and analysis of long slender beam-like structures, e.g., wind turbine blades. The framework is based on the BEam Cross section Analysis Software – BECAS – a finite element based cross section analysis tool. BECAS is used for the...

  10. Application of Statistical Analysis Software in Food Scientific Modeling

    OpenAIRE

    Miaochao Chen; Kong Xiangsheng; Kan Chen

    2014-01-01

    In food scientific researches, sophisticated statistical analysis problems often can be met and in this study, through SPSS statistical analysis software, the method of the curve regression model and the multiple regression model that both are common in food science has been established and the experimental results show that the method can be effectively used in the statistical analysis model of food science.

  11. Analysis of Variance: What Is Your Statistical Software Actually Doing?

    Science.gov (United States)

    Li, Jian; Lomax, Richard G.

    2011-01-01

    Users assume statistical software packages produce accurate results. In this article, the authors systematically examined Statistical Package for the Social Sciences (SPSS) and Statistical Analysis System (SAS) for 3 analysis of variance (ANOVA) designs, mixed-effects ANOVA, fixed-effects analysis of covariance (ANCOVA), and nested ANOVA. For each…

  12. JEM-X science analysis software

    DEFF Research Database (Denmark)

    Westergaard, Niels Jørgen Stenfeldt; Kretschmar, P.; Oxborrow, Carol Anne;

    2003-01-01

    The science analysis of the data from JEM-X on INTEGRAL is performed through a number of levels including corrections, good time selection, imaging and source finding, spectrum and light-curve extraction. These levels consist of individual executables and the running of the complete analysis is...... controlled by a script where parameters for detailed settings are introduced. The end products are FITS files with a format compatible with standard analysis packages such as XSPEC....

  13. Intraprocedural dataflow analysis for software product lines

    DEFF Research Database (Denmark)

    Brabrand, Claus; Ribeiro, Márcio; Tolêdo, Társis;

    2013-01-01

    , developers must generate and analyze all valid products individually, which is expensive for non-trivial SPLs. In this paper, we demonstrate how to take any standard intraprocedural dataflow analysis and automatically turn it into a feature-sensitive dataflow analysis in five different ways where the last...

  14. Power Analysis Software for Educational Researchers

    Science.gov (United States)

    Peng, Chao-Ying Joanne; Long, Haiying; Abaci, Serdar

    2012-01-01

    Given the importance of statistical power analysis in quantitative research and the repeated emphasis on it by American Educational Research Association/American Psychological Association journals, the authors examined the reporting practice of power analysis by the quantitative studies published in 12 education/psychology journals between 2005…

  15. Phycas: software for Bayesian phylogenetic analysis.

    Science.gov (United States)

    Lewis, Paul O; Holder, Mark T; Swofford, David L

    2015-05-01

    Phycas is open source, freely available Bayesian phylogenetics software written primarily in C++ but with a Python interface. Phycas specializes in Bayesian model selection for nucleotide sequence data, particularly the estimation of marginal likelihoods, central to computing Bayes Factors. Marginal likelihoods can be estimated using newer methods (Thermodynamic Integration and Generalized Steppingstone) that are more accurate than the widely used Harmonic Mean estimator. In addition, Phycas supports two posterior predictive approaches to model selection: Gelfand-Ghosh and Conditional Predictive Ordinates. The General Time Reversible family of substitution models, as well as a codon model, are available, and data can be partitioned with all parameters unlinked except tree topology and edge lengths. Phycas provides for analyses in which the prior on tree topologies allows polytomous trees as well as fully resolved trees, and provides for several choices for edge length priors, including a hierarchical model as well as the recently described compound Dirichlet prior, which helps avoid overly informative induced priors on tree length. PMID:25577605

  16. Dispersion analysis of biotoxins using HPAC software

    International Nuclear Information System (INIS)

    Biotoxins are emerging threat agents produced by living organisms: bacteria, plants, or animals. Biotoxins are generally classified as cyanotoxins, hemotoxins, necrotoxins, neurotoxins, and cytotoxins. The application of classical biotoxins as weapons of terror has been realized because of extreme potency and lethality; ease of production, transport, and misuse; and the need for prolonged intensive care among affected persons. Recently, emerging biotoxins, such as ricin and T2 micotoxin have been clandestinely used by either terrorist groups or military combat operations. It is thus highly desirable to have a modeling system to simulate dispersions of biotoxins in a terrorist attack scenario in order to provide prompt technical support and casualty estimation to the first responders and military rescuers. The Hazard Prediction and Assessment Capability (HPAC) automated software system provides the means to accurately predict the effects of hazardous material released into the atmosphere and its impact on civilian and military populations. The system uses integrated source terms, high-resolution weather forecasts and atmospheric transport and dispersion analyses to model hazard areas produced by military or terrorist incidents and industrial accidents. We have successfully incorporated physical, chemical, epidemiological and biological characteristics of a variety of biotoxins into the HPAC system and have conducted numerous analyses for our emergency responders. The health effects caused by these hazards are closely reflected in HPAC output results.(author)

  17. Evolvability Analysis Method for Open Source Software Systems

    OpenAIRE

    Chauhan, Muhammad Aufeef

    2011-01-01

    Software systems evolve over the life span to accommodate changes in order to meet technical and business requirements. Evolution of open source software (OSS) is challenging because of involvement from a large number of independent teams and developers who make modifications in the systems according to their own requirements. It is required to evaluate these changes as these are being incorporated into the system against the long term evolvability objectives. This paper presents the analysis...

  18. The software application and classification algorithms for welds radiograms analysis

    Science.gov (United States)

    Sikora, R.; Chady, T.; Baniukiewicz, P.; Grzywacz, B.; Lopato, P.; Misztal, L.; Napierała, L.; Piekarczyk, B.; Pietrusewicz, T.; Psuj, G.

    2013-01-01

    The paper presents a software implementation of an Intelligent System for Radiogram Analysis (ISAR). The system has to support radiologists in welds quality inspection. The image processing part of software with a graphical user interface and a welds classification part are described with selected classification results. Classification was based on a few algorithms: an artificial neural network, a k-means clustering, a simplified k-means and a rough sets theory.

  19. Software safety analysis techniques for developing safety critical software in the digital protection system of the LMR

    International Nuclear Information System (INIS)

    This report has described the software safety analysis techniques and the engineering guidelines for developing safety critical software to identify the state of the art in this field and to give the software safety engineer a trail map between the code and standards layer and the design methodology and documents layer. We have surveyed the management aspects of software safety activities during the software lifecycle in order to improve the safety. After identifying the conventional safety analysis techniques for systems, we have surveyed in details the software safety analysis techniques, software FMEA(Failure Mode and Effects Analysis), software HAZOP(Hazard and Operability Analysis), and software FTA(Fault Tree Analysis). We have also surveyed the state of the art in the software reliability assessment techniques. The most important results from the reliability techniques are not the specific probability numbers generated, but the insights into the risk importance of software features. To defend against potential common-mode failures, high quality, defense-in-depth, and diversity are considered to be key elements in digital I and C system design. To minimize the possibility of CMFs and thus increase the plant reliability, we have provided D-in-D and D analysis guidelines

  20. Software safety analysis techniques for developing safety critical software in the digital protection system of the LMR

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jang Soo; Cheon, Se Woo; Kim, Chang Hoi; Sim, Yun Sub

    2001-02-01

    This report has described the software safety analysis techniques and the engineering guidelines for developing safety critical software to identify the state of the art in this field and to give the software safety engineer a trail map between the code and standards layer and the design methodology and documents layer. We have surveyed the management aspects of software safety activities during the software lifecycle in order to improve the safety. After identifying the conventional safety analysis techniques for systems, we have surveyed in details the software safety analysis techniques, software FMEA(Failure Mode and Effects Analysis), software HAZOP(Hazard and Operability Analysis), and software FTA(Fault Tree Analysis). We have also surveyed the state of the art in the software reliability assessment techniques. The most important results from the reliability techniques are not the specific probability numbers generated, but the insights into the risk importance of software features. To defend against potential common-mode failures, high quality, defense-in-depth, and diversity are considered to be key elements in digital I and C system design. To minimize the possibility of CMFs and thus increase the plant reliability, we have provided D-in-D and D analysis guidelines.

  1. Adapted wavelet analysis from theory to software

    CERN Document Server

    Wickerhauser, Mladen Victor

    1994-01-01

    This detail-oriented text is intended for engineers and applied mathematicians who must write computer programs to perform wavelet and related analysis on real data. It contains an overview of mathematical prerequisites and proceeds to describe hands-on programming techniques to implement special programs for signal analysis and other applications. From the table of contents: - Mathematical Preliminaries - Programming Techniques - The Discrete Fourier Transform - Local Trigonometric Transforms - Quadrature Filters - The Discrete Wavelet Transform - Wavelet Packets - The Best Basis Algorithm - Multidimensional Library Trees - Time-Frequency Analysis - Some Applications - Solutions to Some of the Exercises - List of Symbols - Quadrature Filter Coefficients

  2. Study and design of indigenous probabilistic safety analysis software

    International Nuclear Information System (INIS)

    With the rapid development of nuclear power technology and engineering, it is necessary and important to study and develop indigenous professional PSA software for nuclear power plant Living PSA development and Living PSA application. According to Living PSA regulation and technical requirements, NFRisk is designed and expected to be a computer software system for Living PSA model development and maintenance, integrated with mode construction tools, qualitative and quantitative analysis tools, post analysis tools and so on, capable of fast analysis and quantification of large scale PSA event tree and fault tree models. Meanwhile, NFRisk incorporates data analysis and management code package, and provides the interface with the commercial PSA software, which enable it to extend multi-application development. In this paper, the design concept, design scheme and function of NFRisk are described. (authors)

  3. Software Security Analysis : Execution Phase Audit

    OpenAIRE

    Carlsson, Bengt; Baca, Dejan

    2005-01-01

    Code revision of a leading telecom product was performed, combining manual audit and static analysis tools. On average, one exploitable vulnerability was found for every 4000 lines of code. Half of the located threats in the product were buffer overflows followed by race condition, misplaced trust, and poor random generators. Static analysis tools were used to speed up the revision process and to integrate security tests into the overall project process. The discussion analyses the effectiven...

  4. Development of output user interface software to support analysis

    Science.gov (United States)

    Wahanani, Nursinta Adi; Natsir, Khairina; Hartini, Entin

    2014-09-01

    Data processing software packages such as VSOP and MCNPX are softwares that has been scientifically proven and complete. The result of VSOP and MCNPX are huge and complex text files. In the analyze process, user need additional processing like Microsoft Excel to show informative result. This research develop an user interface software for output of VSOP and MCNPX. VSOP program output is used to support neutronic analysis and MCNPX program output is used to support burn-up analysis. Software development using iterative development methods which allow for revision and addition of features according to user needs. Processing time with this software 500 times faster than with conventional methods using Microsoft Excel. PYTHON is used as a programming language, because Python is available for all major operating systems: Windows, Linux/Unix, OS/2, Mac, Amiga, among others. Values that support neutronic analysis are k-eff, burn-up and mass Pu239 and Pu241. Burn-up analysis used the mass inventory values of actinide (Thorium, Plutonium, Neptunium and Uranium). Values are visualized in graphical shape to support analysis.

  5. Development of output user interface software to support analysis

    International Nuclear Information System (INIS)

    Data processing software packages such as VSOP and MCNPX are softwares that has been scientifically proven and complete. The result of VSOP and MCNPX are huge and complex text files. In the analyze process, user need additional processing like Microsoft Excel to show informative result. This research develop an user interface software for output of VSOP and MCNPX. VSOP program output is used to support neutronic analysis and MCNPX program output is used to support burn-up analysis. Software development using iterative development methods which allow for revision and addition of features according to user needs. Processing time with this software 500 times faster than with conventional methods using Microsoft Excel. PYTHON is used as a programming language, because Python is available for all major operating systems: Windows, Linux/Unix, OS/2, Mac, Amiga, among others. Values that support neutronic analysis are k-eff, burn-up and mass Pu239 and Pu241. Burn-up analysis used the mass inventory values of actinide (Thorium, Plutonium, Neptunium and Uranium). Values are visualized in graphical shape to support analysis

  6. Development of evaluation method for software safety analysis techniques

    International Nuclear Information System (INIS)

    Full text: Full text: Following the massive adoption of digital Instrumentation and Control (I and C) system for nuclear power plant (NPP), various Software Safety Analysis (SSA) techniques are used to evaluate the NPP safety for adopting appropriate digital I and C system, and then to reduce risk to acceptable level. However, each technique has its specific advantage and disadvantage. If the two or more techniques can be complementarily incorporated, the SSA combination would be more acceptable. As a result, if proper evaluation criteria are available, the analyst can then choose appropriate technique combination to perform analysis on the basis of resources. This research evaluated the applicable software safety analysis techniques nowadays, such as, Preliminary Hazard Analysis (PHA), Failure Modes and Effects Analysis (FMEA), Fault Tree Analysis (FTA), Markov chain modeling, Dynamic Flowgraph Methodology (DFM), and simulation-based model analysis; and then determined indexes in view of their characteristics, which include dynamic capability, completeness, achievability, detail, signal/ noise ratio, complexity, and implementation cost. These indexes may help the decision makers and the software safety analysts to choose the best SSA combination arrange their own software safety plan. By this proposed method, the analysts can evaluate various SSA combinations for specific purpose. According to the case study results, the traditional PHA + FMEA + FTA (with failure rate) + Markov chain modeling (without transfer rate) combination is not competitive due to the dilemma for obtaining acceptable software failure rates. However, the systematic architecture of FTA and Markov chain modeling is still valuable for realizing the software fault structure. The system centric techniques, such as DFM and Simulation-based model analysis, show the advantage on dynamic capability, achievability, detail, signal/noise ratio. However, their disadvantage are the completeness complexity

  7. Software Quality Attribute Analysis by Architecture Reconstruction (SQUA3RE)

    NARCIS (Netherlands)

    Stormer, C.

    2007-01-01

    Software Quality Attribute Analysis by Architecture Reconstruction (SQUA3RE) is a method that fosters a goal-driven process to evaluate the impact of what-if scenarios on existing systems. The method is partitioned into SQA2 and ARE. The SQA2 part provides the analysis models that can be used for q

  8. Propensity Score Analysis in R: A Software Review

    Science.gov (United States)

    Keller, Bryan; Tipton, Elizabeth

    2016-01-01

    In this article, we review four software packages for implementing propensity score analysis in R: "Matching, MatchIt, PSAgraphics," and "twang." After briefly discussing essential elements for propensity score analysis, we apply each package to a data set from the Early Childhood Longitudinal Study in order to estimate the…

  9. Processing of terabytes of data for seismic noise analysis with the Python codes of the Whisper Suite. (Invited)

    Science.gov (United States)

    Briand, X.; Campillo, M.; Brenguier, F.; Boue, P.; Poli, P.; Roux, P.; Takeda, T.

    2013-12-01

    The Whisper Suite, as part of the ERC project Whisper (whisper.obs.ujf-grenoble.fr), is developed with the high-level programming language Python and uses intensively the scientific libraries Scipy and Obspy, which is dedicated to the seismological community (www.obspy.org). The Whisper Suite consists of several tools. It provides a flexible way to specify a pipeline of seismogram processing. The user can define his own sequence of treatments, can use the Python libraries he needs and eventually, can add his processing procedure to the Whisper Suite. Another package is dedicated to the computation of correlations. When dealing with large data set, computational time becomes a major difficulty and we devoted a lot of efforts to make possible the fast processing of the large data sets produced by the present day dense seismic networks. With the Whisper Suite, we manage currently more than 150TB of data for ambient noise analysis. For the computations of 68 millions correlations (daily, 5Hz, correlation window 3600s) on a 50 core cluster, with a dedicated disk array, the required time is 4 days. With a distributed storage (Irods) and a grid of clusters (mode best effort), both provided by the University of Grenoble, we compute currently one year of 4-hours correlations for 550 3C stations of the Hi-Net Japanese Network in one day (about 350 millions individual correlations) . Note that the quadratic space complexity can be critical. We developed also codes for the analysis of the correlations. The Whisper Suite is used to make challenging observations using cross-correlation techniques at various scales in the Earth. We present some examples of applications. Using a global data set of available broadband stations, we discuss the emergence of the complete teleseismic body wave wave field, including the deep phases used for imaging of the mantle and the core. The giant 2011 Tohoku-oki earthquake and the records of the dense Hi-Net array offer an opportunity to analyze

  10. ctools: Cherenkov Telescope Science Analysis Software

    Science.gov (United States)

    Knödlseder, Jürgen; Mayer, Michael; Deil, Christoph; Buehler, Rolf; Bregeon, Johan; Martin, Pierrick

    2016-01-01

    ctools provides tools for the scientific analysis of Cherenkov Telescope Array (CTA) data. Analysis of data from existing Imaging Air Cherenkov Telescopes (such as H.E.S.S., MAGIC or VERITAS) is also supported, provided that the data and response functions are available in the format defined for CTA. ctools comprises a set of ftools-like binary executables with a command-line interface allowing for interactive step-wise data analysis. A Python module allows control of all executables, and the creation of shell or Python scripts and pipelines is supported. ctools provides cscripts, which are Python scripts complementing the binary executables. Extensions of the ctools package by user defined binary executables or Python scripts is supported. ctools are based on GammaLib (ascl:1110.007).

  11. QSoas: A Versatile Software for Data Analysis.

    Science.gov (United States)

    Fourmond, Vincent

    2016-05-17

    Undoubtedly, the most natural way to confirm a model is to quantitatively verify its predictions. However, this is not done systematically, and one of the reasons for that is the lack of appropriate tools for analyzing data, because the existing tools do not implement the required models or they lack the flexibility required to perform data analysis in a reasonable time. We present QSoas, an open-source, cross-platform data analysis program written to overcome these problems. In addition to standard data analysis procedures and full automation using scripts, QSoas features a very powerful data fitting interface with support for arbitrary functions, differential equation and kinetic system integration, and flexible global fits. QSoas is available from http://www.qsoas.org . PMID:27096413

  12. Software Process Models and Analysis on Failure of Software Development Projects

    OpenAIRE

    Kaur, Rupinder; Sengupta, Jyotsna

    2013-01-01

    The software process model consists of a set of activities undertaken to design, develop and maintain software systems. A variety of software process models have been designed to structure, describe and prescribe the software development process. The software process models play a very important role in software development, so it forms the core of the software product. Software project failure is often devastating to an organization. Schedule slips, buggy releases and missing features can me...

  13. Buying in to bioinformatics: an introduction to commercial sequence analysis software.

    Science.gov (United States)

    Smith, David Roy

    2015-07-01

    Advancements in high-throughput nucleotide sequencing techniques have brought with them state-of-the-art bioinformatics programs and software packages. Given the importance of molecular sequence data in contemporary life science research, these software suites are becoming an essential component of many labs and classrooms, and as such are frequently designed for non-computer specialists and marketed as one-stop bioinformatics toolkits. Although beautifully designed and powerful, user-friendly bioinformatics packages can be expensive and, as more arrive on the market each year, it can be difficult for researchers, teachers and students to choose the right software for their needs, especially if they do not have a bioinformatics background. This review highlights some of the currently available and most popular commercial bioinformatics packages, discussing their prices, usability, features and suitability for teaching. Although several commercial bioinformatics programs are arguably overpriced and overhyped, many are well designed, sophisticated and, in my opinion, worth the investment. If you are just beginning your foray into molecular sequence analysis or an experienced genomicist, I encourage you to explore proprietary software bundles. They have the potential to streamline your research, increase your productivity, energize your classroom and, if anything, add a bit of zest to the often dry detached world of bioinformatics. PMID:25183247

  14. Analysis on testing and operational reliability of software

    Institute of Scientific and Technical Information of China (English)

    ZHAO Jing; LIU Hong-wei; CUI Gang; WANG Hui-qiang

    2008-01-01

    Software reliability was estimated based on NHPP software reliability growth models. Testing reliability and operational reliability may be essentially different. On the basis of analyzing similarities and differences of the testing phase and the operational phase, using the concept of operational reliability and the testing reliability, different forms of the comparison between the operational failure ratio and the predicted testing failure ratio were conducted, and the mathematical discussion and analysis were performed in detail. Finally, software optimal release was studied using software failure data. The results show that two kinds of conclusions can be derived by applying this method, one conclusion is to continue testing to meet the required reliability level of users, and the other is that testing stops when the required operational reliability is met, thus the testing cost can be reduced.

  15. Do You Need ERP? In the Business World, Enterprise Resource Planning Software Keeps Costs down and Productivity up. Should Districts Follow Suit?

    Science.gov (United States)

    Careless, James

    2007-01-01

    Enterprise resource planning (ERP) software does what school leaders have always wanted their computer systems to do: It sees all. By integrating every IT application an organization has--from purchasing and inventory control to payroll--ERPs create a single unified system. Not only does this give IT managers a holistic view to what is happening…

  16. Evaluating software development by analysis of changes - Some data from the Software Engineering Laboratory

    Science.gov (United States)

    Weiss, D. M.; Basili, V. R.

    1985-01-01

    Basili and Weiss (1984) have discussed an approach for obtaining valid data which may be used to evaluate software development methodologies in a production environment. The methodology consists of five elements, including the identification of goals, the determination of questions of interest from the goals, the development of a data collection form, the development of data collection procedures, and the validation and analysis of the data. The current investigation is concerned with the presentation of the results from such an evaluation. The presented data were collected as part of studies reported by Basili et al. (1977). These studies had been conducted by NASA's Software Engineering Laboratory (SEL). Attention is given to an overview of the SEL, the application of the considered methodology, the results of a data analysis, and conclusions about the SEL environment.

  17. The software analysis project for the Office of Human Resources

    Science.gov (United States)

    Tureman, Robert L., Jr.

    1994-01-01

    There were two major sections of the project for the Office of Human Resources (OHR). The first section was to conduct a planning study to analyze software use with the goal of recommending software purchases and determining whether the need exists for a file server. The second section was analysis and distribution planning for retirement planning computer program entitled VISION provided by NASA Headquarters. The software planning study was developed to help OHR analyze the current administrative desktop computing environment and make decisions regarding software acquisition and implementation. There were three major areas addressed by the study: current environment new software requirements, and strategies regarding the implementation of a server in the Office. To gather data on current environment, employees were surveyed and an inventory of computers were produced. The surveys were compiled and analyzed by the ASEE fellow with interpretation help by OHR staff. New software requirements represented a compilation and analysis of the surveyed requests of OHR personnel. Finally, the information on the use of a server represents research done by the ASEE fellow and analysis of survey data to determine software requirements for a server. This included selection of a methodology to estimate the number of copies of each software program required given current use and estimated growth. The report presents the results of the computing survey, a description of the current computing environment, recommenations for changes in the computing environment, current software needs, management advantages of using a server, and management considerations in the implementation of a server. In addition, detailed specifications were presented for the hardware and software recommendations to offer a complete picture to OHR management. The retirement planning computer program available to NASA employees will aid in long-range retirement planning. The intended audience is the NASA civil

  18. Software for Data Analysis Programming with R

    CERN Document Server

    Chambers, John

    2008-01-01

    Although statistical design is one of the oldest branches of statistics, its importance is ever increasing, especially in the face of the data flood that often faces statisticians. It is important to recognize the appropriate design, and to understand how to effectively implement it, being aware that the default settings from a computer package can easily provide an incorrect analysis. The goal of this book is to describe the principles that drive good design, paying attention to both the theoretical background and the problems arising from real experimental situations. Designs are motivated t

  19. Combinatorial Generation of Test Suites

    Science.gov (United States)

    Dvorak, Daniel L.; Barrett, Anthony C.

    2009-01-01

    Testgen is a computer program that generates suites of input and configuration vectors for testing other software or software/hardware systems. As systems become ever more complex, often, there is not enough time to test systems against all possible combinations of inputs and configurations, so test engineers need to be selective in formulating test plans. Testgen helps to satisfy this need: In response to a test-suite-requirement-specification model, it generates a minimal set of test vectors that satisfies all the requirements.

  20. Availability Analysis of Application Servers Using Software Rejuvenation and Virtualization

    Institute of Scientific and Technical Information of China (English)

    Thandar Thein; Jong Sou Park

    2009-01-01

    Demands on software reliability and availability have increased tremendously due to the nature of present day applications. We focus on the aspect of software for the high availability of application servers since the unavailability of servers more often originates from software faults rather than hardware faults. The software rejuvenation technique has been widely used to avoid the occurrence of unplanned failures, mainly due to the phenomena of software aging or caused by transient failures. In this paper, first we present a new way of using the virtual machine based software rejuvenation named VMSR to offer high availability for application server systems. Second we model a single physical server which is used to host multiple virtual machines (VMs) with the VMSR framework using stochastic modeling and evaluate it through both numerical analysis and SHARPE (Symbolic Hierarchical Automated Reliability and Performance Evaluator) tool simulation.This VMSR model is very general and can capture application server characteristics, failure behavior, and performability measures. Our results demonstrate that VMSR approach is a practical way to ensure uninterrupted availability and to optimize performance for aging applications.

  1. The BEPCⅡ Data Production and BESⅢ offline Analysis Software System

    Institute of Scientific and Technical Information of China (English)

    ZepuMAO

    2001-01-01

    The BES detector has operated for about 12 years,and the BES offline data analysis environment also has been developed and upgraded along with developments of the BES hardware and software.The BESⅢ software system will operate for many years.Thus they should meet developments of the new technology in software,It should be highly flexible,Powerful,stable and easy for maintenance.And following points should be taken into account:1) To benefit the collaboration and make better exchanges with the international HEP experiments this system shoule be set up by adopting or referring the newest technology in the software from advanced experiments in the world.2).It should support hundreds of the existing BES software packages and serve for both old experts who familiar with BESII software and computing environment and new members who is going to benefit from the new system.3).The most BESII existing packages will be modified or re-designed according to the hardware changes.

  2. RAVEN, a New Software for Dynamic Risk Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Cristian Rabiti; Andrea Alfonsi; Joshua Cogliati; Diego Mandelli; Robert Kinoshita

    2014-06-01

    RAVEN is a generic software driver to perform parametric and probabilistic analysis of code simulating complex systems. Initially developed to provide dynamic risk analysis capabilities to the RELAP-7 code [1] is currently being generalized with the addition of Application Programming Interfaces (APIs). These interfaces are used to extend RAVEN capabilities to any software as long as all the parameters that need to be perturbed are accessible by inputs files or directly via python interfaces. RAVEN is capable to investigate the system response probing the input space using Monte Carlo, grid strategies, or Latin Hyper Cube schemes, but its strength is its focus toward system feature discovery like limit surfaces separating regions of the input space leading to system failure using dynamic supervised learning techniques. The paper will present an overview of the software capabilities and their implementation schemes followed by same application examples.

  3. Analysis of Gumbel Model for Software Reliability Using Bayesian Paradigm

    Directory of Open Access Journals (Sweden)

    Raj Kumar

    2012-12-01

    Full Text Available In this paper, we have illustrated the suitability of Gumbel Model for software reliability data. The model parameters are estimated using likelihood based inferential procedure: classical as well as Bayesian. The quasi Newton-Raphson algorithm is applied to obtain the maximum likelihood estimates and associated probability intervals. The Bayesian estimates of the parameters of Gumbel model are obtained using Markov Chain Monte Carlo(MCMC simulation method in OpenBUGS(established software for Bayesian analysis using Markov Chain Monte Carlo methods. The R functions are developed to study the statistical properties, model validation and comparison tools of the model and the output analysis of MCMC samples generated from OpenBUGS. Details of applying MCMC to parameter estimation for the Gumbel model are elaborated and a real software reliability data set is considered to illustrate the methods of inference discussed in this paper.

  4. First statistical analysis of Geant4 quality software metrics

    Science.gov (United States)

    Ronchieri, Elisabetta; Grazia Pia, Maria; Giacomini, Francesco

    2015-12-01

    Geant4 is a simulation system of particle transport through matter, widely used in several experimental areas from high energy physics and nuclear experiments to medical studies. Some of its applications may involve critical use cases; therefore they would benefit from an objective assessment of the software quality of Geant4. In this paper, we provide a first statistical evaluation of software metrics data related to a set of Geant4 physics packages. The analysis aims at identifying risks for Geant4 maintainability, which would benefit from being addressed at an early stage. The findings of this pilot study set the grounds for further extensions of the analysis to the whole of Geant4 and to other high energy physics software systems.

  5. Control and analysis software for a laser scanning microdensitometer

    Indian Academy of Sciences (India)

    H R Bundel; C P Navathe; P A Naik; P D Gupta

    2006-02-01

    A PC-based control software and data acquisition system is developed for an existing commercial microdensitometer (Biomed make model No. SL-2D/1D UV/VIS) to facilitate scanning and analysis of X-ray films. The software is developed in Labview, which includes operation of the microdensitometer in 1D and 2D scans and analysis of spatial or spectral data on X-ray films, such as optical density, intensity and wavelength. It provides a user-friendly Graphical User Interface (GUI) to analyse the scanned data and also store the analysed data/image in popular formats like data in Excel and images in jpeg. It has also on-line calibration facility with standard optical density tablets. The control software and data acquisition system is simple, inexpensive and versatile.

  6. The CMS computing, software and analysis challenge

    International Nuclear Information System (INIS)

    The CMS experiment has performed a comprehensive challenge during May 2008 to test the full scope of offline data handling and analysis activities needed for data taking during the first few weeks of LHC collider operations. It constitutes the first full-scale challenge with large statistics under the conditions expected at the start-up of the LHC, including the expected initial mis-alignments and mis-calibrations for each sub-detector, and event signatures and rates typical for low instantaneous luminosity. Particular emphasis has been given to the prompt reconstruction workflows, and to the procedures for the alignment and calibration of each sub-detector. The latter were performed with restricted latency using the same computing infrastructure that will be used for real data, and the resulting calibration and alignment constants were used to re-reconstruct the data at Tier-1 centres. The paper addresses the goals and practical experience from the challenge, as well as the lessons learned in view of LHC data taking.

  7. An Overview of the XGAM Code and Related Software for Gamma-ray Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Younes, W. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2014-11-13

    The XGAM spectrum-fitting code and associated software were developed specifically to analyze the complex gamma-ray spectra that can result from neutron-induced reactions. The XGAM code is designed to fit a spectrum over the entire available gamma-ray energy range as a single entity, in contrast to the more traditional piecewise approaches. This global-fit philosophy enforces background continuity as well as consistency between local and global behavior throughout the spectrum, and in a natural way. This report presents XGAM and the suite of programs built around it with an emphasis on how they fit into an overall analysis methodology for complex gamma-ray data. An application to the analysis of time-dependent delayed gamma-ray yields from 235U fission is shown in order to showcase the codes and how they interact.

  8. Evaluation of peak-fitting software for gamma spectrum analysis

    International Nuclear Information System (INIS)

    In all applications of gamma-ray spectroscopy, one of the most important and delicate parts of the data analysis is the fitting of the gamma-ray spectra, where information as the number of counts, the position of the centroid and the width, for instance, are associated with each peak of each spectrum. There's a huge choice of computer programs that perform this type of analysis, and the most commonly used in routine work are the ones that automatically locate and fit the peaks; this fit can be made in several different ways - the most common ways are to fit a Gaussian function to each peak or simply to integrate the area under the peak, but some software go far beyond and include several small corrections to the simple Gaussian peak function, in order to compensate for secondary effects. In this work several gamma-ray spectroscopy software are compared in the task of finding and fitting the gamma-ray peaks in spectra taken with standard sources of 137Cs, 60Co, 133Ba and 152Eu. The results show that all of the automatic software can be properly used in the task of finding and fitting peaks, with the exception of GammaVision; also, it was possible to verify that the automatic peak-fitting software did perform as well as - and sometimes even better than - a manual peak-fitting software. (author)

  9. CMS software architecture. Software framework, services and persistency in high level trigger, reconstruction and analysis

    CERN Document Server

    Innocente, Vincenzo; Stickland, D P

    2001-01-01

    This paper describes the design of a resilient and flexible software architecture that has been developed to satisfy the data processing requirements of a large HEP experiment, CMS, currently being constructed at the LHC machine at CERN. We describe various components of a software framework that allows integration of physics modules and which can be easily adapted for use in different processing environments both real-time (online trigger) and offline (event reconstruction and analysis). Features such as the mechanisms for scheduling algorithms, configuring the application and managing the dependences among modules are described in detail. In particular, a major effort has been placed on providing a service for managing persistent data and the experience using a commercial ODBMS (objectivity/DB) is therefore described in detail. (13 refs).

  10. Comparative Analysis and Evaluation of Existing Risk Management Software

    Directory of Open Access Journals (Sweden)

    2007-01-01

    Full Text Available The focus of this article lies on the specific features of the existing software packages for risk management differentiating three categories. Representative for these categories we consider the Crystal Ball, Haufe Risikomanager and MIS - Risk Management solutions, outlining the strenghts and weaknesses in a comparative analysis.

  11. IDEAL: A Software Package for Analysis of Influence Diagrams

    OpenAIRE

    Srinivas, Sampath; Breese, John S.

    2013-01-01

    IDEAL (Influence Diagram Evaluation and Analysis in Lisp) is a software environment for creation and evaluation of belief networks and influence diagrams. IDEAL is primarily a research tool and provides an implementation of many of the latest developments in belief network and influence diagram evaluation in a unified framework. This paper describes IDEAL and some lessons learned during its development.

  12. Application of software technology to automatic test data analysis

    Science.gov (United States)

    Stagner, J. R.

    1991-01-01

    The verification process for a major software subsystem was partially automated as part of a feasibility demonstration. The methods employed are generally useful and applicable to other types of subsystems. The effort resulted in substantial savings in test engineer analysis time and offers a method for inclusion of automatic verification as a part of regression testing.

  13. Software Product "Equilibrium" for Preparation and Analysis of Aquatic Solutions

    CERN Document Server

    Bontchev, G D; Ivanov, P I; Maslov, O D; Milanov, M V; Dmitriev, S N

    2003-01-01

    Software product "Equilibrium" for preparation and analysis of aquatic solutions is developed. The program allows determining analytical parameters of a solution, such as ionic force and pH. "Equilibrium" is able to calculate the ratio of existing ion forms in the solution, with respect to the hydrolysis and complexation in the presence of one or more ligands.

  14. jPopGen Suite: population genetic analysis of DNA polymorphism from nucleotide sequences with errors

    OpenAIRE

    Liu, Xiaoming

    2012-01-01

    1. Next-generation sequencing (NGS) is being increasingly used in ecological and evolutionary studies. Though promising, NGS is known to be error-prone. Sequencing error can cause significant bias for population genetic analysis of a sequence sample.

  15. FIRE: an open-software suite for real-time 2D/3D image registration for image guided radiotherapy research

    Science.gov (United States)

    Furtado, H.; Gendrin, C.; Spoerk, J.; Steiner, E.; Underwood, T.; Kuenzler, T.; Georg, D.; Birkfellner, W.

    2016-03-01

    Radiotherapy treatments have changed at a tremendously rapid pace. Dose delivered to the tumor has escalated while organs at risk (OARs) are better spared. The impact of moving tumors during dose delivery has become higher due to very steep dose gradients. Intra-fractional tumor motion has to be managed adequately to reduce errors in dose delivery. For tumors with large motion such as tumors in the lung, tracking is an approach that can reduce position uncertainty. Tumor tracking approaches range from purely image intensity based techniques to motion estimation based on surrogate tracking. Research efforts are often based on custom designed software platforms which take too much time and effort to develop. To address this challenge we have developed an open software platform especially focusing on tumor motion management. FLIRT is a freely available open-source software platform. The core method for tumor tracking is purely intensity based 2D/3D registration. The platform is written in C++ using the Qt framework for the user interface. The performance critical methods are implemented on the graphics processor using the CUDA extension. One registration can be as fast as 90ms (11Hz). This is suitable to track tumors moving due to respiration (~0.3Hz) or heartbeat (~1Hz). Apart from focusing on high performance, the platform is designed to be flexible and easy to use. Current use cases range from tracking feasibility studies, patient positioning and method validation. Such a framework has the potential of enabling the research community to rapidly perform patient studies or try new methods.

  16. ReX: A suite of computational tools for the design, visualization, and analysis of chimeric protein libraries.

    Science.gov (United States)

    Huang, Weiliang; Johnston, Wayne A; Boden, Mikael; Gillam, Elizabeth M J

    2016-02-01

    Directed evolution has greatly facilitated protein engineering and provided new insights into protein structure-function relationships. DNA shuffling using restriction enzymes is a particularly simple and cost-effective means of recombinatorial evolution that is well within the capability of most molecular biologists, but tools for the design and analysis of such experiments are limited. Here we introduce a suite of freely available online tools to make the construction and analysis of chimeric libraries readily accessible to the novice. REcut (http://qpmf.rx.umaryland.edu/REcut.html) facilitates the choice of DNA fragmentation strategy, while Xover (http://qpmf.rx.umaryland.edu/Xover.html) analyzes chimeric mutants to reveal recombination patterns and extract quantitative data. PMID:26842355

  17. BIM Software Capability and Interoperability Analysis : An analytical approach toward structural usage of BIM software (S-BIM)

    OpenAIRE

    A. Taher, Ali

    2016-01-01

    This study focused on the structuralanalysis of BIM models. Different commercial software (Autodesk products and Rhinoceros)are presented through modelling and analysis of different structures with varying complexity,section properties, geometry, and material. Beside the commercial software, differentarchitectural and different tools for structural analysis are evaluated (dynamo, grasshopper,add-on tool, direct link, indirect link via IFC).

  18. Open source data analysis and visualization software for optical engineering

    Science.gov (United States)

    Smith, Greg A.; Lewis, Benjamin J.; Palmer, Michael; Kim, Dae Wook; Loeff, Adrian R.; Burge, James H.

    2012-10-01

    SAGUARO is open-source software developed to simplify data assimilation, analysis, and visualization by providing a single framework for disparate data sources from raw hardware measurements to optical simulation output. Developed with a user-friendly graphical interface in the MATLABTM environment, SAGUARO is intended to be easy for the enduser in search of useful optical information as well as the developer wanting to add new modules and functionalities. We present here the flexibility of the SAGUARO software and discuss how it can be applied to the wider optical engineering community.

  19. [Finite Element Analysis of Intravascular Stent Based on ANSYS Software].

    Science.gov (United States)

    Shi, Gengqiang; Song, Xiaobing

    2015-10-01

    This paper adopted UG8.0 to bulid the stent and blood vessel models. The models were then imported into the finite element analysis software ANSYS. The simulation results of ANSYS software showed that after endothelial stent implantation, the velocity of the blood was slow and the fluctuation of velocity was small, which meant the flow was relatively stable. When blood flowed through the endothelial stent, the pressure gradually became smaller, and the range of the pressure was not wide. The endothelial shear stress basically unchanged. In general, it can be concluded that the endothelial stents have little impact on the flow of blood and can fully realize its function. PMID:26964302

  20. Strategic Analysis of the Enterprise Mobile Device Management Software Industry

    OpenAIRE

    Shesterin, Dmitry

    2012-01-01

    This paper analyzes the enterprise mobile device management industry and evaluates three strategic alternatives by which an established computer systems management software manufacturing company can enter this industry. The analysis of the three strategic alternatives to build, buy or partner in order to bring to market an enterprise mobile device management product offering delves into an examination of the company’s existing position and performance; conducts an external analysis of the ent...

  1. Non-Imaging Software/Data Analysis Requirements

    Science.gov (United States)

    1984-01-01

    The analysis software needs of the non-imaging planetary data user are discussed. Assumptions as to the nature of the planetary science data centers where the data are physically stored are advanced, the scope of the non-imaging data is outlined, and facilities that users are likely to need to define and access data are identified. Data manipulation and analysis needs and display graphics are discussed.

  2. Spectrum analysis on quality requirements consideration in software design documents

    OpenAIRE

    Kaiya, Haruhiko; Umemura, Masahiro; Ogata, Shinpei; Kaijiri, Kenji

    2013-01-01

    Abstract Software quality requirements defined in the requirements analysis stage should be implemented in the final products, such as source codes and system deployment. To guarantee this meta-requirement, quality requirements should be considered in the intermediate stages, such as the design stage or the architectural definition stage. We propose a novel method for checking whether quality requirements are considered in the design stage. In this method, a technique called “spectrum analysi...

  3. GWAMA: software for genome-wide association meta-analysis

    OpenAIRE

    Mägi Reedik; Morris Andrew P

    2010-01-01

    Abstract Background Despite the recent success of genome-wide association studies in identifying novel loci contributing effects to complex human traits, such as type 2 diabetes and obesity, much of the genetic component of variation in these phenotypes remains unexplained. One way to improving power to detect further novel loci is through meta-analysis of studies from the same population, increasing the sample size over any individual study. Although statistical software analysis packages in...

  4. STING Millennium: a web-based suite of programs for comprehensive and simultaneous analysis of protein structure and sequence

    Science.gov (United States)

    Neshich, Goran; Togawa, Roberto C.; Mancini, Adauto L.; Kuser, Paula R.; Yamagishi, Michel E. B.; Pappas, Georgios; Torres, Wellington V.; Campos, Tharsis Fonseca e; Ferreira, Leonardo L.; Luna, Fabio M.; Oliveira, Adilton G.; Miura, Ronald T.; Inoue, Marcus K.; Horita, Luiz G.; de Souza, Dimas F.; Dominiquini, Fabiana; Álvaro, Alexandre; Lima, Cleber S.; Ogawa, Fabio O.; Gomes, Gabriel B.; Palandrani, Juliana F.; dos Santos, Gabriela F.; de Freitas, Esther M.; Mattiuz, Amanda R.; Costa, Ivan C.; de Almeida, Celso L.; Souza, Savio; Baudet, Christian; Higa, Roberto H.

    2003-01-01

    STING Millennium Suite (SMS) is a new web-based suite of programs and databases providing visualization and a complex analysis of molecular sequence and structure for the data deposited at the Protein Data Bank (PDB). SMS operates with a collection of both publicly available data (PDB, HSSP, Prosite) and its own data (contacts, interface contacts, surface accessibility). Biologists find SMS useful because it provides a variety of algorithms and validated data, wrapped-up in a user friendly web interface. Using SMS it is now possible to analyze sequence to structure relationships, the quality of the structure, nature and volume of atomic contacts of intra and inter chain type, relative conservation of amino acids at the specific sequence position based on multiple sequence alignment, indications of folding essential residue (FER) based on the relationship of the residue conservation to the intra-chain contacts and Cα–Cα and Cβ–Cβ distance geometry. Specific emphasis in SMS is given to interface forming residues (IFR)—amino acids that define the interactive portion of the protein surfaces. SMS may simultaneously display and analyze previously superimposed structures. PDB updates trigger SMS updates in a synchronized fashion. SMS is freely accessible for public data at http://www.cbi.cnptia.embrapa.br, http://mirrors.rcsb.org/SMS and http://trantor.bioc.columbia.edu/SMS. PMID:12824333

  5. One-Click Data Analysis Software for Science Operations

    Science.gov (United States)

    Navarro, Vicente

    2015-12-01

    One of the important activities of ESA Science Operations Centre is to provide Data Analysis Software (DAS) to enable users and scientists to process data further to higher levels. During operations and post-operations, Data Analysis Software (DAS) is fully maintained and updated for new OS and library releases. Nonetheless, once a Mission goes into the "legacy" phase, there are very limited funds and long-term preservation becomes more and more difficult. Building on Virtual Machine (VM), Cloud computing and Software as a Service (SaaS) technologies, this project has aimed at providing long-term preservation of Data Analysis Software for the following missions: - PIA for ISO (1995) - SAS for XMM-Newton (1999) - Hipe for Herschel (2009) - EXIA for EXOSAT (1983) Following goals have guided the architecture: - Support for all operations, post-operations and archive/legacy phases. - Support for local (user's computer) and cloud environments (ESAC-Cloud, Amazon - AWS). - Support for expert users, requiring full capabilities. - Provision of a simple web-based interface. This talk describes the architecture, challenges, results and lessons learnt gathered in this project.

  6. COMPUTER SIMULATION: COMPARATIVE ANALYSIS OF SOFTWARES ARENA® AND PROMODEL®

    Directory of Open Access Journals (Sweden)

    Luiz Enéias Zanetti Cardoso

    2016-04-01

    Full Text Available The computer simulation is not exclusive areas of Logistics and Production, implementation takes place within the limits of technical expertise of professionals. Although not widespread at present, there is a rise of projection in use, as the numerous application possibilities, if properly modeled in reality presented face. This article proposes to present comparative and qualitative analysis of two computer simulation software, version Arena® 14,000 Student and ProModel® RunTimeSilve version - Demo, according to the following criteria: desktop, access to commands, ease in developing the software model and accessories, and can be seen the main features of each simulation software, as well as the differences between their interfaces, however, both were confirmed as great tools to support management processes.

  7. Software and codes for analysis of concentrating solar power technologies.

    Energy Technology Data Exchange (ETDEWEB)

    Ho, Clifford Kuofei

    2008-12-01

    This report presents a review and evaluation of software and codes that have been used to support Sandia National Laboratories concentrating solar power (CSP) program. Additional software packages developed by other institutions and companies that can potentially improve Sandia's analysis capabilities in the CSP program are also evaluated. The software and codes are grouped according to specific CSP technologies: power tower systems, linear concentrator systems, and dish/engine systems. A description of each code is presented with regard to each specific CSP technology, along with details regarding availability, maintenance, and references. A summary of all the codes is then presented with recommendations regarding the use and retention of the codes. A description of probabilistic methods for uncertainty and sensitivity analyses of concentrating solar power technologies is also provided.

  8. Development of image acquisition and analysis software for accelerator applications

    International Nuclear Information System (INIS)

    The electron beam profile, beam size and beam position are some of the important parameters in an accelerator. Measurement of these parameters in a quantitative manner allows accelerator operators to optimize other beam and machine parameters. One of the most commonly used device for measurement of beam profile and beam size in an accelerator is fluorescent screen beam profile monitor. In Indus Accelerator Complex at Raja Ramanna Centre for Advanced Technology (RRCAT), fluorescent screen beam profile monitors are installed in Transport Lines, Booster Synchrotron, Indus-1 and Indus-2 ring. A software has been developed in-house for image acquisition and analysis which allows accelerator operators to capture the images of beam. Once image is acquired, user can process the image offline to find beam profile and beam position. The software supports various modes of image acquisition and has built-in function for viewing the beam profile. The software allows accelerator operators to create audio video interleave (AVI) files from the acquired images and built-in AVI file viewer allows operators to play the AVI files. The software has been installed in Indus accelerator control room and now routinely being used by Indus accelerator operation group. This paper presents the various features of the software. (author)

  9. GammaLib and ctools: A software framework for the analysis of astronomical gamma-ray data

    CERN Document Server

    Knödlseder, J; Deil, C; Cayrou, J -B; Owen, E; Kelley-Hoskins, N; Lu, C -C; Buehler, R; Forest, F; Louge, T; Siejkowski, H; Kosack, K; Gerard, L; Schulz, A; Martin, P; Sanchez, D; Ohm, S; Hassan, T; Brau-Nogué, S

    2016-01-01

    The field of gamma-ray astronomy has seen important progress during the last decade, yet there exists so far no common software framework for the scientific analysis of gamma-ray telescope data. We propose to fill this gap by means of the GammaLib software, a generic library that we have developed to support the analysis of gamma-ray event data. GammaLib has been written in C++ and all functionality is available in Python through an extension module. On top of this framework we have developed the ctools software package, a suite of software tools that enables building of flexible workflows for the analysis of Imaging Air Cherenkov Telescope event data. The ctools are inspired by science analysis software available for existing high-energy astronomy instruments, and they follow the modular ftools model developed by the High Energy Astrophysics Science Archive Research Center. The ctools have been written in Python and C++, and can be either used from the command line, via shell scripts, or directly from Python...

  10. ROCKETSHIP: a flexible and modular software tool for the planning, processing and analysis of dynamic MRI studies

    International Nuclear Information System (INIS)

    Dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) is a promising technique to characterize pathology and evaluate treatment response. However, analysis of DCE-MRI data is complex and benefits from concurrent analysis of multiple kinetic models and parameters. Few software tools are currently available that specifically focuses on DCE-MRI analysis with multiple kinetic models. Here, we developed ROCKETSHIP, an open-source, flexible and modular software for DCE-MRI analysis. ROCKETSHIP incorporates analyses with multiple kinetic models, including data-driven nested model analysis. ROCKETSHIP was implemented using the MATLAB programming language. Robustness of the software to provide reliable fits using multiple kinetic models is demonstrated using simulated data. Simulations also demonstrate the utility of the data-driven nested model analysis. Applicability of ROCKETSHIP for both preclinical and clinical studies is shown using DCE-MRI studies of the human brain and a murine tumor model. A DCE-MRI software suite was implemented and tested using simulations. Its applicability to both preclinical and clinical datasets is shown. ROCKETSHIP was designed to be easily accessible for the beginner, but flexible enough for changes or additions to be made by the advanced user as well. The availability of a flexible analysis tool will aid future studies using DCE-MRI. A public release of ROCKETSHIP is available at (https://github.com/petmri/ROCKETSHIP)

  11. STARS software tool for analysis of reliability and safety

    International Nuclear Information System (INIS)

    This paper reports on the STARS (Software Tool for the Analysis of Reliability and Safety) project aims at developing an integrated set of Computer Aided Reliability Analysis tools for the various tasks involved in systems safety and reliability analysis including hazard identification, qualitative analysis, logic model construction and evaluation. The expert system technology offers the most promising perspective for developing a Computer Aided Reliability Analysis tool. Combined with graphics and analysis capabilities, it can provide a natural engineering oriented environment for computer assisted reliability and safety modelling and analysis. For hazard identification and fault tree construction, a frame/rule based expert system is used, in which the deductive (goal driven) reasoning and the heuristic, applied during manual fault tree construction, is modelled. Expert system can explain their reasoning so that the analyst can become aware of the why and the how results are being obtained. Hence, the learning aspect involved in manual reliability and safety analysis can be maintained and improved

  12. Knickpoint finder: A software tool that improves neotectonic analysis

    Science.gov (United States)

    Queiroz, G. L.; Salamuni, E.; Nascimento, E. R.

    2015-03-01

    This work presents a new software tool for morphometric analysis of drainage networks based on the methods of Hack (1973) and Etchebehere et al. (2004). This tool is applicable to studies of morphotectonics and neotectonics. The software used a digital elevation model (DEM) to identify the relief breakpoints along drainage profiles (knickpoints). The program was coded in Python for use on the ArcGIS platform and is called Knickpoint Finder. A study area was selected to test and evaluate the software's ability to analyze and identify neotectonic morphostructures based on the morphology of the terrain. For an assessment of its validity, we chose an area of the James River basin, which covers most of the Piedmont area of Virginia (USA), which is an area of constant intraplate seismicity and non-orogenic active tectonics and exhibits a relatively homogeneous geodesic surface currently being altered by the seismogenic features of the region. After using the tool in the chosen area, we found that the knickpoint locations are associated with the geologic structures, epicenters of recent earthquakes, and drainages with rectilinear anomalies. The regional analysis demanded the use of a spatial representation of the data after processing using Knickpoint Finder. The results were satisfactory in terms of the correlation of dense areas of knickpoints with active lineaments and the rapidity of the identification of deformed areas. Therefore, this software tool may be considered useful in neotectonic analyses of large areas and may be applied to any area where there is DEM coverage.

  13. Search for Chemical Biomarkers on Mars Using the Sample Analysis at Mars Instrument Suite on the Mars Science Laboratory

    Science.gov (United States)

    Glavin, D. P.; Conrad, P.; Dworkin, J. P.; Eigenbrode, J.; Mahaffy, P. R.

    2011-01-01

    One key goal for the future exploration of Mars is the search for chemical biomarkers including complex organic compounds important in life on Earth. The Sample Analysis at Mars (SAM) instrument suite on the Mars Science Laboratory (MSL) will provide the most sensitive measurements of the organic composition of rocks and regolith samples ever carried out in situ on Mars. SAM consists of a gas chromatograph (GC), quadrupole mass spectrometer (QMS), and tunable laser spectrometer to measure volatiles in the atmosphere and released from rock powders heated up to 1000 C. The measurement of organics in solid samples will be accomplished by three experiments: (1) pyrolysis QMS to identify alkane fragments and simple aromatic compounds; pyrolysis GCMS to separate and identify complex mixtures of larger hydrocarbons; and (3) chemical derivatization and GCMS extract less volatile compounds including amino and carboxylic acids that are not detectable by the other two experiments.

  14. Development of rotating inner ECT probe to be suited for numerical analysis support

    International Nuclear Information System (INIS)

    Ensuring the security of heat transfer tubes of the pressurized water reactor in nuclear power plants and chemical industry plants is a key factor in their safe exploitation. Accordingly, eddy current testing is used because of its recognized accuracy and high speed since the test must be done in a limited period. We developed a rotating inner probe for detecting axial cracks. The probe is composed of a circumferential winding coil excitation coil and a pickup coil. The excitation coil of this probe is able to generate uniform eddy current distribution over a wider area than the detection area of the pickup coil. Therefore, the probe can ignore the influence of the shape of the excitation current source on detecting cracks, and numerical analysis for this probe is handled simply by devising boundary condition. From measurement results, the probe is confirmed to have high sensitivity. (author)

  15. NEXT GENERATION ANALYSIS SOFTWARE FOR COMPONENT EVALUATION - Results of Rotational Seismometer Evaluation

    Science.gov (United States)

    Hart, D. M.; Merchant, B. J.; Abbott, R. E.

    2012-12-01

    The Component Evaluation project at Sandia National Laboratories supports the Ground-based Nuclear Explosion Monitoring program by performing testing and evaluation of the components that are used in seismic and infrasound monitoring systems. In order to perform this work, Component Evaluation maintains a testing facility called the FACT (Facility for Acceptance, Calibration, and Testing) site, a variety of test bed equipment, and a suite of software tools for analyzing test data. Recently, Component Evaluation has successfully integrated several improvements to its software analysis tools and test bed equipment that have substantially improved our ability to test and evaluate components. The software tool that is used to analyze test data is called TALENT: Test and AnaLysis EvaluatioN Tool. TALENT is designed to be a single, standard interface to all test configuration, metadata, parameters, waveforms, and results that are generated in the course of testing monitoring systems. It provides traceability by capturing everything about a test in a relational database that is required to reproduce the results of that test. TALENT provides a simple, yet powerful, user interface to quickly acquire, process, and analyze waveform test data. The software tool has also been expanded recently to handle sensors whose output is proportional to rotation angle, or rotation rate. As an example of this new processing capability, we show results from testing the new ATA ARS-16 rotational seismometer. The test data was collected at the USGS ASL. Four datasets were processed: 1) 1 Hz with increasing amplitude, 2) 4 Hz with increasing amplitude, 3) 16 Hz with increasing amplitude and 4) twenty-six discrete frequencies between 0.353 Hz to 64 Hz. The results are compared to manufacture-supplied data sheets.

  16. Off-line analysis software for the Texas Test Rig

    International Nuclear Information System (INIS)

    Data analysis for the TTR requires integrating a large number of muon chamber technologies, each with different requirements, into a single analysis chain. Many of these technologies come with their own software, which have different conventions; these packages are grafted on. Data are stored on a tape robot with essential information stored in a database where it may be queried. Operation is done from special-purpose X trademark windows designed to facilitate data selection and its subsequent analysis. Program development was done using the Hewlett-Packard Softbench trademark product

  17. Data Analysis Software for the ESPRESSO Science Machine

    CERN Document Server

    Cupani, Guido; Cristiani, Stefano; González-Hernández, Jonay; Lovis, Christophe; Sousa, Sérgio; Vanzella, Eros; Di Marcantonio, Paolo; Mégevand, Denis

    2015-01-01

    ESPRESSO is an extremely stable high-resolution spectrograph which is currently being developed for the ESO VLT. With its groundbreaking characteristics it is aimed to be a "science machine", i.e., a fully-integrated instrument to directly extract science information from the observations. In particular, ESPRESSO will be the first ESO instrument to be equipped with a dedicated tool for the analysis of data, the Data Analysis Software (DAS), consisting in a number of recipes to analyze both stellar and quasar spectra. Through the new ESO Reflex GUI, the DAS (which will implement new algorithms to analyze quasar spectra) is aimed to get over the shortcomings of the existing software providing multiple iteration modes and full interactivity with the data.

  18. Stromatoporoid biometrics using image analysis software: A first order approach

    Science.gov (United States)

    Wolniewicz, Pawel

    2010-04-01

    Strommetric is a new image analysis computer program that performs morphometric measurements of stromatoporoid sponges. The program measures 15 features of skeletal elements (pillars and laminae) visible in both longitudinal and transverse thin sections. The software is implemented in C++, using the Open Computer Vision (OpenCV) library. The image analysis system distinguishes skeletal elements from sparry calcite using Otsu's method for image thresholding. More than 150 photos of thin sections were used as a test set, from which 36,159 measurements were obtained. The software provided about one hundred times more data than the current method applied until now. The data obtained are reproducible, even if the work is repeated by different workers. Thus the method makes the biometric studies of stromatoporoids objective.

  19. Calibration analysis software for the ATLAS Pixel Detector

    Science.gov (United States)

    Stramaglia, Maria Elena

    2016-07-01

    The calibration of the ATLAS Pixel Detector at LHC fulfils two main purposes: to tune the front-end configuration parameters for establishing the best operational settings and to measure the tuning performance through a subset of scans. An analysis framework has been set up in order to take actions on the detector given the outcome of a calibration scan (e.g. to create a mask for disabling noisy pixels). The software framework to control all aspects of the Pixel Detector scans and analyses is called calibration console. The introduction of a new layer, equipped with new FE-I4 chips, required an update of the console architecture. It now handles scans and scan analyses applied together to chips with different characteristics. An overview of the newly developed calibration analysis software will be presented, together with some preliminary results.

  20. Calibration Analysis Software for the ATLAS Pixel Detector

    CERN Document Server

    Stramaglia, Maria Elena; The ATLAS collaboration

    2015-01-01

    The calibration of the Pixel detector fulfills two main purposes: to tune front-end registers for establishing the best operational settings and to measure the tuning performance through a subset of scans. An analysis framework has been set up in order to take actions on the detector given the outcome of a calibration scan (e.g. to create a mask for disabling noisy pixels). The software framework to control all aspects of the Pixel detector scans and analyses is called Calibration Console. The introduction of a new layer, equipped with new Front End-I4 Chips, required an update the Console architecture. It now handles scans and scans analyses applied toghether to chips with dierent characteristics. An overview of the newly developed Calibration Analysis Software will be presented, together with some preliminary result.

  1. Calibration Analysis Software for the ATLAS Pixel Detector

    CERN Document Server

    Stramaglia, Maria Elena; The ATLAS collaboration

    2015-01-01

    The calibration of the ATLAS Pixel detector at LHC fulfils two main purposes: to tune the front-end configuration parameters for establishing the best operational settings and to measure the tuning performance through a subset of scans. An analysis framework has been set up in order to take actions on the detector given the outcome of a calibration scan (e.g. to create a mask for disabling noisy pixels). The software framework to control all aspects of the Pixel detector scans and analyses is called Calibration Console. The introduction of a new layer, equipped with new Front End-I4 Chips, required an update the Console architecture. It now handles scans and scans analyses applied together to chips with different characteristics. An overview of the newly developed Calibration Analysis Software will be presented, together with some preliminary result.

  2. SNS: Analytic Receiver Analysis Software Using Electrical Scattering Matrices

    CERN Document Server

    King, Oliver G

    2010-01-01

    SNS is a MATLAB-based software library written to aid in the design and analysis of receiver architectures. It uses electrical scattering matrices and noise wave vectors to describe receiver architectures of arbitrary topology and complexity. It differs from existing freely-available software mainly in that the scattering matrices used to describe the receiver and its components are analytic rather than numeric. This allows different types of modeling and analysis of receivers to be performed. Non-ideal behavior of receiver components can be parameterized in their scattering matrices. SNS enables the instrument designer to then derive analytic expressions for the signal and noise at the receiver outputs in terms of parameterized component imperfections, and predict their contribution to receiver systematic errors precisely. This can drive the receiver design process by, for instance, allowing the instrument designer to identify which component imperfections contribute most to receiver systematic errors, and h...

  3. Effectiveness of an Automatic Tracking Software in Underwater Motion Analysis

    Directory of Open Access Journals (Sweden)

    Fabrício A. Magalhaes

    2013-12-01

    Full Text Available Tracking of markers placed on anatomical landmarks is a common practice in sports science to perform the kinematic analysis that interests both athletes and coaches. Although different software programs have been developed to automatically track markers and/or features, none of them was specifically designed to analyze underwater motion. Hence, this study aimed to evaluate the effectiveness of a software developed for automatic tracking of underwater movements (DVP, based on the Kanade-Lucas-Tomasi feature tracker. Twenty-one video recordings of different aquatic exercises (n = 2940 markers’ positions were manually tracked to determine the markers’ center coordinates. Then, the videos were automatically tracked using DVP and a commercially available software (COM. Since tracking techniques may produce false targets, an operator was instructed to stop the automatic procedure and to correct the position of the cursor when the distance between the calculated marker’s coordinate and the reference one was higher than 4 pixels. The proportion of manual interventions required by the software was used as a measure of the degree of automation. Overall, manual interventions were 10.4% lower for DVP (7.4% than for COM (17.8%. Moreover, when examining the different exercise modes separately, the percentage of manual interventions was 5.6% to 29.3% lower for DVP than for COM. Similar results were observed when analyzing the type of marker rather than the type of exercise, with 9.9% less manual interventions for DVP than for COM. In conclusion, based on these results, the developed automatic tracking software presented can be used as a valid and useful tool for underwater motion analysis.

  4. HistFitter software framework for statistical data analysis

    OpenAIRE

    Baak, M.; Besjes, G. J.; D. Côté; Koutsman, A.; Lorenz, J.; Short, D.

    2014-01-01

    We present a software framework for statistical data analysis, called HistFitter, that has been used extensively by the ATLAS Collaboration to analyze big datasets originating from proton–proton collisions at the Large Hadron Collider at CERN. Since 2012 HistFitter has been the standard statistical tool in searches for supersymmetric particles performed by ATLAS. HistFitter is a programmable and flexible framework to build, book-keep, fit, interpret and present results of data models of nearl...

  5. GATB: a software toolbox for genome assembly and analysis

    OpenAIRE

    Drezen, Erwan; Rizk, Guillaume; Chikhi, Rayan; Deltel, Charles; Lemaitre, Claire; Peterlongo, Pierre; Lavenier, Dominique

    2014-01-01

    International audience The analysis of NGS data remains a time and space-consuming task. Many efforts have been made to provide efficient data structures for indexing the terabytes of data generated by the fast sequencing machines (Suffix Array, Burrows-Wheeler transform, Bloom Filter, etc.). Mapper tools, genome assemblers, SNP callers, etc., make an intensive use of these data structures to keep their memory footprint as lower as possible.The overall efficiency of NGS software is brought...

  6. CAVASS: A Computer-Assisted Visualization and Analysis Software System

    OpenAIRE

    Grevera, George; Udupa, Jayaram; Odhner, Dewey; Zhuge, Ying; Souza, Andre; Iwanaga, Tad; Mishra, Shipra

    2007-01-01

    The Medical Image Processing Group at the University of Pennsylvania has been developing (and distributing with source code) medical image analysis and visualization software systems for a long period of time. Our most recent system, 3DVIEWNIX, was first released in 1993. Since that time, a number of significant advancements have taken place with regard to computer platforms and operating systems, networking capability, the rise of parallel processing standards, and the development of open-so...

  7. Software for a measuring facility for activation analysis

    International Nuclear Information System (INIS)

    A software package has been developed for an APPLE P.C. The programs are intended to control an automated measuring station for photon activation analysis at GELINA, the linear accelerator of C.B.N.M. at Geel (Belgium). They allow to set-up a measuring scheme, to execute it under computer control, to accumulate and store 2 K-spectra using a built-in ADC and to output the results as listings, plots or evaluated reports

  8. Open source image analysis software toolboxes for microscopic applications

    OpenAIRE

    Dimiter Prodanov

    2013-01-01

    Modern microscopy allows for acquisition of images spanning in different spectral, spatial and temporal dimensions. Once acquired, these frequently huge images need to be condensed into several quantitative statements that can either support or falsify the initial research questions. This process of measurement and analysis cannot be performed nowadays without the use of specialized software toolboxes. These toolboxes make the backbone of a newly defined branch of bioinformatics denoted as bi...

  9. Dependence Analysis of Component Based Software through Assumptions

    OpenAIRE

    Ratneshwer; Tripathi, A. K.

    2011-01-01

    This study presents a quantitative approach for dependency analysis of Component Based Software (CBS) systems. Various types of dependency, in a CBS, have been observed through 'assumptions' and based on these observations some derived dependency relationships are proposed. The proposed dependency relationships are validated theoretically and an example illustration has been shown to demonstrate the proposal. The result of the study suggests that these dependency relationships may prove helpf...

  10. Nucleonica: Web-based Software Tools for Simulations and Analysis

    OpenAIRE

    Magill, Joseph; DREHER Raymond; SOTI Zsolt; LASCHE George

    2012-01-01

    The authors present a description of a new web-based software portal for simulation and analysis for a wide range of commonly encountered nuclear science applications. Advantages of a web-based approach include availability wherever there is internet access, intuitive user-friendly interface, remote access to high-power computing resources, and continual maintenance, improvement, and addition of tools and techniques common to the nuclear science industry. A description of the nuclear data res...

  11. Designing Embedded Software Analysis System for Telecom Enterprise

    OpenAIRE

    Huang, Rui

    2016-01-01

    In the recent mobile broadband world, the smart and powerful applications have been widely developed for consumer. Obviously, the most convincible instances are Apple’s Mobile Operating System (iOS) and Google’s Mobile Operating System (Android) applications. However, in the traditional industry, such as telecom industry, the mobile applications working as a part of a software analysis system have not yet been widely involved. The emergence of powerful mobile devices and cloud computing platf...

  12. Dependence Analysis of Component Based Software through Assumptions

    Directory of Open Access Journals (Sweden)

    Ratneshwer

    2011-07-01

    Full Text Available This study presents a quantitative approach for dependency analysis of Component Based Software (CBS systems. Various types of dependency, in a CBS, have been observed through 'assumptions' and based on these observations some derived dependency relationships are proposed. The proposed dependency relationships are validated theoretically and an example illustration has been shown to demonstrate the proposal. The result of the study suggests that these dependency relationships may prove helpful in understanding CBS systems.

  13. Evaluating the Quantitative Capabilities of Metagenomic Analysis Software.

    Science.gov (United States)

    Kerepesi, Csaba; Grolmusz, Vince

    2016-05-01

    DNA sequencing technologies are applied widely and frequently today to describe metagenomes, i.e., microbial communities in environmental or clinical samples, without the need for culturing them. These technologies usually return short (100-300 base-pairs long) DNA reads, and these reads are processed by metagenomic analysis software that assign phylogenetic composition-information to the dataset. Here we evaluate three metagenomic analysis software (AmphoraNet-a webserver implementation of AMPHORA2-, MG-RAST, and MEGAN5) for their capabilities of assigning quantitative phylogenetic information for the data, describing the frequency of appearance of the microorganisms of the same taxa in the sample. The difficulties of the task arise from the fact that longer genomes produce more reads from the same organism than shorter genomes, and some software assign higher frequencies to species with longer genomes than to those with shorter ones. This phenomenon is called the "genome length bias." Dozens of complex artificial metagenome benchmarks can be found in the literature. Because of the complexity of those benchmarks, it is usually difficult to judge the resistance of a metagenomic software to this "genome length bias." Therefore, we have made a simple benchmark for the evaluation of the "taxon-counting" in a metagenomic sample: we have taken the same number of copies of three full bacterial genomes of different lengths, break them up randomly to short reads of average length of 150 bp, and mixed the reads, creating our simple benchmark. Because of its simplicity, the benchmark is not supposed to serve as a mock metagenome, but if a software fails on that simple task, it will surely fail on most real metagenomes. We applied three software for the benchmark. The ideal quantitative solution would assign the same proportion to the three bacterial taxa. We have found that AMPHORA2/AmphoraNet gave the most accurate results and the other two software were under

  14. Orion Relative Navigation Flight Software Analysis and Design

    Science.gov (United States)

    D'Souza, Chris; Christian, John; Zanetti, Renato

    2011-01-01

    The Orion relative Navigation System has sought to take advantage of the latest developments in sensor and algorithm technology while living under the constraints of mass, power, volume, and throughput. In particular, the only sensor specifically designed for relative navigation is the Vision Navigation System (VNS), a lidar-based sensor. But it uses the Star Trackers, GPS (when available) and IMUs, which are part of the overall Orion navigation sensor suite, to produce a relative state accurate enough to dock with the ISS. The Orion Relative Navigation System has significantly matured as the program has evolved from the design phase to the flight software implementation phase. With the development of the VNS system and the STORRM flight test of the Orion Relative Navigation hardware, much of the performance of the system will be characterized before the first flight. However challenges abound, not the least of which is the elimination of the RF range and range-rate system, along with the development of the FSW in the Matlab/Simulink/Stateflow environment. This paper will address the features and the rationale for the Orion Relative Navigation design as well as the performance of the FSW in a 6-DOF environment as well as the initial results of the hardware performance from the STORRM flight.

  15. SNANA: A Public Software Package for Supernova Analysis

    CERN Document Server

    Kessler, Richard; Cinabro, David; Dilday, Benjamin; Frieman, Joshua A; Jha, Saurabh; Kuhlmann, Stephen; Miknaitis, Gajus; Sako, Masao; Taylor, Matt; Vanderplas, Jake

    2009-01-01

    We describe a general analysis package for supernova (SN) light curves, called SNANA, that contains a simulation, light curve fitter, and cosmology fitter. The software is designed with the primary goal of using SNe Ia as distance indicators for the determination of cosmological parameters, but it can also be used to study efficiencies for analyses of SN rates, estimate contamination from non-Ia SNe, and optimize future surveys. Several SN models are available within the same software architecture, allowing technical features such as K-corrections to be consistently used among multiple models, and thus making it easier to make detailed comparisons between models. New and improved light-curve models can be easily added. The software works with arbitrary surveys and telescopes and has already been used by several collaborations, leading to more robust and easy-to-use code. This software is not intended as a final product release, but rather it is designed to undergo continual improvements from the community as ...

  16. SNANA: A Public Software Package for Supernova Analysis

    Science.gov (United States)

    Kessler, Richard; Bernstein, Joseph P.; Cinabro, David; Dilday, Benjamin; Frieman, Joshua A.; Jha, Saurabh; Kuhlmann, Stephen; Miknaitis, Gajus; Sako, Masao; Taylor, Matt; VanderPlas, Jake

    2010-10-01

    SNANA is a general analysis package for supernova (SN) light curves that contains a simulation, light curve fitter, and cosmology fitter. The software is designed with the primary goal of using SNe Ia as distance indicators for the determination of cosmological parameters, but it can also be used to study efficiencies for analyses of SN rates, estimate contamination from non-Ia SNe, and optimize future surveys. Several SN models are available within the same software architecture, allowing technical features such as K-corrections to be consistently used among multiple models, and thus making it easier to make detailed comparisons between models. New and improved light-curve models can be easily added. The software works with arbitrary surveys and telescopes and has already been used by several collaborations, leading to more robust and easy-to-use code. This software is not intended as a final product release, but rather it is designed to undergo continual improvements from the community as more is learned about SNe.

  17. Open-source data analysis and visualization software platform: SAGUARO

    Science.gov (United States)

    Kim, Dae Wook; Lewis, Benjamin J.; Burge, James H.

    2011-09-01

    Optical engineering projects often require massive data processing with many steps in the course of design, simulation, fabrication, metrology, and evaluation. A MATLAB™-based data processing platform has been developed to provide a standard way to manipulate and visualize various types of data that are created from optical measurement equipment. The operation of this software platform via a graphical user interface is easy and powerful. Data processing is performed by running modules that use a proscribed format for sharing data. Complex operations are performed by stringing modules together using macros. While numerous modules have been developed to allow data processing without the need to write software, the greatest power of the platform is provided by its flexibility. A developer's toolkit is provided to allow development and customization of modules, and the program allows a real-time interface with the standard MATLAB environment. This software, developed by the Large Optics Fabrication and Testing group at the University of Arizona, is now publicly available.** We present the capabilities of the software and provide some demonstrations of its use for data analysis and visualization. Furthermore, we demonstrate the flexibility of the platform for solving new problems.

  18. Nuclear analysis software. Pt. 1: Spectrum transfer and reformatting (SPEDAC)

    International Nuclear Information System (INIS)

    GANAAS (Gamma, Activity, and Neutron Activation Analysis System) is one in the family of software packages developed under the auspices of the International Atomic Energy Agency. Primarily, the package was intended to support the IAEA Technical Assistance and Cooperation projects in developing countries. However, it is open domain software that can be copied and used by anybody, except for commercial purposes. All the nuclear analysis software provided by the IAEA has the same design philosophy and similar structure. The intention was to provide the user with maximum flexibility, at the same time with a simple and logical organization that requires minimum digging through the manuals. GANAAS is a modular system. It consists of several programmes that can be installed on the hard disk as the are needed. Obviously, some parts of they system are required in all cases. Those are installed at the beginning, without consulting the operator. GANAAS offers the opportunity to expand and improve the system. The gamma spectrum evaluation programmes using different fitting algorithms can be added to GANAAS, under the condition that the format of their input and output files corresponds to the rules of GANAAS. The same applies to the quantitative analysis parts of the programme

  19. Sense and nonsense of pathway analysis software in proteomics.

    Science.gov (United States)

    Müller, Thorsten; Schrötter, Andreas; Loosse, Christina; Helling, Stefan; Stephan, Christian; Ahrens, Maike; Uszkoreit, Julian; Eisenacher, Martin; Meyer, Helmut E; Marcus, Katrin

    2011-12-01

    New developments in proteomics enable scientists to examine hundreds to thousands of proteins in parallel. Quantitative proteomics allows the comparison of different proteomes of cells, tissues, or body fluids with each other. Analyzing and especially organizing these data sets is often a Herculean task. Pathway Analysis software tools aim to take over this task based on present knowledge. Companies promise that their algorithms help to understand the significance of scientist's data, but the benefit remains questionable, and a fundamental systematic evaluation of the potential of such tools has not been performed until now. Here, we tested the commercial Ingenuity Pathway Analysis tool as well as the freely available software STRING using a well-defined study design in regard to the applicability and value of their results for proteome studies. It was our goal to cover a wide range of scientific issues by simulating different established pathways including mitochondrial apoptosis, tau phosphorylation, and Insulin-, App-, and Wnt-signaling. Next to a general assessment and comparison of the pathway analysis tools, we provide recommendations for users as well as for software developers to improve the added value of a pathway study implementation in proteomic pipelines. PMID:21978018

  20. Easily extensible unix software for spectral analysis, display, modification, and synthesis of musical sounds

    Science.gov (United States)

    Beauchamp, James W.

    2002-11-01

    Software has been developed which enables users to perform time-varying spectral analysis of individual musical tones or successions of them and to perform further processing of the data. The package, called sndan, is freely available in source code, uses EPS graphics for display, and is written in ansi c for ease of code modification and extension. Two analyzers, a fixed-filter-bank phase vocoder (''pvan'') and a frequency-tracking analyzer (''mqan'') constitute the analysis front end of the package. While pvan's output consists of continuous amplitudes and frequencies of harmonics, mqan produces disjoint ''tracks.'' However, another program extracts a fundamental frequency and separates harmonics from the tracks, resulting in a continuous harmonic output. ''monan'' is a program used to display harmonic data in a variety of formats, perform various spectral modifications, and perform additive resynthesis of the harmonic partials, including possible pitch-shifting and time-scaling. Sounds can also be synthesized according to a musical score using a companion synthesis language, Music 4C. Several other programs in the sndan suite can be used for specialized tasks, such as signal display and editing. Applications of the software include producing specialized sounds for music compositions or psychoacoustic experiments or as a basis for developing new synthesis algorithms.

  1. IMMAN: free software for information theory-based chemometric analysis.

    Science.gov (United States)

    Urias, Ricardo W Pino; Barigye, Stephen J; Marrero-Ponce, Yovani; García-Jacas, César R; Valdes-Martiní, José R; Perez-Gimenez, Facundo

    2015-05-01

    The features and theoretical background of a new and free computational program for chemometric analysis denominated IMMAN (acronym for Information theory-based CheMoMetrics ANalysis) are presented. This is multi-platform software developed in the Java programming language, designed with a remarkably user-friendly graphical interface for the computation of a collection of information-theoretic functions adapted for rank-based unsupervised and supervised feature selection tasks. A total of 20 feature selection parameters are presented, with the unsupervised and supervised frameworks represented by 10 approaches in each case. Several information-theoretic parameters traditionally used as molecular descriptors (MDs) are adapted for use as unsupervised rank-based feature selection methods. On the other hand, a generalization scheme for the previously defined differential Shannon's entropy is discussed, as well as the introduction of Jeffreys information measure for supervised feature selection. Moreover, well-known information-theoretic feature selection parameters, such as information gain, gain ratio, and symmetrical uncertainty are incorporated to the IMMAN software ( http://mobiosd-hub.com/imman-soft/ ), following an equal-interval discretization approach. IMMAN offers data pre-processing functionalities, such as missing values processing, dataset partitioning, and browsing. Moreover, single parameter or ensemble (multi-criteria) ranking options are provided. Consequently, this software is suitable for tasks like dimensionality reduction, feature ranking, as well as comparative diversity analysis of data matrices. Simple examples of applications performed with this program are presented. A comparative study between IMMAN and WEKA feature selection tools using the Arcene dataset was performed, demonstrating similar behavior. In addition, it is revealed that the use of IMMAN unsupervised feature selection methods improves the performance of both IMMAN and WEKA

  2. An analysis of related software cycles among organizations, people and the software industry

    OpenAIRE

    Adams, Brady.

    2008-01-01

    There is a need to understand cycles associated with software upgrades as they effect people, organizations and the software industry. This thesis intends to explore the moderating factors of these three distinct and disjointed cycles and propose courses of action towards mitigating various issues and problems inherent in the software upgrade process. This thesis will acknowledge that three related but disjointed cycles are common in many software upgrade ventures in today's organization...

  3. Text analysis software to help learners write in French

    OpenAIRE

    Audras, Isabelle; Ganascia, Jean-Gabriel

    2006-01-01

    New text analysis software developed thanks to research in areas such as Machine Learning and Natural Language Processing is also useful in language theory and research. Littératron is a new data-processing tool for automatic syntactic pattern extraction that was designed at the LIP6 laboratory by Jean-Gabriel Ganascia. By syntactic pattern we mean an association of coherent linguistic units. More exactly, the inputs of Littératron are syntactic analysis trees, provided by a linear text analy...

  4. Uses of software in digital image analysis: a forensic report

    Science.gov (United States)

    Sharma, Mukesh; Jha, Shailendra

    2010-02-01

    Forensic image analysis is required an expertise to interpret the content of an image or the image itself in legal matters. Major sub-disciplines of forensic image analysis with law enforcement applications include photo-grammetry, photographic comparison, content analysis and image authentication. It has wide applications in forensic science range from documenting crime scenes to enhancing faint or indistinct patterns such as partial fingerprints. The process of forensic image analysis can involve several different tasks, regardless of the type of image analysis performed. Through this paper authors have tried to explain these tasks, which are described in to three categories: Image Compression, Image Enhancement & Restoration and Measurement Extraction. With the help of examples like signature comparison, counterfeit currency comparison and foot-wear sole impression using the software Canvas and Corel Draw.

  5. A METHOD FOR SELECTING SOFTWARE FOR DYNAMIC EVENT ANALYSIS I: PROBLEM SELECTION

    Energy Technology Data Exchange (ETDEWEB)

    J. M. Lacy; S. R. Novascone; W. D. Richins; T. K. Larson

    2007-08-01

    New nuclear power reactor designs will require resistance to a variety of possible malevolent attacks, as well as traditional dynamic accident scenarios. The design/analysis team may be faced with a broad range of phenomena including air and ground blasts, high-velocity penetrators or shaped charges, and vehicle or aircraft impacts. With a host of software tools available to address these high-energy events, the analysis team must evaluate and select the software most appropriate for their particular set of problems. The accuracy of the selected software should then be validated with respect to the phenomena governing the interaction of the threat and structure. In this paper, we present a method for systematically comparing current high-energy physics codes for specific applications in new reactor design. Several codes are available for the study of blast, impact, and other shock phenomena. Historically, these packages were developed to study specific phenomena such as explosives performance, penetrator/target interaction, or accidental impacts. As developers generalize the capabilities of their software, legacy biases and assumptions can remain that could affect the applicability of the code to other processes and phenomena. R&D institutions generally adopt one or two software packages and use them almost exclusively, performing benchmarks on a single-problem basis. At the Idaho National Laboratory (INL), new comparative information was desired to permit researchers to select the best code for a particular application by matching its characteristics to the physics, materials, and rate scale (or scales) representing the problem at hand. A study was undertaken to investigate the comparative characteristics of a group of shock and high-strain rate physics codes including ABAQUS, LS-DYNA, CTH, ALEGRA, ALE-3D, and RADIOSS. A series of benchmark problems were identified to exercise the features and capabilities of the subject software. To be useful, benchmark problems

  6. DiaSuite: a Tool Suite To Develop Sense/Compute/Control Applications

    OpenAIRE

    Bertran, Benjamin; Bruneau, Julien; Cassou, Damien; Loriant, Nicolas; Balland, Emilie; Consel, Charles

    2014-01-01

    We present DiaSuite, a tool suite that uses a software design approach to drive the development process. DiaSuite focuses on a specific domain, namely Sense/Compute/Control (SCC) applications. It comprises a domain-specific design language, a compiler producing a Java programming framework, a 2D-renderer to simulate an application, and a deployment framework. We have validated our tool suite on a variety of concrete applications in areas including telecommunications, building automation, robo...

  7. Open source software and crowdsourcing for energy analysis

    International Nuclear Information System (INIS)

    Informed energy decision making requires effective software, high-quality input data, and a suitably trained user community. Developing these resources can be expensive and time consuming. Even when data and tools are intended for public re-use they often come with technical, legal, economic and social barriers that make them difficult to adopt, adapt and combine for use in new contexts. We focus on the promise of open, publically accessible software and data as well as crowdsourcing techniques to develop robust energy analysis tools that can deliver crucial, policy-relevant insight, particularly in developing countries, where planning resources are highly constrained—and the need to adapt these resources and methods to the local context is high. We survey existing research, which argues that these techniques can produce high-quality results, and also explore the potential role that linked, open data can play in both supporting the modelling process and in enhancing public engagement with energy issues. - Highlights: ► We focus on the promise of open, publicly accessible software and data. ► These emerging techniques can produce high-quality results for energy analysis. ► Developing economies require new techniques for energy planning.

  8. A software architectural framework specification for neutron activation analysis

    International Nuclear Information System (INIS)

    Neutron Activation Analysis (NAA) is a sensitive multi-element nuclear analytical technique that has been routinely applied by research reactor (RR) facilities to environmental, nutritional, health related, geological and geochemical studies. As RR facilities face calls to increase their research output and impact, with existing or reducing budgets, automation of NAA offers a possible solution. However, automation has many challenges, not the least of which is a lack of system architecture standards to establish acceptable mechanisms for the various hardware/software and software/software interactions among data acquisition systems, specialised hardware such as sample changers, sample loaders, and data processing modules. This lack of standardization often results in automation hardware and software being incompatible with existing system components, in a facility looking to automate its NAA operations. This limits the availability of automation to a few RR facilities with adequate budgets or in-house engineering resources. What is needed is a modern open system architecture for NAA, that provides the required set of functionalities. This paper describes such an 'architectural framework' (OpenNAA), and portions of a reference implementation. As an example of the benefits, calculations indicate that applying this architecture to the compilation and QA steps associated with the analysis of 35 elements in 140 samples, with 14 SRM's, can reduce the time required by over 80 %. The adoption of open standards in the nuclear industry has been very successful over the years in promoting interchangeability and maximising the lifetime and output of nuclear measurement systems. OpenNAA will provide similar benefits within the NAA application space, safeguarding user investments in their current system, while providing a solid path for development into the future. (author)

  9. Comparison of two software versions for assessment of body-composition analysis by DXA

    DEFF Research Database (Denmark)

    Vozarova, B; Wang, J; Weyer, C;

    2001-01-01

    To compare two software versions provided by Lunar CO: for assessment of body composition analysis by DXA.......To compare two software versions provided by Lunar CO: for assessment of body composition analysis by DXA....

  10. Software fault tree analysis of an automated control system device written in Ada

    OpenAIRE

    Winter, Mathias William.

    1995-01-01

    Software Fault Tree Analysis (SFTA) is a technique used to analyze software for faults that could lead to hazardous conditions in systems which contain software components. Previous thesis works have developed three Ada-based, semi-automated software analysis tools, the Automated Code Translation Tool (ACm) an Ada statement template generator, the Fault Tree Editor (Fm) a graphical fault tree editor, and the Fault Isolator (Fl) an automated software fault tree isolator. These previous works d...

  11. UPVapor: Cofrentes nuclear power plant production results analysis software

    Energy Technology Data Exchange (ETDEWEB)

    Curiel, M. [Logistica y Acondicionamientos Industriales SAU, Sorolla Center, local 10, Av. de las Cortes Valencianas No. 58, 46015 Valencia (Spain); Palomo, M. J. [ISIRYM, Universidad Politecnica de Valencia, Camino de Vera s/n, Valencia (Spain); Baraza, A. [Iberdrola Generacion S. A., Central Nuclear Cofrentes, Carretera Almansa Requena s/n, 04662 Cofrentes, Valencia (Spain); Vaquer, J., E-mail: m.curiel@lainsa.co [TITANIA Servicios Tecnologicos SL, Sorolla Center, local 10, Av. de las Cortes Valencianas No. 58, 46015 Valencia (Spain)

    2010-10-15

    UPVapor software version 02 has been developed for the Cofrentes nuclear power plant Data Analysis Department (Spain). It is an analysis graphical environment in which users have available all the plant variables registered in the process computer system (SIEC). In order to perform this, UPVapor software has many advanced graphic tools for work simplicity, as well as a friendly environment easy to use and with many configuration possibilities. Plant variables are classified in the same way that they are in SIEC computer and these values are taken from it through the network of Iberdrola. UPVapor can generate two different types of graphics: evolution graphs and X Y graphs. The first ones analyse the evolution up to twenty plant variables in a user's defined time period and according to historic plant files. Many tools are available: cursors, graphic configuration, mobile means, non valid data visualization ... Moreover, a particular analysis configuration can be saved, as a pre selection, giving the possibility of charging pre selection directly and developing quick monitoring of a group of preselected plant variables. In X Y graphs, it is possible to analyse a variable value against another variable in a defined time. As an option, users can filter previous data depending on a variable certain range, with the possibility of programming up to five filters. As well as the other graph, X Y graph has many configurations, saving and printing options. With UPVapor software, data analysts can save a valuable time during daily work and, as it is of easy utilization, it permits to other users to perform their own analysis without ask the analysts to develop. Besides, it can be used from any work centre with access to network framework. (Author)

  12. UPVapor: Cofrentes nuclear power plant production results analysis software

    International Nuclear Information System (INIS)

    UPVapor software version 02 has been developed for the Cofrentes nuclear power plant Data Analysis Department (Spain). It is an analysis graphical environment in which users have available all the plant variables registered in the process computer system (SIEC). In order to perform this, UPVapor software has many advanced graphic tools for work simplicity, as well as a friendly environment easy to use and with many configuration possibilities. Plant variables are classified in the same way that they are in SIEC computer and these values are taken from it through the network of Iberdrola. UPVapor can generate two different types of graphics: evolution graphs and X Y graphs. The first ones analyse the evolution up to twenty plant variables in a user's defined time period and according to historic plant files. Many tools are available: cursors, graphic configuration, mobile means, non valid data visualization ... Moreover, a particular analysis configuration can be saved, as a pre selection, giving the possibility of charging pre selection directly and developing quick monitoring of a group of preselected plant variables. In X Y graphs, it is possible to analyse a variable value against another variable in a defined time. As an option, users can filter previous data depending on a variable certain range, with the possibility of programming up to five filters. As well as the other graph, X Y graph has many configurations, saving and printing options. With UPVapor software, data analysts can save a valuable time during daily work and, as it is of easy utilization, it permits to other users to perform their own analysis without ask the analysts to develop. Besides, it can be used from any work centre with access to network framework. (Author)

  13. PET-Tool: a software suite for comprehensive processing and managing of Paired-End diTag (PET sequence data

    Directory of Open Access Journals (Sweden)

    Wei Chia-Lin

    2006-08-01

    Full Text Available Abstract Background We recently developed the Paired End diTag (PET strategy for efficient characterization of mammalian transcriptomes and genomes. The paired end nature of short PET sequences derived from long DNA fragments raised a new set of bioinformatics challenges, including how to extract PETs from raw sequence reads, and correctly yet efficiently map PETs to reference genome sequences. To accommodate and streamline data analysis of the large volume PET sequences generated from each PET experiment, an automated PET data process pipeline is desirable. Results We designed an integrated computation program package, PET-Tool, to automatically process PET sequences and map them to the genome sequences. The Tool was implemented as a web-based application composed of four modules: the Extractor module for PET extraction; the Examiner module for analytic evaluation of PET sequence quality; the Mapper module for locating PET sequences in the genome sequences; and the ProjectManager module for data organization. The performance of PET-Tool was evaluated through the analyses of 2.7 million PET sequences. It was demonstrated that PET-Tool is accurate and efficient in extracting PET sequences and removing artifacts from large volume dataset. Using optimized mapping criteria, over 70% of quality PET sequences were mapped specifically to the genome sequences. With a 2.4 GHz LINUX machine, it takes approximately six hours to process one million PETs from extraction to mapping. Conclusion The speed, accuracy, and comprehensiveness have proved that PET-Tool is an important and useful component in PET experiments, and can be extended to accommodate other related analyses of paired-end sequences. The Tool also provides user-friendly functions for data quality check and system for multi-layer data management.

  14. HZAR: hybrid zone analysis using an R software package.

    Science.gov (United States)

    Derryberry, Elizabeth P; Derryberry, Graham E; Maley, James M; Brumfield, Robb T

    2014-05-01

    We present a new software package (HZAR) that provides functions for fitting molecular genetic and morphological data from hybrid zones to classic equilibrium cline models using the Metropolis-Hastings Markov chain Monte Carlo (MCMC) algorithm. The software applies likelihood functions appropriate for different types of data, including diploid and haploid genetic markers and quantitative morphological traits. The modular design allows flexibility in fitting cline models of varying complexity. To facilitate hypothesis testing, an autofit function is included that allows automated model selection from a set of nested cline models. Cline parameter values, such as cline centre and cline width, are estimated and may be compared statistically across clines. The package is written in the R language and is available through the Comprehensive R Archive Network (CRAN; http://cran.r-project.org/). Here, we describe HZAR and demonstrate its use with a sample data set from a well-studied hybrid zone in western Panama between white-collared (Manacus candei) and golden-collared manakins (M. vitellinus). Comparisons of our results with previously published results for this hybrid zone validate the hzar software. We extend analysis of this hybrid zone by fitting additional models to molecular data where appropriate. PMID:24373504

  15. Analysis of Performance of Stereoscopic-Vision Software

    Science.gov (United States)

    Kim, Won; Ansar, Adnan; Steele, Robert; Steinke, Robert

    2007-01-01

    A team of JPL researchers has analyzed stereoscopic vision software and produced a document describing its performance. This software is of the type used in maneuvering exploratory robotic vehicles on Martian terrain. The software in question utilizes correlations between portions of the images recorded by two electronic cameras to compute stereoscopic disparities, which, in conjunction with camera models, are used in computing distances to terrain points to be included in constructing a three-dimensional model of the terrain. The analysis included effects of correlation- window size, a pyramidal image down-sampling scheme, vertical misalignment, focus, maximum disparity, stereo baseline, and range ripples. Contributions of sub-pixel interpolation, vertical misalignment, and foreshortening to stereo correlation error were examined theoretically and experimentally. It was found that camera-calibration inaccuracy contributes to both down-range and cross-range error but stereo correlation error affects only the down-range error. Experimental data for quantifying the stereo disparity error were obtained by use of reflective metrological targets taped to corners of bricks placed at known positions relative to the cameras. For the particular 1,024-by-768-pixel cameras of the system analyzed, the standard deviation of the down-range disparity error was found to be 0.32 pixel.

  16. A software tool for 3D dose verification and analysis

    Science.gov (United States)

    Sa'd, M. Al; Graham, J.; Liney, G. P.

    2013-06-01

    The main recent developments in radiotherapy have focused on improved treatment techniques in order to generate further significant improvements in patient prognosis. There is now an internationally recognised need to improve 3D verification of highly conformal radiotherapy treatments. This is because of the very high dose gradients used in modern treatment techniques, which can result in a small error in the spatial dose distribution leading to a serious complication. In order to gain the full benefits of using 3D dosimetric technologies (such as gel dosimetry), it is vital to use 3D evaluation methods and algorithms. We present in this paper a software solution that provides a comprehensive 3D dose evaluation and analysis. The software is applied to gel dosimetry, which is based on magnetic resonance imaging (MRI) as a read-out method. The software can also be used to compare any two dose distributions, such as two distributions planned using different methods of treatment planning systems, or different dose calculation algorithms.

  17. A software tool for 3D dose verification and analysis

    International Nuclear Information System (INIS)

    The main recent developments in radiotherapy have focused on improved treatment techniques in order to generate further significant improvements in patient prognosis. There is now an internationally recognised need to improve 3D verification of highly conformal radiotherapy treatments. This is because of the very high dose gradients used in modern treatment techniques, which can result in a small error in the spatial dose distribution leading to a serious complication. In order to gain the full benefits of using 3D dosimetric technologies (such as gel dosimetry), it is vital to use 3D evaluation methods and algorithms. We present in this paper a software solution that provides a comprehensive 3D dose evaluation and analysis. The software is applied to gel dosimetry, which is based on magnetic resonance imaging (MRI) as a read-out method. The software can also be used to compare any two dose distributions, such as two distributions planned using different methods of treatment planning systems, or different dose calculation algorithms.

  18. Metadata database and data analysis software for the ground-based upper atmospheric data developed by the IUGONET project

    Science.gov (United States)

    Hayashi, H.; Tanaka, Y.; Hori, T.; Koyama, Y.; Shinbori, A.; Abe, S.; Kagitani, M.; Kouno, T.; Yoshida, D.; Ueno, S.; Kaneda, N.; Yoneda, M.; Tadokoro, H.; Motoba, T.; Umemura, N.; Iugonet Project Team

    2011-12-01

    The Inter-university Upper atmosphere Global Observation NETwork (IUGONET) is a Japanese inter-university project by the National Institute of Polar Research (NIPR), Tohoku University, Nagoya University, Kyoto University, and Kyushu University to build a database of metadata for ground-based observations of the upper atmosphere. The IUGONET institutes/universities have been collecting various types of data by radars, magnetometers, photometers, radio telescopes, helioscopes, etc. at various locations all over the world and at various altitude layers from the Earth's surface to the Sun. The metadata database will be of great help to researchers in efficiently finding and obtaining these observational data spread over the institutes/universities. This should also facilitate synthetic analysis of multi-disciplinary data, which will lead to new types of research in the upper atmosphere. The project has also been developing a software to help researchers download, visualize, and analyze the data provided from the IUGONET institutes/universities. The metadata database system is built on the platform of DSpace, which is an open source software for digital repositories. The data analysis software is written in the IDL language with the TDAS (THEMIS Data Analysis Software suite) library. These products have been just released for beta-testing.

  19. Development of RCM analysis software for Korean nuclear power plants

    International Nuclear Information System (INIS)

    A software called KEPCO RCM workstation (KRCM) has been developed to optimize the maintenance strategies of Korea nuclear power plants. The program modules of the KRCM were designed in a manner that combines EPRI methodologies and KEPRI analysis technique. The KRCM is being applied to the three pilot systems, chemical and volume control system, main steam system, and compressed air system of Yonggwang Units 1 and 2. In addition, the KRCM can be utilized as a tool to meet a part of the requirements of maintenance rule (MR) imposed by U.S. NRC

  20. Fault tree analysis of software at Ontario Hydro

    International Nuclear Information System (INIS)

    The fault tree technique has been used by Ontario Hydro to effectively review and verify safety critical systems at its nuclear generating stations (NGS). Recent efforts, on the Shutdown Systems at Darlington NGS and the protective fuel-handling software at Bruce NGS A, have shown the fault tree technique to be a valuable tool for uncovering latent conditional errors and facilitating recommendations to increase system fault-tolerance. The experiences of the Bruce NGS A analysis are presented here as a vehicle to illustrate the practical advantages and limitations of the fault tree technique

  1. PROTEINCHALLENGE: Crowd sourcing in proteomics analysis and software development

    DEFF Research Database (Denmark)

    Martin, Sarah F.; Falkenberg, Heiner; Dyrlund, Thomas Franck;

    2013-01-01

    , including arguments for community-wide open source software development and “big data” compatible solutions for the future. For the meantime, we have laid out ten top tips for data processing. With these at hand, a first large-scale proteomics analysis hopefully becomes less daunting to navigate...... workflows, with the aim of setting a community-driven gold standard for data handling, reporting and sharing. This article is part of a Special Issue entitled: New Horizons and Applications for Proteomics [EuPA 2012]....

  2. Development of RCM analysis software for Korean nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Young Ho; Choi, Kwang Hee; Jeong, Hyeong Jong [Korea Electric Power Research Institute, Taejon (Korea, Republic of)

    1998-12-31

    A software called KEPCO RCM workstation (KRCM) has been developed to optimize the maintenance strategies of Korean nuclear power plants. The program modules of the KRCM were designed in a manner that combines EPRI methodologies and KEPRI analysis technique. The KRCM is being applied to the three pilot system, chemical and volume control system, main steam system, and compressed air system of Yonggwang Units 1 and 2. In addition, the KRCM can be utilized as a tool to meet a part of the requirements of maintenance rule (MR) imposed by U.S. NRC. 3 refs., 4 figs. (Author)

  3. Development of a software for INAA analysis automation

    International Nuclear Information System (INIS)

    In this work, a software to automate the post-counting tasks in comparative INAA has been developed that aims to become more flexible than the available options, integrating itself with some of the routines currently in use in the IPEN Activation Analysis Laboratory and allowing the user to choose between a fully-automatic analysis or an Excel-oriented one. The software makes use of the Genie 2000 data importing and analysis routines and stores each 'energy-counts-uncertainty' table as a separate ASCII file that can be used later on if required by the analyst. Moreover, it generates an Excel-compatible CSV (comma separated values) file with only the relevant results from the analyses for each sample or comparator, as well as the results of the concentration calculations and the results obtained with four different statistical tools (unweighted average, weighted average, normalized residuals and Rajeval technique), allowing the analyst to double-check the results. Finally, a 'summary' CSV file is also produced, with the final concentration results obtained for each element in each sample. (author)

  4. Integrating software architectures for distributed simulations and simulation analysis communities.

    Energy Technology Data Exchange (ETDEWEB)

    Goldsby, Michael E.; Fellig, Daniel; Linebarger, John Michael; Moore, Patrick Curtis; Sa, Timothy J.; Hawley, Marilyn F.

    2005-10-01

    The one-year Software Architecture LDRD (No.79819) was a cross-site effort between Sandia California and Sandia New Mexico. The purpose of this research was to further develop and demonstrate integrating software architecture frameworks for distributed simulation and distributed collaboration in the homeland security domain. The integrated frameworks were initially developed through the Weapons of Mass Destruction Decision Analysis Center (WMD-DAC), sited at SNL/CA, and the National Infrastructure Simulation & Analysis Center (NISAC), sited at SNL/NM. The primary deliverable was a demonstration of both a federation of distributed simulations and a federation of distributed collaborative simulation analysis communities in the context of the same integrated scenario, which was the release of smallpox in San Diego, California. To our knowledge this was the first time such a combination of federations under a single scenario has ever been demonstrated. A secondary deliverable was the creation of the standalone GroupMeld{trademark} collaboration client, which uses the GroupMeld{trademark} synchronous collaboration framework. In addition, a small pilot experiment that used both integrating frameworks allowed a greater range of crisis management options to be performed and evaluated than would have been possible without the use of the frameworks.

  5. Tool Support for Parametric Analysis of Large Software Simulation Systems

    Science.gov (United States)

    Schumann, Johann; Gundy-Burlet, Karen; Pasareanu, Corina; Menzies, Tim; Barrett, Tony

    2008-01-01

    The analysis of large and complex parameterized software systems, e.g., systems simulation in aerospace, is very complicated and time-consuming due to the large parameter space, and the complex, highly coupled nonlinear nature of the different system components. Thus, such systems are generally validated only in regions local to anticipated operating points rather than through characterization of the entire feasible operational envelope of the system. We have addressed the factors deterring such an analysis with a tool to support envelope assessment: we utilize a combination of advanced Monte Carlo generation with n-factor combinatorial parameter variations to limit the number of cases, but still explore important interactions in the parameter space in a systematic fashion. Additional test-cases, automatically generated from models (e.g., UML, Simulink, Stateflow) improve the coverage. The distributed test runs of the software system produce vast amounts of data, making manual analysis impossible. Our tool automatically analyzes the generated data through a combination of unsupervised Bayesian clustering techniques (AutoBayes) and supervised learning of critical parameter ranges using the treatment learner TAR3. The tool has been developed around the Trick simulation environment, which is widely used within NASA. We will present this tool with a GN&C (Guidance, Navigation and Control) simulation of a small satellite system.

  6. Stress Analysis Of Lpg Cylinder Using Ansys Software

    Directory of Open Access Journals (Sweden)

    Suhas A.Rewatkar

    2013-01-01

    Full Text Available Analysis of the robot hand was analyzed using dedicated software for FEM analysis. The model was exported to FEM processor i.e. in ANSYS, the geometry was updated and the structure meshed using 3D elements. Finiteelement analysis is a method to computationally model reality in a mathematical form to better understand a highly complex problem. In the real world, everything that occurs results from the interaction between atoms (and sub-particles of those atoms. Billions and billions and billions of them. If we were to simulate the world in a computer, we would have to simulate this interaction based on the simple laws of physics. However, no computer can process the near infinite number of atoms in objects, so instead we model 'finite' groups of them.

  7. Comparative analysis of results between CASMO, MCNP and Serpent for a suite of Benchmark problems on BWR reactors

    International Nuclear Information System (INIS)

    In this paper a comparison is made in analyzing the suite of Benchmark problems for reactors type BWR between CASMO-4, MCNP6 and Serpent code. The Benchmark problem consists of two different geometries: a fuel cell of a pin and assembly type BWR. To facilitate the study of reactors physics in the fuel pin their nuclear characteristics are provided to detail, such as burnt dependence, the reactivity of selected nuclide, etc. With respect to the fuel assembly, the presented results are regarding to infinite multiplication factor for burning different steps and different vacuum conditions. Making the analysis of this set of Benchmark problems provides comprehensive test problems for the next fuels generation of BWR reactors with high extended burned. It is important to note that when making this comparison the purpose is to validate the methodologies used in modeling for different operating conditions, if the case is of other BWR assembly. The results will be within a range with some uncertainty, considering that does not depend on code that is used. Escuela Superior de Fisica y Matematicas of Instituto Politecnico Nacional (IPN (Mexico) has accumulated some experience in using Serpent, due to the potential of this code over other commercial codes such as CASMO and MCNP. The obtained results for the infinite multiplication factor are encouraging and motivate the studies to continue with the generation of the X S of a core to a next step a respective nuclear data library is constructed and this can be used by codes developed as part of the development project of the Mexican Analysis Platform of Nuclear Reactors AZTLAN. (Author)

  8. Data-Interpolating Variational Analysis (DIVA) software : recent development and application

    Science.gov (United States)

    Watelet, Sylvain; Barth, Alexander; Troupin, Charles; Ouberdous, Mohamed; Beckers, Jean-Marie

    2014-05-01

    The Data-Interpolating Variational Analysis (DIVA) software is a tool designed to reconstruct a continuous field from discrete measurements. This method is based on the numerical implementation of the Variational Inverse Model (VIM), which consists of a minimization of a cost function, allowing the choice of the analyzed field fitting at best the data sets. The problem is solved efficiently using a finite-element method. This statistical method is particularly suited to deal with irregularly-spaced observations, producing outputs on a regular grid. Initially created to work in a two-dimensional way, the software is now able to handle 3D or even 4D analysis, in order to easily produce ocean climatologies. These analyzes can easily be improved by taking advantage of the DIVA's ability to take topographic and dynamic constraints into account (coastal relief, prevailing wind impacting the advection,...). In DIVA, we assume errors on measurements are not correlated, which means we do not consider the effect of correlated observation errors on the analysis and we therefore use a diagonal observation error covariance matrix. However, the oceanographic data sets are generally clustered in space and time, thus introducing some correlation between observations. In order to determine the impact of such an approximation and provide strategies to mitigate its effects, we conducted several synthetic experiments with known correlation structure. Overall, the best results were obtained with a variant of the covariance inflation method. Finally, a new application of DIVA on satellite altimetry data will be presented : these data have particular space and time distributions, as they consist of repeated tracks (~10-35 days) of measurements with a distance lower than 10 km between two successive measurements in a given track. The tools designed to determine the analysis parameters were adapted to these specificities. Moreover, different weights were applied to measurements in order to

  9. Analysis of Test Efficiency during Software Development Process

    OpenAIRE

    Nair, T. R. Gopalakrishnan; Suma, V.; Tiwari, Pranesh Kumar

    2012-01-01

    One of the prerequisites of any organization is an unvarying sustainability in the dynamic and competitive industrial environment. Development of high quality software is therefore an inevitable constraint of any software industry. Defect management being one of the highly influencing factors for the production of high quality software, it is obligatory for the software organizations to orient them towards effective defect management. Since, the time of software evolution, testing is deemed a...

  10. Evaluation of Peak-Fitting Software for Gamma Spectrum Analysis

    CERN Document Server

    Zahn, Guilherme S; Moralles, Maurício

    2015-01-01

    In all applications of gamma-ray spectroscopy, one of the most important and delicate parts of the data analysis is the fitting of the gamma-ray spectra, where information as the number of counts, the position of the centroid and the width, for instance, are associated with each peak of each spectrum. There's a huge choice of computer programs that perform this type of analysis, and the most commonly used in routine work are the ones that automatically locate and fit the peaks; this fit can be made in several different ways -- the most common ways are to fit a Gaussian function to each peak or simply to integrate the area under the peak, but some software go far beyond and include several small corrections to the simple Gaussian peak function, in order to compensate for secondary effects. In this work several gamma-ray spectroscopy software are compared in the task of finding and fitting the gamma-ray peaks in spectra taken with standard sources of $^{137}$Cs, $^{60}$Co, $^{133}$Ba and $^{152}$Eu. The results...

  11. HistFitter software framework for statistical data analysis

    CERN Document Server

    Baak, M; Cote, D; Koutsman, A; Lorenz, J; Short, D

    2015-01-01

    We present a software framework for statistical data analysis, called HistFitter, that has been used extensively by the ATLAS Collaboration to analyze big datasets originating from proton-proton collisions at the Large Hadron Collider at CERN. Since 2012 HistFitter has been the standard statistical tool in searches for supersymmetric particles performed by ATLAS. HistFitter is a programmable and flexible framework to build, book-keep, fit, interpret and present results of data models of nearly arbitrary complexity. Starting from an object-oriented configuration, defined by users, the framework builds probability density functions that are automatically fitted to data and interpreted with statistical tests. A key innovation of HistFitter is its design, which is rooted in core analysis strategies of particle physics. The concepts of control, signal and validation regions are woven into its very fabric. These are progressively treated with statistically rigorous built-in methods. Being capable of working with mu...

  12. Detection and Quantification of Nitrogen Compounds in Martian Solid Samples by the Sample Analysis at Mars (SAM) Instrument Suite

    Science.gov (United States)

    Stern, Jennifer C.; Navarro-Gonzalez, Rafael; Freissinet, Caroline; McKay, Christopher P.; Archer, Paul Douglas; Buch, Arnaud; Eigenbrode, Jennifer L.; Franz, Heather; Glavin, Daniel Patrick; Ming, Douglas W/; Steele, Andrew; Szopa, Cyril; Wray, James J.; Conrad, Pamela Gales; Mahaffay, Paul R.

    2013-01-01

    The Sample Analysis at Mars (SAM) instrument suite on the Mars Science Laboratory (MSL) Curiosity Rover detected both reduced and oxidized nitrogen-bearing compounds during the pyrolysis of surface materials from three sites at Gale Crater. Preliminary detections of nitrogen species include NO, HCN, ClCN, CH3CN, and TFMA (trifluoro-Nmethyl-acetamide). On Earth, nitrogen is a crucial bio-element, and nitrogen availability controls productivity in many environments. Nitrogen has also recently been detected in the form of CN in inclusions in the Martian meteorite Tissint, and isotopically heavy nitrogen (delta N-15 approx +100per mille) has been measured during stepped combustion experiments in several SNC meteorites. The detection of nitrogen-bearing compounds in Martian regolith would have important implications for the habitability of ancient Mars. However, confirmation of indigenous Martian nitrogen bearing compounds will require ruling out their formation from the terrestrial derivatization reagents (e.g. N-methyl-N-tert-butyldimethylsilyl-trifluoroacetamide, MTBSTFA and dimethylformamide, DMF) carried for SAM's wet chemistry experiment that contribute to the SAM background. The nitrogen species we detect in the SAM solid sample analyses can also be produced during laboratory pyrolysis experiments where these reagents are heated in the presence of perchlorate, a compound that has also been identified by SAM in Mars solid samples. However, this does not preclude a Martian origin for some of these compounds, which are present in nanomolar concentrations in SAM evolved gas analyses. Analysis of SAM data and laboratory breadboard tests are underway to determine whether nitrogen species are present at higher concentrations than can be accounted for by maximum estimates of nitrogen contribution from MTBSTFA and DMF. In addition, methods are currently being developed to use GC Column 6, (functionally similar to a commercial Q-Bond column), to separate and identify

  13. Global Optimization and Broadband Analysis Software for Interstellar Chemistry (GOBASIC)

    Science.gov (United States)

    Rad, Mary L.; Zou, Luyao; Sanders, James L.; Widicus Weaver, Susanna L.

    2016-01-01

    Context. Broadband receivers that operate at millimeter and submillimeter frequencies necessitate the development of new tools for spectral analysis and interpretation. Simultaneous, global, multimolecule, multicomponent analysis is necessary to accurately determine the physical and chemical conditions from line-rich spectra that arise from sources like hot cores. Aims: We aim to provide a robust and efficient automated analysis program to meet the challenges presented with the large spectral datasets produced by radio telescopes. Methods: We have written a program in the MATLAB numerical computing environment for simultaneous global analysis of broadband line surveys. The Global Optimization and Broadband Analysis Software for Interstellar Chemistry (GOBASIC) program uses the simplifying assumption of local thermodynamic equilibrium (LTE) for spectral analysis to determine molecular column density, temperature, and velocity information. Results: GOBASIC achieves simultaneous, multimolecule, multicomponent fitting for broadband spectra. The number of components that can be analyzed at once is only limited by the available computational resources. Analysis of subsequent sets of molecules or components is performed iteratively while taking the previous fits into account. All features of a given molecule across the entire window are fitted at once, which is preferable to the rotation diagram approach because global analysis is less sensitive to blended features and noise features in the spectra. In addition, the fitting method used in GOBASIC is insensitive to the initial conditions chosen, the fitting is automated, and fitting can be performed in a parallel computing environment. These features make GOBASIC a valuable improvement over previously available LTE analysis methods. A copy of the sofware is available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (ftp://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/585/A23

  14. Fault tree synthesis for software design analysis of PLC based safety-critical systems

    Energy Technology Data Exchange (ETDEWEB)

    Koo, S. R.; Cho, C. H. [Corporate R and D Inst., Doosan Heavy Industries and Construction Co., Ltd., 39-3, Seongbok-Dong, Yongin-Si, Gyeonggi-Do 449-795 (Korea, Republic of); Seong, P. H. [Dept. of Nuclear and Quantum Engineering, Korea Advanced Inst. of Science and Technology, 373-3 Guseong-dong, Yuseong-gu, Daejeon, 305-701 (Korea, Republic of)

    2006-07-01

    As a software verification and validation should be performed for the development of PLC based safety-critical systems, a software safety analysis is also considered in line with entire software life cycle. In this paper, we propose a technique of software safety analysis in the design phase. Among various software hazard analysis techniques, fault tree analysis is most widely used for the safety analysis of nuclear power plant systems. Fault tree analysis also has the most intuitive notation and makes both qualitative and quantitative analyses possible. To analyze the design phase more effectively, we propose a technique of fault tree synthesis, along with a universal fault tree template for the architecture modules of nuclear software. Consequently, we can analyze the safety of software on the basis of fault tree synthesis. (authors)

  15. Fault tree synthesis for software design analysis of PLC based safety-critical systems

    International Nuclear Information System (INIS)

    As a software verification and validation should be performed for the development of PLC based safety-critical systems, a software safety analysis is also considered in line with entire software life cycle. In this paper, we propose a technique of software safety analysis in the design phase. Among various software hazard analysis techniques, fault tree analysis is most widely used for the safety analysis of nuclear power plant systems. Fault tree analysis also has the most intuitive notation and makes both qualitative and quantitative analyses possible. To analyze the design phase more effectively, we propose a technique of fault tree synthesis, along with a universal fault tree template for the architecture modules of nuclear software. Consequently, we can analyze the safety of software on the basis of fault tree synthesis. (authors)

  16. ATLAS tile calorimeter cesium calibration control and analysis software

    International Nuclear Information System (INIS)

    An online control system to calibrate and monitor ATLAS Barrel hadronic calorimeter (TileCal) with a movable radioactive source, driven by liquid flow, is described. To read out and control the system an online software has been developed, using ATLAS TDAQ components like DVS (Diagnostic and Verification System) to verify the hardware before running, IS (Information Server) for data and status exchange between networked computers, and other components like DDC (DCS to DAQ Connection), to connect to PVSS-based slow control systems of Tile Calorimeter, high voltage and low voltage. A system of scripting facilities, based on Python language, is used to handle all the calibration and monitoring processes from hardware perspective to final data storage, including various abnormal situations. A QT based graphical user interface to display the status of the calibration system during the cesium source scan is described. The software for analysis of the detector response, using online data, is discussed. Performance of the system and first experience from the ATLAS pit are presented

  17. Acceso abierto y software libre

    OpenAIRE

    Manuel Alejandro Echeverría

    2014-01-01

    The present paper attempts to summarize what is open access, open data, open access repositories, free software applied to libraries at various levels, as well as the advantages and disadvantages of free software in both business and domestic use. It also emphasizes on the benefits of free and alternative web browsers. Finally, an analysis of the disadvantages of the Ubuntu operating system (in household) of Canonical and the LibreOffice office suite of The Document Foundation is carried out.

  18. Software for 3D diagnostic image reconstruction and analysis

    International Nuclear Information System (INIS)

    Recent advances in computer technologies have opened new frontiers in medical diagnostics. Interesting possibilities are the use of three-dimensional (3D) imaging and the combination of images from different modalities. Software prepared in our laboratories devoted to 3D image reconstruction and analysis from computed tomography and ultrasonography is presented. In developing our software it was assumed that it should be applicable in standard medical practice, i.e. it should work effectively with a PC. An additional feature is the possibility of combining 3D images from different modalities. The reconstruction and data processing can be conducted using a standard PC, so low investment costs result in the introduction of advanced and useful diagnostic possibilities. The program was tested on a PC using DICOM data from computed tomography and TIFF files obtained from a 3D ultrasound system. The results of the anthropomorphic phantom and patient data were taken into consideration. A new approach was used to achieve spatial correlation of two independently obtained 3D images. The method relies on the use of four pairs of markers within the regions under consideration. The user selects the markers manually and the computer calculates the transformations necessary for coupling the images. The main software feature is the possibility of 3D image reconstruction from a series of two-dimensional (2D) images. The reconstructed 3D image can be: (1) viewed with the most popular methods of 3D image viewing, (2) filtered and processed to improve image quality, (3) analyzed quantitatively (geometrical measurements), and (4) coupled with another, independently acquired 3D image. The reconstructed and processed 3D image can be stored at every stage of image processing. The overall software performance was good considering the relatively low costs of the hardware used and the huge data sets processed. The program can be freely used and tested (source code and program available at

  19. Further Development of Tieto Software Product Quality Analysis System

    OpenAIRE

    Moisio, Teemu

    2012-01-01

    The definition of software quality and how one experiences quality is a multifaceted matter and usually totally dependent on the user group that observes the quality from different perspectives. A common way to analyse software product’s quality is to measure software product’s characteristics like usability, reliability, efficiency, expandability, testability and maintainability. For analysing software product’s quality, many processes have been developed. Using these processes and acting ac...

  20. Benchmarking library and application software with Data Envelopment Analysis

    OpenAIRE

    Χατζηγεωργίου, Αλέξανδρος; Στειακάκης, Εμμανουήλ

    2011-01-01

    Library software is generally believed to be well-structured and follows certain design guidelines due to the need of continuous evolution and stability of the respective APIs. We perform an empirical study to investigate whether the design of open-source library software is actually superior to that of application software. By analyzing certain design principles and heuristics that are considered important for API design, we extract a set of software metrics that are expected to reflect the ...

  1. Visual data mining and analysis of software repositories

    NARCIS (Netherlands)

    Voinea, Lucian; Telea, Alexandru

    2007-01-01

    In this article we describe an ongoing effort to integrate information visualization techniques into the process of configuration management for software systems. Our focus is to help software engineers manage the evolution of large and complex software systems by offering them effective and efficie

  2. Music Education Suites

    Science.gov (United States)

    Kemp, Wayne

    2009-01-01

    This publication describes options for designing and equipping middle and high school music education suites, and suggests ways of gaining community support for including full service music suites in new and renovated school facilities. In addition to basic music suites, and practice rooms, other options detailed include: (1) small ensemble…

  3. Integrated Software Environment for Pressurized Thermal Shock Analysis

    International Nuclear Information System (INIS)

    The present paper describes the main features and an application to a real Nuclear Power Plant (NPP) of an Integrated Software Environment (in the following referred to as platform) developed at University of Pisa (UNIPI) to perform Pressurized Thermal Shock (PTS) analysis. The platform is written in Java for the portability and it implements all the steps foreseen in the methodology developed at UNIPI for the deterministic analysis of PTS scenarios. The methodology starts with the thermal hydraulic analysis of the NPP with a system code (such as Relap5-3D and Cathare2), during a selected transient scenario. The results so obtained are then processed to provide boundary conditions for the next step, that is, a CFD calculation. Once the system pressure and the RPV wall temperature are known, the stresses inside the RPV wall can be calculated by mean a Finite Element (FE) code. The last step of the methodology is the Fracture Mechanics (FM) analysis, using weight functions, aimed at evaluating the stress intensity factor (KI) at crack tip to be compared with the critical stress intensity factor KIc. The platform automates all these steps foreseen in the methodology once the user specifies a number of boundary conditions at the beginning of the simulation.

  4. Development of the free-space optical communications analysis software

    Science.gov (United States)

    Jeganathan, Muthu; Mecherle, G. Stephen; Lesh, James R.

    1998-05-01

    The Free-space Optical Communication Analysis Software (FOCAS) was developed at the Jet Propulsion Laboratory (JPL) to provide mission planners, systems engineers and communications engineers with an easy to use tool to analyze direct-detection optical communication links. The FOCAS program, implemented in Microsoft Excel, gives it all the power and flexibility built into the spreadsheet. An easy-to-use interface, developed using Visual Basic for Applications (VBA), to the spreadsheet allows easy input of data and parameters. A host of pre- defined components allow an analyst to configure a link without having to know the details of the components. FOCAS replaces the over-a-decade-old FORTRAN program called OPTI widely used previously at JPL. This paper describes the features and capabilities of the Excel-spreadsheet-based FOCAS program.

  5. Comparison among various software for spectral analysis using germanium detectors

    International Nuclear Information System (INIS)

    There are equipment for gamma spectrometry connected to microprocessors having various acquisition systems of spectral data. Generally, they are supplied together applications software packages with resources like peak search, peak are determination and associated uncertainties. Then, using two different Ge detectors, analysis methods by efficiency curve and peak-to-peak, added to appliance of the spectrometry, it was mounted the experimental set-up in order to obtain data that allows to follow performance of codes, by means of comparisons. In this work,a comparative study between the more employed codes in laboratories is showed, with the purpose to check, under certain condition, their response related to the radioactive standards most widely used to calibrate the gamma-rays measurements. (author)

  6. The Database and Data Analysis Software of Radiation Monitoring System

    International Nuclear Information System (INIS)

    Shanghai Synchrotron Radiation Facility (SSRF for short) is a third-generation light source building in China, including a 150MeV injector, 3.5GeV booster, 3.5GeV storage ring and an amount of beam line stations. The data is fetched by the monitoring computer from collecting modules in the front end, and saved in the MySQL database in the managing computer. The data analysis software is coded with Python, a script language, to inquire, summarize and plot the data of a certain monitoring channel during a certain period and export to an external file. In addition, the warning event can be inquired separately. The website for historical and real-time data inquiry and plotting is coded with PHP. (authors)

  7. Knowledge-based requirements analysis for automating software development

    Science.gov (United States)

    Markosian, Lawrence Z.

    1988-01-01

    We present a new software development paradigm that automates the derivation of implementations from requirements. In this paradigm, informally-stated requirements are expressed in a domain-specific requirements specification language. This language is machine-understable and requirements expressed in it are captured in a knowledge base. Once the requirements are captured, more detailed specifications and eventually implementations are derived by the system using transformational synthesis. A key characteristic of the process is that the required human intervention is in the form of providing problem- and domain-specific engineering knowledge, not in writing detailed implementations. We describe a prototype system that applies the paradigm in the realm of communication engineering: the prototype automatically generates implementations of buffers following analysis of the requirements on each buffer.

  8. CancellationTools: All-in-one software for administration and analysis of cancellation tasks.

    Science.gov (United States)

    Dalmaijer, Edwin S; Van der Stigchel, Stefan; Nijboer, Tanja C W; Cornelissen, Tim H W; Husain, Masud

    2015-12-01

    In a cancellation task, a participant is required to search for and cross out ("cancel") targets, which are usually embedded among distractor stimuli. The number of cancelled targets and their location can be used to diagnose the neglect syndrome after stroke. In addition, the organization of search provides a potentially useful way to measure executive control over multitarget search. Although many useful cancellation measures have been introduced, most fail to make their way into research studies and clinical practice due to the practical difficulty of acquiring such parameters from traditional pen-and-paper measures. Here we present new, open-source software that is freely available to all. It allows researchers and clinicians to flexibly administer computerized cancellation tasks using stimuli of their choice, and to directly analyze the data in a convenient manner. The automated analysis suite provides output that includes almost all of the currently existing measures, as well as several new ones introduced here. All tasks can be performed using either a computer mouse or a touchscreen as an input device, and an online version of the task runtime is available for tablet devices. A summary of the results is produced in a single A4-sized PDF document, including high quality data visualizations. For research purposes, batch analysis of large datasets is possible. In sum, CancellationTools allows users to employ a flexible, computerized cancellation task, which provides extensive benefits and ease of use. PMID:25381020

  9. An analysis of containment, surveillance, and authentication for software

    Energy Technology Data Exchange (ETDEWEB)

    Hunteman, W.J.

    1995-09-01

    The {open_quotes}Protection, Containment, Surveillance, and Authentication (PCSA){close_quotes} software was evaluated. The purpose of the evaluation was to determine the effectiveness of PCSA software in meeting the goals of a cost-effective, non-intrusive method for software authentication. Conclusions from the evaluation include (1) Commercially available software should be evaluated for the ability to provide the necessary control of access to and use of a PC. (2) The proposed software authentication methodology is not a cost-effective solution for the Agency and should be abandoned.

  10. Development of Spectrometer Software for Electromagnetic Radiation Measurement and Analysis

    International Nuclear Information System (INIS)

    This software was under development using LabVIEW to be using with StellarNet Spectrometer system. StellarNet Spectrometer was supplied with SpectraWiz operating software that can measure spectral data for real-time spectroscopy. This LabVIEW software was used to access real-time data from SpectraWiz dynamic link library as hardware interfacing. This software will acquire amplitude of every electromagnetic wavelength at periodic time. In addition to hardware interfacing, the user interface capabilities of software include plotting of spectral data in various mode including scope, absorbance, transmission and irradiance mode. This software surely can be used for research and development in application, utilization and safety of electromagnetic radiation, especially solar, laser and ultra violet. Of-line capabilities of this software are almost unlimited due to availability of mathematical and signal processing function in the LabVIEW add on library. (author)

  11. Study of gamma ray analysis software's. Application to activation analysis of geological samples

    International Nuclear Information System (INIS)

    A comparative evaluation of the gamma-ray analysis software VISPECT, in relation to two commercial gamma-ray analysis software packages, OMNIGAM (EG and G Ortec) and SAMPO 90 (Canberra) was performed. For this evaluation, artificial gamma ray spectra were created, presenting peaks of different intensities and located at four different regions of the spectrum. Multiplet peaks with equal and different intensities, but with different channel separations, were also created. The results obtained showed a good performance of VISPECT in detecting and analysing single and multiplet peaks of different intensities in the gamma-ray spectrum. Neutron activation analysis of the geological reference material GS-N (IWG-GIT) and of the granite G-94, used in a Proficiency Testing Trial of Analytical Geochemistry Laboratories, was also performed , in order to evaluate the VISEPCT software in the analysis of real samples. The results obtained by using VISPECT were as good or better than the ones obtained using the other programs. (author)

  12. Software for neutron activation analysis at reactor IBR-2, FLNP, JINR

    CERN Document Server

    Zlokazov, V B

    2004-01-01

    A Delphi program suite, developed for processing gamma-spectra of induced activity of nuclei, obtained from the neutron activation measurements at the reactor IBR-2, FLNF, JINR, is reported. This suite contains components, intended for carrying out all the operations of the analysis cycle, starling with a data acquisition program for gamma -spectrometers Gamma (written in C++ Builder) and including Delphi programs for steps of the analysis. (6 refs).

  13. The Application and Extension of Backward Software Analysis

    CERN Document Server

    Perisic, Aleksandar

    2010-01-01

    The backward software analysis is a method that emanates from executing a program backwards - instead of taking input data and following the execution path, we start from output data and by executing the program backwards command by command, analyze data that could lead to the current output. The changed perspective forces a developer to think in a new way about the program. It can be applied as a thorough procedure or casual method. With this method, we have many advantages in testing, algorithm and system analysis. For example, in testing the advantage is obvious if the set of output data is smaller than possible inputs. For some programs or algorithms, we know more precisely the output data, so this backward analysis can help in reducing the number of test cases or even in strict verification of an algorithm. The difficulty lies in the fact that we need types of data that no programming language currently supports, so we need additional effort to understand how this method works, or what effort we need to ...

  14. A practical approach to handling the uncertainty analysis in gamma spectroscopy with the software's Gamma Vision and Genie

    International Nuclear Information System (INIS)

    Full text: The national Swedish network of laboratories in emergency response and preparedness should provide with fast and reliable measurements. That is why these results should be given with a measure of its quality, which is the measurement uncertainty, as has been stated in several international standards. Many gamma spectroscopy software packages contain advance algorithms for calculation of the activity and its measurement uncertainty. They even include elements of quality assurance and quality control. Despite of that, not all sources of uncertainty are always taken into account. The two most used analysis software packages in the Swedish network of laboratories in emergency response are Gamma Vision from Ortec and Genie (with and without APEX) from Canberra. The purpose of this paper is to present two groups of practical evaluations of uncertainty components for the same kind of gamma-spectroscopy analysis, one that would suit Gamma Vision users and other for Genie users, including the Labsocs tool. The main idea is to profit as much as possible of the software capabilities and semi-manually add the contribution of uncertainty sources that are not been taken into account. The reports from both the software packages are modified so as to reflect the contribution of all sources of uncertainty into the reported relative combined uncertainty. The examples of gamma spectroscopy analysis are for samples of the same matrix and the different geometries foreseen in the context of emergency response by the Swedish emergency network. Together with the evaluation of the uncertainty components a review on the uncertainty propagation and the assumptions taken in each of the software packages is presented. (author)

  15. An analysis software of tritium distribution in food and environmental water in China

    International Nuclear Information System (INIS)

    Objective: The purpose of developing this analysis-software of tritium distribution in food and environmental water is to collect tritium monitoring data, to analyze the data, both automatically, statistically and graphically, and to study and share the data. Methods: Based on the data obtained before, analysis-software is wrote by using VC++. NET as tool software. The software first transfers data from EXCEL into a database. It has additive function of data-append, so operators can embody new monitoring data easily. Results: After turning the monitoring data saved as EXCEL file by original researchers into a database, people can easily access them. The software provides a tool of distributing-analysis of tritium. Conclusion: This software is a first attempt of data-analysis about tritium level in food and environmental water in China. Data achieving, searching and analyzing become easily and directly with the software. (authors)

  16. Demon voltammetry and analysis software: Analysis of cocaine-induced alterations in dopamine signaling using multiple kinetic measures

    Science.gov (United States)

    Yorgason, Jordan T.; España, Rodrigo A.; Jones, Sara R.

    2011-01-01

    The fast sampling rates of fast scan cyclic voltammetry make it a favorable method for measuring changes in brain monoamine release and uptake kinetics in slice, anesthetized, and freely moving preparations. The most common analysis technique for evaluating changes in dopamine signaling uses well-established Michaelis-Menten kinetic methods that can accurately model dopamine release and uptake parameters across multiple experimental conditions. Nevertheless, over the years, many researchers have turned to other measures to estimate changes in dopamine release and uptake, yet to our knowledge no systematic comparison amongst these measures has been conducted. To address this lack of uniformity in kinetic analyses, we have created the Demon Voltammetry and Analysis software suite, which is freely available to academic and non-profit institutions. Here we present an explanation of the Demon Acquisition and Analysis features, and demonstrate its utility for acquiring voltammetric data under in vitro, in vivo anesthetized, and freely moving conditions. Additionally, the software was used to compare the sensitivity of multiple kinetic measures of release and uptake to cocaine-induced changes in electrically evoked dopamine efflux in nucleus accumbens core slices. Specifically, we examined and compared tau, full width at half height, half-life, T20, T80, slope, peak height, calibrated peak dopamine concentration, and area under the curve to the well-characterized Michaelis-Menten parameters, dopamine per pulse, maximal uptake rate, and apparent affinity. Based on observed results we recommend tau for measuring dopamine uptake and calibrated peak dopamine concentration for measuring dopamine release. PMID:21392532

  17. RVA. 3-D Visualization and Analysis Software to Support Management of Oil and Gas Resources

    Energy Technology Data Exchange (ETDEWEB)

    Keefer, Donald A. [Univ. of Illinois, Champaign, IL (United States); Shaffer, Eric G. [Univ. of Illinois, Champaign, IL (United States); Storsved, Brynne [Univ. of Illinois, Champaign, IL (United States); Vanmoer, Mark [Univ. of Illinois, Champaign, IL (United States); Angrave, Lawrence [Univ. of Illinois, Champaign, IL (United States); Damico, James R. [Univ. of Illinois, Champaign, IL (United States); Grigsby, Nathan [Univ. of Illinois, Champaign, IL (United States)

    2015-12-01

    A free software application, RVA, has been developed as a plugin to the US DOE-funded ParaView visualization package, to provide support in the visualization and analysis of complex reservoirs being managed using multi-fluid EOR techniques. RVA, for Reservoir Visualization and Analysis, was developed as an open-source plugin to the 64 bit Windows version of ParaView 3.14. RVA was developed at the University of Illinois at Urbana-Champaign, with contributions from the Illinois State Geological Survey, Department of Computer Science and National Center for Supercomputing Applications. RVA was designed to utilize and enhance the state-of-the-art visualization capabilities within ParaView, readily allowing joint visualization of geologic framework and reservoir fluid simulation model results. Particular emphasis was placed on enabling visualization and analysis of simulation results highlighting multiple fluid phases, multiple properties for each fluid phase (including flow lines), multiple geologic models and multiple time steps. Additional advanced functionality was provided through the development of custom code to implement data mining capabilities. The built-in functionality of ParaView provides the capacity to process and visualize data sets ranging from small models on local desktop systems to extremely large models created and stored on remote supercomputers. The RVA plugin that we developed and the associated User Manual provide improved functionality through new software tools, and instruction in the use of ParaView-RVA, targeted to petroleum engineers and geologists in industry and research. The RVA web site (http://rva.cs.illinois.edu) provides an overview of functions, and the development web site (https://github.com/shaffer1/RVA) provides ready access to the source code, compiled binaries, user manual, and a suite of demonstration data sets. Key functionality has been included to support a range of reservoirs visualization and analysis needs, including

  18. Software selection based on analysis and forecasting methods, practised in 1C

    Science.gov (United States)

    Vazhdaev, A. N.; Chernysheva, T. Y.; Lisacheva, E. I.

    2015-09-01

    The research focuses on the problem of a “1C: Enterprise 8” platform inboard mechanisms for data analysis and forecasting. It is important to evaluate and select proper software to develop effective strategies for customer relationship management in terms of sales, as well as implementation and further maintenance of software. Research data allows creating new forecast models to schedule further software distribution.

  19. Proceedings Fourth International Workshop on Testing, Analysis and Verification of Web Software

    CERN Document Server

    Salaün, Gwen; Hallé, Sylvain; 10.4204/EPTCS.35

    2010-01-01

    This volume contains the papers presented at the fourth international workshop on Testing, Analysis and Verification of Software, which was associated with the 25th IEEE/ACM International Conference on Automated Software Engineering (ASE 2010). The collection of papers includes research on formal specification, model-checking, testing, and debugging of Web software.

  20. Spacelab user implementation assessment study. (Software requirements analysis). Volume 2: Technical report

    Science.gov (United States)

    1976-01-01

    The engineering analyses and evaluation studies conducted for the Software Requirements Analysis are discussed. Included are the development of the study data base, synthesis of implementation approaches for software required by both mandatory onboard computer services and command/control functions, and identification and implementation of software for ground processing activities.

  1. Software selection based on analysis and forecasting methods, practised in 1C

    OpenAIRE

    Vazhdaev, Andrey Nikolaevich; Chernysheva, Tatiana Yurievna; Lisacheva, E. I.

    2015-01-01

    The research focuses on the problem of a "1C: Enterprise 8" platform inboard mechanisms for data analysis and forecasting. It is important to evaluate and select proper software to develop effective strategies for customer relationship management in terms of sales, as well as implementation and further maintenance of software. Research data allows creating new forecast models to schedule further software distribution.

  2. Software requirements definition Shipping Cask Analysis System (SCANS)

    International Nuclear Information System (INIS)

    The US Nuclear Regulatory Commission (NRC) staff reviews the technical adequacy of applications for certification of designs of shipping casks for spent nuclear fuel. In order to confirm an acceptable design, the NRC staff may perform independent calculations. The current NRC procedure for confirming cask design analyses is laborious and tedious. Most of the work is currently done by hand or through the use of a remote computer network. The time required to certify a cask can be long. The review process may vary somewhat with the engineer doing the reviewing. Similarly, the documentation on the results of the review can also vary with the reviewer. To increase the efficiency of this certification process, LLNL was requested to design and write an integrated set of user-oriented, interactive computer programs for a personal microcomputer. The system is known as the NRC Shipping Cask Analysis System (SCANS). The computer codes and the software system supporting these codes are being developed and maintained for the NRC by LLNL. The objective of this system is generally to lessen the time and effort needed to review an application. Additionally, an objective of the system is to assure standardized methods and documentation of the confirmatory analyses used in the review of these cask designs. A software system should be designed based on NRC-defined requirements contained in a requirements document. The requirements document is a statement of a project's wants and needs as the users and implementers jointly understand them. The requirements document states the desired end products (i.e. WHAT's) of the project, not HOW the project provides them. This document describes the wants and needs for the SCANS system. 1 fig., 3 tabs

  3. Software for quantitative analysis of radiotherapy: overview, requirement analysis and design solutions.

    Science.gov (United States)

    Zhang, Lanlan; Hub, Martina; Mang, Sarah; Thieke, Christian; Nix, Oliver; Karger, Christian P; Floca, Ralf O

    2013-06-01

    Radiotherapy is a fast-developing discipline which plays a major role in cancer care. Quantitative analysis of radiotherapy data can improve the success of the treatment and support the prediction of outcome. In this paper, we first identify functional, conceptional and general requirements on a software system for quantitative analysis of radiotherapy. Further we present an overview of existing radiotherapy analysis software tools and check them against the stated requirements. As none of them could meet all of the demands presented herein, we analyzed possible conceptional problems and present software design solutions and recommendations to meet the stated requirements (e.g. algorithmic decoupling via dose iterator pattern; analysis database design). As a proof of concept we developed a software library "RTToolbox" following the presented design principles. The RTToolbox is available as open source library and has already been tested in a larger-scale software system for different use cases. These examples demonstrate the benefit of the presented design principles. PMID:23523366

  4. Control software analysis, Part I Open-loop properties

    CERN Document Server

    Feron, Eric

    2008-01-01

    As the digital world enters further into everyday life, questions are raised about the increasing challenges brought by the interaction of real-time software with physical devices. Many accidents and incidents encountered in areas as diverse as medical systems, transportation systems or weapon systems are ultimately attributed to "software failures". Since real-time software that interacts with physical systems might as well be called control software, the long litany of accidents due to real-time software failures might be taken as an equally long list of opportunities for control systems engineering. In this paper, we are interested only in run-time errors in those pieces of software that are a direct implementation of control system specifications: For well-defined and well-understood control architectures such as those present in standard textbooks on digital control systems, the current state of theoretical computer science is well-equipped enough to address and analyze control algorithms. It appears tha...

  5. Improving systems software security through program analysis and instrumentation

    OpenAIRE

    Kuznetsov, Volodymyr

    2016-01-01

    Security and reliability bugs are prevalent in systems software. Systems code is often written in low-level languages like C/C++, which offer many benefits but also delegate memory management and type safety to programmers. This invites bugs that cause crashes or can be exploited by attackers to take control of the program. This thesis presents techniques to detect and fix security and reliability issues in systems software without burdening the software developers. First, we present code-po...

  6. Contracts in Offshore Software Development: An Empirical Analysis

    OpenAIRE

    Anandasivam Gopal; Konduru Sivaramakrishnan; Krishnan, M. S.; Tridas Mukhopadhyay

    2003-01-01

    We study the determinants of contract choice in offshore software development projects and examine how the choice of contract and other factors in the project affect project profits accruing to the software vendor. Using data collected on 93 offshore projects from a leading Indian software vendor, we provide evidence that specific vendor-, client-, and project-related characteristics such as requirement uncertainty, project team size, and resource shortage significantly explain contract choic...

  7. Space Suit Joint Torque Testing

    Science.gov (United States)

    Valish, Dana J.

    2011-01-01

    In 2009 and early 2010, a test was performed to quantify the torque required to manipulate joints in several existing operational and prototype space suits in an effort to develop joint torque requirements appropriate for a new Constellation Program space suit system. The same test method was levied on the Constellation space suit contractors to verify that their suit design meets the requirements. However, because the original test was set up and conducted by a single test operator there was some question as to whether this method was repeatable enough to be considered a standard verification method for Constellation or other future space suits. In order to validate the method itself, a representative subset of the previous test was repeated, using the same information that would be available to space suit contractors, but set up and conducted by someone not familiar with the previous test. The resultant data was compared using graphical and statistical analysis and a variance in torque values for some of the tested joints was apparent. Potential variables that could have affected the data were identified and re-testing was conducted in an attempt to eliminate these variables. The results of the retest will be used to determine if further testing and modification is necessary before the method can be validated.

  8. Scientific Visualization Using the Flow Analysis Software Toolkit (FAST)

    Science.gov (United States)

    Bancroft, Gordon V.; Kelaita, Paul G.; Mccabe, R. Kevin; Merritt, Fergus J.; Plessel, Todd C.; Sandstrom, Timothy A.; West, John T.

    1993-01-01

    Over the past few years the Flow Analysis Software Toolkit (FAST) has matured into a useful tool for visualizing and analyzing scientific data on high-performance graphics workstations. Originally designed for visualizing the results of fluid dynamics research, FAST has demonstrated its flexibility by being used in several other areas of scientific research. These research areas include earth and space sciences, acid rain and ozone modelling, and automotive design, just to name a few. This paper describes the current status of FAST, including the basic concepts, architecture, existing functionality and features, and some of the known applications for which FAST is being used. A few of the applications, by both NASA and non-NASA agencies, are outlined in more detail. Described in the Outlines are the goals of each visualization project, the techniques or 'tricks' used lo produce the desired results, and custom modifications to FAST, if any, done to further enhance the analysis. Some of the future directions for FAST are also described.

  9. CAVASS: a computer-assisted visualization and analysis software system.

    Science.gov (United States)

    Grevera, George; Udupa, Jayaram; Odhner, Dewey; Zhuge, Ying; Souza, Andre; Iwanaga, Tad; Mishra, Shipra

    2007-11-01

    The Medical Image Processing Group at the University of Pennsylvania has been developing (and distributing with source code) medical image analysis and visualization software systems for a long period of time. Our most recent system, 3DVIEWNIX, was first released in 1993. Since that time, a number of significant advancements have taken place with regard to computer platforms and operating systems, networking capability, the rise of parallel processing standards, and the development of open-source toolkits. The development of CAVASS by our group is the next generation of 3DVIEWNIX. CAVASS will be freely available and open source, and it is integrated with toolkits such as Insight Toolkit and Visualization Toolkit. CAVASS runs on Windows, Unix, Linux, and Mac but shares a single code base. Rather than requiring expensive multiprocessor systems, it seamlessly provides for parallel processing via inexpensive clusters of work stations for more time-consuming algorithms. Most importantly, CAVASS is directed at the visualization, processing, and analysis of 3-dimensional and higher-dimensional medical imagery, so support for digital imaging and communication in medicine data and the efficient implementation of algorithms is given paramount importance. PMID:17786517

  10. Development and applications of Kramers-Kronig PEELS analysis software

    International Nuclear Information System (INIS)

    A Kramers-Kronig analysis program is developed as a custom function for the GATAN parallel electron energy loss spectroscopy (PEELS) software package EL/P. When used with a JEOL 4000EX high-resolution transmission electron microscope this program allows to measure the dielectric functions of materials with an energy resolution of approx 1.4eV. The imaginary part of the dielectric function is particularly useful, since it allows the magnitude of the band gap to be determined for relatively wide-gap materials. More importantly, changes in the gap may be monitored at high spatial resolution, when used in conjunction with the HRTEM images. The principles of the method are described and applications are presented for Type-1a gem quality diamond, before and after neutron irradiation. The former shows a band gap of about 5.8 eV, as expected, whereas for the latter the gap appears to be effectively collapsed. The core-loss spectra confirm that Type-1a diamond has pure sp3 tetrahedral bonding, whereas the neutron irradiated diamond has mixed sp2/sp3 bonding. Analysis of the low-loss spectra for the neutron-irradiated specimen yielded density 1.6 g/cm3, approximately half that of diamond. 10 refs., 2 figs

  11. HistFitter software framework for statistical data analysis

    International Nuclear Information System (INIS)

    We present a software framework for statistical data analysis, called HistFitter, that has been used extensively by the ATLAS Collaboration to analyze big datasets originating from proton-proton collisions at the Large Hadron Collider at CERN. Since 2012 HistFitter has been the standard statistical tool in searches for supersymmetric particles performed by ATLAS. HistFitter is a programmable and flexible framework to build, book-keep, fit, interpret and present results of data models of nearly arbitrary complexity. Starting from an object-oriented configuration, defined by users, the framework builds probability density functions that are automatically fit to data and interpreted with statistical tests. Internally HistFitter uses the statistics packages RooStats and HistFactory. A key innovation of HistFitter is its design, which is rooted in analysis strategies of particle physics. The concepts of control, signal and validation regions are woven into its fabric. These are progressively treated with statistically rigorous built-in methods. Being capable of working with multiple models at once that describe the data, HistFitter introduces an additional level of abstraction that allows for easy bookkeeping, manipulation and testing of large collections of signal hypotheses. Finally, HistFitter provides a collection of tools to present results with publication quality style through a simple command-line interface. (orig.)

  12. HistFitter software framework for statistical data analysis

    Energy Technology Data Exchange (ETDEWEB)

    Baak, M. [CERN, Geneva (Switzerland); Besjes, G.J. [Radboud University Nijmegen, Nijmegen (Netherlands); Nikhef, Amsterdam (Netherlands); Cote, D. [University of Texas, Arlington (United States); Koutsman, A. [TRIUMF, Vancouver (Canada); Lorenz, J. [Ludwig-Maximilians-Universitaet Muenchen, Munich (Germany); Excellence Cluster Universe, Garching (Germany); Short, D. [University of Oxford, Oxford (United Kingdom)

    2015-04-15

    We present a software framework for statistical data analysis, called HistFitter, that has been used extensively by the ATLAS Collaboration to analyze big datasets originating from proton-proton collisions at the Large Hadron Collider at CERN. Since 2012 HistFitter has been the standard statistical tool in searches for supersymmetric particles performed by ATLAS. HistFitter is a programmable and flexible framework to build, book-keep, fit, interpret and present results of data models of nearly arbitrary complexity. Starting from an object-oriented configuration, defined by users, the framework builds probability density functions that are automatically fit to data and interpreted with statistical tests. Internally HistFitter uses the statistics packages RooStats and HistFactory. A key innovation of HistFitter is its design, which is rooted in analysis strategies of particle physics. The concepts of control, signal and validation regions are woven into its fabric. These are progressively treated with statistically rigorous built-in methods. Being capable of working with multiple models at once that describe the data, HistFitter introduces an additional level of abstraction that allows for easy bookkeeping, manipulation and testing of large collections of signal hypotheses. Finally, HistFitter provides a collection of tools to present results with publication quality style through a simple command-line interface. (orig.)

  13. HistFitter software framework for statistical data analysis

    Science.gov (United States)

    Baak, M.; Besjes, G. J.; Côté, D.; Koutsman, A.; Lorenz, J.; Short, D.

    2015-04-01

    We present a software framework for statistical data analysis, called HistFitter, that has been used extensively by the ATLAS Collaboration to analyze big datasets originating from proton-proton collisions at the Large Hadron Collider at CERN. Since 2012 HistFitter has been the standard statistical tool in searches for supersymmetric particles performed by ATLAS. HistFitter is a programmable and flexible framework to build, book-keep, fit, interpret and present results of data models of nearly arbitrary complexity. Starting from an object-oriented configuration, defined by users, the framework builds probability density functions that are automatically fit to data and interpreted with statistical tests. Internally HistFitter uses the statistics packages RooStats and HistFactory. A key innovation of HistFitter is its design, which is rooted in analysis strategies of particle physics. The concepts of control, signal and validation regions are woven into its fabric. These are progressively treated with statistically rigorous built-in methods. Being capable of working with multiple models at once that describe the data, HistFitter introduces an additional level of abstraction that allows for easy bookkeeping, manipulation and testing of large collections of signal hypotheses. Finally, HistFitter provides a collection of tools to present results with publication quality style through a simple command-line interface.

  14. Pattern recognition software and techniques for biological image analysis.

    Directory of Open Access Journals (Sweden)

    Lior Shamir

    Full Text Available The increasing prevalence of automated image acquisition systems is enabling new types of microscopy experiments that generate large image datasets. However, there is a perceived lack of robust image analysis systems required to process these diverse datasets. Most automated image analysis systems are tailored for specific types of microscopy, contrast methods, probes, and even cell types. This imposes significant constraints on experimental design, limiting their application to the narrow set of imaging methods for which they were designed. One of the approaches to address these limitations is pattern recognition, which was originally developed for remote sensing, and is increasingly being applied to the biology domain. This approach relies on training a computer to recognize patterns in images rather than developing algorithms or tuning parameters for specific image processing tasks. The generality of this approach promises to enable data mining in extensive image repositories, and provide objective and quantitative imaging assays for routine use. Here, we provide a brief overview of the technologies behind pattern recognition and its use in computer vision for biological and biomedical imaging. We list available software tools that can be used by biologists and suggest practical experimental considerations to make the best use of pattern recognition techniques for imaging assays.

  15. Software design analysis technique for the development of PLC-based safety-critical systems

    Energy Technology Data Exchange (ETDEWEB)

    Koo, Seo Ryong; Seong, Poong Hyun [Korea Advanced Institute of Science and Technology, Taejeon (Korea, Republic of)

    2005-11-15

    To develop and implement a safety-critical system, the requirements of the system must be analyzed thoroughly during the phases of a software development's life cycle because a single error in the requirements can generate serious software faults. In this study, a nuclear FBD-style design specification and analysis (NuFDS) approach was proposed for PLC based safety-critical systems. The NuFDS approach is suggested in a straightforward manner for the effective and formal specification and analysis of software designs. Accordingly, the proposed NuFDS approach comprises one technique for specifying the software design and another for analyzing the software design.

  16. Research on Application of Enhanced Neural Networks in Software Risk Analysis

    Institute of Scientific and Technical Information of China (English)

    Zhenbang Rong; Juhua Chen; Mei Liu; Yong Hu

    2006-01-01

    This paper puts forward a risk analysis model for software projects using enranced neural networks. The data for analysis are acquired through questionnaires from real software projects. To solve the multicollinearity in software risks, the method of principal components analysis is adopted in the model to enhance network stability. To solve uncertainty of the neural networks structure and the uncertainty of the initial weights, genetic algorithms is employed. The experimental result reveals that the precision of software risk analysis can be improved by using the erhanced neural networks model.

  17. TweezPal - Optical tweezers analysis and calibration software

    Science.gov (United States)

    Osterman, Natan

    2010-11-01

    Optical tweezers, a powerful tool for optical trapping, micromanipulation and force transduction, have in recent years become a standard technique commonly used in many research laboratories and university courses. Knowledge about the optical force acting on a trapped object can be gained only after a calibration procedure which has to be performed (by an expert) for each type of trapped objects. In this paper we present TweezPal, a user-friendly, standalone Windows software tool for optical tweezers analysis and calibration. Using TweezPal, the procedure can be performed in a matter of minutes even by non-expert users. The calibration is based on the Brownian motion of a particle trapped in a stationary optical trap, which is being monitored using video or photodiode detection. The particle trajectory is imported into the software which instantly calculates position histogram, trapping potential, stiffness and anisotropy. Program summaryProgram title: TweezPal Catalogue identifier: AEGR_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEGR_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 44 891 No. of bytes in distributed program, including test data, etc.: 792 653 Distribution format: tar.gz Programming language: Borland Delphi Computer: Any PC running Microsoft Windows Operating system: Windows 95, 98, 2000, XP, Vista, 7 RAM: 12 Mbytes Classification: 3, 4.14, 18, 23 Nature of problem: Quick, robust and user-friendly calibration and analysis of optical tweezers. The optical trap is calibrated from the trajectory of a trapped particle undergoing Brownian motion in a stationary optical trap (input data) using two methods. Solution method: Elimination of the experimental drift in position data. Direct calculation of the trap stiffness from the positional

  18. Improving Software Reliability Forecasting

    NARCIS (Netherlands)

    Burtsy, Bernard; Albeanu, Grigore; Boros, Dragos N.; Popentiu, Florin; Nicola, Victor

    1997-01-01

    This work investigates some methods for software reliability forecasting. A supermodel is presented as a suited tool for prediction of reliability in software project development. Also, times series forecasting for cumulative interfailure time is proposed and illustrated.

  19. XWAS: A Software Toolset for Genetic Data Analysis and Association Studies of the X Chromosome.

    Science.gov (United States)

    Gao, Feng; Chang, Diana; Biddanda, Arjun; Ma, Li; Guo, Yingjie; Zhou, Zilu; Keinan, Alon

    2015-01-01

    XWAS is a new software suite for the analysis of the X chromosome in association studies and similar genetic studies. The X chromosome plays an important role in human disease and traits of many species, especially those with sexually dimorphic characteristics. Special attention needs to be given to its analysis due to the unique inheritance pattern, which leads to analytical complications that have resulted in the majority of genome-wide association studies (GWAS) either not considering X or mishandling it with toolsets that had been designed for non-sex chromosomes. We hence developed XWAS to fill the need for tools that are specially designed for analysis of X. Following extensive, stringent, and X-specific quality control, XWAS offers an array of statistical tests of association, including: 1) the standard test between a SNP (single nucleotide polymorphism) and disease risk, including after first stratifying individuals by sex, 2) a test for a differential effect of a SNP on disease between males and females, 3) motivated by X-inactivation, a test for higher variance of a trait in heterozygous females as compared with homozygous females, and 4) for all tests, a version that allows for combining evidence from all SNPs across a gene. We applied the toolset analysis pipeline to 16 GWAS datasets of immune-related disorders and 7 risk factors of coronary artery disease, and discovered several new X-linked genetic associations. XWAS will provide the tools and incentive for others to incorporate the X chromosome into GWAS and similar studies in any species with an XX/XY system, hence enabling discoveries of novel loci implicated in many diseases and in their sexual dimorphism. PMID:26268243

  20. Analysis of Software Delivery Process Shortcomings and Architectural Pitfalls

    OpenAIRE

    Patwardhan, Amol

    2016-01-01

    This paper highlights the common pitfalls of overcomplicating the software architecture, development and delivery process by examining two enterprise level web application products built using Microsoft.Net framework. The aim of this paper is to identify, discuss and analyze architectural, development and deployment issues and learn lessons using real world examples from the chosen software products as case studies.

  1. Multi-criteria decision analysis methods and software

    CERN Document Server

    Ishizaka, Alessio

    2013-01-01

    This book presents an introduction to MCDA followed by more detailed chapters about each of the leading methods used in this field. Comparison of methods and software is also featured to enable readers to choose the most appropriate method needed in their research. Worked examples as well as the software featured in the book are available on an accompanying website.

  2. IFDOTMETER : A New Software Application for Automated Immunofluorescence Analysis

    NARCIS (Netherlands)

    Rodriguez-Arribas, Mario; Pizarro-Estrella, Elisa; Gomez-Sanchez, Ruben; Yakhine-Diop, S. M. S.; Gragera-Hidalgo, Antonio; Cristo, Alejandro; Bravo-San Pedro, Jose M.; Gonzalez-Polo, Rosa A.; Fuentes, Jose M.

    2016-01-01

    Most laboratories interested in autophagy use different imaging software for managing and analyzing heterogeneous parameters in immunofluorescence experiments (e.g., LC3-puncta quantification and determination of the number and size of lysosomes). One solution would be software that works on a user'

  3. Prototype Software for Automated Structural Analysis of Systems

    DEFF Research Database (Denmark)

    Jørgensen, A.; Izadi-Zamanabadi, Roozbeh; Kristensen, M.

    2004-01-01

    In this paper we present a prototype software tool that is developed to analyse the structural model of automated systems in order to identify redundant information that is hence utilized for Fault detection and Isolation (FDI) purposes. The dedicated algorithms in this software tool use a tri...

  4. A pattern framework for software quality assessment and tradeoff analysis

    NARCIS (Netherlands)

    Folmer, Eelke; Boscht, Jan

    2007-01-01

    The earliest design decisions often have a significant impact on software quality and are the most costly to revoke. One of the challenges in architecture design is to reduce the frequency of retrofit problems in software designs; not being able to improve the quality of a system cost effectively, a

  5. The PRISM Benchmark Suite

    OpenAIRE

    Kwiatkowsa, Marta; Norman, Gethin; Parker, David

    2012-01-01

    We present the PRISM benchmark suite: a collection of probabilistic models and property specifications, designed to facilitate testing, benchmarking and comparisons of probabilistic verification tools and implementations.

  6. Instant Spring Tool Suite

    CERN Document Server

    Chiang, Geoff

    2013-01-01

    Filled with practical, step-by-step instructions and clear explanations for the most important and useful tasks. A tutorial guide that walks you through how to use the features of Spring Tool Suite using well defined sections for the different parts of Spring.Instant Spring Tool Suite is for novice to intermediate Java developers looking to get a head-start in enterprise application development using Spring Tool Suite and the Spring framework. If you are looking for a guide for effective application development using Spring Tool Suite, then this book is for you.

  7. Software para análise quantitativa da deglutição Swallowing quantitative analysis software

    Directory of Open Access Journals (Sweden)

    André Augusto Spadotto

    2008-02-01

    Full Text Available OBJETIVO: Apresentar um software que permita uma análise detalhada da dinâmica da deglutição. MATERIAIS E MÉTODOS: Participaram deste estudo dez indivíduos após acidente vascular encefálico, sendo seis do gênero masculino, com idade média de 57,6 anos. Foi realizada videofluoroscopia da deglutição e as imagens foram digitalizadas em microcomputador, com posterior análise do tempo do trânsito faríngeo da deglutição, por meio de um cronômetro e do software. RESULTADOS: O tempo médio do trânsito faríngeo da deglutição apresentou-se diferente quando comparados os métodos utilizados (cronômetro e software. CONCLUSÃO: Este software é um instrumento de análise dos parâmetros tempo e velocidade da deglutição, propiciando melhor compreensão da dinâmica da deglutição, com reflexos tanto na abordagem clínica dos pacientes com disfagia como para fins de pesquisa científica.OBJECTIVE: The present paper is aimed at introducing a software to allow a detailed analysis of the swallowing dynamics. MATERIALS AND METHODS: The sample included ten (six male and four female stroke patients, with mean age of 57.6 years. Swallowing videofluoroscopy was performed and images were digitized for posterior analysis of the pharyngeal transit time with the aid of a chronometer and the software. RESULTS: Differences were observed in the average pharyngeal swallowing transit time as a result of measurements with chronometer and software. CONCLUSION: This software is a useful tool for the analysis of parameters such as swallowing time and speed, allowing a better understanding of the swallowing dynamics, both in the clinical approach of patients with oropharyngeal dysphagia and for scientific research purposes.

  8. Software safety analysis on the model specified by NuSCR and SMV input language at requirements phase of software development life cycle using SMV

    International Nuclear Information System (INIS)

    Safety-critical software process is composed of development process, verification and validation (V and V) process and safety analysis process. Safety analysis process has been often treated as an additional process and not found in a conventional software process. But software safety analysis (SSA) is required if software is applied to a safety system, and the SSA shall be performed independently for the safety software through software development life cycle (SDLC). Of all the phases in software development, requirements engineering is generally considered to play the most critical role in determining the overall software quality. NASA data demonstrate that nearly 75% of failures found in operational software were caused by errors in the requirements. The verification process in requirements phase checks the correctness of software requirements specification, and the safety analysis process analyzes the safety-related properties in detail. In this paper, the method for safety analysis at requirements phase of software development life cycle using symbolic model verifier (SMV) is proposed. Hazard is discovered by hazard analysis and in other to use SMV for the safety analysis, the safety-related properties are expressed by computation tree logic (CTL)

  9. Design aspects of the Alpha Repository. V. Suite selection and cost analysis of excavation/hauling systems

    International Nuclear Information System (INIS)

    The various types of haulage and excavation equipment that may be suitable for use in the development and excavation of the Alpha repository are described with discussion of the advantages, disadvantages, expected costs, availability, and special features of each. The various equipment suites are delineated, and the costs of mining and transportation of the salt are presented and discussed. Individual manufacturers contacted and equipment considered are listed. Most of the equipment is ''off-the-shelf''; however, some manufactuers were contacted that do custom work because of their expertise in salt mining equipment. The costs of custom equipment are comparable to those for standard equipment

  10. Suite of Benchmark Tests to Conduct Mesh-Convergence Analysis of Nonlinear and Non-constant Coefficient Transport Codes

    Science.gov (United States)

    Zamani, K.; Bombardelli, F. A.

    2014-12-01

    Verification of geophysics codes is imperative to avoid serious academic as well as practical consequences. In case that access to any given source code is not possible, the Method of Manufactured Solution (MMS) cannot be employed in code verification. In contrast, employing the Method of Exact Solution (MES) has several practical advantages. In this research, we first provide four new one-dimensional analytical solutions designed for code verification; these solutions are able to uncover the particular imperfections of the Advection-diffusion-reaction equation, such as nonlinear advection, diffusion or source terms, as well as non-constant coefficient equations. After that, we provide a solution of Burgers' equation in a novel setup. Proposed solutions satisfy the continuity of mass for the ambient flow, which is a crucial factor for coupled hydrodynamics-transport solvers. Then, we use the derived analytical solutions for code verification. To clarify gray-literature issues in the verification of transport codes, we designed a comprehensive test suite to uncover any imperfection in transport solvers via a hierarchical increase in the level of tests' complexity. The test suite includes hundreds of unit tests and system tests to check vis-a-vis the portions of the code. Examples for checking the suite start by testing a simple case of unidirectional advection; then, bidirectional advection and tidal flow and build up to nonlinear cases. We design tests to check nonlinearity in velocity, dispersivity and reactions. The concealing effect of scales (Peclet and Damkohler numbers) on the mesh-convergence study and appropriate remedies are also discussed. For the cases in which the appropriate benchmarks for mesh convergence study are not available, we utilize symmetry. Auxiliary subroutines for automation of the test suite and report generation are designed. All in all, the test package is not only a robust tool for code verification but it also provides comprehensive

  11. Comparative analysis of results between CASMO, MCNP and Serpent for a suite of Benchmark problems on BWR reactors; Analisis comparativo de resultados entre CASMO, MCNP y SERPENT para una suite de problemas Benchmark en reactores BWR

    Energy Technology Data Exchange (ETDEWEB)

    Xolocostli M, J. V.; Vargas E, S.; Gomez T, A. M. [ININ, Carretera Mexico-Toluca s/n, 52750 Ocoyoacac, Estado de Mexico (Mexico); Reyes F, M. del C.; Del Valle G, E., E-mail: vicente.xolocostli@inin.gob.mx [IPN, Escuela Superior de Fisica y Matematicas, UP - Adolfo Lopez Mateos, Edif. 9, 07738 Mexico D. F. (Mexico)

    2014-10-15

    In this paper a comparison is made in analyzing the suite of Benchmark problems for reactors type BWR between CASMO-4, MCNP6 and Serpent code. The Benchmark problem consists of two different geometries: a fuel cell of a pin and assembly type BWR. To facilitate the study of reactors physics in the fuel pin their nuclear characteristics are provided to detail, such as burnt dependence, the reactivity of selected nuclide, etc. With respect to the fuel assembly, the presented results are regarding to infinite multiplication factor for burning different steps and different vacuum conditions. Making the analysis of this set of Benchmark problems provides comprehensive test problems for the next fuels generation of BWR reactors with high extended burned. It is important to note that when making this comparison the purpose is to validate the methodologies used in modeling for different operating conditions, if the case is of other BWR assembly. The results will be within a range with some uncertainty, considering that does not depend on code that is used. Escuela Superior de Fisica y Matematicas of Instituto Politecnico Nacional (IPN (Mexico) has accumulated some experience in using Serpent, due to the potential of this code over other commercial codes such as CASMO and MCNP. The obtained results for the infinite multiplication factor are encouraging and motivate the studies to continue with the generation of the X S of a core to a next step a respective nuclear data library is constructed and this can be used by codes developed as part of the development project of the Mexican Analysis Platform of Nuclear Reactors AZTLAN. (Author)

  12. Automatic test suite evolution

    OpenAIRE

    Mirzaaghaei, Mehdi; Pezzè, Mauro

    2013-01-01

    Software testing is one of the most common approaches to verify software systems. Despite of many automated techniques proposed in the literature, test cases are often generated manually. When a software system evolves during development and maintenance to accommodate requirement changes, bug fixes, or functionality extensions, test cases may become obsolete, and software developers need to evolve them to verify the new version of the software system. Due to time pressure and effort requir...

  13. Development of the software dead time methodology for the 4πβ-γ software coincidence system analysis program

    International Nuclear Information System (INIS)

    The Laboratorio de Metrologia Nuclear - LMN, Nuclear Metrology Laboratory -, at IPEN-CNEN/SP, Sao Paulo, Brazil, developed a new Software Coincidence System (SCS) for 4πβ-γ radioisotope standardization. SCS is composed by the data acquisition hardware, for the coincidence data recording, and the coincidence data analysis program that performs the radioactive activity calculation for the target sample. Due to hardware intrinsic signal sampling characteristics, multiple undesired data recording occurs from a single saturated pulse. Also pulse pileup leads to bad data recording. As the beta counting rates are much greater than the gamma ones, due to the high 4π geometry beta detecting efficiencies, the beta counting significantly increases because of multiple pulse recordings, resulting in a respective increasing in the calculated activity value. In order to minimize such bad recordings effect, a software dead time value was introduced in the coincidence analysis program, under development at LMN, discarding multiple recordings, due to pulse pileup or saturation. This work presents the methodology developed to determine the optimal software dead time data value, for better accuracy results attaining, and discusses the results, pointing to software improvement possibilities. (author)

  14. Software Aging Analysis of Web Server Using Neural Networks

    Directory of Open Access Journals (Sweden)

    G.Sumathi

    2012-05-01

    Full Text Available Software aging is a phenomenon that refers to progressive performance degradation or transient failures or even crashes in long running software systems such as web servers. It mainly occurs due to the deterioration of operating system resource, fragmentation and numerical error accumulation. A primitive method to fight against software aging is software rejuvenation. Software rejuvenation is a proactive fault management technique aimed at cleaning up the system internal state to prevent the occurrence of more severe crash failures in the future. It involves occasionally stopping the running software, cleaning its internal state and restarting it. An optimized schedule for performing the software rejuvenation has to be derived in advance because a long running application could not be put down now and then as it may lead to waste of cost. This paper proposes a method to derive an accurate and optimized schedule for rejuvenation of a web server (Apache by using Radial Basis Function (RBF based Feed Forward Neural Network, a variant of Artificial Neural Networks (ANN. Aging indicators are obtained through experimental setup involving Apache web server and clients, which acts as input to the neural network model. This method is better than existing ones because usage of RBF leads to better accuracy and speed in convergence.

  15. Software Aging Analysis of Web Server Using Neural Networks

    Directory of Open Access Journals (Sweden)

    G.Sumathi

    2012-06-01

    Full Text Available Software aging is a phenomenon that refers to progressive performance degradation or transient failures or even crashes in long running software systems such as web servers. It mainly occurs due to the deterioration of operating system resource, fragmentation and numerical error accumulation. A primitive method to fight against software aging is software rejuvenation. Software rejuvenation is a proactive fault management technique aimed at cleaning up the system internal state to prevent the occurrence of more severe crash failures in the future. It involves occasionally stopping the running software, cleaning its internal state and restarting it. An optimized schedule for performing the software rejuvenation has to be derived in advance because a long running application could not be put down now and then as it may lead to waste of cost. This paper proposes a method to derive an accurate and optimized schedule for rejuvenation of a web server (Apache by using Radial Basis Function (RBF based Feed Forward Neural Network, a variant of Artificial Neural Networks (ANN. Aging indicators are obtained through experimental setup involving Apache web server and clients, which acts as input to the neural network model. This method is better than existing ones because usage of RBF leads to better accuracy and speed in convergence.

  16. Development of a gamma ray spectrometry software for neutron activation analysis using the open source concept

    International Nuclear Information System (INIS)

    This study developed a specific software for gamma ray spectra analysis for researchers of the Neutron Activation Laboratory (LAN), which was named SAANI (Instrumental Neutron Activation Analysis Software). The LAN laboratory of the Institute for Research and Nuclear Energy (IPEN-CNEN/SP), uses a multielemental analytical technique, based on irradiation of a sample by a flux of neutrons from a nuclear reactor, which induces radioactivity. The sample is then placed in a gamma-ray spectrometer, to obtain the spectrum. With free software philosophy in mind, this software will replace the existing software VISPECT / VERSION 2. The new software's main features are: a friendlier interface; easier standardization procedure carried out by LAN staff and researchers; adapted to the use of plug technology; multi platform and code free. The software was developed using the programming Python language, the library Trolltech Qt graphics and some of their scientific extensions. Preliminary results using the SANNI software were compared to those obtained with the existing software and were considered good. There were some errors in accuracy during the implementation of the software. The SAANI software has been installed in selected computers to be used for routine analysis in order to verify its strength, accuracy and usability. (author)

  17. Hydraulic network analysis with the software package NESEI

    International Nuclear Information System (INIS)

    The software package NESEI allows the steady state and time step history hydraulic analysis of complex water supply networks. It runs on a PC under MS-DOS, WINDOWS or WIN-OS/2. A menu guided user interface with a context specific help system means, that DOS-expertise is not required. Contrary to the topological restrictions imposed by the Hardy-Crossmethod our hydraulic program con deal with networks of any topology. Our mathematical algorithm (successive over-relaxation method) uses a numerical table representation of the Moody diagram instead of some specific analytical formulation and is therefore not limited to a certain flow regime. Armatures like valves, throttles, backflow preventing flaps, forward and backward pressure regulators and flow control valves may be simulated. Memory requirement and computer time increase approximately linearly with the number of nodes. A maximum of some 3000 nodes may be simulated with 640 kB RAM and over 16000 nodes with 8 MB. Operating modes for steady state computation, time step operation histories with reservoir accounting and autodimensioning of pipe diameters are possible. For the digitization of networks, data management and graphical presentation, commercial products like GIS-ARC/INFO, GISCAD and GISVIEW or AUTOCAD may be used. Data exchange with our package is performed via text and DBASE files. A rapid generation of instructive graphical result presentations is possible via menu selections and permits to achieve a high productivity for the hydraulic analysis and optimisation of supply systems. Some auxiliary utility programs allow also small stand alone installations without GIS or CAD support. A water pressure logger which may be connected at hydrants for extended periods was developed to allow an economic calibration of a hydraulic network model by field measurements. It is microprocessor-controlled, programmable via a menu form by PC and allows selective pre-event recording according to selectable triggering

  18. A Survey on Test Suite Reduction Frameworks and Tools

    OpenAIRE

    Khan, Saif Ur Rehman; Lee, Sai Peck; Ahmad, Raja Wasim; Akhunzada, Adnan; Chang, Victor

    2016-01-01

    Software testing is a widely accepted practice that ensures the quality of a System under Test (SUT). However, the gradual increase of the test suite size demands high portion of testing budget and time. Test Suite Reduction (TSR) is considered a potential approach to deal with the test suite size problem. Moreover, a complete automation support is highly recommended for software testing to adequately meet the challenges of a resource constrained testing environment. Several TSR frameworks an...

  19. Problem of Office Suite Training at the University

    OpenAIRE

    Natalia A. Nastashchuk; Svetlana S. Litvinova; Tatiana S. Moshkareva

    2013-01-01

    Te paper considers the problem of office suite applications training, caused by a rapid change of their versions, variety of software developers and a rapid development of software and hardware platforms. The content of office suite applications training, based on the system of office suite notions, its basic functional and standards of information technologies development (OpenDocument Format Standard, ISO 26300-200Х) is presented.

  20. DEVELOPMENT OF EDUCATIONAL SOFTWARE FOR STRESS ANALYSIS OF AN AIRCRAFT WING

    Directory of Open Access Journals (Sweden)

    TAZKERA SADEQ

    2012-06-01

    Full Text Available A stress analysis software based on MATLAB, Graphic user interface (GUI has been developed. The developed software can be used to estimate load on a wing and to compute the stresses at any point along the span of the wing of a given aircraft. The generalized formulation allows performing stress analysis even for a multispar (multicell wing. The software is expected to be a useful tool for effective teaching learning process of courses on aircraft structures and aircraft structural design.

  1. The risks analysis like a practice of secure software development : A revision of models and methodologies

    OpenAIRE

    Carrillo Verdún, José; Gasca Hurtado, Gloria; Tovar Caro, Edmundo; Vega Zepeda, Vianca

    2006-01-01

    The following document, presents and analyzes the Risks Analysis in the whole software development life cycle, framed like one of the recommended practices for secure software development. It present and compare a set of Risk Analysis methodologies and strategies, considering like criteria some classifications propose by different authors and the objectives that they persecute to orient them towards of evaluation criterion for the secure software development.

  2. Opportunities and Challenges Applying Functional Data Analysis to the Study of Open Source Software Evolution

    OpenAIRE

    Stewart, Katherine J.; Darcy, David P.; Daniel, Sherae L.

    2006-01-01

    This paper explores the application of functional data analysis (FDA) as a means to study the dynamics of software evolution in the open source context. Several challenges in analyzing the data from software projects are discussed, an approach to overcoming those challenges is described, and preliminary results from the analysis of a sample of open source software (OSS) projects are provided. The results demonstrate the utility of FDA for uncovering and categorizing multiple distinct patterns...

  3. Experimental analysis of specification language impact on NPP software diversity

    International Nuclear Information System (INIS)

    When redundancy and diversity is applied in NPP digital computer system, diversification of system software may be a critical point for the entire system dependability. As the means of enhancing software diversity, specification language diversity is suggested in this study. We set up a simple hypothesis for the specification language impact on common errors, and an experiment based on NPP protection system application was performed. Experiment result showed that this hypothesis could be justified and specification language diversity is effective in overcoming software common mode failure problem

  4. Validation suite for MCNP

    Energy Technology Data Exchange (ETDEWEB)

    Mosteller, R. D. (Russell D.)

    2002-01-01

    Two validation suites, one for criticality and another for radiation shielding, have been defined and tested for the MCNP Monte Carlo code. All of the cases in the validation suites are based on experiments so that calculated and measured results can be compared in a meaningful way. The cases in the validation suites are described, and results from those cases are discussed. For several years, the distribution package for the MCNP Monte Carlo code1 has included an installation test suite to verify that MCNP has been installed correctly. However, the cases in that suite have been constructed primarily to test options within the code and to execute quickly. Consequently, they do not produce well-converged answers, and many of them are physically unrealistic. To remedy these deficiencies, sets of validation suites are being defined and tested for specific types of applications. All of the cases in the validation suites are based on benchmark experiments. Consequently, the results from the measurements are reliable and quantifiable, and calculated results can be compared with them in a meaningful way. Currently, validation suites exist for criticality and radiation-shielding applications.

  5. Suite versus composite statistics

    Science.gov (United States)

    Balsillie, J.H.; Tanner, W.F.

    1999-01-01

    Suite and composite methodologies, two statistically valid approaches for producing statistical descriptive measures, are investigated for sample groups representing a probability distribution where, in addition, each sample is probability distribution. Suite and composite means (first moment measures) are always equivalent. Composite standard deviations (second moment measures) are always larger than suite standard deviations. Suite and composite values for higher moment measures have more complex relationships. Very seldom, however, are they equivalent, and they normally yield statistically significant but different results. Multiple samples are preferable to single samples (including composites) because they permit the investigator to examine sample-to-sample variability. These and other relationships for suite and composite probability distribution analyses are investigated and reported using granulometric data.

  6. A Software Risk Analysis Model Using Bayesian Belief Network

    Institute of Scientific and Technical Information of China (English)

    Yong Hu; Juhua Chen; Mei Liu; Yang Yun; Junbiao Tang

    2006-01-01

    The uncertainty during the period of software project development often brings huge risks to contractors and clients. Ifwe can find an effective method to predict the cost and quality of software projects based on facts like the project character and two-side cooperating capability at the beginning of the project, we can reduce the risk.Bayesian Belief Network(BBN) is a good tool for analyzing uncertain consequences, but it is difficult to produce precise network structure and conditional probability table. In this paper, we built up network structure by Delphi method for conditional probability table learning, and learn update probability table and nodes' confidence levels continuously according to the application cases, which made the evaluation network have learning abilities, and evaluate the software development risk of organization more accurately. This paper also introduces EM algorithm, which will enhance the ability to produce hidden nodes caused by variant software projects.

  7. BEANS - a software package for distributed Big Data analysis

    CERN Document Server

    Hypki, Arkadiusz

    2016-01-01

    BEANS software is a web based, easy to install and maintain, new tool to store and analyse data in a distributed way for a massive amount of data. It provides a clear interface for querying, filtering, aggregating, and plotting data from an arbitrary number of datasets. Its main purpose is to simplify the process of storing, examining and finding new relations in the so-called Big Data. Creation of BEANS software is an answer to the growing needs of the astronomical community to have a versatile tool to store, analyse and compare the complex astrophysical numerical simulations with observations (e.g. simulations of the Galaxy or star clusters with the Gaia archive). However, this software was built in a general form and it is ready to use in any other research field or open source software.

  8. Trends in applied econometrics software development 1985-2008, an analysis of Journal of Applied Econometrics research articles, software reviews, data and code

    OpenAIRE

    Ooms, M.

    2008-01-01

    Trends in software development for applied econometrics emerge from an analysis of the research articles and software reviews of the Journal of Applied Econometrics, appearing since 1986. The data and code archive of the journal provides more specific information on software use for applied econometrics since 1995. GAUSS, Stata, MATLAB and Ox have been the most important softwares after 2001. I compare these higher level programming languages and R in somewhat more detail. An increasing numbe...

  9. Army-NASA aircrew/aircraft integration program: Phase 4 A(3)I Man-Machine Integration Design and Analysis System (MIDAS) software detailed design document

    Science.gov (United States)

    Banda, Carolyn; Bushnell, David; Chen, Scott; Chiu, Alex; Constantine, Betsy; Murray, Jerry; Neukom, Christian; Prevost, Michael; Shankar, Renuka; Staveland, Lowell

    1991-01-01

    The Man-Machine Integration Design and Analysis System (MIDAS) is an integrated suite of software components that constitutes a prototype workstation to aid designers in applying human factors principles to the design of complex human-machine systems. MIDAS is intended to be used at the very early stages of conceptual design to provide an environment wherein designers can use computational representations of the crew station and operator, instead of hardware simulators and man-in-the-loop studies, to discover problems and ask 'what if' questions regarding the projected mission, equipment, and environment. This document is the Software Product Specification for MIDAS. Introductory descriptions of the processing requirements, hardware/software environment, structure, I/O, and control are given in the main body of the document for the overall MIDAS system, with detailed discussion of the individual modules included in Annexes A-J.

  10. A Comparative Analysis of Software Engineering with Knowledge Engineering

    OpenAIRE

    J. F. Vijay; C. Manoharan

    2010-01-01

    Problem statement: Software engineering is not only a technical discipline of its own. It is also a problem domain where technologies coming from other disciplines are relevant and can play an important role. One important example is knowledge engineering, a term that we use in the broad sense to encompass artificial intelligence, computational intelligence, knowledge bases, data mining and machine learning. We see a number of typical software development issues that can benefit from these di...

  11. Analysis of Whole Transcriptome Sequencing Data: Workflow and Software.

    Science.gov (United States)

    Yang, In Seok; Kim, Sangwoo

    2015-12-01

    RNA is a polymeric molecule implicated in various biological processes, such as the coding, decoding, regulation, and expression of genes. Numerous studies have examined RNA features using whole transcriptome sequencing (RNA-seq) approaches. RNA-seq is a powerful technique for characterizing and quantifying the transcriptome and accelerates the development of bioinformatics software. In this review, we introduce routine RNA-seq workflow together with related software, focusing particularly on transcriptome reconstruction and expression quantification. PMID:26865842

  12. Learning from Experience in Software Development: A Multilevel Analysis

    OpenAIRE

    Wai Fong Boh; Slaughter, Sandra A.; J. Alberto Espinosa

    2007-01-01

    This study examines whether individuals, groups, and organizational units learn from experience in software development and whether this learning improves productivity. Although prior research has found the existence of learning curves in manufacturing and service industries, it is not clear whether learning curves also apply to knowledge work like software development. We evaluate the relative productivity impacts from accumulating specialized experience in a system, diversified experience i...

  13. Analysis of Topology Poisoning Attacks in Software-Defined Networking

    OpenAIRE

    Thanh Bui, Tien

    2015-01-01

    Software-defined networking (SDN) is an emerging architecture with a great potential to foster the development of modern networks. By separating the control plane from the network devices and centralizing it at a software-based controller, SDN provides network-wide visibility and flexible programmability to network administrators. However, the security aspects of SDN are not yet fully understood. For example, while SDN is resistant to some topology poisoning attacks in which the attacker misl...

  14. Certification of CFD heat transfer software for turbine blade analysis

    Science.gov (United States)

    Jordan, William A.

    2004-01-01

    Accurate modeling of heat transfer effects is a critical component of the Turbine Branch of the Turbomachinery and Propulsion Systems Division. Being able to adequately predict and model heat flux, coolant flows, and peak temperatures are necessary for the analysis of high pressure turbine blades. To that end, the primary goal of my internship this summer will be to certify the reliability of the CFD program GlennHT for the purpose of turbine blade heat transfer analysis. GlennHT is currently in use by the engineers in the Turbine Branch who use the FORTRAN 77 version of the code for analysis. The program, however, has been updated to a FORTRAN 90 version which is more robust than the older code. In order for the new code to be distributed for use, its reliability must first be certified. Over the course of my internship I will create and run test cases using the FORTRAN 90 version of GlennHT and compare the results to older cases which are known to be accurate, If the results of the new code match those of the sample cases then the newer version will be one step closer to certification for distribution. In order to complete these it will first be necessary to become familiar with operating a number of other programs. Among them are GridPro, which is used to create a grid mesh around a blade geometry, and FieldView, whose purpose is to graphically display the results from the GlennHT program. Once enough familiarity is established with these programs to render them useful, then the work of creating and running test scenarios will begin. The work is additionally complicated by a transition in computer hardware. Most of the working computers in the Turbine Branch are Silicon Graphics machines, which will soon be replaced by LINUX PC's. My project is one of the first to make use the new PC's. The change in system architecture however, has created several software related issues which have greatly increased the time and effort investments required by the project

  15. Evaluation of Distribution Analysis Software for DER Applications

    Energy Technology Data Exchange (ETDEWEB)

    Staunton, RH

    2003-01-23

    unstoppable. In response, energy providers will be forced to both fully acknowledge the trend and plan for accommodating DER [3]. With bureaucratic barriers [4], lack of time/resources, tariffs, etc. still seen in certain regions of the country, changes still need to be made. Given continued technical advances in DER, the time is fast approaching when the industry, nation-wide, must not only accept DER freely but also provide or review in-depth technical assessments of how DER should be integrated into and managed throughout the distribution system. Characterization studies are needed to fully understand how both the utility system and DER devices themselves will respond to all reasonable events (e.g., grid disturbances, faults, rapid growth, diverse and multiple DER systems, large reactive loads). Some of this work has already begun as it relates to operation and control of DER [5] and microturbine performance characterization [6,7]. One of the most urgently needed tools that can provide these types of analyses is a distribution network analysis program in combination with models for various DER. Together, they can be used for (1) analyzing DER placement in distribution networks and (2) helping to ensure that adequate transmission reliability is maintained. Surveys of the market show products that represent a partial match to these needs; specifically, software that has been developed to plan electrical distribution systems and analyze reliability (in a near total absence of DER). The first part of this study (Sections 2 and 3 of the report) looks at a number of these software programs and provides both summary descriptions and comparisons. The second part of this study (Section 4 of the report) considers the suitability of these analysis tools for DER studies. It considers steady state modeling and assessment work performed by ORNL using one commercially available tool on feeder data provided by a southern utility. Appendix A provides a technical report on the results of

  16. Distortion Analysis Toolkit—A Software Tool for Easy Analysis of Nonlinear Audio Systems

    Directory of Open Access Journals (Sweden)

    Pakarinen Jyri

    2010-01-01

    Full Text Available Several audio effects devices deliberately add nonlinear distortion to the processed signal in order to create a desired sound. When creating virtual analog models of nonlinearly distorting devices, it would be very useful to carefully analyze the type of distortion, so that the model could be made as realistic as possible. While traditional system analysis tools such as the frequency response give detailed information on the operation of linear and time-invariant systems, they are less useful for analyzing nonlinear devices. Furthermore, although there do exist separate algorithms for nonlinear distortion analysis, there is currently no unified, easy-to-use tool for rapid analysis of distorting audio systems. This paper offers a remedy by introducing a new software tool for easy analysis of distorting effects. A comparison between a well-known guitar tube amplifier and two commercial software simulations is presented as a case study. This freely available software is written in Matlab language, but the analysis tool can also run as a standalone program, so the user does not need to have Matlab installed in order to perform the analysis.

  17. Progress in Addressing DNFSB Recommendation 2002-1 Issues: Improving Accident Analysis Software Applications

    Energy Technology Data Exchange (ETDEWEB)

    VINCENT, ANDREW

    2005-04-25

    Defense Nuclear Facilities Safety Board (DNFSB) Recommendation 2002-1 (''Quality Assurance for Safety-Related Software'') identified a number of quality assurance issues on the use of software in Department of Energy (DOE) facilities for analyzing hazards, and designing and operating controls to prevent or mitigate potential accidents. Over the last year, DOE has begun several processes and programs as part of the Implementation Plan commitments, and in particular, has made significant progress in addressing several sets of issues particularly important in the application of software for performing hazard and accident analysis. The work discussed here demonstrates that through these actions, Software Quality Assurance (SQA) guidance and software tools are available that can be used to improve resulting safety analysis. Specifically, five of the primary actions corresponding to the commitments made in the Implementation Plan to Recommendation 2002-1 are identified and discussed in this paper. Included are the web-based DOE SQA Knowledge Portal and the Central Registry, guidance and gap analysis reports, electronic bulletin board and discussion forum, and a DOE safety software guide. These SQA products can benefit DOE safety contractors in the development of hazard and accident analysis by precluding inappropriate software applications and utilizing best practices when incorporating software results to safety basis documentation. The improvement actions discussed here mark a beginning to establishing stronger, standard-compliant programs, practices, and processes in SQA among safety software users, managers, and reviewers throughout the DOE Complex. Additional effort is needed, however, particularly in: (1) processes to add new software applications to the DOE Safety Software Toolbox; (2) improving the effectiveness of software issue communication; and (3) promoting a safety software quality assurance culture.

  18. Progress in Addressing DNFSB Recommendation 2002-1 Issues: Improving Accident Analysis Software Applications

    International Nuclear Information System (INIS)

    Defense Nuclear Facilities Safety Board (DNFSB) Recommendation 2002-1 (''Quality Assurance for Safety-Related Software'') identified a number of quality assurance issues on the use of software in Department of Energy (DOE) facilities for analyzing hazards, and designing and operating controls to prevent or mitigate potential accidents. Over the last year, DOE has begun several processes and programs as part of the Implementation Plan commitments, and in particular, has made significant progress in addressing several sets of issues particularly important in the application of software for performing hazard and accident analysis. The work discussed here demonstrates that through these actions, Software Quality Assurance (SQA) guidance and software tools are available that can be used to improve resulting safety analysis. Specifically, five of the primary actions corresponding to the commitments made in the Implementation Plan to Recommendation 2002-1 are identified and discussed in this paper. Included are the web-based DOE SQA Knowledge Portal and the Central Registry, guidance and gap analysis reports, electronic bulletin board and discussion forum, and a DOE safety software guide. These SQA products can benefit DOE safety contractors in the development of hazard and accident analysis by precluding inappropriate software applications and utilizing best practices when incorporating software results to safety basis documentation. The improvement actions discussed here mark a beginning to establishing stronger, standard-compliant programs, practices, and processes in SQA among safety software users, managers, and reviewers throughout the DOE Complex. Additional effort is needed, however, particularly in: (1) processes to add new software applications to the DOE Safety Software Toolbox; (2) improving the effectiveness of software issue communication; and (3) promoting a safety software quality assurance culture

  19. The NEPLAN software package a universal tool for electric power systems analysis

    CERN Document Server

    Kahle, K

    2002-01-01

    The NEPLAN software package has been used by CERN's Electric Power Systems Group since 1997. The software is designed for the calculation of short-circuit currents, load flow, motor start, dynamic stability, harmonic analysis and harmonic filter design. This paper describes the main features of the software package and their application to CERN's electric power systems. The implemented models of CERN's power systems are described in detail. Particular focus is given to fault calculations, harmonic analysis and filter design. Based on this software package and the CERN power network model, several recommendations are given.

  20. Astronomical Video Suites

    Science.gov (United States)

    Francisco Salgado, Jose

    2010-01-01

    Astronomer and visual artist Jose Francisco Salgado has directed two astronomical video suites to accompany live performances of classical music works. The suites feature awe-inspiring images, historical illustrations, and visualizations produced by NASA, ESA, and the Adler Planetarium. By the end of 2009, his video suites Gustav Holst's The Planets and Astronomical Pictures at an Exhibition will have been presented more than 40 times in over 10 countries. Lately Salgado, an avid photographer, has been experimenting with high dynamic range imaging, time-lapse, infrared, and fisheye photography, as well as with stereoscopic photography and video to enhance his multimedia works.

  1. The Social Construction of the Software Operation

    DEFF Research Database (Denmark)

    Frederiksen, Helle Damborg; Rose, Jeremy

    2003-01-01

    challenge the underlying social practice of the software operation, the metrics program reinforced it by adopting the same underlying values. Our conclusion is that, under these circumstances, metrics programs are unlikely to result in radical changes to the software operation, and are best suited to small......In a large software company in Denmark, much effort was expended capturing metrics about the company’s software operation. The purpose of the metrics program was to change and improve the software operation. Writing software can be understood as a socially constructed practice, which can be...... analyzed using structuration theory. This structurational analysis showed that the company’s software operation followed an easily recognizable and widely understood pattern. The software operation was organized in terms of development projects leading to applications that then needed maintenance, and...

  2. Spectral analysis of aeromagnetic profiles for depth estimation principles, software, and practical application

    Science.gov (United States)

    Sadek, H.S.; Rashad, S.M.; Blank, H.R.

    1984-01-01

    Fourier spectral analysis in recent years has become a widely utilized tool for the processing and interpretation of potential field data. It is particularly well suited to analysis of aeromagnetic maps and profiles, where coverage commonly is of broad scope and statistical treatment is appropriate.

  3. Risk Analysis and Decision-Making Software Package (1997 Version) User Manual

    Energy Technology Data Exchange (ETDEWEB)

    Chung, F.T.H.

    1999-02-11

    This manual provides instructions for using the U.S. Department of Energy's (DOE) risk analysis and decision making software (1997 version) developed at BDM Petroleum Technologies by BDM-Oklahoma, Inc. for DOE, under contract No. DE-AC22-94PC91OO8. This software provides petroleum producers with a simple, handy tool for exploration and production risk analysis and decision-making. It collects useful risk analysis tools in one package so that users do not have to use several programs separately. The software is simple to use, but still provides many functions. The 1997 version of the software package includes the following tools: (1) Investment risk (Gambler's ruin) analysis; (2) Monte Carlo simulation; (3) Best fit for distribution functions; (4) Sample and rank correlation; (5) Enhanced oil recovery method screening; and (6) artificial neural network. This software package is subject to change. Suggestions and comments from users are welcome and will be considered for future modifications and enhancements of the software. Please check the opening screen of the software for the current contact information. In the future, more tools will be added to this software package. This manual includes instructions on how to use the software but does not attempt to fully explain the theory and algorithms used to create it.

  4. Design and validation of Segment - freely available software for cardiovascular image analysis

    Science.gov (United States)

    2010-01-01

    Background Commercially available software for cardiovascular image analysis often has limited functionality and frequently lacks the careful validation that is required for clinical studies. We have already implemented a cardiovascular image analysis software package and released it as freeware for the research community. However, it was distributed as a stand-alone application and other researchers could not extend it by writing their own custom image analysis algorithms. We believe that the work required to make a clinically applicable prototype can be reduced by making the software extensible, so that researchers can develop their own modules or improvements. Such an initiative might then serve as a bridge between image analysis research and cardiovascular research. The aim of this article is therefore to present the design and validation of a cardiovascular image analysis software package (Segment) and to announce its release in a source code format. Results Segment can be used for image analysis in magnetic resonance imaging (MRI), computed tomography (CT), single photon emission computed tomography (SPECT) and positron emission tomography (PET). Some of its main features include loading of DICOM images from all major scanner vendors, simultaneous display of multiple image stacks and plane intersections, automated segmentation of the left ventricle, quantification of MRI flow, tools for manual and general object segmentation, quantitative regional wall motion analysis, myocardial viability analysis and image fusion tools. Here we present an overview of the validation results and validation procedures for the functionality of the software. We describe a technique to ensure continued accuracy and validity of the software by implementing and using a test script that tests the functionality of the software and validates the output. The software has been made freely available for research purposes in a source code format on the project home page http

  5. Design and validation of Segment - freely available software for cardiovascular image analysis

    International Nuclear Information System (INIS)

    Commercially available software for cardiovascular image analysis often has limited functionality and frequently lacks the careful validation that is required for clinical studies. We have already implemented a cardiovascular image analysis software package and released it as freeware for the research community. However, it was distributed as a stand-alone application and other researchers could not extend it by writing their own custom image analysis algorithms. We believe that the work required to make a clinically applicable prototype can be reduced by making the software extensible, so that researchers can develop their own modules or improvements. Such an initiative might then serve as a bridge between image analysis research and cardiovascular research. The aim of this article is therefore to present the design and validation of a cardiovascular image analysis software package (Segment) and to announce its release in a source code format. Segment can be used for image analysis in magnetic resonance imaging (MRI), computed tomography (CT), single photon emission computed tomography (SPECT) and positron emission tomography (PET). Some of its main features include loading of DICOM images from all major scanner vendors, simultaneous display of multiple image stacks and plane intersections, automated segmentation of the left ventricle, quantification of MRI flow, tools for manual and general object segmentation, quantitative regional wall motion analysis, myocardial viability analysis and image fusion tools. Here we present an overview of the validation results and validation procedures for the functionality of the software. We describe a technique to ensure continued accuracy and validity of the software by implementing and using a test script that tests the functionality of the software and validates the output. The software has been made freely available for research purposes in a source code format on the project home page (http://segment.heiberg.se). Segment

  6. The Effects of Development Team Skill on Software Product Quality

    Science.gov (United States)

    Beaver, Justin M.; Schiavone, Guy A.

    2006-01-01

    This paper provides an analysis of the effect of the skill/experience of the software development team on the quality of the final software product. A method for the assessment of software development team skill and experience is proposed, and was derived from a workforce management tool currently in use by the National Aeronautics and Space Administration. Using data from 26 smallscale software development projects, the team skill measures are correlated to 5 software product quality metrics from the ISO/IEC 9126 Software Engineering Product Quality standard. in the analysis of the results, development team skill is found to be a significant factor in the adequacy of the design and implementation. In addition, the results imply that inexperienced software developers are tasked with responsibilities ill-suited to their skill level, and thus have a significant adverse effect on the quality of the software product. Keywords: software quality, development skill, software metrics

  7. EDL Sensor Suite Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Optical Air Data Systems (OADS) L.L.C. proposes a LIDAR based remote measurement sensor suite capable of satisfying a significant number of the desired sensing...

  8. Why Software Piracy Rates Differ – A Theoretical Analysis

    OpenAIRE

    Sougata Poddar

    2005-01-01

    The pervasiveness of the illegal copying of software is a worldwide phenomenon. However, the level of piracy across various markets as well as across various countries varies a great deal. In this paper, we develop a theoretical model to explain this feature. In this model, the software firm undertakes costly deterrence activity in the form of R&D to stop piracy. In our model existence (or non-existence) of piracy comes out endogenously. We show that piracy survives in the market when the inc...

  9. The MEME Suite

    OpenAIRE

    Bailey, Timothy L; Johnson, James,; Grant, Charles E.; Noble, William S.

    2015-01-01

    The MEME Suite is a powerful, integrated set of web-based tools for studying sequence motifs in proteins, DNA and RNA. Such motifs encode many biological functions, and their detection and characterization is important in the study of molecular interactions in the cell, including the regulation of gene expression. Since the previous description of the MEME Suite in the 2009 Nucleic Acids Research Web Server Issue, we have added six new tools. Here we describe the capabilities of all the tools...

  10. Comparative analysis of methods for testing software of radio-electronic equipment

    OpenAIRE

    G. A. Mirskikh; Yu. Yu. Reutskaya

    2011-01-01

    The analysis of the concepts of quality and reliability of software products that are part of the radio-electronic equipment is making. Basis testing methods of software products that are used in the design of hardware and software systems, to ensure quality and reliability are given. We consider testing in accordance with the methodology of the "black box" and "white box" methods of integration testing from the bottom up and top down, as well as various modifications of these methods. Effici...

  11. Algebraic software analysis and embedded simulation of a driving robot

    NARCIS (Netherlands)

    Merkx, L.L.F.; Cuijpers, P.J.L.; Duringhof, H.M.

    2007-01-01

    At TNO Automotive the Generic Driving Actuator (GDA) is developed. The GDA is a device capable of driving a vehicle fully automatically using the same interface as a human driverdoes. In this paper, the design of the GDA is discussed. The software and hardware of the GDA and its effect on vehicle be

  12. Reference Management Software: A Comparative Analysis of Four Products

    Science.gov (United States)

    Gilmour, Ron; Cobus-Kuo, Laura

    2011-01-01

    Reference management (RM) software is widely used by researchers in the health and natural sciences. Librarians are often called upon to provide support for these products. The present study compares four prominent RMs: CiteULike, RefWorks, Mendeley, and Zotero, in terms of features offered and the accuracy of the bibliographies that they…

  13. Graph based communication analysis for hardware/software codesign

    DEFF Research Database (Denmark)

    Knudsen, Peter Voigt; Madsen, Jan

    1999-01-01

    In this paper we present a coarse grain CDFG (Control/Data Flow Graph) model suitable for hardware/software partitioning of single processes and demonstrate how it is necessary to perform various transformations on the graph structure before partitioning in order to achieve a structure that allows...

  14. Program spectra analysis in embedded software: a case study

    NARCIS (Netherlands)

    Abreu, R.; Zoeteweij, P.; Van Gemund, A.J.C.

    2006-01-01

    Because of constraints imposed by the market, embedded software in consumer electronics is almost inevitably shipped with faults and the goal is just to reduce the inherent unreliability to an acceptable level before a product has to be released. Automatic fault diagnosis is a valuable tool to captu

  15. A Comparative Analysis of Software Engineering with Knowledge Engineering

    Directory of Open Access Journals (Sweden)

    J. F. Vijay

    2010-01-01

    Full Text Available Problem statement: Software engineering is not only a technical discipline of its own. It is also a problem domain where technologies coming from other disciplines are relevant and can play an important role. One important example is knowledge engineering, a term that we use in the broad sense to encompass artificial intelligence, computational intelligence, knowledge bases, data mining and machine learning. We see a number of typical software development issues that can benefit from these disciplines and, for the sake of clarifying the discussion, we have divided them into four categories: (1 planning, monitoring and quality control of projects, (2 The quality and process improvement of software organizations, (3 decision making support, (4 automation. Approach: First, the planning, monitoring and quality control of software development was typically based unless it is entirely ad-hoc on past project data and/or expert opinion. Results: Several techniques coming from machine learning, computational intelligence and knowledge-based systems had shown to be useful in this context. Second, software organizations are inherently learning organizations, that need to improve, based on experience and project feedback, the way they develop software in changing and volatile environments. Large amounts of data, numerous documents and other forms of information are typically gathered on projects. The question then became how to enable the intelligent storage and use of such information in future projects. Third, during the course of a project, software engineers and managers have to face important, complex decisions. They need decision models to support them, especially when project pressure is intense. Techniques originally developed for building risk models based on expert elicitation or optimization heuristics can play a key role in such a context. The last category of applications concerns automation. Many automation problems, such as test data

  16. Designing and developing of data evaluation and analysis software applied to gamma-ray spectrometry

    International Nuclear Information System (INIS)

    This study is intended to design and develop software for gamma spectral data evaluation and analysis suitable for a variety of gamma-ray spectrometry systems. The software is written in Visual C++. It is designed to run under Microsoft Windows Operating System. The software is capable of covering all the necessary steps for spectral data evaluation and analysis of the collected data. These include peak search, energy calibration, gross and net peak area calculation, peak centroid determination and peak width calculation of the derived gamma-ray peaks. The software offers the ability to report qualitative and quantitative results. The analysis includes: Peak position identification (qualitative analysis) and calculating of its characteristics; Net peak area calculation by subtracting background; Radioactivity estimation (quantitative analysis) using comparison method for gamma peaks from any radioisotopes present during counting; Radioactivity estimation (quantitative analysis) after efficiency calibration; Counting uncertainties calculation; Limit of detection (LOD) estimation. (author)

  17. Software design specification and analysis(NuFDS) approach for the safety critical software based on porgrammable logic controller(PLC)

    International Nuclear Information System (INIS)

    This paper introduces the software design specification and analysis technique for the safety-critical system based on Programmable Logic Controller (PLC). During software development phases, the design phase should perform an important role to connect between requirements phase and implementation phase as a process of translating problem requirements into software structures. In this work, the Nuclear FBD-style Design Specification and analysis (NuFDS) approach was proposed. The NuFDS approach for nuclear Instrumentation and Control (I and C) software are suggested in a straight forward manner. It consists of four major specifications as follows; Database, Software Architecture, System Behavior, and PLC Hardware Configuration. Additionally, correctness, completeness, consistency, and traceability check techniques are also suggested for the formal design analysis in NuFDS approach. In addition, for the tool supporting, we are developing NuSDS tool based on the NuFDS approach which is a tool, especially for the software design specification in nuclear fields

  18. Design and validation of Segment - freely available software for cardiovascular image analysis

    OpenAIRE

    Engblom Henrik; Carlsson Marcus; Ugander Martin; Sjögren Jane; Heiberg Einar; Arheden Håkan

    2010-01-01

    Abstract Background Commercially available software for cardiovascular image analysis often has limited functionality and frequently lacks the careful validation that is required for clinical studies. We have already implemented a cardiovascular image analysis software package and released it as freeware for the research community. However, it was distributed as a stand-alone application and other researchers could not extend it by writing their own custom image analysis algorithms. We believ...

  19. FASEA: A FPGA Acquisition System and Software Event Analysis for liquid scintillation counting

    International Nuclear Information System (INIS)

    The FASEA (FPGA based Acquisition and Software Event Analysis) system has been developed to replace the MAC3 for coincidence pulse processing. The system uses a National Instruments Virtex 5 FPGA card (PXI-7842R) for data acquisition and a purpose developed data analysis software for data analysis. Initial comparisons to the MAC3 unit are included based on measurements of 89Sr and 3H, confirming that the system is able to accurately emulate the behaviour of the MAC3 unit.

  20. Comparative Performance Analysis of Machine Learning Techniques for Software Bug Detection

    OpenAIRE

    Saiqa Aleem; Luiz Fernando Capretz; Faheem Ahmed

    2015-01-01

    Machine learning techniques can be used to analyse data from different perspectives and enable developers to retrieve useful information. Machine learning techniques are proven to be useful in terms of software bug prediction. In this paper, a comparative performance analysis of different machine learning techniques is explored f or software bug prediction on public available data sets. Results showed most of the mac ...

  1. On The Human, Organizational, and Technical Aspects of Software Development and Analysis

    Science.gov (United States)

    Damaševičius, Robertas

    Information systems are designed, constructed, and used by people. Therefore, a software design process is not purely a technical task, but a complex psycho-socio-technical process embedded within organizational, cultural, and social structures. These structures influence the behavior and products of the programmer's work such as source code and documentation. This chapter (1) discusses the non-technical (organizational, social, cultural, and psychological) aspects of software development reflected in program source code; (2) presents a taxonomy of the social disciplines of computer science; and (3) discusses the socio-technical software analysis methods for discovering the human, organizational, and technical aspects embedded within software development artifacts.

  2. A Methodology to Evaluate Object oriented Software Systems Using Change Requirement Traceability Based on Impact Analysis

    Directory of Open Access Journals (Sweden)

    Sunil T. D

    2014-06-01

    Full Text Available It is a well known fact that software maintenance plays a major role and finds importance in software development life cycle. As object - oriented programming has become the standard, it is very important to understand the problems of maintaining object -oriented software systems. This paper aims at evaluating object - oriented software system through change requirement traceability – based impact analysis methodology for non functional requirements using functional requirements . The major issues have been related to change impact algorithms and inheritance of functionality.

  3. Analysis and design of software ecosystem architectures – towards the 4S telemedicine ecosystem

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak; Hansen, Klaus Marius; Kyng, Morten;

    2014-01-01

    application stove-pipes that inhibit the adoption of telemedical solutions. To which extent can a software ecosystem approach to telemedicine alleviate this? Objective In this article, we define the concept of software ecosystem architecture as the structure(s) of a software ecosystem comprising elements......, relations among them, and properties of both. Our objective is to show how this concept can be used i) in the analysis of existing software ecosystems and ii) in the design of new software ecosystems. Method We performed a mixed-method study that consisted of a case study and an experiment. For i), we...... performed a descriptive, revelatory case study of the Danish telemedicine ecosystem and for ii), we experimentally designed, implemented, and evaluated the architecture of 4S. Results We contribute in three areas. First, we define the software ecosystem architecture concept that captures organization...

  4. The IFPUG guide to IT and software measurement

    CERN Document Server

    IFPUG

    2012-01-01

    The widespread deployment of millions of current and emerging software applications has placed software economic studies among the most critical of any form of business analysis. Unfortunately, a lack of an integrated suite of metrics makes software economic analysis extremely difficult. The International Function Point Users Group (IFPUG), a nonprofit and member-governed organization, has become the recognized leader in promoting the effective management of application software development and maintenance activities. The IFPUG Guide to IT and Software Measurement brings together 52 leading so

  5. A Framework for Modelling and Analysis of Software Systems Scalability

    OpenAIRE

    Duboc, L.; Rosenblum, D. S.; Wicks, T.

    2006-01-01

    Scalability is a widely-used term in scientific papers, technical magazines and software descriptions. Its use in the most varied contexts contribute to a general confusion about what the term really means. This lack of consensus is a potential source of problems, as assumptions are made in the face of a scalability claim. A clearer and widely-accepted understanding of scalability is required to restore the usefulness of the term. This research investigates commonly found definitions of scala...

  6. Specification and analysis of requirements negotiation strategy in software ecosystems

    OpenAIRE

    Fricker, S

    2009-01-01

    The development of software products and systems generally requires collaboration of many individuals, groups, and organizations that form an ecosystem of interdependent stakeholders. The way the interests and expectations of such stakeholders are communicated is critical for whether they are heard, hence whether the stakeholders are successful in influencing future solutions to meet their needs. This paper proposes a model based on negotiation and network theory for analyzing and designing f...

  7. Software for muscle fibre type classification and analysis

    Czech Academy of Sciences Publication Activity Database

    Karen, Petr; Števanec, M.; Smerdu, V.; Cvetko, E.; Kubínová, Lucie; Eržen, I.

    2009-01-01

    Roč. 53, č. 2 (2009), s. 87-95. ISSN 1121-760X R&D Projects: GA MŠk(CZ) LC06063; GA MŠk(CZ) MEB090910 Institutional research plan: CEZ:AV0Z50110509 Keywords : muscle fiber types * myosin heavy chain isoforms * image processing Subject RIV: JC - Computer Hardware ; Software Impact factor: 0.886, year: 2009

  8. Rapid software development : ANALYSIS OF AGILE METHODS FOR APP STARTUPS

    OpenAIRE

    Wahlqvist, Daniel

    2014-01-01

    This thesis is focused on software development using so called Agile methods. The scope of research is startup companies creating consumer apps. The thesis work was performed at a Swedish app startup; Storypic/Accelit AB. An overview of current research on Agile methods is given. A qualitative case study was undertaken in four parts; 1. Observing the team 2. Testing business hypotheses 3. Interviews with the team and 4. User feedback. Analyzing the findings some conclusions are drawn:  An ag...

  9. Critical analysis of interactive media with software affordances

    OpenAIRE

    Curinga, Matthew X.

    2014-01-01

    There is a long standing, and unsettled debate surrounding the ways that technology influences society. There is strong scholarship supporting the social construction perspective, arguing that the effects of technology are wholly socially and politically determined. This paper argues that the social constructivist position needs to be expanded if it can be useful for more than observing the ways technologies are designed and used. We need to develop better ways to talk about software, compute...

  10. Environmental Volatility, Development Decisions, and Software Volatility: A Longitudinal Analysis

    OpenAIRE

    Evelyn J. Barry; Kemerer, Chris F.; Slaughter, Sandra A.

    2006-01-01

    Although product development research often focuses on activities prior to product launch, for long-lived, adaptable products like software, development can continue over the entire product life cycle. For managers of these products the challenges are to predict when and how much the products will change and to understand how their development decisions influence the timing and magnitude of future change activities. We develop a two-stage model that relates environmental volatility to product...

  11. Risk Analysis and Mitigation Plan in Software Development

    OpenAIRE

    Dr. Sheel Ghule

    2014-01-01

    Software development, often encounter many unanticipated problems, resulting in projects falling behind on deadlines, releases, exceeding budgets and result in sub-standard products due to its complex nature. Although these problems cannot be totally eliminated, they can however be controlled by applying Risk Management Plan. This can help to deal with problems before they occur. Organisations who implement risk management plan have control over the overall management of the p...

  12. Software for analysis and manipulation of genetic linkage data.

    OpenAIRE

    Weaver, R; Helms, C; Mishra, S. K.; Donis-Keller, H

    1992-01-01

    We present eight computer programs written in the C programming language that are designed to analyze genotypic data and to support existing software used to construct genetic linkage maps. Although each program has a unique purpose, they all share the common goals of affording a greater understanding of genetic linkage data and of automating tasks to make computers more effective tools for map building. The PIC/HET and FAMINFO programs automate calculation of relevant quantities such as hete...

  13. The khmer software package: enabling efficient nucleotide sequence analysis.

    Science.gov (United States)

    Crusoe, Michael R; Alameldin, Hussien F; Awad, Sherine; Boucher, Elmar; Caldwell, Adam; Cartwright, Reed; Charbonneau, Amanda; Constantinides, Bede; Edvenson, Greg; Fay, Scott; Fenton, Jacob; Fenzl, Thomas; Fish, Jordan; Garcia-Gutierrez, Leonor; Garland, Phillip; Gluck, Jonathan; González, Iván; Guermond, Sarah; Guo, Jiarong; Gupta, Aditi; Herr, Joshua R; Howe, Adina; Hyer, Alex; Härpfer, Andreas; Irber, Luiz; Kidd, Rhys; Lin, David; Lippi, Justin; Mansour, Tamer; McA'Nulty, Pamela; McDonald, Eric; Mizzi, Jessica; Murray, Kevin D; Nahum, Joshua R; Nanlohy, Kaben; Nederbragt, Alexander Johan; Ortiz-Zuazaga, Humberto; Ory, Jeramia; Pell, Jason; Pepe-Ranney, Charles; Russ, Zachary N; Schwarz, Erich; Scott, Camille; Seaman, Josiah; Sievert, Scott; Simpson, Jared; Skennerton, Connor T; Spencer, James; Srinivasan, Ramakrishnan; Standage, Daniel; Stapleton, James A; Steinman, Susan R; Stein, Joe; Taylor, Benjamin; Trimble, Will; Wiencko, Heather L; Wright, Michael; Wyss, Brian; Zhang, Qingpeng; Zyme, En; Brown, C Titus

    2015-01-01

    The khmer package is a freely available software library for working efficiently with fixed length DNA words, or k-mers. khmer provides implementations of a probabilistic k-mer counting data structure, a compressible De Bruijn graph representation, De Bruijn graph partitioning, and digital normalization. khmer is implemented in C++ and Python, and is freely available under the BSD license at  https://github.com/dib-lab/khmer/. PMID:26535114

  14. Space Telecommunications Radio System Software Architecture Concepts and Analysis

    Science.gov (United States)

    Handler, Louis M.; Hall, Charles S.; Briones, Janette C.; Blaser, Tammy M.

    2008-01-01

    The Space Telecommunications Radio System (STRS) project investigated various Software Defined Radio (SDR) architectures for Space. An STRS architecture has been selected that separates the STRS operating environment from its various waveforms and also abstracts any specialized hardware to limit its effect on the operating environment. The design supports software evolution where new functionality is incorporated into the radio. Radio hardware functionality has been moving from hardware based ASICs into firmware and software based processors such as FPGAs, DSPs and General Purpose Processors (GPPs). Use cases capture the requirements of a system by describing how the system should interact with the users or other systems (the actors) to achieve a specific goal. The Unified Modeling Language (UML) is used to illustrate the Use Cases in a variety of ways. The Top Level Use Case diagram shows groupings of the use cases and how the actors are involved. The state diagrams depict the various states that a system or object may be in and the transitions between those states. The sequence diagrams show the main flow of activity as described in the use cases.

  15. Waste management facility accident analysis (WASTE ACC) system: software for analysis of waste management alternatives

    International Nuclear Information System (INIS)

    This paper describes the Waste Management Facility Accident Analysis (WASTEunderscoreACC) software, which was developed at Argonne National Laboratory (ANL) to support the US Department of Energy's (DOE's) Waste Management (WM) Programmatic Environmental Impact Statement (PEIS). WASTEunderscoreACC is a decision support and database system that is compatible with Microsoft reg-sign Windows trademark. It assesses potential atmospheric releases from accidents at waste management facilities. The software provides the user with an easy-to-use tool to determine the risk-dominant accident sequences for the many possible combinations of process technologies, waste and facility types, and alternative cases described in the WM PEIS. In addition, its structure will allow additional alternative cases and assumptions to be tested as part of the future DOE programmatic decision-making process. The WASTEunderscoreACC system demonstrates one approach to performing a generic, systemwide evaluation of accident risks at waste management facilities. The advantages of WASTEunderscoreACC are threefold. First, the software gets waste volume and radiological profile data that were used to perform other WM PEIS-related analyses directly from the WASTEunderscoreMGMT system. Second, the system allows for a consistent analysis across all sites and waste streams, which enables decision makers to understand more fully the trade-offs among various policy options and scenarios. Third, the system is easy to operate; even complex scenario runs are completed within minutes

  16. Software component quality evaluation

    Science.gov (United States)

    Clough, A. J.

    1991-01-01

    The paper describes a software inspection process that can be used to evaluate the quality of software components. Quality criteria, process application, independent testing of the process and proposed associated tool support are covered. Early results indicate that this technique is well suited for assessing software component quality in a standardized fashion. With automated machine assistance to facilitate both the evaluation and selection of software components, such a technique should promote effective reuse of software components.

  17. A COMPARISON OF STEPWISE AND FUZZY MULTIPLE REGRESSION ANALYSIS TECHNIQUES FOR MANAGING SOFTWARE PROJECT RISKS: ANALYSIS PHASE

    OpenAIRE

    Abdelrafe Elzamly; Burairah Hussin

    2014-01-01

    Risk is not always avoidable, but it is controllable. The aim of this study is to identify whether those techniques are effective in reducing software failure. This motivates the authors to continue the effort to enrich the managing software project risks with consider mining and quantitative approach with large data set. In this study, two new techniques are introduced namely stepwise multiple regression analysis and fuzzy multiple regression to manage the software risks. Two evaluation proc...

  18. Integrating Multi-Vendor Software Analysis into the Lifecycle for Reliability, Productivity, and Performance Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The goal of the proposed work is to create new ways to manage, visualize, and share data produced by multiple software analysis tools, and to create a framework for...

  19. Application of software quality assurance methods in validation and maintenance of reactor analysis computer codes

    International Nuclear Information System (INIS)

    Various computer codes employed at Israel Electricity Company for preliminary reactor design analysis and fuel cycle scoping calculations have been often subject to program source modifications. Although most changes were due to computer or operating system compatibility problems, a number of significant modifications were due to model improvement and enhancements of algorithm efficiency and accuracy. With growing acceptance of software quality assurance requirements and methods, a program of implementing extensive testing of modified software has been adopted within the regular maintenance activities. In this work survey has been performed of various software quality assurance methods of software testing which belong mainly to the two major categories of implementation ('white box') and specification-based ('black box') testing. The results of this survey exhibits a clear preference of specification-based testing. In particular the equivalence class partitioning method and the boundary value method have been selected as especially suitable functional methods for testing reactor analysis codes.A separate study of software quality assurance methods and techniques has been performed in this work objective to establish appropriate pre-test software specification methods. Two methods of software analysis and specification have been selected as the most suitable for this purpose: The method of data flow diagrams has been shown to be particularly valuable for performing the functional/procedural software specification while the entities - relationship diagrams has been approved to be efficient for specifying software data/information domain. Feasibility of these two methods has been analyzed in particular for software uncertainty analysis and overall code accuracy estimation. (author). 14 refs

  20. "Leagile??? software development: an experience report analysis of the application of lean approaches in agile software development

    OpenAIRE

    Wang, Xiaofeng; Conboy, Kieran; Cawley, Ois??n

    2012-01-01

    peer-reviewed In recent years there has been a noticeable shift in attention from those who use agile software development toward lean software development, often labelled as a shift ???from agile to lean???. However, the reality may not be as simple or linear as this label implies. To provide a better understanding of lean software development approaches and how they are applied in agile software development, we have examined 30 experience reports published in past agile software...

  1. Software tools for the analysis of video meteors emission spectra

    Science.gov (United States)

    Madiedo, J. M.; Toscano, F. M.; Trigo-Rodriguez, J. M.

    2011-10-01

    One of the goals of the SPanish Meteor Network (SPMN) is related to the study of the chemical composition of meteoroids by analyzing the emission spectra resulting from the ablation of these particles of interplanetary matter in the atmosphere. With this aim, some of the CCD video devices we employ to observe the nigh sky are endowed with holographic diffraction gratings, and a continuous monitoring of meteor activity is performed. We have recently developed a new software to analyze these spectra. A description of this computer program is given, and some of the results obtained so far are presented here.

  2. An Integrated Suite of Tools to support Human Factors Engineering

    Energy Technology Data Exchange (ETDEWEB)

    Jacques V Hugo

    2001-08-01

    Human Factors Engineering (HFE) work for the nuclear industry imposes special demands on the practitioner in terms of the scope, complexity and safety requirements for humans in nuclear installations. Unfortunately HFE lags behind other engineering disciplines in the development and use of modern, powerful tools for the full range of analysis and design processes. HFE does not appear to be an attractive market for software and hardware developers and as a result, HFE practitioners usually have to rely on inefficient general-purpose tools like standard office software, or they have to use expensive special-purpose tools that offer only part of the solution they require and which also do not easily integrate with other tools. There have been attempts to develop generic software tools to support the HFE analyst and also to achieve some order and consistency in format and presentation. However, in spite of many years of development, very few tools have emerged that have achieved these goals. This would suggest the need for special tools, but existing commercial products have been found inadequate and to date not a single tool has been developed that adequately supports the special requirements of HFE work for the nuclear industry. This paper describes an integrated suite of generic as well as purpose-built tools that facilitate information solicitation, issues tracking, work domain analysis, functional requirements analysis, function allocation, operational sequence analysis, task analysis and development of HSI design requirements. In combination, this suite of tools supports the analytical as well as the representational aspects of key HFE activities primarily for new NPPs, including capturing information from subject matter experts and various source documents directly into the appropriate tool and then linking, analyzing and extending that information further to represent detailed functional and task information, and ultimately HSI design requirements. The paper

  3. Development of tools for safety analysis of control software in advanced reactors

    International Nuclear Information System (INIS)

    Software based control systems have gained a pervasive presence in a wide variety of applications, including nuclear power plant control and protection systems which are within the oversight and licensing responsibility of the US Nuclear Regulatory Commission. While the cost effectiveness and flexibility of software based plant process control is widely recognized, it is very difficult to achieve and prove high levels of demonstrated dependability and safety assurance for the functions performed by process control software, due to the very flexibility and potential complexity of the software itself. The development of tools to model, analyze and test software design and implementations in the context of the system that the software is designed to control can greatly assist the task of providing higher levels of assurance than those obtainable by software testing alone. This report presents and discusses the development of the Dynamic Flowgraph Methodology (DFM) and its application in the dependability and assurance analysis of software-based control systems. The features of the methodology and full-scale examples of application to both generic process and nuclear power plant control systems are presented and discussed in detail. The features of a workstation software tool developed to assist users in the application of DFM are also described

  4. Development of tools for safety analysis of control software in advanced reactors

    Energy Technology Data Exchange (ETDEWEB)

    Guarro, S.; Yau, M.; Motamed, M. [Advanced Systems Concepts Associates, El Segundo, CA (United States)

    1996-04-01

    Software based control systems have gained a pervasive presence in a wide variety of applications, including nuclear power plant control and protection systems which are within the oversight and licensing responsibility of the US Nuclear Regulatory Commission. While the cost effectiveness and flexibility of software based plant process control is widely recognized, it is very difficult to achieve and prove high levels of demonstrated dependability and safety assurance for the functions performed by process control software, due to the very flexibility and potential complexity of the software itself. The development of tools to model, analyze and test software design and implementations in the context of the system that the software is designed to control can greatly assist the task of providing higher levels of assurance than those obtainable by software testing alone. This report presents and discusses the development of the Dynamic Flowgraph Methodology (DFM) and its application in the dependability and assurance analysis of software-based control systems. The features of the methodology and full-scale examples of application to both generic process and nuclear power plant control systems are presented and discussed in detail. The features of a workstation software tool developed to assist users in the application of DFM are also described.

  5. An Application of Intelligent Data Analysis Techniques to a Large Software Engineering Dataset

    Science.gov (United States)

    Cain, James; Counsell, Steve; Swift, Stephen; Tucker, Allan

    Within the development of large software systems, there is significant value in being able to predict changes. If we can predict the likely changes that a system will undergo, then we can estimate likely developer effort and allocate resources appropriately. Within object oriented software development, these changes are often identified as refactorings. Very few studies have explored the prediction of refactorings on a wide-scale. Within this paper we aim to do just this, through applying intelligent data analysis techniques to a uniquely large and comprehensive software engineering time series dataset. Our analysis show extremely promising results, allowing us to predict the occurrence of future large changes.

  6. A unified approach to feature-centric analysis of object-oriented software

    DEFF Research Database (Denmark)

    Olszak, Andrzej; Jørgensen, Bo Nørregaard

    2010-01-01

    Feature-centric comprehension of software is a prerequisite to incorporating modifications requested by users during software evolution and maintenance. However, feature-centric understanding of large object-oriented programs is difficult to achieve due to size, complexity and implicit cha......-racter of mappings between features and source code. In this paper, we address these issues through our unified approach to feature-centric analysis of object-oriented software. Our approach supports discovery of feature-code traceability links and their analysis from three perspectives and at three levels...... Featureous supports program comprehension by means of concrete cognitive design elements....

  7. Novell ZENworks Suite 7

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    8月22日,Novell推出功能更加强大、性能更加稳定的最新ZENworks Suite 7。Novell公司高级技术支持刘长春建议:“利用Novell ZENworks Suite来规范管理您的网络。”

  8. Analysis of a hardware and software fault tolerant processor for critical applications

    Science.gov (United States)

    Dugan, Joanne B.

    1993-01-01

    Computer systems for critical applications must be designed to tolerate software faults as well as hardware faults. A unified approach to tolerating hardware and software faults is characterized by classifying faults in terms of duration (transient or permanent) rather than source (hardware or software). Errors arising from transient faults can be handled through masking or voting, but errors arising from permanent faults require system reconfiguration to bypass the failed component. Most errors which are caused by software faults can be considered transient, in that they are input-dependent. Software faults are triggered by a particular set of inputs. Quantitative dependability analysis of systems which exhibit a unified approach to fault tolerance can be performed by a hierarchical combination of fault tree and Markov models. A methodology for analyzing hardware and software fault tolerant systems is applied to the analysis of a hypothetical system, loosely based on the Fault Tolerant Parallel Processor. The models consider both transient and permanent faults, hardware and software faults, independent and related software faults, automatic recovery, and reconfiguration.

  9. BASTILLE - Better Analysis Software to Treat ILL Experiments - a unified, unifying approach to data reduction and analysis

    International Nuclear Information System (INIS)

    Data reduction and analysis is a key component in the production of scientific results. If this component, like any other in the chain, is weak, the final output is compromised. The current situation for data reduction and analysis may be regarded as adequate, but it is variable, depending on the instrument, and should be improved. In particular the delivery of new and upgraded instruments in Millennium Phase I and those proposed for Phase II will bring new demands and challenges for software development. Failure to meet these challenges will hamper the exploitation of higher data rates and the delivery of new science. The proposed project is to provide a single, underpinning software infrastructure for data analysis, which would ensure: 1) a clear vision of software provision at ILL; 2) a clear role for the 'Computing for Science' Group (CS) in maintaining and developing the infrastructure and the codes; 3) a well-defined framework for recruiting and training CS staff; 4) ease and efficiency of development within a common, well-defined software environment; 5) safeguarding of key, existing software; and 6) ease of communication with other software like instrument control software to allow real-time data analysis and experiment control, or software from other institutes or sources

  10. Microstructural analysis of quartz grains in Vasyugan suite sandstones of layer Ui1-21 in Kazanskoe deposit

    International Nuclear Information System (INIS)

    Microstructural analysis of quartz grains in sandstones revealed preferred directions which define and influence porosity and permeability anisotropy in oil and gas reservoirs In this research, we investigated the Upper Jurassic sandstone reservoir sediments from 14 wells in Kazanskoe field. The authors studied: the orientation of elongated quartz grains, and intergranular fracture within grains, as well as the pore space in oriented thin sections of sandstones. The analysis of elongated quartz grains in the bedding plane showed three main types of preferred directions in quartz grain orientation along different axes in sandstone reservoirs. Obtained results allow identifying a variability of facies and dynamic depositional environment for Upper Jurassic sandstone formation. Subsequently, these results can be used in field modeling, as well as pattern optimization of injection and production wells

  11. A suite of Gateway® cloning vectors for high-throughput genetic analysis in Saccharomyces cerevisiae

    OpenAIRE

    Alberti, Simon; Gitler, Aaron D.; Lindquist, Susan

    2007-01-01

    In the post-genomic era, academic and biotechnological research is increasingly shifting its attention from single proteins to the analysis of complex protein networks. This change in experimental design requires the use of simple and experimentally tractable organisms, such as the unicellular eukaryote Saccharomyces cerevisiae, and a range of new high-throughput techniques. The Gateway® system has emerged as a powerful high-throughput cloning method that allows for the in vitro recombination...

  12. Learning from examples - Generation and evaluation of decision trees for software resource analysis

    Science.gov (United States)

    Selby, Richard W.; Porter, Adam A.

    1988-01-01

    A general solution method for the automatic generation of decision (or classification) trees is investigated. The approach is to provide insights through in-depth empirical characterization and evaluation of decision trees for software resource data analysis. The trees identify classes of objects (software modules) that had high development effort. Sixteen software systems ranging from 3,000 to 112,000 source lines were selected for analysis from a NASA production environment. The collection and analysis of 74 attributes (or metrics), for over 4,700 objects, captured information about the development effort, faults, changes, design style, and implementation style. A total of 9,600 decision trees were automatically generated and evaluated. The trees correctly identified 79.3 percent of the software modules that had high development effort or faults, and the trees generated from the best parameter combinations correctly identified 88.4 percent of the modules on the average.

  13. A COMPARISON OF STEPWISE AND FUZZY MULTIPLE REGRESSION ANALYSIS TECHNIQUES FOR MANAGING SOFTWARE PROJECT RISKS: ANALYSIS PHASE

    Directory of Open Access Journals (Sweden)

    Abdelrafe Elzamly

    2014-01-01

    Full Text Available Risk is not always avoidable, but it is controllable. The aim of this study is to identify whether those techniques are effective in reducing software failure. This motivates the authors to continue the effort to enrich the managing software project risks with consider mining and quantitative approach with large data set. In this study, two new techniques are introduced namely stepwise multiple regression analysis and fuzzy multiple regression to manage the software risks. Two evaluation procedures such as MMRE and Pred (25 is used to compare the accuracy of techniques. The model’s accuracy slightly improves in stepwise multiple regression rather than fuzzy multiple regression. This study will guide software managers to apply software risk management practices with real world software development organizations and verify the effectiveness of the new techniques and approaches on a software project. The study has been conducted on a group of software project using survey questionnaire. It is hope that this will enable software managers improve their decision to increase the probability of software project success.

  14. Experimental analysis of specification language diversity impact on NPP software diversity

    International Nuclear Information System (INIS)

    In order to increase computer system reliability, software fault tolerance methods have been adopted to some safety critical systems including NPP. Prevention of software common mode failure is very crucial problem in software fault tolerance, but the effective method for this problem is not found yet. In our research, to find out an effective method for prevention of software common mode failure, the impact of specification language diversity on NPP software diversity was examined experimentally. Three specification languages were used to compose three requirements specifications, and programmers made twelve product codes from the specifications. From the product codes analysis, using fault diversity criteria, we concluded that diverse specification language method would enhance program diversity through diversification of requirements specification imperfections

  15. Analysis and recommendations for a reliable programming of software based safety systems

    International Nuclear Information System (INIS)

    The present paper summarizes the results of several studies performed for the development of high software on i486 microprocessors, towards its utilization for control and safety systems for nuclear power plants. The work is based on software programmed in C language. Several recommendations oriented to high reliability software are analyzed, relating the requirements on high level language to its influence on assembler level. Several metrics are implemented, that allow for the quantification of the results achieved. New metrics were developed and other were adapted, in order to obtain more efficient indexes for the software description. Such metrics are helpful to visualize the adaptation of the software under development to the quality rules under use. A specific program developed to assist the reliability analyst on this quantification is also present in the paper. It performs the analysis of an executable program written in C language, disassembling it and evaluating its inter al structures. (author)

  16. Performance Analysis of Software to Hardware Task Migration in Codesign

    CERN Document Server

    Sebai, Dorsaf; Bennour, Imed

    2010-01-01

    The complexity of multimedia applications in terms of intensity of computation and heterogeneity of treated data led the designers to embark them on multiprocessor systems on chip. The complexity of these systems on one hand and the expectations of the consumers on the other hand complicate the designers job to conceive and supply strong and successful systems in the shortest deadlines. They have to explore the different solutions of the design space and estimate their performances in order to deduce the solution that respects their design constraints. In this context, we propose the modeling of one of the design space possible solutions: the software to hardware task migration. This modeling exploits the synchronous dataflow graphs to take into account the different migration impacts and estimate their performances in terms of throughput.

  17. Advanced space system analysis software. Technical, user, and programmer guide

    Science.gov (United States)

    Farrell, C. E.; Zimbelman, H. F.

    1981-01-01

    The LASS computer program provides a tool for interactive preliminary and conceptual design of LSS. Eight program modules were developed, including four automated model geometry generators, an associated mass properties module, an appendage synthesizer module, an rf analysis module, and an orbital transfer analysis module. The existing rigid body controls analysis module was modified to permit analysis of effects of solar pressure on orbital performance. A description of each module, user instructions, and programmer information are included.

  18. Comparing Results from Constant Comparative and Computer Software Methods: A Reflection about Qualitative Data Analysis

    Science.gov (United States)

    Putten, Jim Vander; Nolen, Amanda L.

    2010-01-01

    This study compared qualitative research results obtained by manual constant comparative analysis with results obtained by computer software analysis of the same data. An investigated about issues of trustworthiness and accuracy ensued. Results indicated that the inductive constant comparative data analysis generated 51 codes and two coding levels…

  19. Computer Software for Design, Analysis and Control of Fluid Power Systems

    DEFF Research Database (Denmark)

    Conrad, Finn; Sørensen, Torben; Grahl-Madsen, Mads

    1999-01-01

    This Deliverable presents contributions from SWING's Task 2.3 Analysis of available software solutions. The Deliverable has focus on the results from this analysis having in mind the task objectives·to carry out a thorough analysis of the state-of the-art solutions for fluid power systems modelling...

  20. Improving the Accuracy of Software-Based Energy Analysis for Residential Buildings (Presentation)

    Energy Technology Data Exchange (ETDEWEB)

    Polly, B.

    2011-09-01

    This presentation describes the basic components of software-based energy analysis for residential buildings, explores the concepts of 'error' and 'accuracy' when analysis predictions are compared to measured data, and explains how NREL is working to continuously improve the accuracy of energy analysis methods.

  1. A Parametric Model of Shoulder Articulation for Virtual Assessment of Space Suit Fit

    Science.gov (United States)

    Kim, K. Han; Young, Karen S.; Bernal, Yaritza; Boppana, Abhishektha; Vu, Linh Q.; Benson, Elizabeth A.; Jarvis, Sarah; Rajulu, Sudhakar L.

    2016-01-01

    Shoulder injury is one of the most severe risks that have the potential to impair crewmembers' performance and health in long duration space flight. Overall, 64% of crewmembers experience shoulder pain after extra-vehicular training in a space suit, and 14% of symptomatic crewmembers require surgical repair (Williams & Johnson, 2003). Suboptimal suit fit, in particular at the shoulder region, has been identified as one of the predominant risk factors. However, traditional suit fit assessments and laser scans represent only a single person's data, and thus may not be generalized across wide variations of body shapes and poses. The aim of this work is to develop a software tool based on a statistical analysis of a large dataset of crewmember body shapes. This tool can accurately predict the skin deformation and shape variations for any body size and shoulder pose for a target population, from which the geometry can be exported and evaluated against suit models in commercial CAD software. A preliminary software tool was developed by statistically analyzing 150 body shapes matched with body dimension ranges specified in the Human-Systems Integration Requirements of NASA ("baseline model"). Further, the baseline model was incorporated with shoulder joint articulation ("articulation model"), using additional subjects scanned in a variety of shoulder poses across a pre-specified range of motion. Scan data was cleaned and aligned using body landmarks. The skin deformation patterns were dimensionally reduced and the co-variation with shoulder angles was analyzed. A software tool is currently in development and will be presented in the final proceeding. This tool would allow suit engineers to parametrically generate body shapes in strategically targeted anthropometry dimensions and shoulder poses. This would also enable virtual fit assessments, with which the contact volume and clearance between the suit and body surface can be predictively quantified at reduced time and

  2. Software for analysis and manipulation of genetic linkage data.

    Science.gov (United States)

    Weaver, R; Helms, C; Mishra, S K; Donis-Keller, H

    1992-06-01

    We present eight computer programs written in the C programming language that are designed to analyze genotypic data and to support existing software used to construct genetic linkage maps. Although each program has a unique purpose, they all share the common goals of affording a greater understanding of genetic linkage data and of automating tasks to make computers more effective tools for map building. The PIC/HET and FAMINFO programs automate calculation of relevant quantities such as heterozygosity, PIC, allele frequencies, and informativeness of markers and pedigrees. PREINPUT simplifies data submissions to the Centre d'Etude du Polymorphisme Humain (CEPH) data base by creating a file with genotype assignments that CEPH's INPUT program would otherwise require to be input manually. INHERIT is a program written specifically for mapping the X chromosome: by assigning a dummy allele to males, in the nonpseudoautosomal region, it eliminates falsely perceived noninheritances in the data set. The remaining four programs complement the previously published genetic linkage mapping software CRI-MAP and LINKAGE. TWOTABLE produces a more readable format for the output of CRI-MAP two-point calculations; UNMERGE is the converse to CRI-MAP's merge option; and GENLINK and LINKGEN automatically convert between the genotypic data file formats required by these packages. All eight applications read input from the same types of data files that are used by CRI-MAP and LINKAGE. Their use has simplified the management of data, has increased knowledge of the content of information in pedigrees, and has reduced the amount of time needed to construct genetic linkage maps of chromosomes. PMID:1598906

  3. A Review of CEFA Software: Comprehensive Exploratory Factor Analysis Program

    Science.gov (United States)

    Lee, Soon-Mook

    2010-01-01

    CEFA 3.02(Browne, Cudeck, Tateneni, & Mels, 2008) is a factor analysis computer program designed to perform exploratory factor analysis. It provides the main properties that are needed for exploratory factor analysis, namely a variety of factoring methods employing eight different discrepancy functions to be minimized to yield initial solutions, a…

  4. Oilpatch software

    Energy Technology Data Exchange (ETDEWEB)

    Macedo, Ricardo; Dutta, Ashok; Smith, Maurice; Chandler, Graham

    2011-07-15

    In the oil and gas industry, new software are often developed to improve productivity or processes. Some software developments are presented: Adapx developed Capturx which digitalize and send data collected on paper to the office thanks to digital pens. For offshore purposes, Oceanic Consulting Corporation developed a package including a suite of marine simulation software. Beyond Compliance developed a compliance system to make regulatory compliance easier by streamlining data collection, processing and management. Zantek Information Technology developed an accounting package that integrates all core business functions. Reality Mobile created a platform where employees can share live video onto a secured network using devices they already have. DataShare has developed an online tool to prepare emergency response plans, OPERA, and they also do mapping work. All of these six software are helping oil and gas companies to meet regulatory compliance or facilitate communication.

  5. Detection and Quantification of Nitrogen Compounds in the First Drilled Martian Solid Samples by the Sample Analysis at Mars (SAM) Instrument Suite on the Mars Science Laboratory (MSL)

    Science.gov (United States)

    Stern, J. C.; Navarro-Gonzales, R.; Freissinet, C.; McKay, C. P.; Archer, P. D., Jr.; Buch, A.; Brunner, A. E.; Coll, P.; Eigenbrode, J. L.; Franz, H. B.; Glavin, D. P.; McAdam, A. C.; Ming, D.; Steele, A.; Sutter, B.; Szopa, C.; Wray, J. J.; Conrad, P.; Mahaffy, P. R.

    2014-01-01

    The Sample Analysis at Mars (SAM) instrument suite on the Mars Science Laboratory (MSL) Curiosity Rover detected both reduced and oxidized nitrogen-bearing compounds during the pyrolysis of surface materials at Yellowknife Bay in Gale Crater. Preliminary detections of nitrogen species include NO, HCN, ClCN, CH3CN, and TFMA (trifluoro-N-methyl-acetamide). Confirmation of indigenous Martian N-bearing compounds requires quantifying N contribution from the terrestrial derivatization reagents (e.g. N-methyl-N-tertbutyldimethylsilyltrifluoroacetamide, MTBSTFA and dimethylformamide, DMF) carried for SAM's wet chemistry experiment that contribute to the SAM background. Nitrogen species detected in the SAM solid sample analyses can also be produced during laboratory pyrolysis experiments where these reagents are heated in the presence of perchlorate, a compound that has also been identified by SAM in Mars solid samples.

  6. Design and validation of Segment - freely available software for cardiovascular image analysis

    Directory of Open Access Journals (Sweden)

    Engblom Henrik

    2010-01-01

    Full Text Available Abstract Background Commercially available software for cardiovascular image analysis often has limited functionality and frequently lacks the careful validation that is required for clinical studies. We have already implemented a cardiovascular image analysis software package and released it as freeware for the research community. However, it was distributed as a stand-alone application and other researchers could not extend it by writing their own custom image analysis algorithms. We believe that the work required to make a clinically applicable prototype can be reduced by making the software extensible, so that researchers can develop their own modules or improvements. Such an initiative might then serve as a bridge between image analysis research and cardiovascular research. The aim of this article is therefore to present the design and validation of a cardiovascular image analysis software package (Segment and to announce its release in a source code format. Results Segment can be used for image analysis in magnetic resonance imaging (MRI, computed tomography (CT, single photon emission computed tomography (SPECT and positron emission tomography (PET. Some of its main features include loading of DICOM images from all major scanner vendors, simultaneous display of multiple image stacks and plane intersections, automated segmentation of the left ventricle, quantification of MRI flow, tools for manual and general object segmentation, quantitative regional wall motion analysis, myocardial viability analysis and image fusion tools. Here we present an overview of the validation results and validation procedures for the functionality of the software. We describe a technique to ensure continued accuracy and validity of the software by implementing and using a test script that tests the functionality of the software and validates the output. The software has been made freely available for research purposes in a source code format on the project home

  7. Development of a gamma ray spectrometry software for neutron activation analysis using the open source concept

    International Nuclear Information System (INIS)

    In this work, a new software - SAANI (Instrumental Neutron Activation Analysis Software) was developed and used for gamma ray spectra analysis in the Neutron Activation Laboratory (LAN) of the Nuclear and Energetic Research Institute (IPEN-CNEN/SP). The software was developed to completely replace the old one - VISPECT. Besides the visual improvement in the user interface, the new software will allow the standardization of several procedures which are done nowadays in several different ways by each researcher, avoiding intermediate steps in the calculations. By using a modern programming language - Python, together with the graphical library Qt (by Trolltech), both multi-platform, the new software is able to run in Windows, Linux and other platforms. In addition to this, the new software has being designed to be extensible through plug-ins. In order to achieve the proposed initial scope, that is, completely replace the old software, SAANI has undergone several and different kinds of tests, using spectra from certified reference materials, standards and common spectra already analyzed by other software or that were used in international inter-comparisons. The results obtained by SAANI in all tests were considered very good. Some small discrepancies were found and after careful search and analysis, their source was identified as being an accuracy bug in the old software. Usability and robustness tests were conducted by installing SAANI in several laboratory computers and following them during daily utilization. The results of these tests also indicated that SAANI was ready to be used by all researchers in the LAN-IPEN. (author)

  8. Requirement analysis of the safety-critical software implementation for the nuclear power plant

    International Nuclear Information System (INIS)

    The safety critical software shall be implemented under the strict regulation and standards along with hardware qualification. In general, the safety critical software has been implemented using functional block language (FBL) and structured language like C in the real project. Software design shall comply with such characteristics as; modularity, simplicity, minimizing the use of sub-routine, and excluding the interrupt logic. To meet these prerequisites, we used the computer-aided software engineering (CASE) tool to substantiate the requirements traceability matrix that were manually developed using Word processors or Spreadsheets. And the coding standard and manual have been developed to confirm the quality of software development process, such as; readability, consistency, and maintainability in compliance with NUREG/CR-6463. System level preliminary hazard analysis (PHA) is performed by analyzing preliminary safety analysis report (PSAR) and FMEA document. The modularity concept is effectively implemented for the overall module configurations and functions using RTP software development tool. The response time imposed on the basis of the deterministic structure of the safety-critical software was measured

  9. STEM - software test and evaluation methods: fault detection using static analysis techniques

    International Nuclear Information System (INIS)

    STEM is a software reliability project with the objective of evaluating a number of fault detection and fault estimation methods which can be applied to high integrity software. This Report gives some interim results of applying both manual and computer-based static analysis techniques, in particular SPADE, to an early CERL version of the PODS software containing known faults. The main results of this study are that: The scope for thorough verification is determined by the quality of the design documentation; documentation defects become especially apparent when verification is attempted. For well-defined software, the thoroughness of SPADE-assisted verification for detecting a large class of faults was successfully demonstrated. For imprecisely-defined software (not recommended for high-integrity systems) the use of tools such as SPADE is difficult and inappropriate. Analysis and verification tools are helpful, through their reliability and thoroughness. However, they are designed to assist, not replace, a human in validating software. Manual inspection can still reveal errors (such as errors in specification and errors of transcription of systems constants) which current tools cannot detect. There is a need for tools to automatically detect typographical errors in system constants, for example by reporting outliers to patterns. To obtain the maximum benefit from advanced tools, they should be applied during software development (when verification problems can be detected and corrected) rather than retrospectively. (author)

  10. Systematic Analysis Method of Shear-Wave Splitting:SAM Software System

    Institute of Scientific and Technical Information of China (English)

    Gao Yuan; Liu Xiqiang; Liang Wei; Hao Ping

    2004-01-01

    In order to make a more effective use of the data from regional digital seismograph networks and to promote the study on shear wave splitting and its application to earthquake stressforecasting, SAM software system, i.e., the software on systematic analysis method of shear wave splitting has been developed. This paper introduces the design aims, system structure,function and characteristics about the SAM software system and shows some graphical interfaces of data input and result output. Lastly, it discusses preliminarily the study of shear wave splitting and its application to earthquake forecasting.

  11. Featureous: infrastructure for feature-centric analysis of object-oriented software

    DEFF Research Database (Denmark)

    Olszak, Andrzej; Jørgensen, Bo Nørregaard

    2010-01-01

    The decentralized nature of collaborations between objects in object-oriented software makes it difficult to understand how user-observable program features are implemented and how their implementations relate to each other. It is worthwhile to improve this situation, since feature-centric program...... understanding and modification are essential during software evolution and maintenance. In this paper, we present an infrastructure built on top of the NetBeans IDE called Featureous that allows for rapid construction of tools for feature-centric analysis of object-oriented software. Our infrastructure...

  12. FEATUREOUS: AN INTEGRATED ENVIRONMENT FOR FEATURE-CENTRIC ANALYSIS AND MODIFICATION OF OBJECT-ORIENTED SOFTWARE

    DEFF Research Database (Denmark)

    Olszak, Andrzej; Jørgensen, Bo Nørregaard

    2011-01-01

    The decentralized nature of collaborations between objects in object-oriented software makes it difficult to understand the implementations of user-observable program features and their respective interdependencies. As feature-centric program understanding and modification are essential during...... software maintenance and evolution, this situation needs to change. In this paper, we present Featureous, an integrated development environment built on top of the NetBeans IDE that facilitates feature-centric analysis of object-oriented software. Our integrated development environment encompasses a...

  13. Surveillance Analysis Computer System (SACS): Software requirements specification (SRS). Revision 2

    International Nuclear Information System (INIS)

    This document is the primary document establishing requirements for the Surveillance Analysis Computer System (SACS) database, an Impact Level 3Q system. SACS stores information on tank temperatures, surface levels, and interstitial liquid levels. This information is retrieved by the customer through a PC-based interface and is then available to a number of other software tools. The software requirements specification (SRS) describes the system requirements for the SACS Project, and follows the Standard Engineering Practices (WHC-CM-6-1), Software Practices (WHC-CM-3-10) and Quality Assurance (WHC-CM-4-2, QR 19.0) policies

  14. The R software fundamentals of programming and statistical analysis

    CERN Document Server

    Lafaye de Micheaux, Pierre; Liquet, Benoit

    2013-01-01

    The contents of The R Software are presented so as to be both comprehensive and easy for the reader to use. Besides its application as a self-learning text, this book can support lectures on R at any level from beginner to advanced. This book can serve as a textbook on R for beginners as well as more advanced users, working on Windows, MacOs or Linux OSes. The first part of the book deals with the heart of the R language and its fundamental concepts, including data organization, import and export, various manipulations, documentation, plots, programming and maintenance.  The last chapter in this part deals with oriented object programming as well as interfacing R with C/C++ or Fortran, and contains a section on debugging techniques. This is followed by the second part of the book, which provides detailed explanations on how to perform many standard statistical analyses, mainly in the Biostatistics field. Topics from mathematical and statistical settings that are included are matrix operations, integration, o...

  15. Development of high performance casting analysis software by coupled parallel computation

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Up to now, so much casting analysis software has been continuing to develop the new access way to real casting processes. Those include the melt flow analysis, heat transfer analysis for solidification calculation,mechanical property predictions and microstructure predictions. These trials were successful to obtain the ideal results comparing with real situations, so that CAE technologies became inevitable to design or develop new casting processes. But for manufacturing fields, CAE technologies are not so frequently being used because of their difficulties in using the software or insufficient computing performances. To introduce CAE technologies to manufacturing field,the high performance analysis is essential to shorten the gap between product designing time and prototyping time.The software code optimization can be helpful, but it is not enough, because the codes developed by software experts are already optimized enough. As an alternative proposal for high performance computations, the parallel computation technologies are eagerly being applied to CAE technologies to make the analysis time shorter. In this research, SMP (Shared Memory Processing) and MPI (Message Passing Interface) (1) methods for parallelization were applied to commercial software "Z-Cast" to calculate the casting processes. In the code parallelizing processes,the network stabilization, core optimization were also carried out under Microsoft Windows platform and their performances and results were compared with those of normal linear analysis codes.

  16. Flexible Global Software Development (GSD): Antecedents of Success in Requirements Analysis

    OpenAIRE

    Vanita Yadav; Monica Adya; Varadharajan Sridhar; Dhruv Nath

    2009-01-01

    Globalization of software development has resulted in a rapid shift away from the traditional collocated, on-site development model, to the offshoring model. Emerging trends indicate an increasing interest in offshoring even in early phases like requirements analysis. Additionally, the flexibility offered by the agile development approach makes it attractive for adaptation in globally distributed software work. A question of significance then is what impacts the success of offshoring earlier ...

  17. Software Requirement Analysis Enhancements by Prioritizing Requirement Attributes Using Rank Based Agents

    OpenAIRE

    Ashok Kumar; Vinay Goyal

    2011-01-01

    This paper proposes a new technique in the domain of Agent oriented software engineering. Agents work in autonomous environments and can respond to agent triggers. Agents can be very useful in requirement analysis phase of software development process, where they can react towards the requirement triggers and result in aligned notations to identify the best possible design solution from existing designs. Agent helps in design generation process, which includes the use of Artificial intelligen...

  18. Spectral graph theory analysis of software-defined networks to improve performance and security

    OpenAIRE

    Parker, Thomas C.

    2015-01-01

    Software-defined networks are revolutionizing networking by providing unprecedented visibility into and control over data communication networks. The focus of this work is to develop a method to extract network features, develop a closed-loop control framework for a software-defined network, and build a test bed to validate the proposed scheme. The method developed to extract the network features is called the dual-basis analysis, which is based on the eigendecomposition of a weighted graph t...

  19. 软件脆弱性分析%Software Vulnerability Analysis

    Institute of Scientific and Technical Information of China (English)

    李新明; 李艺; 徐晓梅; 韩存兵

    2003-01-01

    Software vulnerability is the root reason that cause computer system security problem. It' s a new researchtopic to analyze vulnerability based on the essence of software vulnerability. This paper analyzes the main definitionsand taxonomies of vulnerability,studies vulnerability database and tools for vulnerability analysis and detection,andgives the details about what caused the most common vnlnerabilities in the LINUX/UNIX operating systems.

  20. Automated Source Code Analysis to Identify and Remove Software Security Vulnerabilities: Case Studies on Java Programs

    OpenAIRE

    2013-01-01

    The high-level contribution of this paper is to illustrate the development of generic solution strategies to remove software security vulnerabilities that could be identified using automated tools for source code analysis on software programs (developed in Java). We use the Source Code Analyzer and Audit Workbench automated tools, developed by HP Fortify Inc., for our testing purposes. We present case studies involving a file writer program embedded with features for password validation, and ...