WorldWideScience

Sample records for analysis software suite

  1. The ESA's Space Trajectory Analysis software suite

    Science.gov (United States)

    Ortega, Guillermo

    The European Space Agency (ESA) initiated in 2005 an internal activity to develop an open source software suite involving university science departments and research institutions all over the world. This project is called the "Space Trajectory Analysis" or STA. This article describes the birth of STA and its present configuration. One of the STA aims is to promote the exchange of technical ideas, and raise knowledge and competence in the areas of applied mathematics, space engineering, and informatics at University level. Conceived as a research and education tool to support the analysis phase of a space mission, STA is able to visualize a wide range of space trajectories. These include among others ascent, re-entry, descent and landing trajectories, orbits around planets and moons, interplanetary trajectories, rendezvous trajectories, etc. The article explains that STA project is an original idea of the Technical Directorate of ESA. It was born in August 2005 to provide a framework in astrodynamics research at University level. As research and education software applicable to Academia, a number of Universities support this development by joining ESA in leading the development. ESA and Universities partnership are expressed in the STA Steering Board. Together with ESA, each University has a chair in the board whose tasks are develop, control, promote, maintain, and expand the software suite. The article describes that STA provides calculations in the fields of spacecraft tracking, attitude analysis, coverage and visibility analysis, orbit determination, position and velocity of solar system bodies, etc. STA implements the concept of "space scenario" composed of Solar system bodies, spacecraft, ground stations, pads, etc. It is able to propagate the orbit of a spacecraft where orbital propagators are included. STA is able to compute communication links between objects of a scenario (coverage, line of sight), and to represent the trajectory computations and

  2. Orbit Software Suite

    Science.gov (United States)

    Osgood, Cathy; Williams, Kevin; Gentry, Philip; Brownfield, Dana; Hallstrom, John; Stuit, Tim

    2012-01-01

    Orbit Software Suite is used to support a variety of NASA/DM (Dependable Multiprocessor) mission planning and analysis activities on the IPS (Intrusion Prevention System) platform. The suite of Orbit software tools (Orbit Design and Orbit Dynamics) resides on IPS/Linux workstations, and is used to perform mission design and analysis tasks corresponding to trajectory/ launch window, rendezvous, and proximity operations flight segments. A list of tools in Orbit Software Suite represents tool versions established during/after the Equipment Rehost-3 Project.

  3. eXtended CASA Line Analysis Software Suite (XCLASS)

    CERN Document Server

    Möller, T; Schilke, P

    2015-01-01

    The eXtended CASA Line Analysis Software Suite (XCLASS) is a toolbox for the Common Astronomy Software Applications package (CASA) containing new functions for modeling interferometric and single dish data. Among the tools is the myXCLASS program which calculates synthetic spectra by solving the radiative transfer equation for an isothermal object in one dimension, whereas the finite source size and dust attenuation are considered as well. Molecular data required by the myXCLASS program are taken from an embedded SQLite3 database containing entries from the Cologne Database for Molecular Spectroscopy CDMS) and JPL using the Virtual Atomic and Molecular Data Center (VAMDC) portal. Additionally, the toolbox provides an interface for the model optimizer package Modeling and Analysis Generic Interface for eXternal numerical codes (MAGIX), which helps to find the best description of observational data using myXCLASS (or another external model program), i.e., finding the parameter set that most closely reproduces t...

  4. Design and Implementation of Convex Analysis of Mixtures Software Suite

    OpenAIRE

    Meng, Fan

    2012-01-01

    Various convex analysis of mixtures (CAM) based algorithms have been developed to address real world blind source separation (BSS) problems and proven to have good performances in previous papers. This thesis reported the implementation of a comprehensive software CAM-Java, which contains three different CAM based algorithms, CAM compartment modeling (CAM-CM), CAM non-negative independent component analysis (CAM-nICA), and CAM non-negative well-grounded component analysis (CAM-nWCA). The imp...

  5. Navigation/Prop Software Suite

    Science.gov (United States)

    Bruchmiller, Tomas; Tran, Sanh; Lee, Mathew; Bucker, Scott; Bupane, Catherine; Bennett, Charles; Cantu, Sergio; Kwong, Ping; Propst, Carolyn

    2012-01-01

    Navigation (Nav)/Prop software is used to support shuttle mission analysis, production, and some operations tasks. The Nav/Prop suite containing configuration items (CIs) resides on IPS/Linux workstations. It features lifecycle documents, and data files used for shuttle navigation and propellant analysis for all flight segments. This suite also includes trajectory server, archive server, and RAT software residing on MCC/Linux workstations. Navigation/Prop represents tool versions established during or after IPS Equipment Rehost-3 or after the MCC Rehost.

  6. A Comprehensive Software Suite for the Analysis of cDNAs

    Institute of Scientific and Technical Information of China (English)

    Kazuharu Arakawa; Haruo Suzuki; Kosuke Fujishima; Kenji Fujimoto; Sho Ueda; Motomu Matsui; Masaru Tomita

    2005-01-01

    We have developed a comprehensive software suite for bioinformatics research of cDNAs; it is aimed at rapid characterization of the features of genes and the proteins they code. Methods implemented include the detection of translation initiation and termination signals, statistical analysis of codon usage, comparative study of amino acid composition, comparative modeling of the structures of product proteins, prediction of alternative splice forms, and metabolic pathway reconstruction.The software package is freely available under the GNU General Public License at http://www.g-language.org/data/cdna/.

  7. IMAGE Software Suite

    Science.gov (United States)

    Gallagher, Dennis L.; Rose, M. Franklin (Technical Monitor)

    2000-01-01

    The IMAGE Mission is generating a truely unique set of magnetospheric measurement through a first-of-its-kind complement of remote, global observations. These data are being distributed in the Universal Data Format (UDF), which consists of data, calibration, and documentation. This is an open dataset, available to all by request to the National Space Science Data Center (NSSDC) at NASA Goddard Space Flight Center. Browse data, which consists of summary observations, is also available through the NSSDC in the Common Data Format (CDF) and graphic representations of the browse data. Access to the browse data can be achieved through the NSSDC CDAWeb services or by use of NSSDC provided software tools. This presentation documents the software tools, being provided by the IMAGE team, for use in viewing and analyzing the UDF telemetry data. Like the IMAGE data, these tools are openly available. What these tools can do, how they can be obtained, and how they are expected to evolve will be discussed.

  8. Developing a Comprehensive Software Suite for Advanced Reactor Performance and Safety Analysis

    International Nuclear Information System (INIS)

    This paper provides an introduction to the reactor analysis capabilities of the nuclear power reactor simulation tools that are being developed as part of the US Department of Energy’s Nuclear Energy Advanced Modeling and Simulation (NEAMS) Toolkit. The NEAMS Toolkit is an integrated suite of multiphysics simulation tools that leverage high performance computing to reduce uncertainty in the prediction of the performance and safety of advanced reactor and fuel designs. The toolkit effort is composed of two major components, the fuels product line, which provides tools for fuel performance analysis, and the reactor product line, which provides tools for reactor performance and safety analysis. This paper presents an overview of the NEAMS reactor product line development effort. (author)

  9. HDBStat!: A platform-independent software suite for statistical analysis of high dimensional biology data

    Directory of Open Access Journals (Sweden)

    Brand Jacob PL

    2005-04-01

    Full Text Available Abstract Background Many efforts in microarray data analysis are focused on providing tools and methods for the qualitative analysis of microarray data. HDBStat! (High-Dimensional Biology-Statistics is a software package designed for analysis of high dimensional biology data such as microarray data. It was initially developed for the analysis of microarray gene expression data, but it can also be used for some applications in proteomics and other aspects of genomics. HDBStat! provides statisticians and biologists a flexible and easy-to-use interface to analyze complex microarray data using a variety of methods for data preprocessing, quality control analysis and hypothesis testing. Results Results generated from data preprocessing methods, quality control analysis and hypothesis testing methods are output in the form of Excel CSV tables, graphs and an Html report summarizing data analysis. Conclusion HDBStat! is a platform-independent software that is freely available to academic institutions and non-profit organizations. It can be downloaded from our website http://www.soph.uab.edu/ssg_content.asp?id=1164.

  10. The Software Architecture of the Upgraded ESA DRAMA Software Suite

    Science.gov (United States)

    Kebschull, Christopher; Flegel, Sven; Gelhaus, Johannes; Mockel, Marek; Braun, Vitali; Radtke, Jonas; Wiedemann, Carsten; Vorsmann, Peter; Sanchez-Ortiz, Noelia; Krag, Holger

    2013-08-01

    In the beginnings of man's space flight activities there was the belief that space is so big that everybody could use it without any repercussions. However during the last six decades the increasing use of Earth's orbits has lead to a rapid growth in the space debris environment, which has a big influence on current and future space missions. For this reason ESA issued the "Requirements on Space Debris Mitigation for ESA Projects" [1] in 2008, which apply to all ESA missions henceforth. The DRAMA (Debris Risk Assessment and Mitigation Analysis) software suite had been developed to support the planning of space missions to comply with these requirements. During the last year the DRAMA software suite has been upgraded under ESA contract by TUBS and DEIMOS to include additional tools and increase the performance of existing ones. This paper describes the overall software architecture of the ESA DRAMA software suite. Specifically the new graphical user interface, which manages the five main tools ARES (Assessment of Risk Event Statistics), MIDAS (MASTER-based Impact Flux and Damage Assessment Software), OSCAR (Orbital Spacecraft Active Removal), CROC (Cross Section of Complex Bodies) and SARA (Re-entry Survival and Risk Analysis) is being discussed. The advancements are highlighted as well as the challenges that arise from the integration of the five tool interfaces. A framework had been developed at the ILR and was used for MASTER-2009 and PROOF-2009. The Java based GUI framework, enables the cross-platform deployment, and its underlying model-view-presenter (MVP) software pattern, meet strict design requirements necessary to ensure a robust and reliable method of operation in an environment where the GUI is separated from the processing back-end. While the GUI framework evolved with each project, allowing an increasing degree of integration of services like validators for input fields, it has also increased in complexity. The paper will conclude with an outlook on

  11. ORBS, ORCS, OACS, a Software Suite for Data Reduction and Analysis of the Hyperspectral Imagers SITELLE and SpIOMM

    Science.gov (United States)

    Martin, T.; Drissen, L.; Joncas, G.

    2015-09-01

    SITELLE (installed in 2015 at the Canada-France-Hawaii Telescope) and SpIOMM (a prototype attached to the Observatoire du Mont-Mégantic) are the first Imaging Fourier Transform Spectrometers (IFTS) capable of obtaining a hyperspectral data cube which samples a 12 arc minutes field of view into four millions of visible spectra. The result of each observation is made up of two interferometric data cubes which need to be merged, corrected, transformed and calibrated in order to get a spectral cube of the observed region ready to be analysed. ORBS is a fully automatic data reduction software that has been entirely designed for this purpose. The data size (up to 68 Gb for larger science cases) and the computational needs have been challenging and the highly parallelized object-oriented architecture of ORBS reflects the solutions adopted which made possible to process 68 Gb of raw data in less than 11 hours using 8 cores and 22.6 Gb of RAM. It is based on a core framework (ORB) that has been designed to support the whole software suite for data analysis (ORCS and OACS), data simulation (ORUS) and data acquisition (IRIS). They all aim to provide a strong basis for the creation and development of specialized analysis modules that could benefit the scientific community working with SITELLE and SpIOMM.

  12. Analysis of Array-CGH Data Using the R and Bioconductor Software Suite

    Directory of Open Access Journals (Sweden)

    Winfried A. Hofmann

    2009-01-01

    Full Text Available Background. Array-based comparative genomic hybridization (array-CGH is an emerging high-resolution and high-throughput molecular genetic technique that allows genome-wide screening for chromosome alterations. DNA copy number alterations (CNAs are a hallmark of somatic mutations in tumor genomes and congenital abnormalities that lead to diseases such as mental retardation. However, accurate identification of amplified or deleted regions requires a sequence of different computational analysis steps of the microarray data. Results. We have developed a user-friendly and versatile tool for the normalization, visualization, breakpoint detection, and comparative analysis of array-CGH data which allows the accurate and sensitive detection of CNAs. Conclusion. The implemented option for the determination of minimal altered regions (MARs from a series of tumor samples is a step forward in the identification of new tumor suppressor genes or oncogenes.

  13. CASS—CFEL-ASG software suite

    Science.gov (United States)

    Foucar, Lutz; Barty, Anton; Coppola, Nicola; Hartmann, Robert; Holl, Peter; Hoppe, Uwe; Kassemeyer, Stephan; Kimmel, Nils; Küpper, Jochen; Scholz, Mirko; Techert, Simone; White, Thomas A.; Strüder, Lothar; Ullrich, Joachim

    2012-10-01

    The Max Planck Advanced Study Group (ASG) at the Center for Free Electron Laser Science (CFEL) has created the CFEL-ASG Software Suite CASS to view, process and analyse multi-parameter experimental data acquired at Free Electron Lasers (FELs) using the CFEL-ASG Multi Purpose (CAMP) instrument Strüder et al. (2010) [6]. The software is based on a modular design so that it can be adjusted to accommodate the needs of all the various experiments that are conducted with the CAMP instrument. In fact, this allows the use of the software in all experiments where multiple detectors are involved. One of the key aspects of CASS is that it can be used either 'on-line', using a live data stream from the free-electron laser facility's data acquisition system to guide the experiment, and 'off-line', on data acquired from a previous experiment which has been saved to file. Program summary Program title: CASS Catalogue identifier: AEMP_v1_0 Program summary URL: http://cpc.cs.qub.ac.uk/summaries/AEMP_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public Licence, version 3 No. of lines in distributed program, including test data, etc.: 167073 No. of bytes in distributed program, including test data, etc.: 1065056 Distribution format: tar.gz Programming language: C++. Computer: Intel x86-64. Operating system: GNU/Linux (for information about restrictions see outlook). RAM: >8 GB Classification: 2.3, 3, 15, 16.4. External routines: Qt-Framework[1], SOAP[2], (optional HDF5[3], VIGRA[4], ROOT[5], QWT[6]) Nature of problem: Analysis and visualisation of scientific data acquired at Free-Electron-Lasers Solution method: Generalise data access and storage so that a variety of small programming pieces can be linked to form a complex analysis chain. Unusual features: Complex analysis chains can be built without recompiling the program Additional comments: An updated extensive documentation of CASS is available

  14. The BTeV Software Tutorial Suite

    Energy Technology Data Exchange (ETDEWEB)

    Robert K. Kutschke

    2004-02-20

    The BTeV Collaboration is starting to develop its C++ based offline software suite, an integral part of which is a series of tutorials. These tutorials are targeted at a diverse audience, including new graduate students, experienced physicists with little or no C++ experience, those with just enough C++ to be dangerous, and experts who need only an overview of the available tools. The tutorials must both teach C++ in general and the BTeV specific tools in particular. Finally, they must teach physicists how to find and use the detailed documentation. This report will review the status of the BTeV experiment, give an overview of the plans for and the state of the software and will then describe the plans for the tutorial suite.

  15. Spinal Test Suites for Software Product Lines

    OpenAIRE

    Beohar, Harsh; Mousavi, MR Mohammad Reza

    2014-01-01

    A major challenge in testing software product lines is efficiency. In particular, testing a product line should take less effort than testing each and every product individually. We address this issue in the context of input-output conformance testing, which is a formal theory of model-based testing. We extend the notion of conformance testing on input-output featured transition systems with the novel concept of spinal test suites. We show how this concept dispenses with retesting the common ...

  16. A metrics suite for coupling measurement of software architecture

    Institute of Scientific and Technical Information of China (English)

    KONG Qing-yan; LUN Li-jun; ZHAO Jia-hua; WANG Yi-he

    2009-01-01

    To better evaluate the quality of software architecture,a metrics suite is proposed to measure the coupling of software architecture models,in which CBC is used to measure the coupling between components,CBCC is used to measure the coupling of transferring message between components,CBCCT is used to measure the coupling of software architecture, WCBCC is used to measure the coupling of transferring message with weight between components,and WCBCCT is used to measure the coupling of message transmission with weight in the whole software architecture.The proposed algorithm for the coupling metrics is applied to the design of serve software architecture.Analysis of an example validates the feasibility of this metrics suite.

  17. Engineering Software Suite Validates System Design

    Science.gov (United States)

    2007-01-01

    EDAptive Computing Inc.'s (ECI) EDAstar engineering software tool suite, created to capture and validate system design requirements, was significantly funded by NASA's Ames Research Center through five Small Business Innovation Research (SBIR) contracts. These programs specifically developed Syscape, used to capture executable specifications of multi-disciplinary systems, and VectorGen, used to automatically generate tests to ensure system implementations meet specifications. According to the company, the VectorGen tests considerably reduce the time and effort required to validate implementation of components, thereby ensuring their safe and reliable operation. EDASHIELD, an additional product offering from ECI, can be used to diagnose, predict, and correct errors after a system has been deployed using EDASTAR -created models. Initial commercialization for EDASTAR included application by a large prime contractor in a military setting, and customers include various branches within the U.S. Department of Defense, industry giants like the Lockheed Martin Corporation, Science Applications International Corporation, and Ball Aerospace and Technologies Corporation, as well as NASA's Langley and Glenn Research Centers

  18. Strengthening Software Authentication with the ROSE Software Suite

    International Nuclear Information System (INIS)

    Many recent nonproliferation and arms control software projects include a software authentication regime. These include U.S. Government-sponsored projects both in the United States and in the Russian Federation (RF). This trend toward requiring software authentication is only accelerating. Demonstrating assurance that software performs as expected without hidden ''backdoors'' is crucial to a project's success. In this context, ''authentication'' is defined as determining that a software package performs only its intended purpose and performs said purpose correctly and reliably over the planned duration of an agreement. In addition to visual inspections by knowledgeable computer scientists, automated tools are needed to highlight suspicious code constructs, both to aid visual inspection and to guide program development. While many commercial tools are available for portions of the authentication task, they are proprietary and not extensible. An open-source, extensible tool can be customized to the unique needs of each project (projects can have both common and custom rules to detect flaws and security holes). Any such extensible tool has to be based on a complete language compiler. ROSE is precisely such a compiler infrastructure developed within the Department of Energy (DOE) and targeted at the optimization of scientific applications and user-defined libraries within large-scale applications (typically applications of a million lines of code). ROSE is a robust, source-to-source analysis and optimization infrastructure currently addressing large, million-line DOE applications in C and C++ (handling the full C, C99, C++ languages and with current collaborations to support Fortran90). We propose to extend ROSE to address a number of security-specific requirements, and apply it to software authentication for nonproliferation and arms control projects

  19. CAMEO (Computer-Aided Management of Emergency Operations) Software Suite

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — CAMEO is the umbrella name for a system of software applications used widely to plan for and respond to chemical emergencies. All of the programs in the suite work...

  20. BioImage Suite: An integrated medical image analysis suite: An update

    OpenAIRE

    Papademetris, Xenophon; Jackowski, Marcel P; Rajeevan, Nallakkandi; DiStasio, Marcello; Okuda, Hirohito; Constable, R. Todd; Staib, Lawrence H.

    2006-01-01

    BioImage Suite is an NIH-supported medical image analysis software suite developed at Yale. It leverages both the Visualization Toolkit (VTK) and the Insight Toolkit (ITK) and it includes many additional algorithms for image analysis especially in the areas of segmentation, registration, diffusion weighted image processing and fMRI analysis. BioImage Suite has a user-friendly user interface developed in the Tcl scripting language. A final beta version is freely available for download 1

  1. Extending and Enhancing SAS (Static Analysis Suite)

    CERN Document Server

    Ho, David

    2016-01-01

    The Static Analysis Suite (SAS) is an open-source software package used to perform static analysis on C and C++ code, helping to ensure safety, readability and maintainability. In this Summer Student project, SAS was enhanced to improve ease of use and user customisation. A straightforward method of integrating static analysis into a project at compilation time was provided using the automated build tool CMake. The process of adding checkers to the suite was streamlined and simplied by developing an automatic code generator. To make SAS more suitable for continuous integration, a reporting mechanism summarising results was added. This suitability has been demonstrated by inclusion of SAS in the Future Circular Collider Software nightly build system. Scalability of the improved package was demonstrated by using the tool to analyse the ROOT code base.

  2. Approaches in highly parameterized inversion—PEST++ Version 3, a Parameter ESTimation and uncertainty analysis software suite optimized for large environmental models

    Science.gov (United States)

    Welter, David E.; White, Jeremy T.; Hunt, Randall J.; Doherty, John E.

    2015-09-18

    The PEST++ Version 1 object-oriented parameter estimation code is here extended to Version 3 to incorporate additional algorithms and tools to further improve support for large and complex environmental modeling problems. PEST++ Version 3 includes the Gauss-Marquardt-Levenberg (GML) algorithm for nonlinear parameter estimation, Tikhonov regularization, integrated linear-based uncertainty quantification, options of integrated TCP/IP based parallel run management or external independent run management by use of a Version 2 update of the GENIE Version 1 software code, and utilities for global sensitivity analyses. The Version 3 code design is consistent with PEST++ Version 1 and continues to be designed to lower the barriers of entry for users as well as developers while providing efficient and optimized algorithms capable of accommodating large, highly parameterized inverse problems. As such, this effort continues the original focus of (1) implementing the most popular and powerful features of the PEST software suite in a fashion that is easy for novice or experienced modelers to use and (2) developing a software framework that is easy to extend.

  3. Approaches in highly parameterized inversion—PEST++ Version 3, a Parameter ESTimation and uncertainty analysis software suite optimized for large environmental models

    Science.gov (United States)

    Welter, David E.; White, Jeremy T.; Hunt, Randall J.; Doherty, John E.

    2015-01-01

    The PEST++ Version 1 object-oriented parameter estimation code is here extended to Version 3 to incorporate additional algorithms and tools to further improve support for large and complex environmental modeling problems. PEST++ Version 3 includes the Gauss-Marquardt-Levenberg (GML) algorithm for nonlinear parameter estimation, Tikhonov regularization, integrated linear-based uncertainty quantification, options of integrated TCP/IP based parallel run management or external independent run management by use of a Version 2 update of the GENIE Version 1 software code, and utilities for global sensitivity analyses. The Version 3 code design is consistent with PEST++ Version 1 and continues to be designed to lower the barriers of entry for users as well as developers while providing efficient and optimized algorithms capable of accommodating large, highly parameterized inverse problems. As such, this effort continues the original focus of (1) implementing the most popular and powerful features of the PEST software suite in a fashion that is easy for novice or experienced modelers to use and (2) developing a software framework that is easy to extend.

  4. Recent advances in the CRANK software suite for experimental phasing

    International Nuclear Information System (INIS)

    Recent developments in the CRANK software suite for experimental phasing have led to many more structures being built automatically. For its first release in 2004, CRANK was shown to effectively detect and phase anomalous scatterers from single-wavelength anomalous diffraction data. Since then, CRANK has been significantly improved and many more structures can be built automatically with single- or multiple-wavelength anomalous diffraction or single isomorphous replacement with anomalous scattering data. Here, the new algorithms that have been developed that have led to these substantial improvements are discussed and CRANK’s performance on over 100 real data sets is shown. The latest version of CRANK is freely available for download at http://www.bfsc.leidenuniv.nl/software/crank/ and from CCP4 (http://www.ccp4.ac.uk/)

  5. User Guide for the STAYSL PNNL Suite of Software Tools

    Energy Technology Data Exchange (ETDEWEB)

    Greenwood, Lawrence R.; Johnson, Christian D.

    2013-02-27

    The STAYSL PNNL software suite provides a set of tools for working with neutron activation rates measured in a nuclear fission reactor, an accelerator-based neutron source, or any neutron field to determine the neutron flux spectrum through a generalized least-squares approach. This process is referred to as neutron spectral adjustment since the preferred approach is to use measured data to adjust neutron spectra provided by neutron physics calculations. The input data consist of the reaction rates based on measured activities, an initial estimate of the neutron flux spectrum, neutron activation cross sections and their associated uncertainties (covariances), and relevant correction factors. The output consists of the adjusted neutron flux spectrum and associated covariance matrix, which is useful for neutron dosimetry and radiation damage calculations.

  6. SUIT

    DEFF Research Database (Denmark)

    Algreen-Ussing, Gregers; Wedebrunn, Ola

    2003-01-01

    Leaflet om project SUIT udgivet af European Commission. Tryksagen forklarer i korte ord resultatet af projektet SUIT. Kulturværdier i Miljøspørgsmål. Vurdering af projekter og indvirkning på miljø.......Leaflet om project SUIT udgivet af European Commission. Tryksagen forklarer i korte ord resultatet af projektet SUIT. Kulturværdier i Miljøspørgsmål. Vurdering af projekter og indvirkning på miljø....

  7. Assessment and Comparison of Fuzzy Based Test Suite Prioritization Method for GUI Based Software

    OpenAIRE

    Neha Chaudhary; O.P. Sangwan

    2016-01-01

    The testing of event driven software has significant role to improve overall quality of software. Due to event driven nature of GUI based software many test cases are generated and it is difficult to identify test cases whose fault revealing capability is high. To identify those test cases test suite prioritization is done. Various test suite prioritization methods exists for GUI based software in literature. Prioritization methods improve the rate of fault detection. In our previous work we ...

  8. Test Suite Reduction for Regression Testing of Simple Interactions between Two Software Modules

    OpenAIRE

    Dmitry, Kichigin

    2007-01-01

    This paper presents a new test suite reduction technique for regression testing of simple interactions between two software modules. The idea of the technique consists in building models of interactions between two modules and using those models for the test suite reduction. Interaction models are built using sequences of interface functions, invoked during software execution.

  9. SOFAS: Software Analysis Services

    OpenAIRE

    Ghezzi, G

    2010-01-01

    We propose a distributed and collaborative software analysis platform to enable seamless interoperability of software analysis tools across platform, geographical and organizational boundaries. In particular, we devise software analysis tools as services that can be accessed and composed over the Internet. These distributed services shall be widely accessible through a software analysis broker where organizations and research groups can register and share their tools. To enable (semi)-automat...

  10. Design and Development of Ontology Suite for Software Risk Planning, Software Risk Tracking and Software Risk Control

    Directory of Open Access Journals (Sweden)

    C. R.R. Robin

    2011-01-01

    Full Text Available Problem statement: Ontology as a conceptual courseware structure may work as a mind tool for effective teaching and as a visual navigation interface to the learning objects. Knowledge visualization is defined as the use of visual representations to transfer knowledge between at least two persons. This study presents the design, development and visualization of ontologies for Software Risk Planning, Software Risk Tracking and Software Risk Controlling. Approach: The ontologies are developed using protégé tool, an effective ontology editor and it is represented by the formal knowledge representational language OWL. In order to increase the richness of the knowledge available in the ontologies, its semantic representation is presented using ontology document generator. Finally the ontologies are effectively visualised using OntoViz. Results: The ontologies represent the domain knowledge Software Risk Planning, Software Risk Tracking and Software Risk Controlling respectively and is developed with the indention to use it as a knowledge base for effective knowledge representation, Knowledge Management and E-Learning applications. The constructed ontologies are evaluated using quantitative analysis and qualitative analysis. Conclusion: Since the average reuse ratio is 0.95, the developed ontologies are highly cohesive. Comparison of concepts and properties used in the ontologies proved that the developed ontologies are concept oriented ontology. The both quantitative and qualitative analysis says, the developed ontologies are ready to use for applications such as E-Learning, Knowledge Management.

  11. Multi Objective Test Suite Reduction for GUI Based Software Using NSGA-II

    Directory of Open Access Journals (Sweden)

    Neha Chaudhary

    2016-08-01

    Full Text Available Regression Testing is a performed to ensure modified code does not have any unintended side effect on the software. If regression testing is performed with retest-all method it will be very time consuming as testing activity. Therefore test suite reduction methods are used to reduce the size of original test suite. Objective of test suite reduction is to reduce those test cases which are redundant or less important in their fault revealing capability. Test suite reduction can only be used when time is critical to run all test cases and selective testing can only be done. Various methods exist in the literature related to test suite reduction of traditional software. Most of the methods are based of single objective optimization. In case of multi objective optimization of test suite, usually researchers assign different weight values to different objectives and combine them as single objective. However in test suite reduction multiple Pareto-optimal solutions are present, it is difficult to select one test case over other. Since GUI based software is our concern there exist very few reduction techniques and none of them consider multiple objective based reduction. In this work we propose a new test suite reduction technique based on two objectives, event weight and number of faults identified by test case. We evaluated our results for 2 different applications and we achieved 20% reduction in test suite size for both applications. In Terp Paint 3.0 application compromise 15.6% fault revealing capability and for Notepad 11.1% fault revealing capability is reduced.

  12. Rietveld analysis software for J-PARC

    International Nuclear Information System (INIS)

    A new analysis software suite, Z-Code, is under development for powder diffraction data analyses in the Materials and Life Science Facility (MLF) of the Japan Proton Accelerator Research Complex (J-PARC). This software suite comprises data processing, data analyses, graphical user interface and visualization software. As a part of Z-Code, a Rietveld analysis program for neutron (TOF and angle dispersive) and X-ray data, Z-Rietveld, has been developed. Here we report the basic traits and some significant features of Z-Rietveld.

  13. Software Suite to Support In-Flight Characterization of Remote Sensing Systems

    Science.gov (United States)

    Stanley, Thomas; Holekamp, Kara; Gasser, Gerald; Tabor, Wes; Vaughan, Ronald; Ryan, Robert; Pagnutti, Mary; Blonski, Slawomir; Kenton, Ross

    2014-01-01

    A characterization software suite was developed to facilitate NASA's in-flight characterization of commercial remote sensing systems. Characterization of aerial and satellite systems requires knowledge of ground characteristics, or ground truth. This information is typically obtained with instruments taking measurements prior to or during a remote sensing system overpass. Acquired ground-truth data, which can consist of hundreds of measurements with different data formats, must be processed before it can be used in the characterization. Accurate in-flight characterization of remote sensing systems relies on multiple field data acquisitions that are efficiently processed, with minimal error. To address the need for timely, reproducible ground-truth data, a characterization software suite was developed to automate the data processing methods. The characterization software suite is engineering code, requiring some prior knowledge and expertise to run. The suite consists of component scripts for each of the three main in-flight characterization types: radiometric, geometric, and spatial. The component scripts for the radiometric characterization operate primarily by reading the raw data acquired by the field instruments, combining it with other applicable information, and then reducing it to a format that is appropriate for input into MODTRAN (MODerate resolution atmospheric TRANsmission), an Air Force Research Laboratory-developed radiative transport code used to predict at-sensor measurements. The geometric scripts operate by comparing identified target locations from the remote sensing image to known target locations, producing circular error statistics defined by the Federal Geographic Data Committee Standards. The spatial scripts analyze a target edge within the image, and produce estimates of Relative Edge Response and the value of the Modulation Transfer Function at the Nyquist frequency. The software suite enables rapid, efficient, automated processing of

  14. A new methane control and prediction software suite for longwall mines

    Science.gov (United States)

    Dougherty, Heather N.; Özgen Karacan, C.

    2011-09-01

    This paper presents technical and application aspects of a new software suite, MCP (Methane Control and Prediction), developed for addressing some of the methane and methane control issues in longwall coal mines. The software suite consists of dynamic link library (DLL) extensions to MS-Access TM, written in C++. In order to create the DLLs, various statistical, mathematical approaches, prediction and classification artificial neural network (ANN) methods were used. The current version of MCP suite (version 1.3) discussed in this paper has four separate modules that (a) predict the dynamic elastic properties of coal-measure rocks, (b) predict ventilation emissions from longwall mines, (c) determine the type of degasification system that needs to be utilized for given situations and (d) assess the production performance of gob gas ventholes that are used to extract methane from longwall gobs. These modules can be used with the data from basic logs, mining, longwall panel, productivity, and coal bed characteristics. The applications of these modules separately or in combination for methane capture and control related problems will help improve the safety of mines. The software suite's version 1.3 is discussed in this paper. Currently, it's new version 2.0 is available and can be downloaded from http://www.cdc.gov/niosh/mining/products/product180.htm free of charge. The models discussed in this paper can be found under "ancillary models" and under "methane prediction models" for specific U.S. conditions in the new version.

  15. Software safety hazard analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lawrence, J.D. [Lawrence Livermore National Lab., CA (United States)

    1996-02-01

    Techniques for analyzing the safety and reliability of analog-based electronic protection systems that serve to mitigate hazards in process control systems have been developed over many years, and are reasonably well understood. An example is the protection system in a nuclear power plant. The extension of these techniques to systems which include digital computers is not well developed, and there is little consensus among software engineering experts and safety experts on how to analyze such systems. One possible technique is to extend hazard analysis to include digital computer-based systems. Software is frequently overlooked during system hazard analyses, but this is unacceptable when the software is in control of a potentially hazardous operation. In such cases, hazard analysis should be extended to fully cover the software. A method for performing software hazard analysis is proposed in this paper.

  16. Data analysis and graphing in an introductory physics laboratory: spreadsheet versus statistics suite

    OpenAIRE

    Peterlin, Primoz

    2010-01-01

    Two methods of data analysis are compared: spreadsheet software and a statistics software suite. Their use is compared analyzing data collected in three selected experiments taken from an introductory physics laboratory, which include a linear dependence, a non-linear dependence, and a histogram. The merits of each method are compared.

  17. Data Analysis and Graphing in an Introductory Physics Laboratory: Spreadsheet versus Statistics Suite

    Science.gov (United States)

    Peterlin, Primoz

    2010-01-01

    Two methods of data analysis are compared: spreadsheet software and a statistics software suite. Their use is compared analysing data collected in three selected experiments taken from an introductory physics laboratory, which include a linear dependence, a nonlinear dependence and a histogram. The merits of each method are compared. (Contains 7…

  18. Xmipp 3.0: an improved software suite for image processing in electron microscopy.

    Science.gov (United States)

    de la Rosa-Trevín, J M; Otón, J; Marabini, R; Zaldívar, A; Vargas, J; Carazo, J M; Sorzano, C O S

    2013-11-01

    Xmipp is a specialized software package for image processing in electron microscopy, and that is mainly focused on 3D reconstruction of macromolecules through single-particles analysis. In this article we present Xmipp 3.0, a major release which introduces several improvements and new developments over the previous version. A central improvement is the concept of a project that stores the entire processing workflow from data import to final results. It is now possible to monitor, reproduce and restart all computing tasks as well as graphically explore the complete set of interrelated tasks associated to a given project. Other graphical tools have also been improved such as data visualization, particle picking and parameter "wizards" that allow the visual selection of some key parameters. Many standard image formats are transparently supported for input/output from all programs. Additionally, results have been standardized, facilitating the interoperation between different Xmipp programs. Finally, as a result of a large code refactoring, the underlying C++ libraries are better suited for future developments and all code has been optimized. Xmipp is an open-source package that is freely available for download from: http://xmipp.cnb.csic.es.

  19. ANALYSIS OF DESIGN ELEMENTS IN SKI SUITS

    Directory of Open Access Journals (Sweden)

    Birsen Çileroğlu

    2014-06-01

    Full Text Available Popularity of Ski Sport in 19th century necessitated a new perspective on protective skiing clothing ag ainst the mountain climates and excessive cold. Winter clothing were the basis of ski attire during this period. By the beginning of 20th century lining cloth were used to minimize the wind effect. The difference between the men and women’s ski attire of the time consisted of a knee - length skirts worn over the golf trousers. Subsequent to the First World War, skiing suit models were influenced by the period uniforms and the producers reflected the fashion trends to the ski clothing. In conformance with th e prevailing trends, ski trousers were designed and produced for the women thus leading to reduction in gender differences. Increases in the ski tourism and holding of the first winter olympics in 1924 resulted in variations in ski attires, development of design characteristics, growth in user numbers, and enlargement of production capacities. Designers emphasized in their collections combined presence of elegance and practicality in the skiing attire. In 1930s, the ski suits influenced by pilots’ uniforms included characteristics permitting freedom of motion, and the design elements exhibited changes in terms of style, material and aerodynamics. In time, the ski attires showed varying design features distinguishing professionals from the amateurs. While protective functionality was primary consideration for the amateurs, for professionals the aerodynamic design was also a leading factor. Eventually, the increased differences in design characteristics were exhibited in ski suit collections, World reknown brands were formed, production and sales volumes showed significant rise. During 20th century the ski suits influenced by fashion trends to acquire unique styles reached a position of dominance to impact current fashion trends, and apart from sports attir es they became a style determinant in the clothing of cold climates. Ski suits

  20. Design and functionalities of the MADOR® software suite for dose-reduction management after DTPA therapy.

    Science.gov (United States)

    Leprince, B; Fritsch, P; Bérard, P; Roméo, P-H

    2016-03-01

    A software suite on biokinetics of radionuclides and internal dosimetry intended for the occupational health practitioners of nuclear industry and for expert opinions has been developed under Borland C++ Builder™. These computing tools allow physicians to improve the dosimetric follow-up of workers in agreement with the French regulations and to manage new internal contaminations by radionuclides such as Pu and/or Am after diethylene triamine penta-acetic acid treatments. In this paper, the concept and functionalities of the first two computing tools of this MADOR(®) suite are described. The release 0.0 is the forensic application, which allows calculating the derived recording levels for intake by inhalation or ingestion of the main radioisotopes encountered in occupational environment. Indeed, these reference values of activity are convenient to interpret rapidly the bioassay measurements and make decisions as part of medical monitoring. The release 1.0 addresses the effect of DTPA treatments on Pu/Am biokinetics and the dose benefit. The forensic results of the MADOR(®) suite were validated by comparison with reference data.

  1. Design and functionalities of the MADORR software suite for dose-reduction management after DTPA therapy

    International Nuclear Information System (INIS)

    A software suite on biokinetics of radionuclides and internal dosimetry intended for the occupational health practitioners of nuclear industry and for expert opinions has been developed under Borland C++ BuilderTM. These computing tools allow physicians to improve the dosimetric follow-up of workers in agreement with the French regulations and to manage new internal contaminations by radionuclides such as Pu and/or Am after diethylene triamine penta-acetic acid treatments. In this paper, the concept and functionalities of the first two computing tools of this MADORR suite are described. The release 0.0 is the forensic application, which allows calculating the derived recording levels for intake by inhalation or ingestion of the main radioisotopes encountered in occupational environment. Indeed, these reference values of activity are convenient to interpret rapidly the bioassay measurements and make decisions as part of medical monitoring. The release 1.0 addresses the effect of DTPA treatments on Pu/Am biokinetics and the dose benefit. The forensic results of the MADORR suite were validated by comparison with reference data. (authors)

  2. A Suite of Tools for ROC Analysis of Spatial Models

    Directory of Open Access Journals (Sweden)

    Hermann Rodrigues

    2013-09-01

    Full Text Available The Receiver Operating Characteristic (ROC is widely used for assessing the performance of classification algorithms. In GIScience, ROC has been applied to assess models aimed at predicting events, such as land use/cover change (LUCC, species distribution and disease risk. However, GIS software packages offer few statistical tests and guidance tools for ROC analysis and interpretation. This paper presents a suite of GIS tools designed to facilitate ROC curve analysis for GIS users by applying proper statistical tests and analysis procedures. The tools are freely available as models and submodels of Dinamica EGO freeware. The tools give the ROC curve, the area under the curve (AUC, partial AUC, lower and upper AUCs, the confidence interval of AUC, the density of event in probability bins and tests to evaluate the difference between the AUCs of two models. We present first the procedures and statistical tests implemented in Dinamica EGO, then the application of the tools to assess LUCC and species distribution models. Finally, we interpret and discuss the ROC-related statistics resulting from various case studies.

  3. Automatic Feature Interaction Analysis in PacoSuite

    Directory of Open Access Journals (Sweden)

    Wim Vanderperren

    2004-10-01

    Full Text Available In this paper, we build upon previous work that aims at recuperating aspect oriented ideas into component based software development. In that research, a composition adapter was proposed in order to capture crosscutting concerns in the PacoSuite component based methodology. A composition adapter is visually applied onto a given component composition and the changes it describes are automatically applied. Stacking multiple composition adapters onto the same component composition can however lead to unpredictable and undesired side-effects. In this paper, we propose a solution for this issue, widely known as the feature interaction problem. We present a classification of different interaction levels among composition adapters and the algorithms required to verify them. The proposed algorithms are however of exponential nature and depend on both the composition adapters and the component composition as a whole. In order to enhance the performance of our feature interaction analysis, we present a set of theorems that define the interaction levels solely in terms of the properties of the composition adapters themselves.

  4. Kinematic Analysis of Exoskeleton Suit for Human Arm

    Directory of Open Access Journals (Sweden)

    Surachai Panich

    2010-01-01

    Full Text Available Problem statement: There are many robotic arms developed for providing care to physically disabled people. It is difficult to find robot designs in literature that articulate such a procedure. Therefore, it is our hope that the design work shown in this study may serve as a good example of a systematic method for rehabilitation robot design. Approach: The arm exoskeleton suit was developed to increase human's strength, endurance, or speed enabling them to perform tasks that they previously could not perform. It should not impede the user's natural motion and should be comfortable and safe to wear and easy to use. Although movement is difficult for them, they usually want to go somewhere by themselves. Results: The kinematic exoskeleton suit for human arms is simulated by MATLAB software. The exoskeleton suit of human arm consists of one link length, three link twists, two link offsets and three joint angles. Conclusion: This study introduced the kinematic of exoskeleton suit for human arm. The exoskeleton suit can be used to be instrument for anyone who needs to improve human's performance. It will increase the strength of human that can lift heavy load or help handicapped patients, who cannot use their arm.

  5. The Toast++ software suite for forward and inverse modeling in optical tomography.

    Science.gov (United States)

    Schweiger, Martin; Arridge, Simon

    2014-04-01

    We present the Toast++ open-source software environment for solving the forward and inverse problems in diffuse optical tomography (DOT). The software suite consists of a set of libraries to simulate near-infrared light propagation in highly scattering media with complex boundaries and heterogeneous internal parameter distribution, based on a finite-element solver. Steady-state, time- and frequency-domain data acquisition systems can be modeled. The forward solver is implemented in C++ and supports performance acceleration with parallelization for shared and distributed memory architectures, as well as graphics processing computation. Building on the numerical forward solver, Toast++ contains model-based iterative inverse solvers for reconstructing the volume distribution of absorption and scattering parameters from boundary measurements of light transmission. A range of regularization methods are provided, including the possibility of incorporating prior knowledge of internal structure. The user can link to the Toast++ libraries either directly to compile application programs for DOT, or make use of the included MATLAB and PYTHON bindings to generate script-based solutions. This approach allows rapid prototyping and provides a rich toolset in both environments for debugging, testing, and visualization. PMID:24781586

  6. ABC Tester - Artificial Bee Colony Based Software Test Suite Optimization Approach

    Directory of Open Access Journals (Sweden)

    D. Jeya Mala

    2009-07-01

    Full Text Available In this paper we present a new, non-pheromone-based test suite optimization approach inspired by the behavior of biological bees. Our proposed approach is based on ABC (Artificial Bee Colony Optimization which is motivated by the intelligent behavior of honey bees. In our proposed system, the sites are the nodes in the Software under Test (SUT, the artificial bees modify the test cases with time and the bee?s aim is to discover the places of nodes with higher coverage and finally the one with the highest usage by the given test case. Since ABC system combines local search methods carried out by employed bees with global search methods managed by onlookers and scouts, we attain near global optima. We investigate whether this new approach outperforms existing test optimization approach based on Genetic Algorithms (GA in the task of software test optimization. Taking into account the results of our experiments, we conclude that (i the proposed approach uses fewer iterations to complete the task; (ii is more scalable, i.e., it requires less computation time to complete the task, and finally (iii our approach is best in achieving near global optimal solution.

  7. Niche idea : Pandell's Nexus suite of back-office software developed with juniors in mind

    Energy Technology Data Exchange (ETDEWEB)

    Wells, P.

    2007-07-15

    Pandell Technology Corporation has developed a complete suite of back-office software products for the junior oil and gas sector. The Nexus product line that was developed for this niche market includes JVNexus for joint venture financial accounting, AFENexus for expenditure tracking, GeoNexus for land management, and EANexus for economic analysis. Each application can help junior to midsize oil and gas companies capitalize on their internal resources, providing them an affordable way to acquire the services they need to support their business. Clients can acquire the software through a software-as-a-service (SaaS) business model. Microsoft has supported Pandell's efforts to offer clients better products and services through SaaS. The four systems cost between $450 and $750 each with no additional upfront capital expenditures. This article also listed companies that have adopted Nexus products, including Annex Petroleum Inc., Delphi Energy Corporation, and Innova Exploration Limited. Pandell is currently working on a new and improved version of GeoNexus, which will be fully Web-enabled. 1 fig.

  8. The PyRosetta Toolkit: a graphical user interface for the Rosetta software suite.

    Directory of Open Access Journals (Sweden)

    Jared Adolf-Bryfogle

    Full Text Available The Rosetta Molecular Modeling suite is a command-line-only collection of applications that enable high-resolution modeling and design of proteins and other molecules. Although extremely useful, Rosetta can be difficult to learn for scientists with little computational or programming experience. To that end, we have created a Graphical User Interface (GUI for Rosetta, called the PyRosetta Toolkit, for creating and running protocols in Rosetta for common molecular modeling and protein design tasks and for analyzing the results of Rosetta calculations. The program is highly extensible so that developers can add new protocols and analysis tools to the PyRosetta Toolkit GUI.

  9. The application of formal software engineering methods to the unattended and remote monitoring software suite at Los Alamos National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Determan, John Clifford [Los Alamos National Laboratory; Longo, Joseph F [Los Alamos National Laboratory; Michel, Kelly D [Los Alamos National Laboratory

    2009-01-01

    The Unattended and Remote Monitoring (UNARM) system is a collection of specialized hardware and software used by the International Atomic Energy Agency (IAEA) to institute nuclear safeguards at many nuclear facilities around the world. The hardware consists of detectors, instruments, and networked computers for acquiring various forms of data, including but not limited to radiation data, global position coordinates, camera images, isotopic data, and operator declarations. The software provides two primary functions: the secure and reliable collection of this data from the instruments and the ability to perform an integrated review and analysis of the disparate data sources. Several years ago the team responsible for maintaining the software portion of the UNARM system began the process of formalizing its operations. These formal operations include a configuration management system, a change control board, an issue tracking system, and extensive formal testing, for both functionality and reliability. Functionality is tested with formal test cases chosen to fully represent the data types and methods of analysis that will be commonly encountered. Reliability is tested with iterative, concurrent testing where up to five analyses are executed simultaneously for thousands of cycles. Iterative concurrent testing helps ensure that there are no resource conflicts or leaks when multiple system components are in use simultaneously. The goal of this work is to provide a high quality, reliable product, commensurate with the criticality of the application. Testing results will be presented that demonstrate that this goal has been achieved and the impact of the introduction of a formal software engineering framework to the UNARM product will be presented.

  10. Comparative Analysis of MOGA, NSGA-II and MOPSO for Regression Test Suite Optimization

    Directory of Open Access Journals (Sweden)

    Zeeshan Anwar

    2014-01-01

    Full Text Available In Software Engineering Regression Testing is a mandatory activity. Whenever, a change in existing system occurs and new version appears, the unchanged portions need to be regression tested for any resulting undesirable effects. During process of Regression Testing, same test cases are executed repeatedly for un-modified portion of software. This activity is an overhead and consumes huge resources and budget. To save time and resources, researches have proposed various techniques for Regression Test Suite Optimization. In this research regression test suites are minimized using three Computational Intelligence multi-objective techniques for black box testing methods. These include; 1- Multi-Objective Genetic Algorithms (MOGA, 2- Non-Dominated Sorting Genetic Algorithm (NSGA-II and 3- Multi-Objective Particle Swarm Optimization (MOPSO. Said techniques are applied on two published case studies and through experimentation, the quality of these techniques is analyzed. Four quality metrics are defined to perform this analysis. The results of research show that MOGA is better for reducing the size and thus execution time of the regression test suites as compared to MOPSO and NSGA-II. It was also found that use of MOGA, NSGA-II and MOPSO are not safe for regression test suite optimization. This is because fault detection rate and requirement coverage is reduced after optimization of Regression Test Suites.

  11. Distributed and collaborative software analysis

    OpenAIRE

    Ghezzi, G; H.C. Gall

    2010-01-01

    Throughout the years software engineers have come up with a myriad of specialized tools and techniques that focus on a certain type of software analysis such as source code analysis, duplication analysis, co-change analysis, bug prediction, or detection of bug fixing patterns. However, easy and straight forward synergies between these analyses and tools rarely exist because of their stand-alone nature, their platform dependence, their different input and output formats and the variety of data...

  12. Space suit bioenergetics: framework and analysis of unsuited and suited activity.

    Science.gov (United States)

    Carr, Christopher E; Newman, Dava J

    2007-11-01

    Metabolic costs limit the duration and intensity of extravehicular activity (EVA), an essential component of future human missions to the Moon and Mars. Energetics Framework: We present a framework for comparison of energetics data across and between studies. This framework, applied to locomotion, differentiates between muscle efficiency and energy recovery, two concepts often confused in the literature. The human run-walk transition in Earth gravity occurs at the point for which energy recovery is approximately the same for walking and running, suggesting a possible role for recovery in gait transitions. Muscular Energetics: Muscle physiology limits the overall efficiency by which chemical energy is converted through metabolism to useful work. Unsuited Locomotion: Walking and running use different methods of energy storage and release. These differences contribute to the relative changes in the metabolic cost of walking and running as gravity is varied, with the metabolic cost of locomoting at a given velocity changing in proportion to gravity for running and less than in proportion for walking. Space Suits: Major factors affecting the energetic cost of suited movement include suit pressurization, gravity, velocity, surface slope, and space suit configuration. Apollo lunar surface EVA traverse metabolic rates, while unexpectedly low, were higher than other activity categories. The Lunar Roving Vehicle facilitated even lower metabolic rates, thus longer duration EVAs. Muscles and tendons act like springs during running; similarly, longitudinal pressure forces in gas pressure space suits allow spring-like storage and release of energy when suits are self-supporting. PMID:18018432

  13. Data processing software suite SITENNO for coherent X-ray diffraction imaging using the X-ray free-electron laser SACLA

    International Nuclear Information System (INIS)

    The software suite SITENNO is developed for processing diffraction data collected in coherent X-ray diffraction imaging experiments of non-crystalline particles using an X-ray free-electron laser. Coherent X-ray diffraction imaging is a promising technique for visualizing the structures of non-crystalline particles with dimensions of micrometers to sub-micrometers. Recently, X-ray free-electron laser sources have enabled efficient experiments in the ‘diffraction before destruction’ scheme. Diffraction experiments have been conducted at SPring-8 Angstrom Compact free-electron LAser (SACLA) using the custom-made diffraction apparatus KOTOBUKI-1 and two multiport CCD detectors. In the experiments, ten thousands of single-shot diffraction patterns can be collected within several hours. Then, diffraction patterns with significant levels of intensity suitable for structural analysis must be found, direct-beam positions in diffraction patterns determined, diffraction patterns from the two CCD detectors merged, and phase-retrieval calculations for structural analyses performed. A software suite named SITENNO has been developed to semi-automatically apply the four-step processing to a huge number of diffraction data. Here, details of the algorithm used in the suite are described and the performance for approximately 9000 diffraction patterns collected from cuboid-shaped copper oxide particles reported. Using the SITENNO suite, it is possible to conduct experiments with data processing immediately after the data collection, and to characterize the size distribution and internal structures of the non-crystalline particles

  14. The Sample Analysis at Mars Investigation and Instrument Suite

    Science.gov (United States)

    Mahaffy, Paul; Webster, Chris R.; Cabane, M.; Conrad, Pamela G.; Coll, Patrice; Atreya, Sushil K.; Arvey, Robert; Barciniak, Michael; Benna, Mehdi; Bleacher, L.; Brinckerhoff, William B.; Eigenbrode, Jennifer L.; Carignan, Daniel; Cascia, Mark; Chalmers, Robert A.; Dworkin, Jason P.; Errigo, Therese; Everson, Paula; Franz, Heather; Farley, Rodger; Feng, Steven; Frazier, Gregory; Freissinet, Caroline; Glavin, Daniel P.; Harpold, Daniel N.

    2012-01-01

    The Sample Analysis at Mars (SAM) investigation of the Mars Science Laboratory(MSL) addresses the chemical and isotopic composition of the atmosphere and volatilesextracted from solid samples. The SAM investigation is designed to contribute substantiallyto the mission goal of quantitatively assessing the habitability of Mars as an essentialstep in the search for past or present life on Mars. SAM is a 40 kg instrument suite locatedin the interior of MSLs Curiosity rover. The SAM instruments are a quadrupole massspectrometer, a tunable laser spectrometer, and a 6-column gas chromatograph all coupledthrough solid and gas processing systems to provide complementary information on thesame samples. The SAM suite is able to measure a suite of light isotopes and to analyzevolatiles directly from the atmosphere or thermally released from solid samples. In additionto measurements of simple inorganic compounds and noble gases SAM will conducta sensitive search for organic compounds with either thermal or chemical extraction fromsieved samples delivered by the sample processing system on the Curiosity rovers roboticarm.

  15. The PyRosetta Toolkit: A Graphical User Interface for the Rosetta Software Suite

    OpenAIRE

    Jared Adolf-Bryfogle; Dunbrack, Roland L.

    2013-01-01

    The Rosetta Molecular Modeling suite is a command-line-only collection of applications that enable high-resolution modeling and design of proteins and other molecules. Although extremely useful, Rosetta can be difficult to learn for scientists with little computational or programming experience. To that end, we have created a Graphical User Interface (GUI) for Rosetta, called the PyRosetta Toolkit, for creating and running protocols in Rosetta for common molecular modeling and protein design ...

  16. DelPhi: a comprehensive suite for DelPhi software and associated resources

    Directory of Open Access Journals (Sweden)

    Li Lin

    2012-05-01

    Full Text Available Abstract Background Accurate modeling of electrostatic potential and corresponding energies becomes increasingly important for understanding properties of biological macromolecules and their complexes. However, this is not an easy task due to the irregular shape of biological entities and the presence of water and mobile ions. Results Here we report a comprehensive suite for the well-known Poisson-Boltzmann solver, DelPhi, enriched with additional features to facilitate DelPhi usage. The suite allows for easy download of both DelPhi executable files and source code along with a makefile for local installations. The users can obtain the DelPhi manual and parameter files required for the corresponding investigation. Non-experienced researchers can download examples containing all necessary data to carry out DelPhi runs on a set of selected examples illustrating various DelPhi features and demonstrating DelPhi’s accuracy against analytical solutions. Conclusions DelPhi suite offers not only the DelPhi executable and sources files, examples and parameter files, but also provides links to third party developed resources either utilizing DelPhi or providing plugins for DelPhi. In addition, the users and developers are offered a forum to share ideas, resolve issues, report bugs and seek help with respect to the DelPhi package. The resource is available free of charge for academic users from URL: http://compbio.clemson.edu/DelPhi.php.

  17. Software Design for Smile Analysis

    Directory of Open Access Journals (Sweden)

    A. Sarkhosh

    2010-12-01

    Full Text Available Introduction: Esthetics and attractiveness of the smile is one of the major demands in contemporary orthodontic treatment. In order to improve a smile design, it is necessary to record “posed smile” as an intentional, non-pressure, static, natural and reproduciblesmile. The record then should be analyzed to determine its characteristics. In this study,we intended to design and introduce a software to analyze the smile rapidly and precisely in order to produce an attractive smile for the patients.Materials and Methods: For this purpose, a practical study was performed to design multimedia software “Smile Analysis” which can receive patients’ photographs and videographs. After giving records to the software, the operator should mark the points and lines which are displayed on the system’s guide and also define the correct scale for each image. Thirty-three variables are measured by the software and displayed on the report page. Reliability of measurements in both image and video was significantly high(=0.7-1.Results: In order to evaluate intra- operator and inter-operator reliability, five cases were selected randomly. Statistical analysis showed that calculations performed in smile analysis software were both valid and highly reliable (for both video and photo.Conclusion: The results obtained from smile analysis could be used in diagnosis,treatment planning and evaluation of the treatment progress.

  18. Cyber-physical systems software development: way of working and tool suite

    NARCIS (Netherlands)

    Bezemer, Maarten Matthijs

    2013-01-01

    Designing embedded control software for modern cyber-physical systems becomes more and more difficult, because of the increasing amount and complexity of their requirements. The regular requirements are extended with modern requirements, for example, to get a general purpose cyber-physical system ca

  19. Development of an e-VLBI Data Transport Software Suite with VDIF

    Science.gov (United States)

    Sekido, Mamoru; Takefuji, Kazuhiro; Kimura, Moritaka; Hobiger, Thomas; Kokado, Kensuke; Nozawa, Kentarou; Kurihara, Shinobu; Shinno, Takuya; Takahashi, Fujinobu

    2010-01-01

    We have developed a software library (KVTP-lib) for VLBI data transmission over the network with the VDIF (VLBI Data Interchange Format), which is the newly proposed standard VLBI data format designed for electronic data transfer over the network. The software package keeps the application layer (VDIF frame) and the transmission layer separate, so that each layer can be developed efficiently. The real-time VLBI data transmission tool sudp-send is an application tool based on the KVTP-lib library. sudp-send captures the VLBI data stream from the VSI-H interface with the K5/VSI PC-board and writes the data to file in standard Linux file format or transmits it to the network using the simple- UDP (SUDP) protocol. Another tool, sudp-recv , receives the data stream from the network and writes the data to file in a specific VLBI format (K5/VSSP, VDIF, or Mark 5B). This software system has been implemented on the Wettzell Tsukuba baseline; evaluation before operational employment is under way.

  20. Team Software Development for Aerothermodynamic and Aerodynamic Analysis and Design

    Science.gov (United States)

    Alexandrov, N.; Atkins, H. L.; Bibb, K. L.; Biedron, R. T.; Carpenter, M. H.; Gnoffo, P. A.; Hammond, D. P.; Jones, W. T.; Kleb, W. L.; Lee-Rausch, E. M.

    2003-01-01

    A collaborative approach to software development is described. The approach employs the agile development techniques: project retrospectives, Scrum status meetings, and elements of Extreme Programming to efficiently develop a cohesive and extensible software suite. The software product under development is a fluid dynamics simulator for performing aerodynamic and aerothermodynamic analysis and design. The functionality of the software product is achieved both through the merging, with substantial rewrite, of separate legacy codes and the authorship of new routines. Examples of rapid implementation of new functionality demonstrate the benefits obtained with this agile software development process. The appendix contains a discussion of coding issues encountered while porting legacy Fortran 77 code to Fortran 95, software design principles, and a Fortran 95 coding standard.

  1. Inertial motion capture system for biomechanical analysis in pressure suits

    Science.gov (United States)

    Di Capua, Massimiliano

    A non-invasive system has been developed at the University of Maryland Space System Laboratory with the goal of providing a new capability for quantifying the motion of the human inside a space suit. Based on an array of six microprocessors and eighteen microelectromechanical (MEMS) inertial measurement units (IMUs), the Body Pose Measurement System (BPMS) allows the monitoring of the kinematics of the suit occupant in an unobtrusive, self-contained, lightweight and compact fashion, without requiring any external equipment such as those necessary with modern optical motion capture systems. BPMS measures and stores the accelerations, angular rates and magnetic fields acting upon each IMU, which are mounted on the head, torso, and each segment of each limb. In order to convert the raw data into a more useful form, such as a set of body segment angles quantifying pose and motion, a series of geometrical models and a non-linear complimentary filter were implemented. The first portion of this works focuses on assessing system performance, which was measured by comparing the BPMS filtered data against rigid body angles measured through an external VICON optical motion capture system. This type of system is the industry standard, and is used here for independent measurement of body pose angles. By comparing the two sets of data, performance metrics such as BPMS system operational conditions, accuracy, and drift were evaluated and correlated against VICON data. After the system and models were verified and their capabilities and limitations assessed, a series of pressure suit evaluations were conducted. Three different pressure suits were used to identify the relationship between usable range of motion and internal suit pressure. In addition to addressing range of motion, a series of exploration tasks were also performed, recorded, and analysed in order to identify different motion patterns and trajectories as suit pressure is increased and overall suit mobility is reduced

  2. Onco-Regulon: an integrated database and software suite for site specific targeting of transcription factors of cancer genes.

    Science.gov (United States)

    Tomar, Navneet; Mishra, Akhilesh; Mrinal, Nirotpal; Jayaram, B

    2016-01-01

    Transcription factors (TFs) bind at multiple sites in the genome and regulate expression of many genes. Regulating TF binding in a gene specific manner remains a formidable challenge in drug discovery because the same binding motif may be present at multiple locations in the genome. Here, we present Onco-Regulon (http://www.scfbio-iitd.res.in/software/onco/NavSite/index.htm), an integrated database of regulatory motifs of cancer genes clubbed with Unique Sequence-Predictor (USP) a software suite that identifies unique sequences for each of these regulatory DNA motifs at the specified position in the genome. USP works by extending a given DNA motif, in 5'→3', 3' →5' or both directions by adding one nucleotide at each step, and calculates the frequency of each extended motif in the genome by Frequency Counter programme. This step is iterated till the frequency of the extended motif becomes unity in the genome. Thus, for each given motif, we get three possible unique sequences. Closest Sequence Finder program predicts off-target drug binding in the genome. Inclusion of DNA-Protein structural information further makes Onco-Regulon a highly informative repository for gene specific drug development. We believe that Onco-Regulon will help researchers to design drugs which will bind to an exclusive site in the genome with no off-target effects, theoretically.Database URL: http://www.scfbio-iitd.res.in/software/onco/NavSite/index.htm. PMID:27515825

  3. OARDAS stray radiation analysis software

    Science.gov (United States)

    Rock, David F.

    1999-09-01

    OARDAS (Off-Axis Rejection Design Analysis Software) is a Raytheon in-house code designed to aid in stray light analysis. The code development started in 1982, and by 1986 the program was fully operational. Since that time, the work has continued--not with the goal of creating a marketable product, but with a focus on creating a powerful, user- friendly, highly graphical tool that makes stray light analysis as easy and efficient as possible. The goal has been to optimize the analysis process, with a clear emphasis on designing an interface between computer and user that allows each to do what he does best. The code evolution has resulted in a number of analysis features that are unique to the industry. This paper looks at a variety of stray light analysis problems that the analyst is typically faced with and shows how they are approached using OARDAS.

  4. EXPANDER – an integrative program suite for microarray data analysis

    Directory of Open Access Journals (Sweden)

    Shiloh Yosef

    2005-09-01

    Full Text Available Abstract Background Gene expression microarrays are a prominent experimental tool in functional genomics which has opened the opportunity for gaining global, systems-level understanding of transcriptional networks. Experiments that apply this technology typically generate overwhelming volumes of data, unprecedented in biological research. Therefore the task of mining meaningful biological knowledge out of the raw data is a major challenge in bioinformatics. Of special need are integrative packages that provide biologist users with advanced but yet easy to use, set of algorithms, together covering the whole range of steps in microarray data analysis. Results Here we present the EXPANDER 2.0 (EXPression ANalyzer and DisplayER software package. EXPANDER 2.0 is an integrative package for the analysis of gene expression data, designed as a 'one-stop shop' tool that implements various data analysis algorithms ranging from the initial steps of normalization and filtering, through clustering and biclustering, to high-level functional enrichment analysis that points to biological processes that are active in the examined conditions, and to promoter cis-regulatory elements analysis that elucidates transcription factors that control the observed transcriptional response. EXPANDER is available with pre-compiled functional Gene Ontology (GO and promoter sequence-derived data files for yeast, worm, fly, rat, mouse and human, supporting high-level analysis applied to data obtained from these six organisms. Conclusion EXPANDER integrated capabilities and its built-in support of multiple organisms make it a very powerful tool for analysis of microarray data. The package is freely available for academic users at http://www.cs.tau.ac.il/~rshamir/expander

  5. Spherical Coordinate Systems for Streamlining Suited Mobility Analysis

    Science.gov (United States)

    Benson, Elizabeth; Cowley, Matthew S.; Harvill. Lauren; Rajulu, Sudhakar

    2014-01-01

    When describing human motion, biomechanists generally report joint angles in terms of Euler angle rotation sequences. However, there are known limitations in using this method to describe complex motions such as the shoulder joint during a baseball pitch. Euler angle notation uses a series of three rotations about an axis where each rotation is dependent upon the preceding rotation. As such, the Euler angles need to be regarded as a set to get accurate angle information. Unfortunately, it is often difficult to visualize and understand these complex motion representations. One of our key functions is to help design engineers understand how a human will perform with new designs and all too often traditional use of Euler rotations becomes as much of a hindrance as a help. It is believed that using a spherical coordinate system will allow ABF personnel to more quickly and easily transmit important mobility data to engineers, in a format that is readily understandable and directly translatable to their design efforts. Objectives: The goal of this project is to establish new analysis and visualization techniques to aid in the examination and comprehension of complex motions. Methods: This project consisted of a series of small sub-projects, meant to validate and verify the method before it was implemented in the ABF's data analysis practices. The first stage was a proof of concept, where a mechanical test rig was built and instrumented with an inclinometer, so that its angle from horizontal was known. The test rig was tracked in 3D using an optical motion capture system, and its position and orientation were reported in both Euler and spherical reference systems. The rig was meant to simulate flexion/extension, transverse rotation and abduction/adduction of the human shoulder, but without the variability inherent in human motion. In the second phase of the project, the ABF estimated the error inherent in a spherical coordinate system, and evaluated how this error would

  6. Intercomparison of gamma ray analysis software packages

    International Nuclear Information System (INIS)

    The IAEA undertook an intercomparison exercise to review available software for gamma ray spectra analysis. This document describes the methods used in the intercomparison exercise, characterizes the software packages reviewed and presents the results obtained. Only direct results are given without any recommendation for a particular software or method for gamma ray spectra analysis

  7. Web Server Suite for Complex Mixture Analysis by Covariance NMR

    OpenAIRE

    Zhang, Fengli; Robinette, Steve; Bruschweiler-Li, Lei; Brüschweiler, Rafael

    2009-01-01

    Elucidation of the chemical composition of biological samples is a main focus of systems biology and metabolomics. Their comprehensive study requires reliable, efficient, and automatable methods to identify and quantify the underlying metabolites. Because nuclear magnetic resonance (NMR) spectroscopy is a rich source of molecular information, it has a unique potential for this task. Here we present a suite of public web servers (http://spinportal.magnet.fsu.edu), termed COLMAR, that facilitat...

  8. Software reliability analysis in probabilistic risk analysis

    International Nuclear Information System (INIS)

    Probabilistic Risk Analysis (PRA) is a tool which can reveal shortcomings of the NPP design in general. PRA analysts have not had sufficient guiding principles in modelling particular digital components malfunctions. Digital I and C systems are mostly analysed simply and the software reliability estimates are engineering judgments often lacking a proper justification. The OECD/NEA Working Group RISK's task DIGREL develops a taxonomy of failure modes of digital I and C systems. The EU FP7 project HARMONICS develops software reliability estimation method based on an analytic approach and Bayesian belief network. (author)

  9. Software Security Analysis : Managing source code audit

    OpenAIRE

    Persson, Daniel; Baca, Dejan

    2004-01-01

    Software users have become more conscious of security. More people have access to Internet and huge databases of security exploits. To make secure products, software developers must acknowledge this threat and take action. A first step is to perform a software security analysis. The software security analysis was performed using automatic auditing tools. An experimental environment was constructed to check if the findings were exploitable or not. Open source projects were used as reference to...

  10. Human Factors Analysis in Software Engineering

    Institute of Scientific and Technical Information of China (English)

    Xu Ren-zuo; Ma Ruo-feng; Liu Li-na; Xiong Zhong-wei

    2004-01-01

    The general human factors analysis analyzes human functions, effects and influence in a system. But in a narrow sense, it analyzes human influence upon the reliability of a system, it includes traditional human reliability analysis, human error analysis, man-machine interface analysis, human character analysis, and others. A software development project in software engineering is successful or not to be completely determined by human factors. In this paper, we discuss the human factors intensions, declare the importance of human factors analysis for software engineering by listed some instances. At last, we probe preliminarily into the mentality that a practitioner in software engineering should possess.

  11. Integrating security analysis and safeguards software engineering

    Energy Technology Data Exchange (ETDEWEB)

    Spencer, D.D.; Axline, R.M.

    1989-01-01

    These initiatives will work together to provide more secure safeguards software, as well as other critical systems software. The resulting design tools and methodologies, the evolving guidelines for software security, and the adversary-resistant software components will be applied to the software design at each stage to increase the design's inherent security and to make the design easier to analyze. The resident hardware monitor or other architectural innovations will provide complementary additions to the design to remove some of the burden of security from the software. The security analysis process, supported by new analysis methodologies and tools, will be applied to the software design as it evolves in an attempt to identify and remove vulnerabilities at the earliest possible point in the safeguards system life cycle. The result should be better and more verifiably secure software systems.

  12. Reliability analysis of software based safety functions

    International Nuclear Information System (INIS)

    The methods applicable in the reliability analysis of software based safety functions are described in the report. Although the safety functions also include other components, the main emphasis in the report is on the reliability analysis of software. The check list type qualitative reliability analysis methods, such as failure mode and effects analysis (FMEA), are described, as well as the software fault tree analysis. The safety analysis based on the Petri nets is discussed. The most essential concepts and models of quantitative software reliability analysis are described. The most common software metrics and their combined use with software reliability models are discussed. The application of software reliability models in PSA is evaluated; it is observed that the recent software reliability models do not produce the estimates needed in PSA directly. As a result from the study some recommendations and conclusions are drawn. The need of formal methods in the analysis and development of software based systems, the applicability of qualitative reliability engineering methods in connection to PSA and the need to make more precise the requirements for software based systems and their analyses in the regulatory guides should be mentioned. (orig.). (46 refs., 13 figs., 1 tab.)

  13. Petri net modeling and software safety analysis: methodology for an embedded military application.

    OpenAIRE

    Lewis, Alan D.

    1988-01-01

    Approved for public release; distribution is unlimited This thesis investigates the feasibility of software safety analysis using Petri net modeling and an automated suite of Petri Net UTilities (P-NUT) developed at UC Irvine. We briefly introduce software safety concepts, Petri nets, reachability theory, and the use of P-NUT. We then develop a methodology to combine these ideas for efficient and effective preliminary safety analysis of a real-time, embedded software, ...

  14. Linear Analysis and Verification Suite for Edge Turbulence

    Energy Technology Data Exchange (ETDEWEB)

    Myra, J R; Umansky, M

    2008-04-24

    The edge and scrape-off-layer region of a tokamak plasma is subject to well known resistive and ideal instabilities that are driven by various curvature- and sheath-related mechanisms. While the boundary plasma is typically strongly turbulent in experiments, it is useful to have computational tools that can analyze the linear eigenmode structure, predict quantitative trends in growth rates and elucidate and the underlying drive mechanisms. Furthermore, measurement of the linear growth rate of unstable modes emerging from a known, established equilibrium configuration provides one of the few quantitative ways of rigorously benchmarking large-scale plasma turbulence codes with each other and with a universal standard. In this report, a suite of codes that can describe linearized, nonlocal (e.g. separatrix-spanning) modes in axisymmetric (realistic divertor), toroidal geometry is discussed. Examples of several benchmark comparisons are given, and future development plans for a new eigenvalue edge code are presented.

  15. Analysis strategies and software for geodetic VLBI

    OpenAIRE

    Haas, R.

    2004-01-01

    This article describes currently used analysis strategy and data analysis software for geodetic VLBI.Today's geodetic observing strategies are shortly presented, and the geodetic VLBI observables and data modeling are briefly discussed. A short overview is given on existing geodetic VLBI software packages and the statistical approaches that are applied. Necessary improvements of today's analysis software are described. Some of the future expectations and goals of geodetic VLBI are presented a...

  16. Integrated Methodology for Software Reliability Analysis

    Directory of Open Access Journals (Sweden)

    Marian Pompiliu CRISTESCU

    2012-01-01

    Full Text Available The most used techniques to ensure safety and reliability of the systems are applied together as a whole, and in most cases, the software components are usually overlooked or to little analyzed. The present paper describes the applicability of fault trees analysis software system, analysis defined as Software Fault Tree Analysis (SFTA, fault trees are evaluated using binary decision diagrams, all of these being integrated and used with help from Java library reliability.

  17. Software architecture analysis of usability

    NARCIS (Netherlands)

    Folmer, Eelke

    2005-01-01

    One of the qualities that has received increased attention in recent decades is usability. A software product with poor usability is likely to fail in a highly competitive market; therefore software developing organizations are paying more and more attention to ensuring the usability of their softwa

  18. Analysis strategies and software for geodetic VLBI

    CERN Document Server

    Haas, R

    2004-01-01

    This article describes currently used analysis strategy and data analysis software for geodetic VLBI.Today's geodetic observing strategies are shortly presented, and the geodetic VLBI observables and data modeling are briefly discussed. A short overview is given on existing geodetic VLBI software packages and the statistical approaches that are applied. Necessary improvements of today's analysis software are described. Some of the future expectations and goals of geodetic VLBI are presented and the corresponding consequences for the VLBI technique are explained. This includes consequences in terms of technical development and corresponding improvements in data modeling and analysis software.

  19. A Coupled Calculation Suite for Atucha II Operational Transients Analysis

    Directory of Open Access Journals (Sweden)

    Oscar Mazzantini

    2011-01-01

    Full Text Available While more than a decade ago reactor and thermal hydraulic calculations were tedious and often needed a lot of approximations and simplifications that forced the designers to take a very conservative approach, computational resources available nowadays allow engineers to cope with increasingly complex problems in a reasonable time. The use of best-estimate calculations provides tools to justify convenient engineering margins, reduces costs, and maximises economic benefits. In this direction, a suite of coupled best-estimate specific calculation codes was developed to analyse the behaviour of the Atucha II nuclear power plant in Argentina. The developed tool includes three-dimensional spatial neutron kinetics, a channel-level model of the core thermal hydraulics with subcooled boiling correlations, a one-dimensional model of the primary and secondary circuits including pumps, steam generators, heat exchangers, and the turbine with all their associated control loops, and a complete simulation of the reactor control, limitation, and protection system working in closed-loop conditions as a faithful representation of the real power plant. In the present paper, a description of the coupling scheme between the codes involved is given, and some examples of their application to Atucha II are shown.

  20. Analysis of open source GIS software

    OpenAIRE

    Božnis, Andrius

    2006-01-01

    GIS is one of the most perspective information technology sciences sphere. GIS conjuncts the digital image analysis and data base systems. This makes GIS wide applicable and very high skills demanding system. There is a lot of commercial GIS software which is well advertised and which functionality is pretty well known, while open source software is forgotten. In this diploma work is made analysis of available open source GIS software on the Internet, in the scope of different projects interr...

  1. Safety Analysis of an Evolving Software Architecture

    OpenAIRE

    de Lemos, Rogério

    2000-01-01

    The safety analysis of an evolving software system has to consider the impact that changes might have on the software components, and to provide confidence that the risk is acceptable. If the impact of a change is not thoroughly analysed, accidents can occur as a result of faulty interactions between components, for example. However, the process of safety analysis can be enhanced if appropriate abstractions are provided for modelling and analysing software components and their interactions. I...

  2. Hazard Analysis for the Mark III Space Suit Assembly (SSA) Used in One-g Operations

    Science.gov (United States)

    Mitchell, Kate; Ross, Amy; Blanco, Raul; Wood, Art

    2012-01-01

    This Hazard Analysis document encompasses the Mark III Space Suit Assembly (SSA) and associated ancillary equipment. It has been prepared using JSC17773, "Preparing Hazard Analyses for JSC Ground Operation", as a guide. The purpose of this document is to present the potential hazards involved in ground (23 % maximum O2, One-g) operations of the Mark III and associated ancillary support equipment system. The hazards listed in this document are specific to suit operations only; each supporting facility (Bldg. 9, etc.) is responsible for test specific Hazard Analyses. A "hazard" is defined as any condition that has the potential for harming personnel or equipment. This analysis was performed to document the safety aspects associated with manned use of the Mark III for pressurized and unpressurized ambient, ground-based, One-g human testing. The hazards identified herein represent generic hazards inherent to all standard JSC test venues for nominal ground test configurations. Non-standard test venues or test specific configurations may warrant consideration of additional hazards analysis prior to test. The cognizant suit engineer is responsible for the safety of the astronaut/test subject, space suit, and suit support personnel. The test requester, for the test supported by the suit test engineer and suited subject, is responsible for overall safety and any necessary Test Readiness Reviews (TRR).

  3. DSN Data Visualization Suite

    Science.gov (United States)

    Bui, Bach X.; Malhotra, Mark R.; Kim, Richard M.

    2009-01-01

    The DSN Data Visualization Suite is a set of computer programs and reusable Application Programming Interfaces (APIs) that assist in the visualization and analysis of Deep Space Network (DSN) spacecraft-tracking data, which can include predicted and actual values of downlink frequencies, uplink frequencies, and antenna-pointing angles in various formats that can include tables of values and polynomial coefficients. The data can also include lists of antenna-pointing events, lists of antenna- limit events, and schedules of tracking activities. To date, analysis and correlation of these intricately related data before and after tracking have been difficult and time-consuming. The DSN Data Visualization Suite enables operators to quickly diagnose tracking-data problems before, during, and after tracking. The Suite provides interpolation on demand and plotting of DSN tracking data, correlation of all data on a given temporal point, and display of data with color coding configurable by users. The suite thereby enables rapid analysis of the data prior to transmission of the data to DSN control centers. At the control centers, the same suite enables operators to validate the data before committing the data to DSN subsystems. This software is also Web-enabled to afford its capabilities to international space agencies.

  4. Numerical methods in software and analysis

    CERN Document Server

    Rice, John R

    1992-01-01

    Numerical Methods, Software, and Analysis, Second Edition introduces science and engineering students to the methods, tools, and ideas of numerical computation. Introductory courses in numerical methods face a fundamental problem-there is too little time to learn too much. This text solves that problem by using high-quality mathematical software. In fact, the objective of the text is to present scientific problem solving using standard mathematical software. This book discusses numerous programs and software packages focusing on the IMSL library (including the PROTRAN system) and ACM Algorithm

  5. GRACAT, Software for grounding and collision analysis

    DEFF Research Database (Denmark)

    Friis-Hansen, Peter; Simonsen, Bo Cerup

    2002-01-01

    From 1998 to 2001 an integrated software package for grounding and collision analysis was developed at the Technical University of Denmark within the ISESO project at the cost of six man years (0.75M US$). The software provides a toolbox for a multitude of analyses related to collision...... route where the result is the probability density functions for the cost of oil outflow in a given area per year for the two vessels. In this paper we describe the basic modelling principles and the capabilities of the software package. The software package can be downloaded for research purposes from...

  6. Development of integrated transport analysis suite for LHD plasmas towards transport model validation and increased predictability

    International Nuclear Information System (INIS)

    In this study, the integrated transport analysis suite, TASK3D-a, was developed to enhance the physics understanding and accurate discussion of the Large Helical Device (LHD) experiment toward facilitating transport model validation. Steady-state and dynamic (transient) transport analyses of NBI (neutral-beam-injection)-heated LHD plasmas have been greatly facilitated by this suite. This will increase the predictability of the transport properties of LHD plasmas toward reactor-relevant regimes and reactor-scale plasmas. (author)

  7. Software safety analysis practice in installation phase

    Energy Technology Data Exchange (ETDEWEB)

    Huang, H. W.; Chen, M. H.; Shyu, S. S., E-mail: hwhwang@iner.gov.t [Institute of Nuclear Energy Research, No. 1000 Wenhua Road, Chiaan Village, Longtan Township, 32546 Taoyuan County, Taiwan (China)

    2010-10-15

    This work performed a software safety analysis in the installation phase of the Lung men nuclear power plant in Taiwan, under the cooperation of Institute of Nuclear Energy Research and Tpc. The US Nuclear Regulatory Commission requests licensee to perform software safety analysis and software verification and validation in each phase of software development life cycle with Branch Technical Position 7-14. In this work, 37 safety grade digital instrumentation and control systems were analyzed by failure mode and effects analysis, which is suggested by IEEE standard 7-4.3.2-2003. During the installation phase, skew tests for safety grade network and point to point tests were performed. The failure mode and effects analysis showed all the single failure modes can be resolved by the redundant means. Most of the common mode failures can be resolved by operator manual actions. (Author)

  8. A methodology, based on a language's properties, for the selection and validation of a suite of software metrics.

    OpenAIRE

    Bodnar, Roger P. Jr.

    1997-01-01

    Software Engineering has attempted to improve the software development process for over two decades. A primary attempt at this process lies in the arena of measurement. "You can't control what you can't measure" [DEMT82]. This thesis attempts to measure the development of multimedia products. Multimedia languages seem to be the trend of future languages. Problem areas such as Education, Instruction, Training, and Information Systems require that various media allow the achievement of suc...

  9. Software acquisition: a business strategy analysis

    OpenAIRE

    Farbey, B.; Finkelstein, A.

    2001-01-01

    The paper argues that there are new insights to be gained from a strategic analysis of requirements engineering. The paper is motivated by a simple question: what does it take to be a world class software acquirer? The question has relevance for requirements engineers because for many organisations market pressures mean that software is commonly acquired rather than developed from scratch. The paper builds on the work of C. H. Fine (1998) who suggests that product, process and supply chain sh...

  10. MathWeb: a concurrent image analysis tool suite for multispectral data fusion

    Science.gov (United States)

    Achalakul, Tiranee; Haaland, Peter D.; Taylor, Stephen

    1999-03-01

    This paper describes a preliminary approach to the fusion of multi-spectral image data for the analysis of cervical cancer. The long-term goal of this research is to define spectral signatures and automatically detect cancer cell structures. The approach combines a multi-spectral microscope with an image analysis tool suite, MathWeb. The tool suite incorporates a concurrent Principal Component Transform (PCT) that is used to fuse the multi-spectral data. This paper describes the general approach and the concurrent PCT algorithm. The algorithm is evaluated from both the perspective of image quality and performance scalability.

  11. Acoustic Emission Analysis Applet (AEAA) Software

    Science.gov (United States)

    Nichols, Charles T.; Roth, Don J.

    2013-01-01

    NASA Glenn Research and NASA White Sands Test Facility have developed software supporting an automated pressure vessel structural health monitoring (SHM) system based on acoustic emissions (AE). The software, referred to as the Acoustic Emission Analysis Applet (AEAA), provides analysts with a tool that can interrogate data collected on Digital Wave Corp. and Physical Acoustics Corp. software using a wide spectrum of powerful filters and charts. This software can be made to work with any data once the data format is known. The applet will compute basic AE statistics, and statistics as a function of time and pressure (see figure). AEAA provides value added beyond the analysis provided by the respective vendors' analysis software. The software can handle data sets of unlimited size. A wide variety of government and commercial applications could benefit from this technology, notably requalification and usage tests for compressed gas and hydrogen-fueled vehicles. Future enhancements will add features similar to a "check engine" light on a vehicle. Once installed, the system will ultimately be used to alert International Space Station crewmembers to critical structural instabilities, but will have little impact to missions otherwise. Diagnostic information could then be transmitted to experienced technicians on the ground in a timely manner to determine whether pressure vessels have been impacted, are structurally unsound, or can be safely used to complete the mission.

  12. A 'Toolbox' Equivalent Process for Safety Analysis Software

    International Nuclear Information System (INIS)

    Defense Nuclear Facilities Safety Board (DNFSB) Recommendation 2002-1 (Quality Assurance for Safety-Related Software) identified a number of quality assurance issues on the use of software in Department of Energy (DOE) facilities for analyzing hazards, and designing and operating controls that prevent or mitigate potential accidents. The development and maintenance of a collection, or 'toolbox', of multiple-site use, standard solution, Software Quality Assurance (SQA)-compliant safety software is one of the major improvements identified in the associated DOE Implementation Plan (IP). The DOE safety analysis toolbox will contain a set of appropriately quality-assured, configuration-controlled, safety analysis codes, recognized for DOE-broad, safety basis applications. Currently, six widely applied safety analysis computer codes have been designated for toolbox consideration. While the toolbox concept considerably reduces SQA burdens among DOE users of these codes, many users of unique, single-purpose, or single-site software may still have sufficient technical justification to continue use of their computer code of choice, but are thwarted by the multiple-site condition on toolbox candidate software. The process discussed here provides a roadmap for an equivalency argument, i.e., establishing satisfactory SQA credentials for single-site software that can be deemed ''toolbox-equivalent''. The process is based on the model established to meet IP Commitment 4.2.1.2: Establish SQA criteria for the safety analysis ''toolbox'' codes. Implementing criteria that establish the set of prescriptive SQA requirements are based on implementation plan/procedures from the Savannah River Site, also incorporating aspects of those from the Waste Isolation Pilot Plant (SNL component) and the Yucca Mountain Project. The major requirements are met with evidence of a software quality assurance plan, software requirements and design documentation, user's instructions, test report, a

  13. Software abstractions logic, language, and analysis

    CERN Document Server

    Jackson, Daniel

    2011-01-01

    In Software Abstractions Daniel Jackson introduces an approach to software design that draws on traditional formal methods but exploits automated tools to find flaws as early as possible. This approach--which Jackson calls "lightweight formal methods" or "agile modeling"--takes from formal specification the idea of a precise and expressive notation based on a tiny core of simple and robust concepts but replaces conventional analysis based on theorem proving with a fully automated analysis that gives designers immediate feedback. Jackson has developed Alloy, a language that captures the essence of software abstractions simply and succinctly, using a minimal toolkit of mathematical notions. This revised edition updates the text, examples, and appendixes to be fully compatible with the latest version of Alloy (Alloy 4). The designer can use automated analysis not only to correct errors but also to make models that are more precise and elegant. This approach, Jackson says, can rescue designers from "the tarpit of...

  14. Power and performance software analysis and optimization

    CERN Document Server

    Kukunas, Jim

    2015-01-01

    Power and Performance: Software Analysis and Optimization is a guide to solving performance problems in modern Linux systems. Power-efficient chips are no help if the software those chips run on is inefficient. Starting with the necessary architectural background as a foundation, the book demonstrates the proper usage of performance analysis tools in order to pinpoint the cause of performance problems, and includes best practices for handling common performance issues those tools identify. Provides expert perspective from a key member of Intel's optimization team on how processors and memory

  15. Apollo/Skylab suit program management systems study. Volume 2: Cost analysis

    Science.gov (United States)

    1974-01-01

    The business management methods employed in the performance of the Apollo-Skylab Suit Program are studied. The data accumulated over the span of the contract as well as the methods used to accumulate the data are examined. Management methods associated with the monitoring and control of resources applied towards the performance of the contract are also studied and recommended upon. The primary objective is the compilation, analysis, and presentation of historical cost performance criteria. Cost data are depicted for all phases of the Apollo-Skylab program in common, meaningful terms, whereby the data may be applicable to future suit program planning efforts.

  16. Objective facial photograph analysis using imaging software.

    Science.gov (United States)

    Pham, Annette M; Tollefson, Travis T

    2010-05-01

    Facial analysis is an integral part of the surgical planning process. Clinical photography has long been an invaluable tool in the surgeon's practice not only for accurate facial analysis but also for enhancing communication between the patient and surgeon, for evaluating postoperative results, for medicolegal documentation, and for educational and teaching opportunities. From 35-mm slide film to the digital technology of today, clinical photography has benefited greatly from technological advances. With the development of computer imaging software, objective facial analysis becomes easier to perform and less time consuming. Thus, while the original purpose of facial analysis remains the same, the process becomes much more efficient and allows for some objectivity. Although clinical judgment and artistry of technique is never compromised, the ability to perform objective facial photograph analysis using imaging software may become the standard in facial plastic surgery practices in the future. PMID:20511080

  17. The emerging Web 2.0 social software: an enabling suite of sociable technologies in health and health care education.

    Science.gov (United States)

    Kamel Boulos, Maged N; Wheeler, Steve

    2007-03-01

    Web 2.0 sociable technologies and social software are presented as enablers in health and health care, for organizations, clinicians, patients and laypersons. They include social networking services, collaborative filtering, social bookmarking, folksonomies, social search engines, file sharing and tagging, mashups, instant messaging, and online multi-player games. The more popular Web 2.0 applications in education, namely wikis, blogs and podcasts, are but the tip of the social software iceberg. Web 2.0 technologies represent a quite revolutionary way of managing and repurposing/remixing online information and knowledge repositories, including clinical and research information, in comparison with the traditional Web 1.0 model. The paper also offers a glimpse of future software, touching on Web 3.0 (the Semantic Web) and how it could be combined with Web 2.0 to produce the ultimate architecture of participation. Although the tools presented in this review look very promising and potentially fit for purpose in many health care applications and scenarios, careful thinking, testing and evaluation research are still needed in order to establish 'best practice models' for leveraging these emerging technologies to boost our teaching and learning productivity, foster stronger 'communities of practice', and support continuing medical education/professional development (CME/CPD) and patient education.

  18. Software for computerised analysis of cardiotocographic traces.

    Science.gov (United States)

    Romano, M; Bifulco, P; Ruffo, M; Improta, G; Clemente, F; Cesarelli, M

    2016-02-01

    Despite the widespread use of cardiotocography in foetal monitoring, the evaluation of foetal status suffers from a considerable inter and intra-observer variability. In order to overcome the main limitations of visual cardiotocographic assessment, computerised methods to analyse cardiotocographic recordings have been recently developed. In this study, a new software for automated analysis of foetal heart rate is presented. It allows an automatic procedure for measuring the most relevant parameters derivable from cardiotocographic traces. Simulated and real cardiotocographic traces were analysed to test software reliability. In artificial traces, we simulated a set number of events (accelerations, decelerations and contractions) to be recognised. In the case of real signals, instead, results of the computerised analysis were compared with the visual assessment performed by 18 expert clinicians and three performance indexes were computed to gain information about performances of the proposed software. The software showed preliminary performance we judged satisfactory in that the results matched completely the requirements, as proved by tests on artificial signals in which all simulated events were detected from the software. Performance indexes computed in comparison with obstetricians' evaluations are, on the contrary, not so satisfactory; in fact they led to obtain the following values of the statistical parameters: sensitivity equal to 93%, positive predictive value equal to 82% and accuracy equal to 77%. Very probably this arises from the high variability of trace annotation carried out by clinicians.

  19. Software for computerised analysis of cardiotocographic traces.

    Science.gov (United States)

    Romano, M; Bifulco, P; Ruffo, M; Improta, G; Clemente, F; Cesarelli, M

    2016-02-01

    Despite the widespread use of cardiotocography in foetal monitoring, the evaluation of foetal status suffers from a considerable inter and intra-observer variability. In order to overcome the main limitations of visual cardiotocographic assessment, computerised methods to analyse cardiotocographic recordings have been recently developed. In this study, a new software for automated analysis of foetal heart rate is presented. It allows an automatic procedure for measuring the most relevant parameters derivable from cardiotocographic traces. Simulated and real cardiotocographic traces were analysed to test software reliability. In artificial traces, we simulated a set number of events (accelerations, decelerations and contractions) to be recognised. In the case of real signals, instead, results of the computerised analysis were compared with the visual assessment performed by 18 expert clinicians and three performance indexes were computed to gain information about performances of the proposed software. The software showed preliminary performance we judged satisfactory in that the results matched completely the requirements, as proved by tests on artificial signals in which all simulated events were detected from the software. Performance indexes computed in comparison with obstetricians' evaluations are, on the contrary, not so satisfactory; in fact they led to obtain the following values of the statistical parameters: sensitivity equal to 93%, positive predictive value equal to 82% and accuracy equal to 77%. Very probably this arises from the high variability of trace annotation carried out by clinicians. PMID:26638805

  20. The Pragmatic Analysis on Offensive Words-With Suits as a Case

    Institute of Scientific and Technical Information of China (English)

    高盈盈; 李卉艳

    2016-01-01

    As the researches on offensive words focus more attention on people in the symmetric power context and ignores dynamic contexts, it causes a lack of systematic analysis on the mechanism of the realization forms of offensive words. Based on the analysis on Suits, this paper finds out power plays a significant role in constraining people’s offensive words and their realization forms vary in different power contexts.

  1. Advanced Software Methods for Physics Analysis

    International Nuclear Information System (INIS)

    Unprecedented data analysis complexity is experienced in modern High Energy Physics experiments. The complexity arises from the growing size of recorded data samples, the large number of data analyses performed by different users in each single experiment, and the level of complexity of each single analysis. For this reason, the requirements on software for data analysis impose a very high level of reliability. We present two concrete examples: the former from BaBar experience with the migration to a new Analysis Model with the definition of a new model for the Event Data Store, the latter about a toolkit for multivariate statistical and parametric Monte Carlo analysis developed using generic programming

  2. Software for analysis of visual meteor data

    Science.gov (United States)

    Veljković, Kristina; Ivanović, Ilija

    2014-02-01

    In this paper, we will present new software for analysis of IMO data collected from visual observations. The software consists of a package of functions written in the statistical programming language R, as well as a Java application which uses these functions in a user friendly environment. R code contains various filters for selection of data, methods for calculation of Zenithal Hourly Rate (ZHR), solar longitude, population index and graphical representation of ZHR and distribution of observed magnitudes. The Java application allows everyone to use these functions without any knowledge of R. Both R code and the Java application are open source and free with user manuals and examples provided.

  3. Towards software analysis as a service

    OpenAIRE

    Ghezzi, G; H.C. Gall

    2008-01-01

    Throughout the years software engineers have come up with a myriad of specialized tools and techniques that focus on a certain type of analysis, such as metrics extraction, evolution tracking, co-change detection, bug prediction, all the way up to social network analysis of team dynamics. However, easy and straight forward synergies between these analyses/tools rarely exist because of their stand-alone nature, their platform dependence, their different input and output formats and the vari...

  4. Intraprocedural dataflow analysis for software product lines

    DEFF Research Database (Denmark)

    Brabrand, Claus; Ribeiro, Márcio; Tolêdo, Társis;

    2013-01-01

    Software product lines (SPLs) developed using annotative approaches such as conditional compilation come with an inherent risk of constructing erroneous products. For this reason, it is essential to be able to analyze such SPLs. However, as dataflow analysis techniques are not able to deal with SP...... and memory characteristics on five qualitatively different SPLs. On our benchmarks, the combined analysis strategy is up to almost eight times faster than the brute-force approach....

  5. Image analysis software and sample preparation demands

    Science.gov (United States)

    Roth, Karl n.; Wenzelides, Knut; Wolf, Guenter; Hufnagl, Peter

    1990-11-01

    Image analysis offers the opportunity to analyse many processes in medicine, biology and engeneering in a quantitative manner. Experience shows that it is only by awareness of preparation methods and attention to software design that full benefit can be reaped from a picture processing system in the fields of cytology and histology. Some examples of special stains for automated analysis are given here and the effectiveness of commercially available software packages is investigated. The application of picture processing and development of related special hardware and software has been increasing within the last years. As PC-based picture processing systems can be purchased at reasonable costs more and more users are confronted with these problems. Experience shows that the quality of commercially available software packages differ and the requirements on the sample preparation needed for successful problem solutions are often underestimated. But as always, sample preparation is still the key to success in automated image analysis for cells and tissues. Hence, a problem solution requires the permanent interaction between sample preparation methods and algorithm development.

  6. Improving Software Systems By Flow Control Analysis

    Directory of Open Access Journals (Sweden)

    Piotr Poznanski

    2012-01-01

    Full Text Available Using agile methods during the implementation of the system that meets mission critical requirements can be a real challenge. The change in the system built of dozens or even hundreds of specialized devices with embedded software requires the cooperation of a large group of engineers. This article presents a solution that supports parallel work of groups of system analysts and software developers. Deployment of formal rules to the requirements written in natural language enables using formal analysis of artifacts being a bridge between software and system requirements. Formalism and textual form of requirements allowed the automatic generation of message flow graph for the (sub system, called the “big-picture-model”. Flow diagram analysis helped to avoid a large number of defects whose repair cost in extreme cases could undermine the legitimacy of agile methods in projects of this scale. Retrospectively, a reduction of technical debt was observed. Continuous analysis of the “big picture model” improves the control of the quality parameters of the software architecture. The article also tries to explain why the commercial platform based on UML modeling language may not be sufficient in projects of this complexity.

  7. Residence time distribution software analysis. User's manual

    International Nuclear Information System (INIS)

    Radiotracer applications cover a wide range of industrial activities in chemical and metallurgical processes, water treatment, mineral processing, environmental protection and civil engineering. Experiment design, data acquisition, treatment and interpretation are the basic elements of tracer methodology. The application of radiotracers to determine impulse response as RTD as well as the technical conditions for conducting experiments in industry and in the environment create a need for data processing using special software. Important progress has been made during recent years in the preparation of software programs for data treatment and interpretation. The software package developed for industrial process analysis and diagnosis by the stimulus-response methods contains all the methods for data processing for radiotracer experiments

  8. DDS-Suite - A Dynamic Data Acquisition, Processing, and Analysis System for Wind Tunnel Testing

    Science.gov (United States)

    Burnside, Jathan J.

    2012-01-01

    Wind Tunnels have optimized their steady-state data systems for acquisition and analysis and even implemented large dynamic-data acquisition systems, however development of near real-time processing and analysis tools for dynamic-data have lagged. DDS-Suite is a set of tools used to acquire, process, and analyze large amounts of dynamic data. Each phase of the testing process: acquisition, processing, and analysis are handled by separate components so that bottlenecks in one phase of the process do not affect the other, leading to a robust system. DDS-Suite is capable of acquiring 672 channels of dynamic data at rate of 275 MB / s. More than 300 channels of the system use 24-bit analog-to-digital cards and are capable of producing data with less than 0.01 of phase difference at 1 kHz. System architecture, design philosophy, and examples of use during NASA Constellation and Fundamental Aerodynamic tests are discussed.

  9. Automating Risk Analysis of Software Design Models

    Directory of Open Access Journals (Sweden)

    Maxime Frydman

    2014-01-01

    Full Text Available The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance.

  10. Development of Advanced Suite of Deterministic Codes for VHTR Physics Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Kang Seog; Cho, J. Y.; Lee, K. H. (and others)

    2007-07-15

    Advanced Suites of deterministic codes for VHTR physics analysis has been developed for detailed analysis of current and advanced reactor designs as part of a US-ROK collaborative I-NERI project. These code suites include the conventional 2-step procedure in which a few group constants are generated by a transport lattice calculation, and the reactor physics analysis is performed by a 3-dimensional diffusion calculation, and a whole core transport code that can model local heterogeneities directly at the core level. Particular modeling issues in physics analysis of the gas-cooled VHTRs were resolved, which include a double heterogeneity of the coated fuel particles, a neutron streaming in the coolant channels, a strong core-reflector interaction, and large spectrum shifts due to changes of the surrounding environment, temperature and burnup. And the geometry handling capability of the DeCART code were extended to deal with the hexagonal fuel elements of the VHTR core. The developed code suites were validated and verified by comparing the computational results with those of the Monte Carlo calculations for the benchmark problems.

  11. Complexity of software trustworthiness and its dynamical statistical analysis methods

    Institute of Scientific and Technical Information of China (English)

    ZHENG ZhiMing; MA ShiLong; LI Wei; JIANG Xin; WEI Wei; MA LiLi; TANG ShaoTing

    2009-01-01

    Developing trusted softwares has become an important trend and a natural choice in the development of software technology and applications.At present,the method of measurement and assessment of software trustworthiness cannot guarantee safe and reliable operations of software systems completely and effectively.Based on the dynamical system study,this paper interprets the characteristics of behaviors of software systems and the basic scientific problems of software trustworthiness complexity,analyzes the characteristics of complexity of software trustworthiness,and proposes to study the software trustworthiness measurement in terms of the complexity of software trustworthiness.Using the dynamical statistical analysis methods,the paper advances an invariant-measure based assessment method of software trustworthiness by statistical indices,and hereby provides a dynamical criterion for the untrustworthiness of software systems.By an example,the feasibility of the proposed dynamical statistical analysis method in software trustworthiness measurement is demonstrated using numerical simulations and theoretical analysis.

  12. Platform Independent Dynamic Java Virtual Machine Analysis: the Java Grande Forum Benchmark Suite

    OpenAIRE

    Daly, Charles; Horgan, Jane; Power, James; Waldron, John

    2001-01-01

    In this paper we present a platform independent analysis of the dynamic profiles of Java programs when executing on the Java Virtual Machine. The Java programs selected are taken from the Java Grande Forum benchmark suite, and five different Java-to-bytecode compilers are analysed. The results presented describe the dynamic instruction usage frequencies, as well as the sizes of the local variable, parameter and operand stacks during execution on the JVM. These results,...

  13. Software analysis in the semantic web

    Science.gov (United States)

    Taylor, Joshua; Hall, Robert T.

    2013-05-01

    Many approaches in software analysis, particularly dynamic malware analyis, benefit greatly from the use of linked data and other Semantic Web technology. In this paper, we describe AIS, Inc.'s Semantic Extractor (SemEx) component from the Malware Analysis and Attribution through Genetic Information (MAAGI) effort, funded under DARPA's Cyber Genome program. The SemEx generates OWL-based semantic models of high and low level behaviors in malware samples from system call traces generated by AIS's introspective hypervisor, IntroVirtTM. Within MAAGI, these semantic models were used by modules that cluster malware samples by functionality, and construct "genealogical" malware lineages. Herein, we describe the design, implementation, and use of the SemEx, as well as the C2DB, an OWL ontology used for representing software behavior and cyber-environments.

  14. Analysis and design for architecture-based software

    Institute of Scientific and Technical Information of China (English)

    Jia Xiaolin; He Jian; Qin Zheng; Wang Xianghua

    2005-01-01

    The technologies of software architecture are introduced, and the software analysis-and-design process is divided into requirement analysis, software architecture design and system design. Using these technologies, a model of architecture-centric software analysis and design process(ACSADP) is proposed. Meanwhile, with regard to the completeness, consistency and correctness between the software requirements and design results, the theories of function and process control are applied to ACSADP. Finally, a model of integrated development environment (IDE) for ACSADP is proposed. It can be demonstrated by the practice that the model of ACSADP can aid developer to manage software process effectively and improve the quality of software analysis and design.

  15. A Lexical Analysis of Social Software Literature

    OpenAIRE

    Loay ALTAMIMI

    2013-01-01

    Social software are today more prevalent in organizational context, providing new ways for work and giving web users new opportunities for interaction and collaboration. This review aims to gain insight into the extent of available scholarly and professional literature on these new tools and into interests in this field. The analysis of the 5356 collected articles includes type of publication, year of publication, source, keywords in articles' titles and abstracts. The study here adopted a sy...

  16. Towards an Analysis of Daylighting Simulation Software

    Directory of Open Access Journals (Sweden)

    Juan J. Sendra

    2011-06-01

    Full Text Available The aim of this article was to assess some of the main lighting software programs habitually used in architecture, subjecting them to a series of trials and analyzing the light distribution obtained in situations with different orientations, dates and geometry. The analysis examines Lightscape 3.2, Desktop Radiance 2.0, Lumen Micro 7.5, Ecotect 5.5 and Dialux 4.4.

  17. Towards an Analysis of Daylighting Simulation Software

    OpenAIRE

    Sendra, Juan J.; Jaime Navarro; Ignacio Acosta

    2011-01-01

    The aim of this article was to assess some of the main lighting software programs habitually used in architecture, subjecting them to a series of trials and analyzing the light distribution obtained in situ ations with different orientations, dates and geometry. The analysis examines Lightscape 3.2, Desktop Radiance 2.0, Lumen Micro 7.5, Ecotect 5.5 and Dialux 4.4.

  18. Calibration of the Quadrupole Mass Spectrometer of the Sample Analysis at Mars Instrument Suite

    Science.gov (United States)

    Mahaffy, P. R.; Trainer, M. G.; Eigenbrode, J. L.; Franz, H. B.; Stern, J. C.; Harpold, D.; Conrad, P. G.; Raaen, E.; Lyness, E.

    2011-01-01

    The SAM suite of instruments on the "Curiosity" Rover of the Mars Science Laboratory (MSL) is designed to provide chemical and isotopic analysis of organic and inorganic volatiles for both atmospheric and solid samples. The mission of the MSL investigations is to advance beyond the successful search for aqueous transformation in surface environments at Mars toward a quantitative assessment of habitability and preservation through a series of chemical and geological measurements. The SAM suite was delivered in December 2010 (Figure 1) to the Jet Propulsion Laboratory for integration into the Curiosity Rover. We previously outlined the range of SAM solid and gas calibrations implemented or planned and here we discuss a specific set of calibration experiments to establish the response of the SAM Quadrupole Mass Spectrometer (QMS) to the four most abundant gases in the Martian atmosphere CO2, N2, Ar, and O2, A full SAM instrument description and calibration report is presently in preparation.

  19. Software Metrics: Some degree of software measurement and analysis

    OpenAIRE

    Rakesh. L; Manoranjan Kumar Singh; Gunaseelan Devaraj

    2010-01-01

    Measurement lies at the heart of many systems that govern our lives. Measurement is essential to our daily life and measuring has become a common place and well accepted. Engineering discipline use methods that are based on models and theories. Methodological improvements alone do not make an engineering discipline. Measurement encourages us to improve our processes and products. This paper examines the realm of software engineering to see why measurement is needed and also set the scene for ...

  20. Digital PIV (DPIV) Software Analysis System

    Science.gov (United States)

    Blackshire, James L.

    1997-01-01

    A software package was developed to provide a Digital PIV (DPIV) capability for NASA LaRC. The system provides an automated image capture, test correlation, and autocorrelation analysis capability for the Kodak Megaplus 1.4 digital camera system for PIV measurements. The package includes three separate programs that, when used together with the PIV data validation algorithm, constitutes a complete DPIV analysis capability. The programs are run on an IBM PC/AT host computer running either Microsoft Windows 3.1 or Windows 95 using a 'quickwin' format that allows simple user interface and output capabilities to the windows environment.

  1. A Lexical Analysis of Social Software Literature

    Directory of Open Access Journals (Sweden)

    Loay ALTAMIMI

    2013-01-01

    Full Text Available Social software are today more prevalent in organizational context, providing new ways for work and giving web users new opportunities for interaction and collaboration. This review aims to gain insight into the extent of available scholarly and professional literature on these new tools and into interests in this field. The analysis of the 5356 collected articles includes type of publication, year of publication, source, keywords in articles' titles and abstracts. The study here adopted a systematic approach for the literature review, that is, the principle of Lexical Analysis.

  2. EDA: EXAFS data analysis software package

    Science.gov (United States)

    Kuzmin, A.

    1995-02-01

    The present paper describes the EXAFS data analysis software package, called EDA, originally developed by the author for IBM PC compatible computers. It consists of a set of interactive programs which allow to carry out all steps of the EXAFS data analysis procedure. There are two main differences from known packages. First, a significantly improved algorithm is used for atomic-like background removal in the EXAFS extraction procedure. Second, a model independent derivation of the radial distribution function from EXAFS, based on a maximum-entropy-like algorithm, is available.

  3. Analysis of software for modeling atmospheric dispersion

    International Nuclear Information System (INIS)

    During last few years, a number software packages for microcomputes have appeared with the aim to simulate diffusion of atmospheric pollutants. These codes, simplifying the models used for safety analyses of industrial plants are becoming more useful, and are even used for post-accidental conditions. The report presents for the first time in a critical manner, principal models available up to this date. The problem arises in adapting the models to the demanded post-accidental interventions. In parallel to this action an analysis of performance was performed. It means, identifying the need of forecasting the most appropriate actions to be performed having in mind short available time and lack of information. Because of these difficulties, it is possible to simplify the software, which will not include all the options but could deal with a specific situation. This would enable minimisation of data to be collected on the site

  4. ISON Data Acquisition and Analysis Software

    Science.gov (United States)

    Kouprianov, Vladimir

    2013-08-01

    Since the first days of the ISON project, its success was strongly based on using advanced data analysis techniques and their implementation in software. Space debris studies and space surveillance in optical are very unique from the point of view of observation techniques and thus infer extremely specific requirements on sensor design and control and on initial data analysis, dictated mostly by fast apparent motion of space objects being studied. From the point of view of data acquisition and analysis software, this implies support for sophisticated scheduling, complex tracking, accurate timing, large fields of view, and undersampled CCD images with trailed sources. Here we present the historical outline, major goals and design concepts of the standard ISON data acquisition and analysis packages, and how they meet these requirements. Among these packages, the most important are: CHAOS telescope control system (TCS), its recent successor FORTE, and Apex II ‒ a platform for astronomical image analysis with focus on high-precision astrometry and photometry of fast-moving objects and transient phenomena. Development of these packages is supported by ISON, and they are now responsible for most of the raw data produced by the network. They are installed on nearly all sensors and are available to all participants of the ISON collaboration.

  5. Software Reliability Growth Model with Logistic-Exponential Test-Effort Function and Analysis of Software Release Policy

    OpenAIRE

    Shaik. Mohammad Rafi; Dr.K.Nageswara Rao; Shaheda Akthar

    2010-01-01

    software reliability is one of the important factors of software quality. Before software delivered in to market it is thoroughly checked and errors are removed. Every software industry wants to develop software that should be error free. Software reliabilitygrowth models are helping the software industries to develop software which is error free and reliable. In this paper an analysis is done based on incorporating the logistic-exponential testing-effort in to NHPP Software reliability growt...

  6. Static analysis of software the abstract interpretation

    CERN Document Server

    Boulanger, Jean-Louis

    2013-01-01

    The existing literature currently available to students and researchers is very general, covering only the formal techniques of static analysis. This book presents real examples of the formal techniques called ""abstract interpretation"" currently being used in various industrial fields: railway, aeronautics, space, automotive, etc. The purpose of this book is to present students and researchers, in a single book, with the wealth of experience of people who are intrinsically involved in the realization and evaluation of software-based safety critical systems. As the authors are people curr

  7. R suite for the Reduction and Analysis of UFO Orbit Data

    Science.gov (United States)

    Campbell-Burns, P.; Kacerek, R.

    2016-02-01

    This paper presents work undertaken by UKMON to compile a suite of simple R scripts for the reduction and analysis of meteor data. The application of R in this context is by no means an original idea and there is no doubt that it has been used already in many reports to the IMO. However, we are unaware of any common libraries or shared resources available to the meteor community. By sharing our work we hope to stimulate interest and discussion. Graphs shown in this paper are illustrative and are based on current data from both EDMOND and UKMON.

  8. Development of software for airborne photos analysis

    Science.gov (United States)

    Rudowicz-Nawrocka, J.; Tomczak, R. J.; Nowakowski, K.; Mueller, W.; Kujawa, S.

    2014-04-01

    Systems type UAV / UAS enable acquisition of huge amounts of data, such as images. For their storage and analysis IT systems are necessary. Existing systems do not always allow you to perform such operations as researchers wish to [1]. The purpose of the research is to automate the process of recognizing objects and phenomena occurring on grasslands. The basis for action are numerous collections of images taken from the oktokopter [2]. For the purpose of the collection, management and analysis of image data and character acquired in the course of research, in accordance with the principles of software engineering several computer programs has been produced. The resulting software is different functionality and type. Applications were made using a number of popular technologies. The choice of so many technology was primarily dictated by the possibilities of their use for specific tasks and availability on different platforms and the ability to distribute open source. Applications presented by the authors, designed to assess the status of grassland based on aerial photography, show the complexity of the issues but at the same time tend to further research.

  9. STAR: Software Toolkit for Analysis Research

    Energy Technology Data Exchange (ETDEWEB)

    Doak, J.E.; Prommel, J.M.; Whiteson, R.; Hoffbauer, B.L.; Thomas, T.R. [Los Alamos National Lab., NM (United States); Helman, P. [New Mexico Univ., Albuquerque, NM (United States). Dept. of Computer Science

    1993-08-01

    Analyzing vast quantities of data from diverse information sources is an increasingly important element for nonproliferation and arms control analysis. Much of the work in this area has used human analysts to assimilate, integrate, and interpret complex information gathered from various sources. With the advent of fast computers, we now have the capability to automate this process thereby shifting this burden away from humans. In addition, there now exist huge data storage capabilities which have made it possible to formulate large integrated databases comprising many thereabouts of information spanning a variety of subjects. We are currently designing a Software Toolkit for Analysis Research (STAR) to address these issues. The goal of STAR is to Produce a research tool that facilitates the development and interchange of algorithms for locating phenomena of interest to nonproliferation and arms control experts. One major component deals with the preparation of information. The ability to manage and effectively transform raw data into a meaningful form is a prerequisite for analysis by any methodology. The relevant information to be analyzed can be either unstructured text structured data, signals, or images. Text can be numerical and/or character, stored in raw data files, databases, streams of bytes, or compressed into bits in formats ranging from fixed, to character-delimited, to a count followed by content The data can be analyzed in real-time or batch mode. Once the data are preprocessed, different analysis techniques can be applied. Some are built using expert knowledge. Others are trained using data collected over a period of time. Currently, we are considering three classes of analyzers for use in our software toolkit: (1) traditional machine learning techniques, (2) the purely statistical system, and (3) expert systems.

  10. ANALYSIS OF SOFTWARE COST ESTIMATION MODELS

    OpenAIRE

    Tahir Abdullah; Rabia Saleem; Shahbaz Nazeer; Muhammad Usman

    2012-01-01

    Software Cost estimation is a process of forecasting the Cost of project in terms of budget, time, and other resources needed to complete a software system and it is a core issue in the software project management to estimate the cost of a project before initiating the Software Project. Different models have been developed to estimate the cost of software projects for the last several years. Most of these models rely on the Analysts’ experience, size of the software project and some other sof...

  11. Visual querying and analysis of large software repositories

    NARCIS (Netherlands)

    Voinea, Lucian; Telea, Alexandru

    2009-01-01

    We present a software framework for mining software repositories. Our extensible framework enables the integration of data extraction from repositories with data analysis and interactive visualization. We demonstrate the applicability of the framework by presenting several case studies performed on

  12. Software Speeds Up Analysis of Breast Cancer Risk

    Science.gov (United States)

    ... page: https://medlineplus.gov/news/fullstory_161117.html Software Speeds Up Analysis of Breast Cancer Risk: Study ... 22, 2016 THURSDAY, Sept. 22, 2016 (HealthDay News) -- Software that quickly analyzes mammograms and patient history to ...

  13. Analysis of Software Product Strategy at TPS

    OpenAIRE

    Oystryk, Gareth

    2010-01-01

    The purpose of this report is to help TPS make strategic decisions about the future of its human services software products. TPS is a privately held company that entered the software publishing industry in 2006 with the intent of selling products and services to the human services software market. However, TPS’ portfolio of products experienced uneven financial performance over the last four years, resulting in the need to reconsider its software product strategy. This report presents finding...

  14. Method for detecting software anomalies based on recurrence plot analysis

    Directory of Open Access Journals (Sweden)

    Michał Mosdorf

    2012-03-01

    Full Text Available Presented paper evaluates method for detecting software anomalies based on recurrence plot analysis of trace log generated by software execution. Described method for detecting software anomalies is based on windowed recurrence quantification analysis for selected measures (e.g. Recurrence rate - RR or Determinism - DET. Initial results show that proposed method is useful in detecting silent software anomalies that do not result in typical crashes (e.g. exceptions.

  15. Image processing and analysis software development

    International Nuclear Information System (INIS)

    The work presented in this project is aimed at developing a software 'IMAGE GALLERY' to investigate various image processing and analysis techniques. The work was divided into two parts namely the image processing techniques and pattern recognition, which further comprised of character and face recognition. Various image enhancement techniques including negative imaging, contrast stretching, compression of dynamic, neon, diffuse, emboss etc. have been studied. Segmentation techniques including point detection, line detection, edge detection have been studied. Also some of the smoothing and sharpening filters have been investigated. All these imaging techniques have been implemented in a window based computer program written in Visual Basic Neural network techniques based on Perception model have been applied for face and character recognition. (author)

  16. Standardizing Activation Analysis: New Software for Photon Activation Analysis

    International Nuclear Information System (INIS)

    Photon Activation Analysis (PAA) of environmental, archaeological and industrial samples requires extensive data analysis that is susceptible to error. For the purpose of saving time, manpower and minimizing error, a computer program was designed, built and implemented using SQL, Access 2007 and asp.net technology to automate this process. Based on the peak information of the spectrum and assisted by its PAA library, the program automatically identifies elements in the samples and calculates their concentrations and respective uncertainties. The software also could be operated in browser/server mode, which gives the possibility to use it anywhere the internet is accessible. By switching the nuclide library and the related formula behind, the new software can be easily expanded to neutron activation analysis (NAA), charged particle activation analysis (CPAA) or proton-induced X-ray emission (PIXE). Implementation of this would standardize the analysis of nuclear activation data. Results from this software were compared to standard PAA analysis with excellent agreement. With minimum input from the user, the software has proven to be fast, user-friendly and reliable.

  17. Analysis of Empirical Software Effort Estimation Models

    CERN Document Server

    Basha, Saleem

    2010-01-01

    Reliable effort estimation remains an ongoing challenge to software engineers. Accurate effort estimation is the state of art of software engineering, effort estimation of software is the preliminary phase between the client and the business enterprise. The relationship between the client and the business enterprise begins with the estimation of the software. The credibility of the client to the business enterprise increases with the accurate estimation. Effort estimation often requires generalizing from a small number of historical projects. Generalization from such limited experience is an inherently under constrained problem. Accurate estimation is a complex process because it can be visualized as software effort prediction, as the term indicates prediction never becomes an actual. This work follows the basics of the empirical software effort estimation models. The goal of this paper is to study the empirical software effort estimation. The primary conclusion is that no single technique is best for all sit...

  18. Mutation Analysis Approach to Develop Reliable Object-Oriented Software

    Directory of Open Access Journals (Sweden)

    Monalisa Sarma

    2014-01-01

    Full Text Available In general, modern programs are large and complex and it is essential that they should be highly reliable in applications. In order to develop highly reliable software, Java programming language developer provides a rich set of exceptions and exception handling mechanisms. Exception handling mechanisms are intended to help developers build robust programs. Given a program with exception handling constructs, for an effective testing, we are to detect whether all possible exceptions are raised and caught or not. However, complex exception handling constructs make it tedious to trace which exceptions are handled and where and which exceptions are passed on. In this paper, we address this problem and propose a mutation analysis approach to develop reliable object-oriented programs. We have applied a number of mutation operators to create a large set of mutant programs with different type of faults. We then generate test cases and test data to uncover exception related faults. The test suite so obtained is applied to the mutant programs measuring the mutation score and hence verifying whether mutant programs are effective or not. We have tested our approach with a number of case studies to substantiate the efficacy of the proposed mutation analysis technique.

  19. Software patterns, knowledge maps, and domain analysis

    CERN Document Server

    Fayad, Mohamed E; Hegde, Srikanth GK; Basia, Anshu; Vakil, Ashka

    2014-01-01

    Preface AcknowledgmentsAuthors INTRODUCTIONAn Overview of Knowledge MapsIntroduction: Key Concepts-Software Stable Models, Knowledge Maps, Pattern Language, Goals, Capabilities (Enduring Business Themes + Business Objects) The Motivation The Problem The Objectives Overview of Software Stability Concepts Overview of Knowledge Maps Pattern Languages versus Knowledge Maps: A Brief ComparisonThe Solution Knowledge Maps Methodology or Concurrent Software Development ModelWhy Knowledge Maps? Research Methodology Undertaken Research Verification and Validation The Stratification of This Book Summary

  20. Analysis on Some of Software Reliability Models

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Software reliability & maintainability evaluation tool (SRMET 3.0) is introducted in detail in this paper,which was developed by Software Evaluation and Test Center of China Aerospace Mechanical Corporation. SRMET 3.0is supported by seven soft ware reliability models and four software maintainability models. Numerical characteristicsfor all those models are deeply studied in this paper, and corresponding numerical algorithms for each model are alsogiven in the paper.

  1. A suite of R packages for web-enabled modeling and analysis of surface waters

    Science.gov (United States)

    Read, J. S.; Winslow, L. A.; Nüst, D.; De Cicco, L.; Walker, J. I.

    2014-12-01

    Researchers often create redundant methods for downloading, manipulating, and analyzing data from online resources. Moreover, the reproducibility of science can be hampered by complicated and voluminous data, lack of time for documentation and long-term maintenance of software, and fear of exposing programming skills. The combination of these factors can encourage unshared one-off programmatic solutions instead of openly provided reusable methods. Federal and academic researchers in the water resources and informatics domains have collaborated to address these issues. The result of this collaboration is a suite of modular R packages that can be used independently or as elements in reproducible analytical workflows. These documented and freely available R packages were designed to fill basic needs for the effective use of water data: the retrieval of time-series and spatial data from web resources (dataRetrieval, geoknife), performing quality assurance and quality control checks of these data with robust statistical methods (sensorQC), the creation of useful data derivatives (including physically- and biologically-relevant indices; GDopp, LakeMetabolizer), and the execution and evaluation of models (glmtools, rLakeAnalyzer). Here, we share details and recommendations for the collaborative coding process, and highlight the benefits of an open-source tool development pattern with a popular programming language in the water resources discipline (such as R). We provide examples of reproducible science driven by large volumes of web-available data using these tools, explore benefits of accessing packages as standardized web processing services (WPS) and present a working platform that allows domain experts to publish scientific algorithms in a service-oriented architecture (WPS4R). We assert that in the era of open data, tools that leverage these data should also be freely shared, transparent, and developed in an open innovation environment.

  2. Visual querying and analysis of large software repositories

    OpenAIRE

    Voinea, Lucian; Telea, Alexandru

    2009-01-01

    We present a software framework for mining software repositories. Our extensible framework enables the integration of data extraction from repositories with data analysis and interactive visualization. We demonstrate the applicability of the framework by presenting several case studies performed on industry-size software repositories. In each study we use the framework to give answers to one or several software engineering questions addressing a specific project. Next, we validate the answers...

  3. A strategic analysis of a software company in transition

    OpenAIRE

    Larson, Marnie

    2005-01-01

    This project provides an in depth analysis of a small software company attempting to transition business models in an evolving software market. The market for HRIPayroll software solutions is consolidating quickly and StarGarden Software must decide where its place is in the market and whether it makes sense for the company to continue to go it alone. Options include: utilizing resellers, downsizing, and becoming an acquisition target. StarGarden also has an exciting new product line in devel...

  4. Change Impact Analysis of Crosscutting in Software Architectural Design

    OpenAIRE

    Berg, van den, W.

    2006-01-01

    Software architectures should be amenable to changes in user requirements and implementation technology. The analysis of the impact of these changes can be based on traceability of architectural design elements. Design elements have dependencies with other software artifacts but also evolve in time. Crosscutting dependencies may have a strong influence on modifiability of software architectures. We present an impact analysis of crosscutting dependencies in architectural design. The analysis i...

  5. Regression Testing Cost Reduction Suite

    OpenAIRE

    Mohamed Alaa El-Din; Ismail Abd El-Hamid Taha; Hesham El-Deeb

    2014-01-01

    The estimated cost of software maintenance exceeds 70 percent of total software costs [1], and large portion of this maintenance expenses is devoted to regression testing. Regression testing is an expensive and frequently executed maintenance activity used to revalidate the modified software. Any reduction in the cost of regression testing would help to reduce the software maintenance cost. Test suites once developed are reused and updated frequently as the software evolves. As a result, some...

  6. User manual for freight transportation analysis software

    Energy Technology Data Exchange (ETDEWEB)

    Terziev, M.N.; Wilson, L.B.

    1976-12-01

    Under sponsorship of the Federal Energy Administration, The Center for Transportation Studies at M.I.T. developed and tested a methodology for analysis of the impacts of various government and carrier policies on the demand for freight transportation. The purpose of this document is to familiarize the reader with the computer programs included in this methodology. The purpose of the computer software developed for this project is threefold. First, programs are used to calculate the cost of each of the transport alternatives available for the purchase of a given commodity by a receiver in a given industrial sector. Furthermore, these programs identify the least-cost alternative, and thus provide a forecasting capability at the disaggregate level. Given a description of the population of receivers in the destination city, a second group of programs applies the costing and forecasting programs to each receiver in a sample drawn from the population. The disaggregate forecasts are summed to produce an aggregate forecast of modal tonnages for the given origin/destination city-pair. Finally, a third group of programs computes fuel consumed in transportation from the aggregate modal tonnages. These three groups of programs were placed under the control of a master routine which coordinates the input and output of data.

  7. Analysis of Test Efficiency during Software Development Process

    CERN Document Server

    Nair, T R Gopalakrishnan; Tiwari, Pranesh Kumar

    2012-01-01

    One of the prerequisites of any organization is an unvarying sustainability in the dynamic and competitive industrial environment. Development of high quality software is therefore an inevitable constraint of any software industry. Defect management being one of the highly influencing factors for the production of high quality software, it is obligatory for the software organizations to orient them towards effective defect management. Since, the time of software evolution, testing is deemed a promising technique of defect management in all IT industries. This paper provides an empirical investigation of several projects through a case study comprising of four software companies having various production capabilities. The aim of this investigation is to analyze the efficiency of test team during software development process. The study indicates very low-test efficiency at requirements analysis phase and even lesser test efficiency at design phase of software development. Subsequently, the study calls for a str...

  8. Development of a New VLBI Data Analysis Software

    Science.gov (United States)

    Bolotin, Sergei; Gipson, John M.; MacMillan, Daniel S.

    2010-01-01

    We present an overview of a new VLBI analysis software under development at NASA GSFC. The new software will replace CALC/SOLVE and many related utility programs. It will have the capabilities of the current system as well as incorporate new models and data analysis techniques. In this paper we give a conceptual overview of the new software. We formulate the main goals of the software. The software should be flexible and modular to implement models and estimation techniques that currently exist or will appear in future. On the other hand it should be reliable and possess production quality for processing standard VLBI sessions. Also, it needs to be capable of processing observations from a fully deployed network of VLBI2010 stations in a reasonable time. We describe the software development process and outline the software architecture.

  9. Spacecraft Trajectory Analysis and Mission Planning Simulation (STAMPS) Software

    Science.gov (United States)

    Puckett, Nancy; Pettinger, Kris; Hallstrom,John; Brownfield, Dana; Blinn, Eric; Williams, Frank; Wiuff, Kelli; McCarty, Steve; Ramirez, Daniel; Lamotte, Nicole; Vu, Tuan

    2014-01-01

    STAMPS simulates either three- or six-degree-of-freedom cases for all spacecraft flight phases using translated HAL flight software or generic GN&C models. Single or multiple trajectories can be simulated for use in optimization and dispersion analysis. It includes math models for the vehicle and environment, and currently features a "C" version of shuttle onboard flight software. The STAMPS software is used for mission planning and analysis within ascent/descent, rendezvous, proximity operations, and navigation flight design areas.

  10. Managing Software Project Risks (Analysis Phase) with Proposed Fuzzy Regression Analysis Modelling Techniques with Fuzzy Concepts

    OpenAIRE

    Elzamly, Abdelrafe; Hussin, Burairah

    2014-01-01

    The aim of this paper is to propose new mining techniques by which we can study the impact of different risk management techniques and different software risk factors on software analysis development projects. The new mining technique uses the fuzzy multiple regression analysis techniques with fuzzy concepts to manage the software risks in a software project and mitigating risk with software process improvement. Top ten software risk factors in analysis phase and thirty risk management techni...

  11. Rapid Optical Characterization Suite for in situ Target Analysis of Rock Surfaces Project

    Data.gov (United States)

    National Aeronautics and Space Administration — ROCSTAR is an in situ instrument suite that can accomplish rapid mineral and molecular identification without sample preparation for in situ planetary exploration;...

  12. New software for XRF quantitative analysis

    International Nuclear Information System (INIS)

    It is well known that in XRF quantitative analysis empirical calibrations, even in the most simple case of binary mixtures, a relatively large number of standards are required. In case of samples containing more than 3 elements, the number of standards needed for calibration becomes suddenly prohibitive and the calibration curve has to be obtained by complicated multidimensional fits. In order to overcome this difficulty, a new XRF analysis software has been developed, based exclusively on theoretical treatment of photon interactions in sample. Starting from theoretical formulas of Shiraiwa and Fujino for primary and secondary fluorescence, modified to take into account the finite sample thickness, the total yield for a characteristic line Xj in a sample can be calculated as function of its composition w vector = (w1,...,wn) were {wj} are the concentrations of all n constituent elements. A non-linear system can be written for a given sample with unknown composition. Choosing a number of equations equal to the number of identified elements, we obtain a non-linear system that can be solved numerically by Newton's algorithm. When a light element is known to be present in sample and its lines cannot be seen in spectrum (i.e. Al, C, etc) the completeness equation, Σwi = 1 must be added in the system to take into account the true composition. Based on the algorithm sketched before, a set of computer codes has been written each one being specific to one of the three types of excitation sources usually used in XRF: collimated beam from a X-ray tube, collimated isotopic source and ring-like isotopic source. The ring-source version is completed by a Monte Carlo code for incidence vs. detection angle weight matrix calculation. Also, a version taking into account chemical content for the existing compounds in sample has been written for each type of excitation source. The programs were tested on many samples with known composition and the results were always below 10

  13. Runtime analysis of search heuristics on software engineering problems

    Institute of Scientific and Technical Information of China (English)

    Per Kristian LEHRE; Xin YAO

    2009-01-01

    Many software engineering tasks can potentially be automated using search heuristics. However, much work is needed in designing and evaluating search heuristics before this approach can be routinely applied to a software engineering problem. Experimental methodology should be complemented with theoretical analysis to achieve this goal.Recently, there have been significant theoretical advances in the runtime analysis of evolutionary algorithms (EAs) and other search heuristics in other problem domains. We suggest that these methods could be transferred and adapted to gain insight into the behaviour of search heuristics on software engineering problems while automating software engineering.

  14. Using the Beopt Automated Residential Simulation Test Suite to Enable Comparative Analysis Between Energy Simulation Engines: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Tabares-Velasco, Paulo Cesar [National Renewable Energy Lab. (NREL), Golden, CO (United States); Maguire, Jeff [National Renewable Energy Lab. (NREL), Golden, CO (United States); Horowitz, Scott [National Renewable Energy Lab. (NREL), Golden, CO (United States); Christensen, Craig [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2014-09-01

    Verification and validation are crucial software quality control procedures to follow when developing and implementing models. This is particularly important because a variety of stakeholders rely on accurate predictions from building simulation programs. This study uses the BEopt Automated Residential Simulation Test Suite (BARTS) to facilitate comparison of two energy simulation engines across various building components and includes building models that isolate the impacts of specific components on annual energy consumption. As a case study, BARTS has been used to identify important discrepancies between the engines for several components of the building models. These discrepancies are caused by differences in the algorithms used by the engines or coding errors.

  15. Using the BEopt Automated Residential Simulation Test Suite to Enable Comparative Analysis Between Energy Simulation Engines: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Tabares-Velasco, P. C.; Maguire, J.; Horowitz, S.; Christensen, C.

    2014-09-01

    Verification and validation are crucial software quality control procedures when developing and implementing models. This is particularly important as a variety of stakeholders rely on accurate predictions from building simulation programs. This study uses the BEopt Automated Residential Simulation Test Suite (BARTS) to facilitate comparison of two energy simulation engines across various building components and includes models that isolate the impacts of specific building components on annual energy consumption. As a case study, BARTS has been used to identify important discrepancies between the engines for several components of the building models; these discrepancies are caused by differences in the models used by the engines or coding errors.

  16. The Combustion Experiment on the Sample Analysis at Mars (SAM) Instrument Suite on the Curiosity Rover

    Science.gov (United States)

    Stern, J. C.; Malespin, C. A.; Eigenbrode, J. L.; Graham, H. V.; Archer, P. D., Jr.; Brunner, A. E.; Freissinet, C.; Franz, H. B.; Fuentes, J.; Glavin, D. P.; Leshin, L. A.; Mahaffy, P. R.; McAdam, A. C.; Ming, D. W.; Navvaro-Gonzales, R.; Niles, P. B.; Steele, A.

    2014-01-01

    The combustion experiment on the Sample Analysis at Mars (SAM) suite on Curiosity will heat a sample of Mars regolith in the presence of oxygen and measure composition of the evolved gases using quadrupole mass spectrometry (QMS) and tunable laser spectrometry (TLS). QMS will enable detection of combustion products such as CO, CO2, NO, and other oxidized species, while TLS will enable precise measurements of the abundance and carbon isotopic composition (delta(sup 13)C) of the evolved CO2 and hydrogen isotopic composition (deltaD) of H2O. SAM will perform a two-step combustion to isolate combustible materials below approx.550 C and above approx.550 C. The combustion experiment on SAM, if properly designed and executed, has the potential to answer multiple questions regarding the origins of volatiles seen thus far in SAM evolved gas analysis (EGA) on Mars. Constraints imposed by SAM and MSL time and power resources, as well as SAM consumables (oxygen gas), will limit the number of SAM combustion experiments, so it is imperative to design an experiment targeting the most pressing science questions. Low temperature combustion experiments will primarily target the quantification of carbon (and nitrogen) contributed by SAM wet chemistry reagants MTBSTFA (N-Methyl-N-tert-butyldimethylsilyltrifluoroacetamide) and DMF (Dimethylformamide), which have been identified in the background of blank and sample runs and may adsorb to the sample while the cup is in the Sample Manipulation System (SMS). In addition, differences between the sample and "blank" may yield information regarding abundance and delta(sup 13)C of bulk (both organic and inorganic) martian carbon. High temperature combustion experiments primarily aim to detect refractory organic matter, if present in Cumberland fines, as well as address the question of quantification and deltaD value of water evolution associated with hydroxyl hydrogen in clay minerals.

  17. Design and Development of a Miniaturized Double Latching Solenoid Valve for the Sample Analysis at Mars Instrument Suite

    Science.gov (United States)

    Smith, James T.

    2008-01-01

    The development of the in-house Miniaturized Double Latching Solenoid Valve, or Microvalve, for the Gas Processing System (GPS) of the Sample Analysis at Mars (SAM) instrument suite is described. The Microvalve is a double latching solenoid valve that actuates a pintle shaft axially to hermetically seal an orifice. The key requirements and the design innovations implemented to meet them are described.

  18. Theoretical and software considerations for nonlinear dynamic analysis

    Science.gov (United States)

    Schmidt, R. J.; Dodds, R. H., Jr.

    1983-01-01

    In the finite element method for structural analysis, it is generally necessary to discretize the structural model into a very large number of elements to accurately evaluate displacements, strains, and stresses. As the complexity of the model increases, the number of degrees of freedom can easily exceed the capacity of present-day software system. Improvements of structural analysis software including more efficient use of existing hardware and improved structural modeling techniques are discussed. One modeling technique that is used successfully in static linear and nonlinear analysis is multilevel substructuring. This research extends the use of multilevel substructure modeling to include dynamic analysis and defines the requirements for a general purpose software system capable of efficient nonlinear dynamic analysis. The multilevel substructuring technique is presented, the analytical formulations and computational procedures for dynamic analysis and nonlinear mechanics are reviewed, and an approach to the design and implementation of a general purpose structural software system is presented.

  19. Fabrication and performance analysis of a DEA cuff designed for dry-suit applications

    Science.gov (United States)

    Ahmadi, S.; Camacho Mattos, A.; Barbazza, A.; Soleimani, M.; Boscariol, P.; Menon, C.

    2013-03-01

    A method for manufacturing a cylindrical dielectric elastomer actuator (DEA) is presented. The cylindrical DEA can be used in fabricating the cuff area of dry-suits where the garment is very tight and wearing the suit is difficult. When electrically actuated, the DEA expands radially and the suit can be worn more comfortably. In order to study the performance of the DEA, a customized testing setup was designed, and silicone-made cuff samples with different material stiffnesses were tested. Analytical and FEM modeling were considered to evaluate the experimental output. The results revealed that although the stiffness of the DEA material has a direct relationship with the radial constrictive pressure caused by mechanically stretching the DEA, it has a minor effect on the actuation pressure. It was also found that stacking multiple layers of the DEA to fabricate a laminated structure enabled the attainment of a desired variation of pressure required for the implementation of an electrically tunable cuff.

  20. Software metrics a guide to planning, analysis, and application

    CERN Document Server

    Pandian, C Ravindranath

    2003-01-01

    Software Metrics: A Guide to Planning, Analysis, and Application simplifies software measurement and explains its value as a pragmatic tool for management. Ideas and techniques presented in this book are derived from best practices. The ideas are field-proven, down to earth, and straightforward, making this volume an invaluable resource for those striving for process improvement.

  1. Integrated analysis software for bulk power system stability

    Energy Technology Data Exchange (ETDEWEB)

    Tanaka, T.; Nagao, T.; Takahashi, K. [Central Research Inst. of Electric Power Industry, Tokyo (Japan)

    1994-12-31

    This paper presents Central Research Inst.of Electric Power Industry - CRIEPI`s - own developed three softwares for bulk power network analysis and the user support system which arranges tremendous data necessary for these softwares with easy and high reliability. (author) 3 refs., 7 figs., 2 tabs.

  2. TEST SUITE GENERATION PROCESS FOR AGENT TESTING

    Directory of Open Access Journals (Sweden)

    HOUHAMDI ZINA

    2011-04-01

    Full Text Available Software agents are a promising technology for today's complex, distributed systems. Methodologies and techniques that address testing and reliability of multi agent systems are increasingly demanded, in particular to support automated test case generation and execution. In this paper, we introduce a novel approach for goal-oriented software agent testing. It specifies a testing process that complements the goal oriented methodology Tropos and reinforces the mutual relationship between goal analysis and testing. Furthermore, it defines a structured and comprehensive agent test suite generation process by providing a systematic way of deriving test cases from goal analysis.

  3. The decommissioning and demolition of four suites of high active chemical analysis cells at DNPDE

    International Nuclear Information System (INIS)

    The decommissioning and demolition of four laboratory suites of high active cells at DNPDE is described. All four suites had suffered drain leaks of high active liquor into underfloor ducts; the options available at the time and current policy for dealing with the resultant activity deposits are given. The decommissioning procedures are detailed to provide information for future similar exercises. Features to ease demolition of such facilities and to eliminate the possibility of long term activity deposition from drain leaks are highlighted for incorporation in future designs. The waste arisings and radiation doses received during the work are tabulated. (author)

  4. Analysis of Hollinshed watershed using GIS software

    OpenAIRE

    Hipp, Michael.

    1999-01-01

    CIVINS The objective of this study is to apply GIS and storm water modeling software to develop an accurate hydrologic model of the Hollinshed watershed. Use of GIS will allow the user to quickly change the land use of specific areas within in the watershed to determine the hydrologic effects throughout the watershed using the storm water model. Specific objectives were to: (1) develop a GIS database for the Hollinshed watershed; (2) Develop an appropriate link/ node diagram and correspond...

  5. Software Users Manual (SUM): Extended Testability Analysis (ETA) Tool

    Science.gov (United States)

    Maul, William A.; Fulton, Christopher E.

    2011-01-01

    This software user manual describes the implementation and use the Extended Testability Analysis (ETA) Tool. The ETA Tool is a software program that augments the analysis and reporting capabilities of a commercial-off-the-shelf (COTS) testability analysis software package called the Testability Engineering And Maintenance System (TEAMS) Designer. An initial diagnostic assessment is performed by the TEAMS Designer software using a qualitative, directed-graph model of the system being analyzed. The ETA Tool utilizes system design information captured within the diagnostic model and testability analysis output from the TEAMS Designer software to create a series of six reports for various system engineering needs. The ETA Tool allows the user to perform additional studies on the testability analysis results by determining the detection sensitivity to the loss of certain sensors or tests. The ETA Tool was developed to support design and development of the NASA Ares I Crew Launch Vehicle. The diagnostic analysis provided by the ETA Tool was proven to be valuable system engineering output that provided consistency in the verification of system engineering requirements. This software user manual provides a description of each output report generated by the ETA Tool. The manual also describes the example diagnostic model and supporting documentation - also provided with the ETA Tool software release package - that were used to generate the reports presented in the manual

  6. The development of automated behavior analysis software

    Science.gov (United States)

    Jaana, Yuki; Prima, Oky Dicky A.; Imabuchi, Takashi; Ito, Hisayoshi; Hosogoe, Kumiko

    2015-03-01

    The measurement of behavior for participants in a conversation scene involves verbal and nonverbal communications. The measurement validity may vary depending on the observers caused by some aspects such as human error, poorly designed measurement systems, and inadequate observer training. Although some systems have been introduced in previous studies to automatically measure the behaviors, these systems prevent participants to talk in a natural way. In this study, we propose a software application program to automatically analyze behaviors of the participants including utterances, facial expressions (happy or neutral), head nods, and poses using only a single omnidirectional camera. The camera is small enough to be embedded into a table to allow participants to have spontaneous conversation. The proposed software utilizes facial feature tracking based on constrained local model to observe the changes of the facial features captured by the camera, and the Japanese female facial expression database to recognize expressions. Our experiment results show that there are significant correlations between measurements observed by the observers and by the software.

  7. Free software for performing physical analysis of systems for digital radiography and mammography

    Energy Technology Data Exchange (ETDEWEB)

    Donini, Bruno; Lanconelli, Nico, E-mail: nico.lanconelli@unibo.it [Alma Mater Studiorum, Department of Physics and Astronomy, University of Bologna, Bologna 40127 (Italy); Rivetti, Stefano [Fisica Medica, Ospedale di Sassuolo S.p.A., Sassuolo 41049 (Italy); Bertolini, Marco [Medical Physics Unit, Azienda Ospedaliera ASMN, Istituto di Ricovero e Cura a Carattere Scientifico, Reggio Emilia 42123 (Italy)

    2014-05-15

    Purpose: In this paper, the authors present a free software for assisting users in achieving the physical characterization of x-ray digital systems and image quality checks. Methods: The program was developed as a plugin of a well-known public-domain suite ImageJ. The software can assist users in calculating various physical parameters such as the response curve (also termed signal transfer property), modulation transfer function (MTF), noise power spectra (NPS), and detective quantum efficiency (DQE). It also includes the computation of some image quality checks: defective pixel analysis, uniformity, dark analysis, and lag. Results: The software was made available in 2009 and has been used during the last couple of years by many users who gave us valuable feedback for improving its usability. It was tested for achieving the physical characterization of several clinical systems for digital radiography and mammography. Various published papers made use of the outcomes of the plugin. Conclusions: This software is potentially beneficial to a variety of users: physicists working in hospitals, staff working in radiological departments, such as medical physicists, physicians, engineers. The plugin, together with a brief user manual, are freely available and can be found online ( http://www.medphys.it/downloads.htm ). With our plugin users can estimate all three most important parameters used for physical characterization (MTF, NPS, and also DQE). The plugin can run on any operating system equipped with ImageJ suite. The authors validated the software by comparing MTF and NPS curves on a common set of images with those obtained with other dedicated programs, achieving a very good agreement.

  8. GWAMA: software for genome-wide association meta-analysis

    Directory of Open Access Journals (Sweden)

    Mägi Reedik

    2010-05-01

    Full Text Available Abstract Background Despite the recent success of genome-wide association studies in identifying novel loci contributing effects to complex human traits, such as type 2 diabetes and obesity, much of the genetic component of variation in these phenotypes remains unexplained. One way to improving power to detect further novel loci is through meta-analysis of studies from the same population, increasing the sample size over any individual study. Although statistical software analysis packages incorporate routines for meta-analysis, they are ill equipped to meet the challenges of the scale and complexity of data generated in genome-wide association studies. Results We have developed flexible, open-source software for the meta-analysis of genome-wide association studies. The software incorporates a variety of error trapping facilities, and provides a range of meta-analysis summary statistics. The software is distributed with scripts that allow simple formatting of files containing the results of each association study and generate graphical summaries of genome-wide meta-analysis results. Conclusions The GWAMA (Genome-Wide Association Meta-Analysis software has been developed to perform meta-analysis of summary statistics generated from genome-wide association studies of dichotomous phenotypes or quantitative traits. Software with source files, documentation and example data files are freely available online at http://www.well.ox.ac.uk/GWAMA.

  9. The Einstein Suite: A Web-Based Tool for Rapid and Collaborative Engineering Design and Analysis

    Science.gov (United States)

    Palmer, Richard S.

    1997-01-01

    Taken together the components of the Einstein Suite provide two revolutionary capabilities - they have the potential to change the way engineering and financial engineering are performed by: (1) providing currently unavailable functionality, and (2) providing a 10-100 times improvement over currently available but impractical or costly functionality.

  10. Modular Open-Source Software for Item Factor Analysis

    Science.gov (United States)

    Pritikin, Joshua N.; Hunter, Micheal D.; Boker, Steven M.

    2015-01-01

    This article introduces an item factor analysis (IFA) module for "OpenMx," a free, open-source, and modular statistical modeling package that runs within the R programming environment on GNU/Linux, Mac OS X, and Microsoft Windows. The IFA module offers a novel model specification language that is well suited to programmatic generation…

  11. Software quality studies using analytical metric analysis

    OpenAIRE

    Rodríguez Martínez, Cecilia

    2013-01-01

    Actualmente las empresas de ingeniería derivan una gran cantidad de recursos a la detección y corrección de errores en sus códigos software. Estos errores se deben generalmente a los errores cometidos por los desarrolladores cuando escriben el código o sus especificaciones.  No hay ninguna herramienta capaz de detectar todos estos errores y algunos de ellos pasan desapercibidos tras el proceso de pruebas. Por esta razón, numerosas investigaciones han intentado encontrar indicadores en los cód...

  12. Project Report: Automatic Sequence Processor Software Analysis

    Science.gov (United States)

    Benjamin, Brandon

    2011-01-01

    The Mission Planning and Sequencing (MPS) element of Multi-Mission Ground System and Services (MGSS) provides space missions with multi-purpose software to plan spacecraft activities, sequence spacecraft commands, and then integrate these products and execute them on spacecraft. Jet Propulsion Laboratory (JPL) is currently is flying many missions. The processes for building, integrating, and testing the multi-mission uplink software need to be improved to meet the needs of the missions and the operations teams that command the spacecraft. The Multi-Mission Sequencing Team is responsible for collecting and processing the observations, experiments and engineering activities that are to be performed on a selected spacecraft. The collection of these activities is called a sequence and ultimately a sequence becomes a sequence of spacecraft commands. The operations teams check the sequence to make sure that no constraints are violated. The workflow process involves sending a program start command, which activates the Automatic Sequence Processor (ASP). The ASP is currently a file-based system that is comprised of scripts written in perl, c-shell and awk. Once this start process is complete, the system checks for errors and aborts if there are any; otherwise the system converts the commands to binary, and then sends the resultant information to be radiated to the spacecraft.

  13. GammaLib and ctools. A software framework for the analysis of astronomical gamma-ray data

    Science.gov (United States)

    Knödlseder, J.; Mayer, M.; Deil, C.; Cayrou, J.-B.; Owen, E.; Kelley-Hoskins, N.; Lu, C.-C.; Buehler, R.; Forest, F.; Louge, T.; Siejkowski, H.; Kosack, K.; Gerard, L.; Schulz, A.; Martin, P.; Sanchez, D.; Ohm, S.; Hassan, T.; Brau-Nogué, S.

    2016-08-01

    The field of gamma-ray astronomy has seen important progress during the last decade, yet to date no common software framework has been developed for the scientific analysis of gamma-ray telescope data. We propose to fill this gap by means of the GammaLib software, a generic library that we have developed to support the analysis of gamma-ray event data. GammaLib was written in C++ and all functionality is available in Python through an extension module. Based on this framework we have developed the ctools software package, a suite of software tools that enables flexible workflows to be built for the analysis of Imaging Air Cherenkov Telescope event data. The ctools are inspired by science analysis software available for existing high-energy astronomy instruments, and they follow the modular ftools model developed by the High Energy Astrophysics Science Archive Research Center. The ctools were written in Python and C++, and can be either used from the command line via shell scripts or directly from Python. In this paper we present the GammaLib and ctools software versions 1.0 that were released at the end of 2015. GammaLib and ctools are ready for the science analysis of Imaging Air Cherenkov Telescope event data, and also support the analysis of Fermi-LAT data and the exploitation of the COMPTEL legacy data archive. We propose using ctools as the science tools software for the Cherenkov Telescope Array Observatory.

  14. Analysis and Optimization of a Thyristor Structure Using Backside Schottky Contacts Suited for the High Temperature

    OpenAIRE

    Toulon, Gaëtan; Bourennane , Abdelhakim; Isoird, Karine

    2013-01-01

    International audience In high current, high voltage, high temperature (T > 125 °C) power applications, commercially available conventional silicon thyristors are not suited because they present high leakage current. In this context, this paper presents a high-symmetrical (voltage) thyristor structure that presents a lower leakage current and higher breakover voltage as compared with the conventional thyristor at T > 125 °C. It is shown through 2-D physical simulations that the replacement...

  15. Development of software tools for 4-D visualization and quantitative analysis of PHITS simulation results

    International Nuclear Information System (INIS)

    A suite of software tools has been developed to facilitate the development of apparatus using a radiation transport simulation code PHITS by enabling 4D visualization (3D space and time) and quantitative analysis of so-called dieaway plots. To deliver useable tools as soon as possible, the existing software was utilized as much as possible; ParaView will be used for the 4D visualization of the results, whereas the analyses of dieaway plots will be done with ROOT toolkit with a tool named “diana”. To enable 4D visualization using ParaView, a group of tools (angel2vtk, DispDCAS1, CamPos) has been developed for the conversion of the data format to the one which can be read from ParaView and to ease the visualization. (author)

  16. Applications of the BEam Cross section Analysis Software (BECAS)

    DEFF Research Database (Denmark)

    Blasques, José Pedro Albergaria Amaral; Bitsche, Robert; Fedorov, Vladimir;

    2013-01-01

    A newly developed framework is presented for structural design and analysis of long slender beam-like structures, e.g., wind turbine blades. The framework is based on the BEam Cross section Analysis Software – BECAS – a finite element based cross section analysis tool. BECAS is used...

  17. Analysis of Variance: What Is Your Statistical Software Actually Doing?

    Science.gov (United States)

    Li, Jian; Lomax, Richard G.

    2011-01-01

    Users assume statistical software packages produce accurate results. In this article, the authors systematically examined Statistical Package for the Social Sciences (SPSS) and Statistical Analysis System (SAS) for 3 analysis of variance (ANOVA) designs, mixed-effects ANOVA, fixed-effects analysis of covariance (ANCOVA), and nested ANOVA. For each…

  18. JEM-X science analysis software

    DEFF Research Database (Denmark)

    Westergaard, Niels Jørgen Stenfeldt; Kretschmar, P.; Oxborrow, Carol Anne;

    2003-01-01

    The science analysis of the data from JEM-X on INTEGRAL is performed through a number of levels including corrections, good time selection, imaging and source finding, spectrum and light-curve extraction. These levels consist of individual executables and the running of the complete analysis...... is controlled by a script where parameters for detailed settings are introduced. The end products are FITS files with a format compatible with standard analysis packages such as XSPEC....

  19. Analytical Tools for Space Suit Design

    Science.gov (United States)

    Aitchison, Lindsay

    2011-01-01

    As indicated by the implementation of multiple small project teams within the agency, NASA is adopting a lean approach to hardware development that emphasizes quick product realization and rapid response to shifting program and agency goals. Over the past two decades, space suit design has been evolutionary in approach with emphasis on building prototypes then testing with the largest practical range of subjects possible. The results of these efforts show continuous improvement but make scaled design and performance predictions almost impossible with limited budgets and little time. Thus, in an effort to start changing the way NASA approaches space suit design and analysis, the Advanced Space Suit group has initiated the development of an integrated design and analysis tool. It is a multi-year-if not decadal-development effort that, when fully implemented, is envisioned to generate analysis of any given space suit architecture or, conversely, predictions of ideal space suit architectures given specific mission parameters. The master tool will exchange information to and from a set of five sub-tool groups in order to generate the desired output. The basic functions of each sub-tool group, the initial relationships between the sub-tools, and a comparison to state of the art software and tools are discussed.

  20. Phycas: software for Bayesian phylogenetic analysis.

    Science.gov (United States)

    Lewis, Paul O; Holder, Mark T; Swofford, David L

    2015-05-01

    Phycas is open source, freely available Bayesian phylogenetics software written primarily in C++ but with a Python interface. Phycas specializes in Bayesian model selection for nucleotide sequence data, particularly the estimation of marginal likelihoods, central to computing Bayes Factors. Marginal likelihoods can be estimated using newer methods (Thermodynamic Integration and Generalized Steppingstone) that are more accurate than the widely used Harmonic Mean estimator. In addition, Phycas supports two posterior predictive approaches to model selection: Gelfand-Ghosh and Conditional Predictive Ordinates. The General Time Reversible family of substitution models, as well as a codon model, are available, and data can be partitioned with all parameters unlinked except tree topology and edge lengths. Phycas provides for analyses in which the prior on tree topologies allows polytomous trees as well as fully resolved trees, and provides for several choices for edge length priors, including a hierarchical model as well as the recently described compound Dirichlet prior, which helps avoid overly informative induced priors on tree length. PMID:25577605

  1. Dispersion analysis of biotoxins using HPAC software

    International Nuclear Information System (INIS)

    Biotoxins are emerging threat agents produced by living organisms: bacteria, plants, or animals. Biotoxins are generally classified as cyanotoxins, hemotoxins, necrotoxins, neurotoxins, and cytotoxins. The application of classical biotoxins as weapons of terror has been realized because of extreme potency and lethality; ease of production, transport, and misuse; and the need for prolonged intensive care among affected persons. Recently, emerging biotoxins, such as ricin and T2 micotoxin have been clandestinely used by either terrorist groups or military combat operations. It is thus highly desirable to have a modeling system to simulate dispersions of biotoxins in a terrorist attack scenario in order to provide prompt technical support and casualty estimation to the first responders and military rescuers. The Hazard Prediction and Assessment Capability (HPAC) automated software system provides the means to accurately predict the effects of hazardous material released into the atmosphere and its impact on civilian and military populations. The system uses integrated source terms, high-resolution weather forecasts and atmospheric transport and dispersion analyses to model hazard areas produced by military or terrorist incidents and industrial accidents. We have successfully incorporated physical, chemical, epidemiological and biological characteristics of a variety of biotoxins into the HPAC system and have conducted numerous analyses for our emergency responders. The health effects caused by these hazards are closely reflected in HPAC output results.(author)

  2. Power Analysis Software for Educational Researchers

    Science.gov (United States)

    Peng, Chao-Ying Joanne; Long, Haiying; Abaci, Serdar

    2012-01-01

    Given the importance of statistical power analysis in quantitative research and the repeated emphasis on it by American Educational Research Association/American Psychological Association journals, the authors examined the reporting practice of power analysis by the quantitative studies published in 12 education/psychology journals between 2005…

  3. The RUMBA software: tools for neuroimaging data analysis.

    Science.gov (United States)

    Bly, Benjamin Martin; Rebbechi, Donovan; Hanson, Stephen Jose; Grasso, Giorgio

    2004-01-01

    The enormous scale and complexity of data sets in functional neuroimaging makes it crucial to have well-designed and flexible software for image processing, modeling, and statistical analysis. At present, researchers must choose between general purpose scientific computing environments (e.g., Splus and Matlab), and specialized human brain mapping packages that implement particular analysis strategies (e.g., AFNI, SPM, VoxBo, FSL or FIASCO). For the vast majority of users in Human Brain Mapping and Cognitive Neuroscience, general purpose computing environments provide an insufficient framework for a complex data-analysis regime. On the other hand, the operational particulars of more specialized neuroimaging analysis packages are difficult or impossible to modify and provide little transparency or flexibility to the user for approaches other than massively multiple comparisons based on inferential statistics derived from linear models. In order to address these problems, we have developed open-source software that allows a wide array of data analysis procedures. The RUMBA software includes programming tools that simplify the development of novel methods, and accommodates data in several standard image formats. A scripting interface, along with programming libraries, defines a number of useful analytic procedures, and provides an interface to data analysis procedures. The software also supports a graphical functional programming environment for implementing data analysis streams based on modular functional components. With these features, the RUMBA software provides researchers programmability, reusability, modular analysis tools, novel data analysis streams, and an analysis environment in which multiple approaches can be contrasted and compared. The RUMBA software retains the flexibility of general scientific computing environments while adding a framework in which both experts and novices can develop and adapt neuroimaging-specific analyses.

  4. New Graphical User Interface for EXAFS analysis with the GNXAS suite of programs

    Science.gov (United States)

    Hatada, Keisuke; Iesari, Fabio; Properzi, Leonardo; Minicucci, M.; di Cicco, Andrea

    2016-05-01

    GNXAS is a suite of programs based on multiple scattering calculations which performs a structural refinement of EXAFS spectra. It can be used for any system although it has been mainly developed to determine the local structure of disordered substances. We developed a user-friendly graphical user interface (GUI) to facilitate use of the codes by using wxPython. The developed GUI and the codes are multiplatform running on Windows, Macintosh and Linux systems, and are free shareware (http://gnxas.unicam.it). In this work we illustrate features and potentials of this newly developed version of GNXAS (w-GNXAS).

  5. Software safety analysis techniques for developing safety critical software in the digital protection system of the LMR

    International Nuclear Information System (INIS)

    This report has described the software safety analysis techniques and the engineering guidelines for developing safety critical software to identify the state of the art in this field and to give the software safety engineer a trail map between the code and standards layer and the design methodology and documents layer. We have surveyed the management aspects of software safety activities during the software lifecycle in order to improve the safety. After identifying the conventional safety analysis techniques for systems, we have surveyed in details the software safety analysis techniques, software FMEA(Failure Mode and Effects Analysis), software HAZOP(Hazard and Operability Analysis), and software FTA(Fault Tree Analysis). We have also surveyed the state of the art in the software reliability assessment techniques. The most important results from the reliability techniques are not the specific probability numbers generated, but the insights into the risk importance of software features. To defend against potential common-mode failures, high quality, defense-in-depth, and diversity are considered to be key elements in digital I and C system design. To minimize the possibility of CMFs and thus increase the plant reliability, we have provided D-in-D and D analysis guidelines

  6. Software safety analysis techniques for developing safety critical software in the digital protection system of the LMR

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jang Soo; Cheon, Se Woo; Kim, Chang Hoi; Sim, Yun Sub

    2001-02-01

    This report has described the software safety analysis techniques and the engineering guidelines for developing safety critical software to identify the state of the art in this field and to give the software safety engineer a trail map between the code and standards layer and the design methodology and documents layer. We have surveyed the management aspects of software safety activities during the software lifecycle in order to improve the safety. After identifying the conventional safety analysis techniques for systems, we have surveyed in details the software safety analysis techniques, software FMEA(Failure Mode and Effects Analysis), software HAZOP(Hazard and Operability Analysis), and software FTA(Fault Tree Analysis). We have also surveyed the state of the art in the software reliability assessment techniques. The most important results from the reliability techniques are not the specific probability numbers generated, but the insights into the risk importance of software features. To defend against potential common-mode failures, high quality, defense-in-depth, and diversity are considered to be key elements in digital I and C system design. To minimize the possibility of CMFs and thus increase the plant reliability, we have provided D-in-D and D analysis guidelines.

  7. Adapted wavelet analysis from theory to software

    CERN Document Server

    Wickerhauser, Mladen Victor

    1994-01-01

    This detail-oriented text is intended for engineers and applied mathematicians who must write computer programs to perform wavelet and related analysis on real data. It contains an overview of mathematical prerequisites and proceeds to describe hands-on programming techniques to implement special programs for signal analysis and other applications. From the table of contents: - Mathematical Preliminaries - Programming Techniques - The Discrete Fourier Transform - Local Trigonometric Transforms - Quadrature Filters - The Discrete Wavelet Transform - Wavelet Packets - The Best Basis Algorithm - Multidimensional Library Trees - Time-Frequency Analysis - Some Applications - Solutions to Some of the Exercises - List of Symbols - Quadrature Filter Coefficients

  8. Evolvability Analysis Method for Open Source Software Systems

    OpenAIRE

    Chauhan, Muhammad Aufeef

    2011-01-01

    Software systems evolve over the life span to accommodate changes in order to meet technical and business requirements. Evolution of open source software (OSS) is challenging because of involvement from a large number of independent teams and developers who make modifications in the systems according to their own requirements. It is required to evaluate these changes as these are being incorporated into the system against the long term evolvability objectives. This paper presents the analysis...

  9. Software Security Analysis : Execution Phase Audit

    OpenAIRE

    Carlsson, Bengt; Baca, Dejan

    2005-01-01

    Code revision of a leading telecom product was performed, combining manual audit and static analysis tools. On average, one exploitable vulnerability was found for every 4000 lines of code. Half of the located threats in the product were buffer overflows followed by race condition, misplaced trust, and poor random generators. Static analysis tools were used to speed up the revision process and to integrate security tests into the overall project process. The discussion analyses the effectiven...

  10. Software Process Models and Analysis on Failure of Software Development Projects

    OpenAIRE

    Kaur, Rupinder; Sengupta, Jyotsna

    2013-01-01

    The software process model consists of a set of activities undertaken to design, develop and maintain software systems. A variety of software process models have been designed to structure, describe and prescribe the software development process. The software process models play a very important role in software development, so it forms the core of the software product. Software project failure is often devastating to an organization. Schedule slips, buggy releases and missing features can me...

  11. Study and design of indigenous probabilistic safety analysis software

    International Nuclear Information System (INIS)

    With the rapid development of nuclear power technology and engineering, it is necessary and important to study and develop indigenous professional PSA software for nuclear power plant Living PSA development and Living PSA application. According to Living PSA regulation and technical requirements, NFRisk is designed and expected to be a computer software system for Living PSA model development and maintenance, integrated with mode construction tools, qualitative and quantitative analysis tools, post analysis tools and so on, capable of fast analysis and quantification of large scale PSA event tree and fault tree models. Meanwhile, NFRisk incorporates data analysis and management code package, and provides the interface with the commercial PSA software, which enable it to extend multi-application development. In this paper, the design concept, design scheme and function of NFRisk are described. (authors)

  12. Development of output user interface software to support analysis

    Science.gov (United States)

    Wahanani, Nursinta Adi; Natsir, Khairina; Hartini, Entin

    2014-09-01

    Data processing software packages such as VSOP and MCNPX are softwares that has been scientifically proven and complete. The result of VSOP and MCNPX are huge and complex text files. In the analyze process, user need additional processing like Microsoft Excel to show informative result. This research develop an user interface software for output of VSOP and MCNPX. VSOP program output is used to support neutronic analysis and MCNPX program output is used to support burn-up analysis. Software development using iterative development methods which allow for revision and addition of features according to user needs. Processing time with this software 500 times faster than with conventional methods using Microsoft Excel. PYTHON is used as a programming language, because Python is available for all major operating systems: Windows, Linux/Unix, OS/2, Mac, Amiga, among others. Values that support neutronic analysis are k-eff, burn-up and mass Pu239 and Pu241. Burn-up analysis used the mass inventory values of actinide (Thorium, Plutonium, Neptunium and Uranium). Values are visualized in graphical shape to support analysis.

  13. Development of evaluation method for software safety analysis techniques

    International Nuclear Information System (INIS)

    Full text: Full text: Following the massive adoption of digital Instrumentation and Control (I and C) system for nuclear power plant (NPP), various Software Safety Analysis (SSA) techniques are used to evaluate the NPP safety for adopting appropriate digital I and C system, and then to reduce risk to acceptable level. However, each technique has its specific advantage and disadvantage. If the two or more techniques can be complementarily incorporated, the SSA combination would be more acceptable. As a result, if proper evaluation criteria are available, the analyst can then choose appropriate technique combination to perform analysis on the basis of resources. This research evaluated the applicable software safety analysis techniques nowadays, such as, Preliminary Hazard Analysis (PHA), Failure Modes and Effects Analysis (FMEA), Fault Tree Analysis (FTA), Markov chain modeling, Dynamic Flowgraph Methodology (DFM), and simulation-based model analysis; and then determined indexes in view of their characteristics, which include dynamic capability, completeness, achievability, detail, signal/ noise ratio, complexity, and implementation cost. These indexes may help the decision makers and the software safety analysts to choose the best SSA combination arrange their own software safety plan. By this proposed method, the analysts can evaluate various SSA combinations for specific purpose. According to the case study results, the traditional PHA + FMEA + FTA (with failure rate) + Markov chain modeling (without transfer rate) combination is not competitive due to the dilemma for obtaining acceptable software failure rates. However, the systematic architecture of FTA and Markov chain modeling is still valuable for realizing the software fault structure. The system centric techniques, such as DFM and Simulation-based model analysis, show the advantage on dynamic capability, achievability, detail, signal/noise ratio. However, their disadvantage are the completeness complexity

  14. Software Quality Attribute Analysis by Architecture Reconstruction (SQUA3RE)

    NARCIS (Netherlands)

    Stormer, C.

    2007-01-01

    Software Quality Attribute Analysis by Architecture Reconstruction (SQUA3RE) is a method that fosters a goal-driven process to evaluate the impact of what-if scenarios on existing systems. The method is partitioned into SQA2 and ARE. The SQA2 part provides the analysis models that can be used for q

  15. Propensity Score Analysis in R: A Software Review

    Science.gov (United States)

    Keller, Bryan; Tipton, Elizabeth

    2016-01-01

    In this article, we review four software packages for implementing propensity score analysis in R: "Matching, MatchIt, PSAgraphics," and "twang." After briefly discussing essential elements for propensity score analysis, we apply each package to a data set from the Early Childhood Longitudinal Study in order to estimate the…

  16. Do You Need ERP? In the Business World, Enterprise Resource Planning Software Keeps Costs down and Productivity up. Should Districts Follow Suit?

    Science.gov (United States)

    Careless, James

    2007-01-01

    Enterprise resource planning (ERP) software does what school leaders have always wanted their computer systems to do: It sees all. By integrating every IT application an organization has--from purchasing and inventory control to payroll--ERPs create a single unified system. Not only does this give IT managers a holistic view to what is happening…

  17. Analysis on testing and operational reliability of software

    Institute of Scientific and Technical Information of China (English)

    ZHAO Jing; LIU Hong-wei; CUI Gang; WANG Hui-qiang

    2008-01-01

    Software reliability was estimated based on NHPP software reliability growth models. Testing reliability and operational reliability may be essentially different. On the basis of analyzing similarities and differences of the testing phase and the operational phase, using the concept of operational reliability and the testing reliability, different forms of the comparison between the operational failure ratio and the predicted testing failure ratio were conducted, and the mathematical discussion and analysis were performed in detail. Finally, software optimal release was studied using software failure data. The results show that two kinds of conclusions can be derived by applying this method, one conclusion is to continue testing to meet the required reliability level of users, and the other is that testing stops when the required operational reliability is met, thus the testing cost can be reduced.

  18. WHIPPET: a collaborative software environment for medical image processing and analysis

    Science.gov (United States)

    Hu, Yangqiu; Haynor, David R.; Maravilla, Kenneth R.

    2007-03-01

    While there are many publicly available software packages for medical image processing, making them available to end users in clinical and research labs remains non-trivial. An even more challenging task is to mix these packages to form pipelines that meet specific needs seamlessly, because each piece of software usually has its own input/output formats, parameter sets, and so on. To address these issues, we are building WHIPPET (Washington Heterogeneous Image Processing Pipeline EnvironmenT), a collaborative platform for integrating image analysis tools from different sources. The central idea is to develop a set of Python scripts which glue the different packages together and make it possible to connect them in processing pipelines. To achieve this, an analysis is carried out for each candidate package for WHIPPET, describing input/output formats, parameters, ROI description methods, scripting and extensibility and classifying its compatibility with other WHIPPET components as image file level, scripting level, function extension level, or source code level. We then identify components that can be connected in a pipeline directly via image format conversion. We set up a TWiki server for web-based collaboration so that component analysis and task request can be performed online, as well as project tracking, knowledge base management, and technical support. Currently WHIPPET includes the FSL, MIPAV, FreeSurfer, BrainSuite, Measure, DTIQuery, and 3D Slicer software packages, and is expanding. Users have identified several needed task modules and we report on their implementation.

  19. Software for Data Analysis Programming with R

    CERN Document Server

    Chambers, John

    2008-01-01

    Although statistical design is one of the oldest branches of statistics, its importance is ever increasing, especially in the face of the data flood that often faces statisticians. It is important to recognize the appropriate design, and to understand how to effectively implement it, being aware that the default settings from a computer package can easily provide an incorrect analysis. The goal of this book is to describe the principles that drive good design, paying attention to both the theoretical background and the problems arising from real experimental situations. Designs are motivated t

  20. The software analysis project for the Office of Human Resources

    Science.gov (United States)

    Tureman, Robert L., Jr.

    1994-01-01

    There were two major sections of the project for the Office of Human Resources (OHR). The first section was to conduct a planning study to analyze software use with the goal of recommending software purchases and determining whether the need exists for a file server. The second section was analysis and distribution planning for retirement planning computer program entitled VISION provided by NASA Headquarters. The software planning study was developed to help OHR analyze the current administrative desktop computing environment and make decisions regarding software acquisition and implementation. There were three major areas addressed by the study: current environment new software requirements, and strategies regarding the implementation of a server in the Office. To gather data on current environment, employees were surveyed and an inventory of computers were produced. The surveys were compiled and analyzed by the ASEE fellow with interpretation help by OHR staff. New software requirements represented a compilation and analysis of the surveyed requests of OHR personnel. Finally, the information on the use of a server represents research done by the ASEE fellow and analysis of survey data to determine software requirements for a server. This included selection of a methodology to estimate the number of copies of each software program required given current use and estimated growth. The report presents the results of the computing survey, a description of the current computing environment, recommenations for changes in the computing environment, current software needs, management advantages of using a server, and management considerations in the implementation of a server. In addition, detailed specifications were presented for the hardware and software recommendations to offer a complete picture to OHR management. The retirement planning computer program available to NASA employees will aid in long-range retirement planning. The intended audience is the NASA civil

  1. Processing of terabytes of data for seismic noise analysis with the Python codes of the Whisper Suite. (Invited)

    Science.gov (United States)

    Briand, X.; Campillo, M.; Brenguier, F.; Boue, P.; Poli, P.; Roux, P.; Takeda, T.

    2013-12-01

    The Whisper Suite, as part of the ERC project Whisper (whisper.obs.ujf-grenoble.fr), is developed with the high-level programming language Python and uses intensively the scientific libraries Scipy and Obspy, which is dedicated to the seismological community (www.obspy.org). The Whisper Suite consists of several tools. It provides a flexible way to specify a pipeline of seismogram processing. The user can define his own sequence of treatments, can use the Python libraries he needs and eventually, can add his processing procedure to the Whisper Suite. Another package is dedicated to the computation of correlations. When dealing with large data set, computational time becomes a major difficulty and we devoted a lot of efforts to make possible the fast processing of the large data sets produced by the present day dense seismic networks. With the Whisper Suite, we manage currently more than 150TB of data for ambient noise analysis. For the computations of 68 millions correlations (daily, 5Hz, correlation window 3600s) on a 50 core cluster, with a dedicated disk array, the required time is 4 days. With a distributed storage (Irods) and a grid of clusters (mode best effort), both provided by the University of Grenoble, we compute currently one year of 4-hours correlations for 550 3C stations of the Hi-Net Japanese Network in one day (about 350 millions individual correlations) . Note that the quadratic space complexity can be critical. We developed also codes for the analysis of the correlations. The Whisper Suite is used to make challenging observations using cross-correlation techniques at various scales in the Earth. We present some examples of applications. Using a global data set of available broadband stations, we discuss the emergence of the complete teleseismic body wave wave field, including the deep phases used for imaging of the mantle and the core. The giant 2011 Tohoku-oki earthquake and the records of the dense Hi-Net array offer an opportunity to analyze

  2. The BEPCⅡ Data Production and BESⅢ offline Analysis Software System

    Institute of Scientific and Technical Information of China (English)

    ZepuMAO

    2001-01-01

    The BES detector has operated for about 12 years,and the BES offline data analysis environment also has been developed and upgraded along with developments of the BES hardware and software.The BESⅢ software system will operate for many years.Thus they should meet developments of the new technology in software,It should be highly flexible,Powerful,stable and easy for maintenance.And following points should be taken into account:1) To benefit the collaboration and make better exchanges with the international HEP experiments this system shoule be set up by adopting or referring the newest technology in the software from advanced experiments in the world.2).It should support hundreds of the existing BES software packages and serve for both old experts who familiar with BESII software and computing environment and new members who is going to benefit from the new system.3).The most BESII existing packages will be modified or re-designed according to the hardware changes.

  3. Availability Analysis of Application Servers Using Software Rejuvenation and Virtualization

    Institute of Scientific and Technical Information of China (English)

    Thandar Thein; Jong Sou Park

    2009-01-01

    Demands on software reliability and availability have increased tremendously due to the nature of present day applications. We focus on the aspect of software for the high availability of application servers since the unavailability of servers more often originates from software faults rather than hardware faults. The software rejuvenation technique has been widely used to avoid the occurrence of unplanned failures, mainly due to the phenomena of software aging or caused by transient failures. In this paper, first we present a new way of using the virtual machine based software rejuvenation named VMSR to offer high availability for application server systems. Second we model a single physical server which is used to host multiple virtual machines (VMs) with the VMSR framework using stochastic modeling and evaluate it through both numerical analysis and SHARPE (Symbolic Hierarchical Automated Reliability and Performance Evaluator) tool simulation.This VMSR model is very general and can capture application server characteristics, failure behavior, and performability measures. Our results demonstrate that VMSR approach is a practical way to ensure uninterrupted availability and to optimize performance for aging applications.

  4. RAVEN, a New Software for Dynamic Risk Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Cristian Rabiti; Andrea Alfonsi; Joshua Cogliati; Diego Mandelli; Robert Kinoshita

    2014-06-01

    RAVEN is a generic software driver to perform parametric and probabilistic analysis of code simulating complex systems. Initially developed to provide dynamic risk analysis capabilities to the RELAP-7 code [1] is currently being generalized with the addition of Application Programming Interfaces (APIs). These interfaces are used to extend RAVEN capabilities to any software as long as all the parameters that need to be perturbed are accessible by inputs files or directly via python interfaces. RAVEN is capable to investigate the system response probing the input space using Monte Carlo, grid strategies, or Latin Hyper Cube schemes, but its strength is its focus toward system feature discovery like limit surfaces separating regions of the input space leading to system failure using dynamic supervised learning techniques. The paper will present an overview of the software capabilities and their implementation schemes followed by same application examples.

  5. Analysis of Gumbel Model for Software Reliability Using Bayesian Paradigm

    Directory of Open Access Journals (Sweden)

    Raj Kumar

    2012-12-01

    Full Text Available In this paper, we have illustrated the suitability of Gumbel Model for software reliability data. The model parameters are estimated using likelihood based inferential procedure: classical as well as Bayesian. The quasi Newton-Raphson algorithm is applied to obtain the maximum likelihood estimates and associated probability intervals. The Bayesian estimates of the parameters of Gumbel model are obtained using Markov Chain Monte Carlo(MCMC simulation method in OpenBUGS(established software for Bayesian analysis using Markov Chain Monte Carlo methods. The R functions are developed to study the statistical properties, model validation and comparison tools of the model and the output analysis of MCMC samples generated from OpenBUGS. Details of applying MCMC to parameter estimation for the Gumbel model are elaborated and a real software reliability data set is considered to illustrate the methods of inference discussed in this paper.

  6. Control and analysis software for a laser scanning microdensitometer

    Indian Academy of Sciences (India)

    H R Bundel; C P Navathe; P A Naik; P D Gupta

    2006-02-01

    A PC-based control software and data acquisition system is developed for an existing commercial microdensitometer (Biomed make model No. SL-2D/1D UV/VIS) to facilitate scanning and analysis of X-ray films. The software is developed in Labview, which includes operation of the microdensitometer in 1D and 2D scans and analysis of spatial or spectral data on X-ray films, such as optical density, intensity and wavelength. It provides a user-friendly Graphical User Interface (GUI) to analyse the scanned data and also store the analysed data/image in popular formats like data in Excel and images in jpeg. It has also on-line calibration facility with standard optical density tablets. The control software and data acquisition system is simple, inexpensive and versatile.

  7. FIRE: an open-software suite for real-time 2D/3D image registration for image guided radiotherapy research

    Science.gov (United States)

    Furtado, H.; Gendrin, C.; Spoerk, J.; Steiner, E.; Underwood, T.; Kuenzler, T.; Georg, D.; Birkfellner, W.

    2016-03-01

    Radiotherapy treatments have changed at a tremendously rapid pace. Dose delivered to the tumor has escalated while organs at risk (OARs) are better spared. The impact of moving tumors during dose delivery has become higher due to very steep dose gradients. Intra-fractional tumor motion has to be managed adequately to reduce errors in dose delivery. For tumors with large motion such as tumors in the lung, tracking is an approach that can reduce position uncertainty. Tumor tracking approaches range from purely image intensity based techniques to motion estimation based on surrogate tracking. Research efforts are often based on custom designed software platforms which take too much time and effort to develop. To address this challenge we have developed an open software platform especially focusing on tumor motion management. FLIRT is a freely available open-source software platform. The core method for tumor tracking is purely intensity based 2D/3D registration. The platform is written in C++ using the Qt framework for the user interface. The performance critical methods are implemented on the graphics processor using the CUDA extension. One registration can be as fast as 90ms (11Hz). This is suitable to track tumors moving due to respiration (~0.3Hz) or heartbeat (~1Hz). Apart from focusing on high performance, the platform is designed to be flexible and easy to use. Current use cases range from tracking feasibility studies, patient positioning and method validation. Such a framework has the potential of enabling the research community to rapidly perform patient studies or try new methods.

  8. Combinatorial Generation of Test Suites

    Science.gov (United States)

    Dvorak, Daniel L.; Barrett, Anthony C.

    2009-01-01

    Testgen is a computer program that generates suites of input and configuration vectors for testing other software or software/hardware systems. As systems become ever more complex, often, there is not enough time to test systems against all possible combinations of inputs and configurations, so test engineers need to be selective in formulating test plans. Testgen helps to satisfy this need: In response to a test-suite-requirement-specification model, it generates a minimal set of test vectors that satisfies all the requirements.

  9. An Overview of the XGAM Code and Related Software for Gamma-ray Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Younes, W. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2014-11-13

    The XGAM spectrum-fitting code and associated software were developed specifically to analyze the complex gamma-ray spectra that can result from neutron-induced reactions. The XGAM code is designed to fit a spectrum over the entire available gamma-ray energy range as a single entity, in contrast to the more traditional piecewise approaches. This global-fit philosophy enforces background continuity as well as consistency between local and global behavior throughout the spectrum, and in a natural way. This report presents XGAM and the suite of programs built around it with an emphasis on how they fit into an overall analysis methodology for complex gamma-ray data. An application to the analysis of time-dependent delayed gamma-ray yields from 235U fission is shown in order to showcase the codes and how they interact.

  10. Software Product "Equilibrium" for Preparation and Analysis of Aquatic Solutions

    CERN Document Server

    Bontchev, G D; Ivanov, P I; Maslov, O D; Milanov, M V; Dmitriev, S N

    2003-01-01

    Software product "Equilibrium" for preparation and analysis of aquatic solutions is developed. The program allows determining analytical parameters of a solution, such as ionic force and pH. "Equilibrium" is able to calculate the ratio of existing ion forms in the solution, with respect to the hydrolysis and complexation in the presence of one or more ligands.

  11. WinDAM C earthen embankment internal erosion analysis software

    Science.gov (United States)

    Two primary causes of dam failure are overtopping and internal erosion. For the purpose of evaluating dam safety for existing earthen embankment dams and proposed earthen embankment dams, Windows Dam Analysis Modules C (WinDAM C) software will simulate either internal erosion or erosion resulting f...

  12. PROTEINCHALLENGE: Crowd sourcing in proteomics analysis and software development

    DEFF Research Database (Denmark)

    Martin, Sarah F.; Falkenberg, Heiner; Dyrlund, Thomas Franck;

    2013-01-01

    , including arguments for community-wide open source software development and “big data” compatible solutions for the future. For the meantime, we have laid out ten top tips for data processing. With these at hand, a first large-scale proteomics analysis hopefully becomes less daunting to navigate...

  13. Suomi National Polar-Orbiting Partnership Visible Infrared Imaging Radiometer Suite polarization sensitivity analysis.

    Science.gov (United States)

    Sun, Junqiang; Xiong, Xiaoxiong; Waluschka, Eugene; Wang, Menghua

    2016-09-20

    The Visible Infrared Imaging Radiometer Suite (VIIRS) is one of five instruments onboard the Suomi National Polar-Orbiting Partnership (SNPP) satellite that launched from Vandenberg Air Force Base, California, on October 28, 2011. It is a whiskbroom radiometer that provides ±56.28° scans of the Earth view. It has 22 bands, among which 14 are reflective solar bands (RSBs). The RSBs cover a wavelength range from 410 to 2250 nm. The RSBs of a remote sensor are usually sensitive to the polarization of incident light. For VIIRS, it is specified that the polarization factor should be smaller than 3% for 410 and 862 nm bands and 2.5% for other RSBs for the scan angle within ±45°. Several polarization sensitivity tests were performed prelaunch for SNPP VIIRS. The first few tests either had large uncertainty or were less reliable, while the last one was believed to provide the more accurate information about the polarization property of the instrument. In this paper, the measured data in the last polarization sensitivity test are analyzed, and the polarization factors and phase angles are derived from the measurements for all the RSBs. The derived polarization factors and phase angles are band, detector, and scan angle dependent. For near-infrared bands, they also depend on the half-angle mirror side. Nevertheless, the derived polarization factors are all within the specification, although the strong detector dependence of the polarization parameters was not expected. Compared to the Moderate Resolution Imaging Spectroradiometer on both Aqua and Terra satellites, the polarization effect on VIIRS RSB is much smaller. PMID:27661594

  14. Open source data analysis and visualization software for optical engineering

    Science.gov (United States)

    Smith, Greg A.; Lewis, Benjamin J.; Palmer, Michael; Kim, Dae Wook; Loeff, Adrian R.; Burge, James H.

    2012-10-01

    SAGUARO is open-source software developed to simplify data assimilation, analysis, and visualization by providing a single framework for disparate data sources from raw hardware measurements to optical simulation output. Developed with a user-friendly graphical interface in the MATLABTM environment, SAGUARO is intended to be easy for the enduser in search of useful optical information as well as the developer wanting to add new modules and functionalities. We present here the flexibility of the SAGUARO software and discuss how it can be applied to the wider optical engineering community.

  15. Strategic Analysis of the Enterprise Mobile Device Management Software Industry

    OpenAIRE

    Shesterin, Dmitry

    2012-01-01

    This paper analyzes the enterprise mobile device management industry and evaluates three strategic alternatives by which an established computer systems management software manufacturing company can enter this industry. The analysis of the three strategic alternatives to build, buy or partner in order to bring to market an enterprise mobile device management product offering delves into an examination of the company’s existing position and performance; conducts an external analysis of the ent...

  16. GWAMA: software for genome-wide association meta-analysis

    OpenAIRE

    Mägi Reedik; Morris Andrew P

    2010-01-01

    Abstract Background Despite the recent success of genome-wide association studies in identifying novel loci contributing effects to complex human traits, such as type 2 diabetes and obesity, much of the genetic component of variation in these phenotypes remains unexplained. One way to improving power to detect further novel loci is through meta-analysis of studies from the same population, increasing the sample size over any individual study. Although statistical software analysis packages in...

  17. One-Click Data Analysis Software for Science Operations

    Science.gov (United States)

    Navarro, Vicente

    2015-12-01

    One of the important activities of ESA Science Operations Centre is to provide Data Analysis Software (DAS) to enable users and scientists to process data further to higher levels. During operations and post-operations, Data Analysis Software (DAS) is fully maintained and updated for new OS and library releases. Nonetheless, once a Mission goes into the "legacy" phase, there are very limited funds and long-term preservation becomes more and more difficult. Building on Virtual Machine (VM), Cloud computing and Software as a Service (SaaS) technologies, this project has aimed at providing long-term preservation of Data Analysis Software for the following missions: - PIA for ISO (1995) - SAS for XMM-Newton (1999) - Hipe for Herschel (2009) - EXIA for EXOSAT (1983) Following goals have guided the architecture: - Support for all operations, post-operations and archive/legacy phases. - Support for local (user's computer) and cloud environments (ESAC-Cloud, Amazon - AWS). - Support for expert users, requiring full capabilities. - Provision of a simple web-based interface. This talk describes the architecture, challenges, results and lessons learnt gathered in this project.

  18. COMPUTER SIMULATION: COMPARATIVE ANALYSIS OF SOFTWARES ARENA® AND PROMODEL®

    Directory of Open Access Journals (Sweden)

    Luiz Enéias Zanetti Cardoso

    2016-04-01

    Full Text Available The computer simulation is not exclusive areas of Logistics and Production, implementation takes place within the limits of technical expertise of professionals. Although not widespread at present, there is a rise of projection in use, as the numerous application possibilities, if properly modeled in reality presented face. This article proposes to present comparative and qualitative analysis of two computer simulation software, version Arena® 14,000 Student and ProModel® RunTimeSilve version - Demo, according to the following criteria: desktop, access to commands, ease in developing the software model and accessories, and can be seen the main features of each simulation software, as well as the differences between their interfaces, however, both were confirmed as great tools to support management processes.

  19. Software and codes for analysis of concentrating solar power technologies.

    Energy Technology Data Exchange (ETDEWEB)

    Ho, Clifford Kuofei

    2008-12-01

    This report presents a review and evaluation of software and codes that have been used to support Sandia National Laboratories concentrating solar power (CSP) program. Additional software packages developed by other institutions and companies that can potentially improve Sandia's analysis capabilities in the CSP program are also evaluated. The software and codes are grouped according to specific CSP technologies: power tower systems, linear concentrator systems, and dish/engine systems. A description of each code is presented with regard to each specific CSP technology, along with details regarding availability, maintenance, and references. A summary of all the codes is then presented with recommendations regarding the use and retention of the codes. A description of probabilistic methods for uncertainty and sensitivity analyses of concentrating solar power technologies is also provided.

  20. Simulated spectra for QA/QC of spectral analysis software

    Energy Technology Data Exchange (ETDEWEB)

    Jackman, K. R. (Kevin R.); Biegalski, S. R.

    2004-01-01

    Monte Carlo simulated spectra have been developed to test the peak analysis algorithms of several spectral analysis software packages. Using MCNP 5, generic sample spectra were generated in order to perform ANSI N42.14 standard spectral tests on Canberra Genie-2000, Ortec GammaVision, and UniSampo. The reference spectra were generated in MCNP 5 using an F8, pulse height, tally with a detector model of an actual Germanium detector used in counting. The detector model matches the detector resolution, energy calibration, and efficiency. The simulated spectra have been found to be useful in testing the reliability and performance of spectral analysis programs. The detector model used was found to be useful in testing the performance of modern spectral analysis software tools. The software packages were analyzed and found to be in compliance with the ANSI 42.14 tests of the peak-search and peak-fitting algorithms. This method of using simulated spectra can be used to perform the ANSI 42.14 tests on the reliability and performance of spectral analysis programs in the absence of standard radioactive materials.

  1. jPopGen Suite: population genetic analysis of DNA polymorphism from nucleotide sequences with errors

    OpenAIRE

    Liu, Xiaoming

    2012-01-01

    1. Next-generation sequencing (NGS) is being increasingly used in ecological and evolutionary studies. Though promising, NGS is known to be error-prone. Sequencing error can cause significant bias for population genetic analysis of a sequence sample.

  2. Knickpoint finder: A software tool that improves neotectonic analysis

    Science.gov (United States)

    Queiroz, G. L.; Salamuni, E.; Nascimento, E. R.

    2015-03-01

    This work presents a new software tool for morphometric analysis of drainage networks based on the methods of Hack (1973) and Etchebehere et al. (2004). This tool is applicable to studies of morphotectonics and neotectonics. The software used a digital elevation model (DEM) to identify the relief breakpoints along drainage profiles (knickpoints). The program was coded in Python for use on the ArcGIS platform and is called Knickpoint Finder. A study area was selected to test and evaluate the software's ability to analyze and identify neotectonic morphostructures based on the morphology of the terrain. For an assessment of its validity, we chose an area of the James River basin, which covers most of the Piedmont area of Virginia (USA), which is an area of constant intraplate seismicity and non-orogenic active tectonics and exhibits a relatively homogeneous geodesic surface currently being altered by the seismogenic features of the region. After using the tool in the chosen area, we found that the knickpoint locations are associated with the geologic structures, epicenters of recent earthquakes, and drainages with rectilinear anomalies. The regional analysis demanded the use of a spatial representation of the data after processing using Knickpoint Finder. The results were satisfactory in terms of the correlation of dense areas of knickpoints with active lineaments and the rapidity of the identification of deformed areas. Therefore, this software tool may be considered useful in neotectonic analyses of large areas and may be applied to any area where there is DEM coverage.

  3. GammaLib and ctools: A software framework for the analysis of astronomical gamma-ray data

    CERN Document Server

    Knödlseder, J; Deil, C; Cayrou, J -B; Owen, E; Kelley-Hoskins, N; Lu, C -C; Buehler, R; Forest, F; Louge, T; Siejkowski, H; Kosack, K; Gerard, L; Schulz, A; Martin, P; Sanchez, D; Ohm, S; Hassan, T; Brau-Nogué, S

    2016-01-01

    The field of gamma-ray astronomy has seen important progress during the last decade, yet there exists so far no common software framework for the scientific analysis of gamma-ray telescope data. We propose to fill this gap by means of the GammaLib software, a generic library that we have developed to support the analysis of gamma-ray event data. GammaLib has been written in C++ and all functionality is available in Python through an extension module. On top of this framework we have developed the ctools software package, a suite of software tools that enables building of flexible workflows for the analysis of Imaging Air Cherenkov Telescope event data. The ctools are inspired by science analysis software available for existing high-energy astronomy instruments, and they follow the modular ftools model developed by the High Energy Astrophysics Science Archive Research Center. The ctools have been written in Python and C++, and can be either used from the command line, via shell scripts, or directly from Python...

  4. STARS software tool for analysis of reliability and safety

    International Nuclear Information System (INIS)

    This paper reports on the STARS (Software Tool for the Analysis of Reliability and Safety) project aims at developing an integrated set of Computer Aided Reliability Analysis tools for the various tasks involved in systems safety and reliability analysis including hazard identification, qualitative analysis, logic model construction and evaluation. The expert system technology offers the most promising perspective for developing a Computer Aided Reliability Analysis tool. Combined with graphics and analysis capabilities, it can provide a natural engineering oriented environment for computer assisted reliability and safety modelling and analysis. For hazard identification and fault tree construction, a frame/rule based expert system is used, in which the deductive (goal driven) reasoning and the heuristic, applied during manual fault tree construction, is modelled. Expert system can explain their reasoning so that the analyst can become aware of the why and the how results are being obtained. Hence, the learning aspect involved in manual reliability and safety analysis can be maintained and improved

  5. ROCKETSHIP: a flexible and modular software tool for the planning, processing and analysis of dynamic MRI studies

    International Nuclear Information System (INIS)

    Dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) is a promising technique to characterize pathology and evaluate treatment response. However, analysis of DCE-MRI data is complex and benefits from concurrent analysis of multiple kinetic models and parameters. Few software tools are currently available that specifically focuses on DCE-MRI analysis with multiple kinetic models. Here, we developed ROCKETSHIP, an open-source, flexible and modular software for DCE-MRI analysis. ROCKETSHIP incorporates analyses with multiple kinetic models, including data-driven nested model analysis. ROCKETSHIP was implemented using the MATLAB programming language. Robustness of the software to provide reliable fits using multiple kinetic models is demonstrated using simulated data. Simulations also demonstrate the utility of the data-driven nested model analysis. Applicability of ROCKETSHIP for both preclinical and clinical studies is shown using DCE-MRI studies of the human brain and a murine tumor model. A DCE-MRI software suite was implemented and tested using simulations. Its applicability to both preclinical and clinical datasets is shown. ROCKETSHIP was designed to be easily accessible for the beginner, but flexible enough for changes or additions to be made by the advanced user as well. The availability of a flexible analysis tool will aid future studies using DCE-MRI. A public release of ROCKETSHIP is available at (https://github.com/petmri/ROCKETSHIP)

  6. STING Millennium: a web-based suite of programs for comprehensive and simultaneous analysis of protein structure and sequence

    Science.gov (United States)

    Neshich, Goran; Togawa, Roberto C.; Mancini, Adauto L.; Kuser, Paula R.; Yamagishi, Michel E. B.; Pappas, Georgios; Torres, Wellington V.; Campos, Tharsis Fonseca e; Ferreira, Leonardo L.; Luna, Fabio M.; Oliveira, Adilton G.; Miura, Ronald T.; Inoue, Marcus K.; Horita, Luiz G.; de Souza, Dimas F.; Dominiquini, Fabiana; Álvaro, Alexandre; Lima, Cleber S.; Ogawa, Fabio O.; Gomes, Gabriel B.; Palandrani, Juliana F.; dos Santos, Gabriela F.; de Freitas, Esther M.; Mattiuz, Amanda R.; Costa, Ivan C.; de Almeida, Celso L.; Souza, Savio; Baudet, Christian; Higa, Roberto H.

    2003-01-01

    STING Millennium Suite (SMS) is a new web-based suite of programs and databases providing visualization and a complex analysis of molecular sequence and structure for the data deposited at the Protein Data Bank (PDB). SMS operates with a collection of both publicly available data (PDB, HSSP, Prosite) and its own data (contacts, interface contacts, surface accessibility). Biologists find SMS useful because it provides a variety of algorithms and validated data, wrapped-up in a user friendly web interface. Using SMS it is now possible to analyze sequence to structure relationships, the quality of the structure, nature and volume of atomic contacts of intra and inter chain type, relative conservation of amino acids at the specific sequence position based on multiple sequence alignment, indications of folding essential residue (FER) based on the relationship of the residue conservation to the intra-chain contacts and Cα–Cα and Cβ–Cβ distance geometry. Specific emphasis in SMS is given to interface forming residues (IFR)—amino acids that define the interactive portion of the protein surfaces. SMS may simultaneously display and analyze previously superimposed structures. PDB updates trigger SMS updates in a synchronized fashion. SMS is freely accessible for public data at http://www.cbi.cnptia.embrapa.br, http://mirrors.rcsb.org/SMS and http://trantor.bioc.columbia.edu/SMS. PMID:12824333

  7. SMV model-based safety analysis of software requirements

    Energy Technology Data Exchange (ETDEWEB)

    Koh, Kwang Yong [Department of Nuclear and Quantum Engineering, Korea Advanced Institute of Science and Technology, 373-1, Guseong-dong, Yuseong-gu, Daejeon 305-701 (Korea, Republic of); Seong, Poong Hyun [Department of Nuclear and Quantum Engineering, Korea Advanced Institute of Science and Technology, 373-1, Guseong-dong, Yuseong-gu, Daejeon 305-701 (Korea, Republic of)], E-mail: phseong@kaist.ac.kr

    2009-02-15

    Fault tree analysis (FTA) is one of the most frequently applied safety analysis techniques when developing safety-critical industrial systems such as software-based emergency shutdown systems of nuclear power plants and has been used for safety analysis of software requirements in the nuclear industry. However, the conventional method for safety analysis of software requirements has several problems in terms of correctness and efficiency; the fault tree generated from natural language specifications may contain flaws or errors while the manual work of safety verification is very labor-intensive and time-consuming. In this paper, we propose a new approach to resolve problems of the conventional method; we generate a fault tree from a symbolic model verifier (SMV) model, not from natural language specifications, and verify safety properties automatically, not manually, by a model checker SMV. To demonstrate the feasibility of this approach, we applied it to shutdown system 2 (SDS2) of Wolsong nuclear power plant (NPP). In spite of subtle ambiguities present in the approach, the results of this case study demonstrate its overall feasibility and effectiveness.

  8. Model Based Analysis and Test Generation for Flight Software

    Science.gov (United States)

    Pasareanu, Corina S.; Schumann, Johann M.; Mehlitz, Peter C.; Lowry, Mike R.; Karsai, Gabor; Nine, Harmon; Neema, Sandeep

    2009-01-01

    We describe a framework for model-based analysis and test case generation in the context of a heterogeneous model-based development paradigm that uses and combines Math- Works and UML 2.0 models and the associated code generation tools. This paradigm poses novel challenges to analysis and test case generation that, to the best of our knowledge, have not been addressed before. The framework is based on a common intermediate representation for different modeling formalisms and leverages and extends model checking and symbolic execution tools for model analysis and test case generation, respectively. We discuss the application of our framework to software models for a NASA flight mission.

  9. Stromatoporoid biometrics using image analysis software: A first order approach

    Science.gov (United States)

    Wolniewicz, Pawel

    2010-04-01

    Strommetric is a new image analysis computer program that performs morphometric measurements of stromatoporoid sponges. The program measures 15 features of skeletal elements (pillars and laminae) visible in both longitudinal and transverse thin sections. The software is implemented in C++, using the Open Computer Vision (OpenCV) library. The image analysis system distinguishes skeletal elements from sparry calcite using Otsu's method for image thresholding. More than 150 photos of thin sections were used as a test set, from which 36,159 measurements were obtained. The software provided about one hundred times more data than the current method applied until now. The data obtained are reproducible, even if the work is repeated by different workers. Thus the method makes the biometric studies of stromatoporoids objective.

  10. TOM software toolbox: acquisition and analysis for electron tomography.

    Science.gov (United States)

    Nickell, Stephan; Förster, Friedrich; Linaroudis, Alexandros; Net, William Del; Beck, Florian; Hegerl, Reiner; Baumeister, Wolfgang; Plitzko, Jürgen M

    2005-03-01

    Automated data acquisition procedures have changed the perspectives of electron tomography (ET) in a profound manner. Elaborate data acquisition schemes with autotuning functions minimize exposure of the specimen to the electron beam and sophisticated image analysis routines retrieve a maximum of information from noisy data sets. "TOM software toolbox" integrates established algorithms and new concepts tailored to the special needs of low dose ET. It provides a user-friendly unified platform for all processing steps: acquisition, alignment, reconstruction, and analysis. Designed as a collection of computational procedures it is a complete software solution within a highly flexible framework. TOM represents a new way of working with the electron microscope and can serve as the basis for future high-throughput applications.

  11. Calibration Analysis Software for the ATLAS Pixel Detector

    CERN Document Server

    Stramaglia, Maria Elena; The ATLAS collaboration

    2015-01-01

    The calibration of the Pixel detector fulfills two main purposes: to tune front-end registers for establishing the best operational settings and to measure the tuning performance through a subset of scans. An analysis framework has been set up in order to take actions on the detector given the outcome of a calibration scan (e.g. to create a mask for disabling noisy pixels). The software framework to control all aspects of the Pixel detector scans and analyses is called Calibration Console. The introduction of a new layer, equipped with new Front End-I4 Chips, required an update the Console architecture. It now handles scans and scans analyses applied toghether to chips with dierent characteristics. An overview of the newly developed Calibration Analysis Software will be presented, together with some preliminary result.

  12. Calibration Analysis Software for the ATLAS Pixel Detector

    CERN Document Server

    Stramaglia, Maria Elena; The ATLAS collaboration

    2015-01-01

    The calibration of the ATLAS Pixel detector at LHC fulfils two main purposes: to tune the front-end configuration parameters for establishing the best operational settings and to measure the tuning performance through a subset of scans. An analysis framework has been set up in order to take actions on the detector given the outcome of a calibration scan (e.g. to create a mask for disabling noisy pixels). The software framework to control all aspects of the Pixel detector scans and analyses is called Calibration Console. The introduction of a new layer, equipped with new Front End-I4 Chips, required an update the Console architecture. It now handles scans and scans analyses applied together to chips with different characteristics. An overview of the newly developed Calibration Analysis Software will be presented, together with some preliminary result.

  13. Calibration analysis software for the ATLAS Pixel Detector

    Science.gov (United States)

    Stramaglia, Maria Elena

    2016-07-01

    The calibration of the ATLAS Pixel Detector at LHC fulfils two main purposes: to tune the front-end configuration parameters for establishing the best operational settings and to measure the tuning performance through a subset of scans. An analysis framework has been set up in order to take actions on the detector given the outcome of a calibration scan (e.g. to create a mask for disabling noisy pixels). The software framework to control all aspects of the Pixel Detector scans and analyses is called calibration console. The introduction of a new layer, equipped with new FE-I4 chips, required an update of the console architecture. It now handles scans and scan analyses applied together to chips with different characteristics. An overview of the newly developed calibration analysis software will be presented, together with some preliminary results.

  14. HistFitter software framework for statistical data analysis

    OpenAIRE

    Baak, M.; Besjes, G. J.; D. Côté; Koutsman, A.; Lorenz, J.; Short, D.

    2014-01-01

    We present a software framework for statistical data analysis, called HistFitter, that has been used extensively by the ATLAS Collaboration to analyze big datasets originating from proton–proton collisions at the Large Hadron Collider at CERN. Since 2012 HistFitter has been the standard statistical tool in searches for supersymmetric particles performed by ATLAS. HistFitter is a programmable and flexible framework to build, book-keep, fit, interpret and present results of data models of nearl...

  15. GATB: a software toolbox for genome assembly and analysis

    OpenAIRE

    Drezen, Erwan; Rizk, Guillaume; Chikhi, Rayan; Deltel, Charles; Lemaitre, Claire; Peterlongo, Pierre; Lavenier, Dominique

    2014-01-01

    International audience The analysis of NGS data remains a time and space-consuming task. Many efforts have been made to provide efficient data structures for indexing the terabytes of data generated by the fast sequencing machines (Suffix Array, Burrows-Wheeler transform, Bloom Filter, etc.). Mapper tools, genome assemblers, SNP callers, etc., make an intensive use of these data structures to keep their memory footprint as lower as possible.The overall efficiency of NGS software is brought...

  16. Software for a measuring facility for activation analysis

    International Nuclear Information System (INIS)

    A software package has been developed for an APPLE P.C. The programs are intended to control an automated measuring station for photon activation analysis at GELINA, the linear accelerator of C.B.N.M. at Geel (Belgium). They allow to set-up a measuring scheme, to execute it under computer control, to accumulate and store 2 K-spectra using a built-in ADC and to output the results as listings, plots or evaluated reports

  17. CAVASS: A Computer-Assisted Visualization and Analysis Software System

    OpenAIRE

    Grevera, George; Udupa, Jayaram; Odhner, Dewey; Zhuge, Ying; Souza, Andre; Iwanaga, Tad; Mishra, Shipra

    2007-01-01

    The Medical Image Processing Group at the University of Pennsylvania has been developing (and distributing with source code) medical image analysis and visualization software systems for a long period of time. Our most recent system, 3DVIEWNIX, was first released in 1993. Since that time, a number of significant advancements have taken place with regard to computer platforms and operating systems, networking capability, the rise of parallel processing standards, and the development of open-so...

  18. Nucleonica: Web-based Software Tools for Simulations and Analysis

    OpenAIRE

    Magill, Joseph; DREHER Raymond; SOTI Zsolt; LASCHE George

    2012-01-01

    The authors present a description of a new web-based software portal for simulation and analysis for a wide range of commonly encountered nuclear science applications. Advantages of a web-based approach include availability wherever there is internet access, intuitive user-friendly interface, remote access to high-power computing resources, and continual maintenance, improvement, and addition of tools and techniques common to the nuclear science industry. A description of the nuclear data res...

  19. Open-source data analysis and visualization software platform: SAGUARO

    Science.gov (United States)

    Kim, Dae Wook; Lewis, Benjamin J.; Burge, James H.

    2011-09-01

    Optical engineering projects often require massive data processing with many steps in the course of design, simulation, fabrication, metrology, and evaluation. A MATLAB™-based data processing platform has been developed to provide a standard way to manipulate and visualize various types of data that are created from optical measurement equipment. The operation of this software platform via a graphical user interface is easy and powerful. Data processing is performed by running modules that use a proscribed format for sharing data. Complex operations are performed by stringing modules together using macros. While numerous modules have been developed to allow data processing without the need to write software, the greatest power of the platform is provided by its flexibility. A developer's toolkit is provided to allow development and customization of modules, and the program allows a real-time interface with the standard MATLAB environment. This software, developed by the Large Optics Fabrication and Testing group at the University of Arizona, is now publicly available.** We present the capabilities of the software and provide some demonstrations of its use for data analysis and visualization. Furthermore, we demonstrate the flexibility of the platform for solving new problems.

  20. SIMA: Python software for analysis of dynamic fluorescence imaging data

    Directory of Open Access Journals (Sweden)

    Patrick eKaifosh

    2014-09-01

    Full Text Available Fluorescence imaging is a powerful method for monitoring dynamic signals in the nervous system. However, analysis of dynamic fluorescence imaging data remains burdensome, in part due to the shortage of available software tools. To address this need, we have developed SIMA, an open source Python package that facilitates common analysis tasks related to fluorescence imaging. Functionality of this package includes correction of motion artifacts occurring during in vivo imaging with laser-scanning microscopy, segmentation of imaged fields into regions of interest (ROIs, and extraction of signals from the segmented ROIs. We have also developed a graphical user interface (GUI for manual editing of the automatically segmented ROIs and automated registration of ROIs across multiple imaging datasets. This software has been designed with flexibility in mind to allow for future extension with different analysis methods and potential integration with other packages. Software, documentation, and source code for the SIMA package and ROI Buddy GUI are freely available at http://www.losonczylab.org/sima/.

  1. Nuclear analysis software. Pt. 1: Spectrum transfer and reformatting (SPEDAC)

    International Nuclear Information System (INIS)

    GANAAS (Gamma, Activity, and Neutron Activation Analysis System) is one in the family of software packages developed under the auspices of the International Atomic Energy Agency. Primarily, the package was intended to support the IAEA Technical Assistance and Cooperation projects in developing countries. However, it is open domain software that can be copied and used by anybody, except for commercial purposes. All the nuclear analysis software provided by the IAEA has the same design philosophy and similar structure. The intention was to provide the user with maximum flexibility, at the same time with a simple and logical organization that requires minimum digging through the manuals. GANAAS is a modular system. It consists of several programmes that can be installed on the hard disk as the are needed. Obviously, some parts of they system are required in all cases. Those are installed at the beginning, without consulting the operator. GANAAS offers the opportunity to expand and improve the system. The gamma spectrum evaluation programmes using different fitting algorithms can be added to GANAAS, under the condition that the format of their input and output files corresponds to the rules of GANAAS. The same applies to the quantitative analysis parts of the programme

  2. An analysis of related software cycles among organizations, people and the software industry

    OpenAIRE

    Adams, Brady.

    2008-01-01

    There is a need to understand cycles associated with software upgrades as they effect people, organizations and the software industry. This thesis intends to explore the moderating factors of these three distinct and disjointed cycles and propose courses of action towards mitigating various issues and problems inherent in the software upgrade process. This thesis will acknowledge that three related but disjointed cycles are common in many software upgrade ventures in today's organization...

  3. Search for Chemical Biomarkers on Mars Using the Sample Analysis at Mars Instrument Suite on the Mars Science Laboratory

    Science.gov (United States)

    Glavin, D. P.; Conrad, P.; Dworkin, J. P.; Eigenbrode, J.; Mahaffy, P. R.

    2011-01-01

    One key goal for the future exploration of Mars is the search for chemical biomarkers including complex organic compounds important in life on Earth. The Sample Analysis at Mars (SAM) instrument suite on the Mars Science Laboratory (MSL) will provide the most sensitive measurements of the organic composition of rocks and regolith samples ever carried out in situ on Mars. SAM consists of a gas chromatograph (GC), quadrupole mass spectrometer (QMS), and tunable laser spectrometer to measure volatiles in the atmosphere and released from rock powders heated up to 1000 C. The measurement of organics in solid samples will be accomplished by three experiments: (1) pyrolysis QMS to identify alkane fragments and simple aromatic compounds; pyrolysis GCMS to separate and identify complex mixtures of larger hydrocarbons; and (3) chemical derivatization and GCMS extract less volatile compounds including amino and carboxylic acids that are not detectable by the other two experiments.

  4. IMMAN: free software for information theory-based chemometric analysis.

    Science.gov (United States)

    Urias, Ricardo W Pino; Barigye, Stephen J; Marrero-Ponce, Yovani; García-Jacas, César R; Valdes-Martiní, José R; Perez-Gimenez, Facundo

    2015-05-01

    The features and theoretical background of a new and free computational program for chemometric analysis denominated IMMAN (acronym for Information theory-based CheMoMetrics ANalysis) are presented. This is multi-platform software developed in the Java programming language, designed with a remarkably user-friendly graphical interface for the computation of a collection of information-theoretic functions adapted for rank-based unsupervised and supervised feature selection tasks. A total of 20 feature selection parameters are presented, with the unsupervised and supervised frameworks represented by 10 approaches in each case. Several information-theoretic parameters traditionally used as molecular descriptors (MDs) are adapted for use as unsupervised rank-based feature selection methods. On the other hand, a generalization scheme for the previously defined differential Shannon's entropy is discussed, as well as the introduction of Jeffreys information measure for supervised feature selection. Moreover, well-known information-theoretic feature selection parameters, such as information gain, gain ratio, and symmetrical uncertainty are incorporated to the IMMAN software ( http://mobiosd-hub.com/imman-soft/ ), following an equal-interval discretization approach. IMMAN offers data pre-processing functionalities, such as missing values processing, dataset partitioning, and browsing. Moreover, single parameter or ensemble (multi-criteria) ranking options are provided. Consequently, this software is suitable for tasks like dimensionality reduction, feature ranking, as well as comparative diversity analysis of data matrices. Simple examples of applications performed with this program are presented. A comparative study between IMMAN and WEKA feature selection tools using the Arcene dataset was performed, demonstrating similar behavior. In addition, it is revealed that the use of IMMAN unsupervised feature selection methods improves the performance of both IMMAN and WEKA

  5. IMMAN: free software for information theory-based chemometric analysis.

    Science.gov (United States)

    Urias, Ricardo W Pino; Barigye, Stephen J; Marrero-Ponce, Yovani; García-Jacas, César R; Valdes-Martiní, José R; Perez-Gimenez, Facundo

    2015-05-01

    The features and theoretical background of a new and free computational program for chemometric analysis denominated IMMAN (acronym for Information theory-based CheMoMetrics ANalysis) are presented. This is multi-platform software developed in the Java programming language, designed with a remarkably user-friendly graphical interface for the computation of a collection of information-theoretic functions adapted for rank-based unsupervised and supervised feature selection tasks. A total of 20 feature selection parameters are presented, with the unsupervised and supervised frameworks represented by 10 approaches in each case. Several information-theoretic parameters traditionally used as molecular descriptors (MDs) are adapted for use as unsupervised rank-based feature selection methods. On the other hand, a generalization scheme for the previously defined differential Shannon's entropy is discussed, as well as the introduction of Jeffreys information measure for supervised feature selection. Moreover, well-known information-theoretic feature selection parameters, such as information gain, gain ratio, and symmetrical uncertainty are incorporated to the IMMAN software ( http://mobiosd-hub.com/imman-soft/ ), following an equal-interval discretization approach. IMMAN offers data pre-processing functionalities, such as missing values processing, dataset partitioning, and browsing. Moreover, single parameter or ensemble (multi-criteria) ranking options are provided. Consequently, this software is suitable for tasks like dimensionality reduction, feature ranking, as well as comparative diversity analysis of data matrices. Simple examples of applications performed with this program are presented. A comparative study between IMMAN and WEKA feature selection tools using the Arcene dataset was performed, demonstrating similar behavior. In addition, it is revealed that the use of IMMAN unsupervised feature selection methods improves the performance of both IMMAN and WEKA

  6. Analysis of signal acquisition in GPS receiver software

    Directory of Open Access Journals (Sweden)

    Vlada S. Sokolović

    2011-01-01

    Full Text Available This paper presents a critical analysis of the flow signal processing carried out in GPS receiver software, which served as a basis for a critical comparison of different signal processing architectures within the GPS receiver. It is possible to achieve Increased flexibility and reduction of GPS device commercial costs, including those of mobile devices, by using radio technology software (SDR, Software Defined Radio. The SDR application can be realized when certain hardware components in a GPS receiver are replaced. Signal processing in the SDR is implemented using a programmable DSP (Digital Signal Processing or FPGA (Field Programmable Gate Array circuit, which allows a simple change of digital signal processing algorithms and a simple change of the receiver parameters. The starting point of the research is the signal generated on the satellite the structure of which is shown in the paper. Based on the GPS signal structure, a receiver is realized with a task to extract an appropriate signal from the spectrum and detect it. Based on collected navigation data, the receiver calculates the position of the end user. The signal coming from the satellite may be at the carrier frequencies of L1 and L2. Since the SPS is used in the civil service, all the tests shown in the work were performed on the L1 signal. The signal coming to the receiver is generated in the spread spectrum technology and is situated below the level of noise. Such signals often interfere with signals from the environment which presents a difficulty for a receiver to perform proper detection and signal processing. Therefore, signal processing technology is continually being improved, aiming at more accurate and faster signal processing. All tests were carried out on a signal acquired from the satellite using the SE4110 input circuit used for filtering, amplification and signal selection. The samples of the received signal were forwarded to a computer for data post processing, i. e

  7. A METHOD FOR SELECTING SOFTWARE FOR DYNAMIC EVENT ANALYSIS I: PROBLEM SELECTION

    Energy Technology Data Exchange (ETDEWEB)

    J. M. Lacy; S. R. Novascone; W. D. Richins; T. K. Larson

    2007-08-01

    New nuclear power reactor designs will require resistance to a variety of possible malevolent attacks, as well as traditional dynamic accident scenarios. The design/analysis team may be faced with a broad range of phenomena including air and ground blasts, high-velocity penetrators or shaped charges, and vehicle or aircraft impacts. With a host of software tools available to address these high-energy events, the analysis team must evaluate and select the software most appropriate for their particular set of problems. The accuracy of the selected software should then be validated with respect to the phenomena governing the interaction of the threat and structure. In this paper, we present a method for systematically comparing current high-energy physics codes for specific applications in new reactor design. Several codes are available for the study of blast, impact, and other shock phenomena. Historically, these packages were developed to study specific phenomena such as explosives performance, penetrator/target interaction, or accidental impacts. As developers generalize the capabilities of their software, legacy biases and assumptions can remain that could affect the applicability of the code to other processes and phenomena. R&D institutions generally adopt one or two software packages and use them almost exclusively, performing benchmarks on a single-problem basis. At the Idaho National Laboratory (INL), new comparative information was desired to permit researchers to select the best code for a particular application by matching its characteristics to the physics, materials, and rate scale (or scales) representing the problem at hand. A study was undertaken to investigate the comparative characteristics of a group of shock and high-strain rate physics codes including ABAQUS, LS-DYNA, CTH, ALEGRA, ALE-3D, and RADIOSS. A series of benchmark problems were identified to exercise the features and capabilities of the subject software. To be useful, benchmark problems

  8. A software architectural framework specification for neutron activation analysis

    International Nuclear Information System (INIS)

    Neutron Activation Analysis (NAA) is a sensitive multi-element nuclear analytical technique that has been routinely applied by research reactor (RR) facilities to environmental, nutritional, health related, geological and geochemical studies. As RR facilities face calls to increase their research output and impact, with existing or reducing budgets, automation of NAA offers a possible solution. However, automation has many challenges, not the least of which is a lack of system architecture standards to establish acceptable mechanisms for the various hardware/software and software/software interactions among data acquisition systems, specialised hardware such as sample changers, sample loaders, and data processing modules. This lack of standardization often results in automation hardware and software being incompatible with existing system components, in a facility looking to automate its NAA operations. This limits the availability of automation to a few RR facilities with adequate budgets or in-house engineering resources. What is needed is a modern open system architecture for NAA, that provides the required set of functionalities. This paper describes such an 'architectural framework' (OpenNAA), and portions of a reference implementation. As an example of the benefits, calculations indicate that applying this architecture to the compilation and QA steps associated with the analysis of 35 elements in 140 samples, with 14 SRM's, can reduce the time required by over 80 %. The adoption of open standards in the nuclear industry has been very successful over the years in promoting interchangeability and maximising the lifetime and output of nuclear measurement systems. OpenNAA will provide similar benefits within the NAA application space, safeguarding user investments in their current system, while providing a solid path for development into the future. (author)

  9. DiaSuite: a Tool Suite To Develop Sense/Compute/Control Applications

    OpenAIRE

    Bertran, Benjamin; Bruneau, Julien; Cassou, Damien; Loriant, Nicolas; Balland, Emilie; Consel, Charles

    2014-01-01

    We present DiaSuite, a tool suite that uses a software design approach to drive the development process. DiaSuite focuses on a specific domain, namely Sense/Compute/Control (SCC) applications. It comprises a domain-specific design language, a compiler producing a Java programming framework, a 2D-renderer to simulate an application, and a deployment framework. We have validated our tool suite on a variety of concrete applications in areas including telecommunications, building automation, robo...

  10. Software fault tree analysis of an automated control system device written in Ada

    OpenAIRE

    Winter, Mathias William.

    1995-01-01

    Software Fault Tree Analysis (SFTA) is a technique used to analyze software for faults that could lead to hazardous conditions in systems which contain software components. Previous thesis works have developed three Ada-based, semi-automated software analysis tools, the Automated Code Translation Tool (ACm) an Ada statement template generator, the Fault Tree Editor (Fm) a graphical fault tree editor, and the Fault Isolator (Fl) an automated software fault tree isolator. These previous works d...

  11. Comparison of two software versions for assessment of body-composition analysis by DXA

    DEFF Research Database (Denmark)

    Vozarova, B; Wang, J; Weyer, C;

    2001-01-01

    To compare two software versions provided by Lunar CO: for assessment of body composition analysis by DXA.......To compare two software versions provided by Lunar CO: for assessment of body composition analysis by DXA....

  12. BABELOMICS: a suite of web tools for functional annotation and analysis of groups of genes in high-throughput experiments.

    Science.gov (United States)

    Al-Shahrour, Fátima; Minguez, Pablo; Vaquerizas, Juan M; Conde, Lucía; Dopazo, Joaquín

    2005-07-01

    We present Babelomics, a complete suite of web tools for the functional analysis of groups of genes in high-throughput experiments, which includes the use of information on Gene Ontology terms, interpro motifs, KEGG pathways, Swiss-Prot keywords, analysis of predicted transcription factor binding sites, chromosomal positions and presence in tissues with determined histological characteristics, through five integrated modules: FatiGO (fast assignment and transference of information), FatiWise, transcription factor association test, GenomeGO and tissues mining tool, respectively. Additionally, another module, FatiScan, provides a new procedure that integrates biological information in combination with experimental results in order to find groups of genes with modest but coordinate significant differential behaviour. FatiScan is highly sensitive and is capable of finding significant asymmetries in the distribution of genes of common function across a list of ordered genes even if these asymmetries were not extreme. The strong multiple-testing nature of the contrasts made by the tools is taken into account. All the tools are integrated in the gene expression analysis package GEPAS. Babelomics is the natural evolution of our tool FatiGO (which analysed almost 22,000 experiments during the last year) to include more sources on information and new modes of using it. Babelomics can be found at http://www.babelomics.org.

  13. PET-Tool: a software suite for comprehensive processing and managing of Paired-End diTag (PET sequence data

    Directory of Open Access Journals (Sweden)

    Wei Chia-Lin

    2006-08-01

    Full Text Available Abstract Background We recently developed the Paired End diTag (PET strategy for efficient characterization of mammalian transcriptomes and genomes. The paired end nature of short PET sequences derived from long DNA fragments raised a new set of bioinformatics challenges, including how to extract PETs from raw sequence reads, and correctly yet efficiently map PETs to reference genome sequences. To accommodate and streamline data analysis of the large volume PET sequences generated from each PET experiment, an automated PET data process pipeline is desirable. Results We designed an integrated computation program package, PET-Tool, to automatically process PET sequences and map them to the genome sequences. The Tool was implemented as a web-based application composed of four modules: the Extractor module for PET extraction; the Examiner module for analytic evaluation of PET sequence quality; the Mapper module for locating PET sequences in the genome sequences; and the ProjectManager module for data organization. The performance of PET-Tool was evaluated through the analyses of 2.7 million PET sequences. It was demonstrated that PET-Tool is accurate and efficient in extracting PET sequences and removing artifacts from large volume dataset. Using optimized mapping criteria, over 70% of quality PET sequences were mapped specifically to the genome sequences. With a 2.4 GHz LINUX machine, it takes approximately six hours to process one million PETs from extraction to mapping. Conclusion The speed, accuracy, and comprehensiveness have proved that PET-Tool is an important and useful component in PET experiments, and can be extended to accommodate other related analyses of paired-end sequences. The Tool also provides user-friendly functions for data quality check and system for multi-layer data management.

  14. UPVapor: Cofrentes nuclear power plant production results analysis software

    International Nuclear Information System (INIS)

    UPVapor software version 02 has been developed for the Cofrentes nuclear power plant Data Analysis Department (Spain). It is an analysis graphical environment in which users have available all the plant variables registered in the process computer system (SIEC). In order to perform this, UPVapor software has many advanced graphic tools for work simplicity, as well as a friendly environment easy to use and with many configuration possibilities. Plant variables are classified in the same way that they are in SIEC computer and these values are taken from it through the network of Iberdrola. UPVapor can generate two different types of graphics: evolution graphs and X Y graphs. The first ones analyse the evolution up to twenty plant variables in a user's defined time period and according to historic plant files. Many tools are available: cursors, graphic configuration, mobile means, non valid data visualization ... Moreover, a particular analysis configuration can be saved, as a pre selection, giving the possibility of charging pre selection directly and developing quick monitoring of a group of preselected plant variables. In X Y graphs, it is possible to analyse a variable value against another variable in a defined time. As an option, users can filter previous data depending on a variable certain range, with the possibility of programming up to five filters. As well as the other graph, X Y graph has many configurations, saving and printing options. With UPVapor software, data analysts can save a valuable time during daily work and, as it is of easy utilization, it permits to other users to perform their own analysis without ask the analysts to develop. Besides, it can be used from any work centre with access to network framework. (Author)

  15. UPVapor: Cofrentes nuclear power plant production results analysis software

    Energy Technology Data Exchange (ETDEWEB)

    Curiel, M. [Logistica y Acondicionamientos Industriales SAU, Sorolla Center, local 10, Av. de las Cortes Valencianas No. 58, 46015 Valencia (Spain); Palomo, M. J. [ISIRYM, Universidad Politecnica de Valencia, Camino de Vera s/n, Valencia (Spain); Baraza, A. [Iberdrola Generacion S. A., Central Nuclear Cofrentes, Carretera Almansa Requena s/n, 04662 Cofrentes, Valencia (Spain); Vaquer, J., E-mail: m.curiel@lainsa.co [TITANIA Servicios Tecnologicos SL, Sorolla Center, local 10, Av. de las Cortes Valencianas No. 58, 46015 Valencia (Spain)

    2010-10-15

    UPVapor software version 02 has been developed for the Cofrentes nuclear power plant Data Analysis Department (Spain). It is an analysis graphical environment in which users have available all the plant variables registered in the process computer system (SIEC). In order to perform this, UPVapor software has many advanced graphic tools for work simplicity, as well as a friendly environment easy to use and with many configuration possibilities. Plant variables are classified in the same way that they are in SIEC computer and these values are taken from it through the network of Iberdrola. UPVapor can generate two different types of graphics: evolution graphs and X Y graphs. The first ones analyse the evolution up to twenty plant variables in a user's defined time period and according to historic plant files. Many tools are available: cursors, graphic configuration, mobile means, non valid data visualization ... Moreover, a particular analysis configuration can be saved, as a pre selection, giving the possibility of charging pre selection directly and developing quick monitoring of a group of preselected plant variables. In X Y graphs, it is possible to analyse a variable value against another variable in a defined time. As an option, users can filter previous data depending on a variable certain range, with the possibility of programming up to five filters. As well as the other graph, X Y graph has many configurations, saving and printing options. With UPVapor software, data analysts can save a valuable time during daily work and, as it is of easy utilization, it permits to other users to perform their own analysis without ask the analysts to develop. Besides, it can be used from any work centre with access to network framework. (Author)

  16. Feature-Oriented Nonfunctional Requirement Analysis for Software Product Line

    Institute of Scientific and Technical Information of China (English)

    Xin Peng; Seok-Won Lee; Wen-Yun Zhao

    2009-01-01

    Domain analysis in software product line (SPL) development provides a basis for core assets design and implementation by a systematic and comprehensive commonality/variability analysis. In feature-oriented SPL methods, products of the domain analysis are domain feature models and corresponding feature decision models to facilitate application-oriented customization. As in requirement analysis for a single system, the domain analysis in the SPL development should consider both functional and nonfunctional domain requirements. However, the nonfunctional requirements (NFRs) are often neglected in the existing domain analysis methods. In this paper, we propose a context-based method of the NFR analysis for the SPL development. In the method, NFRs are materialized by connecting nonfunctional goals with real-world context,thus NFR elicitation and variability analysis can be performed by context analysis for the whole domain with the assistance of NFR templates and NFR graphs. After the variability analysis, our method integrates both functional and nonfunctional perspectives by incorporating the nonfunctional goals and operationalizations into an initial functional feature model.NFR-related constraints are also elicited and integrated. Finally, a decision model with both functional and nonfunctional perspectives is constructed to facilitate application-oriented feature model customization. A computer-aided grading system (CAGS) product line is employed to demonstrate the method throughout the paper.

  17. Analysis of Performance of Stereoscopic-Vision Software

    Science.gov (United States)

    Kim, Won; Ansar, Adnan; Steele, Robert; Steinke, Robert

    2007-01-01

    A team of JPL researchers has analyzed stereoscopic vision software and produced a document describing its performance. This software is of the type used in maneuvering exploratory robotic vehicles on Martian terrain. The software in question utilizes correlations between portions of the images recorded by two electronic cameras to compute stereoscopic disparities, which, in conjunction with camera models, are used in computing distances to terrain points to be included in constructing a three-dimensional model of the terrain. The analysis included effects of correlation- window size, a pyramidal image down-sampling scheme, vertical misalignment, focus, maximum disparity, stereo baseline, and range ripples. Contributions of sub-pixel interpolation, vertical misalignment, and foreshortening to stereo correlation error were examined theoretically and experimentally. It was found that camera-calibration inaccuracy contributes to both down-range and cross-range error but stereo correlation error affects only the down-range error. Experimental data for quantifying the stereo disparity error were obtained by use of reflective metrological targets taped to corners of bricks placed at known positions relative to the cameras. For the particular 1,024-by-768-pixel cameras of the system analyzed, the standard deviation of the down-range disparity error was found to be 0.32 pixel.

  18. Development of RCM analysis software for Korean nuclear power plants

    International Nuclear Information System (INIS)

    A software called KEPCO RCM workstation (KRCM) has been developed to optimize the maintenance strategies of Korea nuclear power plants. The program modules of the KRCM were designed in a manner that combines EPRI methodologies and KEPRI analysis technique. The KRCM is being applied to the three pilot systems, chemical and volume control system, main steam system, and compressed air system of Yonggwang Units 1 and 2. In addition, the KRCM can be utilized as a tool to meet a part of the requirements of maintenance rule (MR) imposed by U.S. NRC

  19. Fault tree analysis of software at Ontario Hydro

    International Nuclear Information System (INIS)

    The fault tree technique has been used by Ontario Hydro to effectively review and verify safety critical systems at its nuclear generating stations (NGS). Recent efforts, on the Shutdown Systems at Darlington NGS and the protective fuel-handling software at Bruce NGS A, have shown the fault tree technique to be a valuable tool for uncovering latent conditional errors and facilitating recommendations to increase system fault-tolerance. The experiences of the Bruce NGS A analysis are presented here as a vehicle to illustrate the practical advantages and limitations of the fault tree technique

  20. Development of software for the thermohydraulic analysis of air coolers

    Directory of Open Access Journals (Sweden)

    Šerbanović Slobodan P.

    2003-01-01

    Full Text Available Air coolers consume much more energy compared to other heat exchangers due to the large fan power required. This is an additional reason to establish reliable methods for the rational design and thermohydraulic analysis of these devices. The optimal values of the outlet temperature and air flow rate are of particular importance. The paper presents a methodology for the thermohydraulic calculation of air cooler performances, which is incorporated in the "Air Cooler" software module. The module covers two options: cooling and/or condensation of process fluids by ambient air. The calculated results can be given in various ways ie. in the tabular and graphical form.

  1. Development of RCM analysis software for Korean nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Young Ho; Choi, Kwang Hee; Jeong, Hyeong Jong [Korea Electric Power Research Institute, Taejon (Korea, Republic of)

    1998-12-31

    A software called KEPCO RCM workstation (KRCM) has been developed to optimize the maintenance strategies of Korean nuclear power plants. The program modules of the KRCM were designed in a manner that combines EPRI methodologies and KEPRI analysis technique. The KRCM is being applied to the three pilot system, chemical and volume control system, main steam system, and compressed air system of Yonggwang Units 1 and 2. In addition, the KRCM can be utilized as a tool to meet a part of the requirements of maintenance rule (MR) imposed by U.S. NRC. 3 refs., 4 figs. (Author)

  2. Tool Support for Parametric Analysis of Large Software Simulation Systems

    Science.gov (United States)

    Schumann, Johann; Gundy-Burlet, Karen; Pasareanu, Corina; Menzies, Tim; Barrett, Tony

    2008-01-01

    The analysis of large and complex parameterized software systems, e.g., systems simulation in aerospace, is very complicated and time-consuming due to the large parameter space, and the complex, highly coupled nonlinear nature of the different system components. Thus, such systems are generally validated only in regions local to anticipated operating points rather than through characterization of the entire feasible operational envelope of the system. We have addressed the factors deterring such an analysis with a tool to support envelope assessment: we utilize a combination of advanced Monte Carlo generation with n-factor combinatorial parameter variations to limit the number of cases, but still explore important interactions in the parameter space in a systematic fashion. Additional test-cases, automatically generated from models (e.g., UML, Simulink, Stateflow) improve the coverage. The distributed test runs of the software system produce vast amounts of data, making manual analysis impossible. Our tool automatically analyzes the generated data through a combination of unsupervised Bayesian clustering techniques (AutoBayes) and supervised learning of critical parameter ranges using the treatment learner TAR3. The tool has been developed around the Trick simulation environment, which is widely used within NASA. We will present this tool with a GN&C (Guidance, Navigation and Control) simulation of a small satellite system.

  3. Integrating software architectures for distributed simulations and simulation analysis communities.

    Energy Technology Data Exchange (ETDEWEB)

    Goldsby, Michael E.; Fellig, Daniel; Linebarger, John Michael; Moore, Patrick Curtis; Sa, Timothy J.; Hawley, Marilyn F.

    2005-10-01

    The one-year Software Architecture LDRD (No.79819) was a cross-site effort between Sandia California and Sandia New Mexico. The purpose of this research was to further develop and demonstrate integrating software architecture frameworks for distributed simulation and distributed collaboration in the homeland security domain. The integrated frameworks were initially developed through the Weapons of Mass Destruction Decision Analysis Center (WMD-DAC), sited at SNL/CA, and the National Infrastructure Simulation & Analysis Center (NISAC), sited at SNL/NM. The primary deliverable was a demonstration of both a federation of distributed simulations and a federation of distributed collaborative simulation analysis communities in the context of the same integrated scenario, which was the release of smallpox in San Diego, California. To our knowledge this was the first time such a combination of federations under a single scenario has ever been demonstrated. A secondary deliverable was the creation of the standalone GroupMeld{trademark} collaboration client, which uses the GroupMeld{trademark} synchronous collaboration framework. In addition, a small pilot experiment that used both integrating frameworks allowed a greater range of crisis management options to be performed and evaluated than would have been possible without the use of the frameworks.

  4. Stress Analysis Of Lpg Cylinder Using Ansys Software

    Directory of Open Access Journals (Sweden)

    Suhas A.Rewatkar

    2013-01-01

    Full Text Available Analysis of the robot hand was analyzed using dedicated software for FEM analysis. The model was exported to FEM processor i.e. in ANSYS, the geometry was updated and the structure meshed using 3D elements. Finiteelement analysis is a method to computationally model reality in a mathematical form to better understand a highly complex problem. In the real world, everything that occurs results from the interaction between atoms (and sub-particles of those atoms. Billions and billions and billions of them. If we were to simulate the world in a computer, we would have to simulate this interaction based on the simple laws of physics. However, no computer can process the near infinite number of atoms in objects, so instead we model 'finite' groups of them.

  5. Analysis of Test Efficiency during Software Development Process

    OpenAIRE

    Nair, T. R. Gopalakrishnan; Suma, V.; Tiwari, Pranesh Kumar

    2012-01-01

    One of the prerequisites of any organization is an unvarying sustainability in the dynamic and competitive industrial environment. Development of high quality software is therefore an inevitable constraint of any software industry. Defect management being one of the highly influencing factors for the production of high quality software, it is obligatory for the software organizations to orient them towards effective defect management. Since, the time of software evolution, testing is deemed a...

  6. Evaluation of Peak-Fitting Software for Gamma Spectrum Analysis

    CERN Document Server

    Zahn, Guilherme S; Moralles, Maurício

    2015-01-01

    In all applications of gamma-ray spectroscopy, one of the most important and delicate parts of the data analysis is the fitting of the gamma-ray spectra, where information as the number of counts, the position of the centroid and the width, for instance, are associated with each peak of each spectrum. There's a huge choice of computer programs that perform this type of analysis, and the most commonly used in routine work are the ones that automatically locate and fit the peaks; this fit can be made in several different ways -- the most common ways are to fit a Gaussian function to each peak or simply to integrate the area under the peak, but some software go far beyond and include several small corrections to the simple Gaussian peak function, in order to compensate for secondary effects. In this work several gamma-ray spectroscopy software are compared in the task of finding and fitting the gamma-ray peaks in spectra taken with standard sources of $^{137}$Cs, $^{60}$Co, $^{133}$Ba and $^{152}$Eu. The results...

  7. HistFitter software framework for statistical data analysis

    CERN Document Server

    Baak, M; Cote, D; Koutsman, A; Lorenz, J; Short, D

    2015-01-01

    We present a software framework for statistical data analysis, called HistFitter, that has been used extensively by the ATLAS Collaboration to analyze big datasets originating from proton-proton collisions at the Large Hadron Collider at CERN. Since 2012 HistFitter has been the standard statistical tool in searches for supersymmetric particles performed by ATLAS. HistFitter is a programmable and flexible framework to build, book-keep, fit, interpret and present results of data models of nearly arbitrary complexity. Starting from an object-oriented configuration, defined by users, the framework builds probability density functions that are automatically fitted to data and interpreted with statistical tests. A key innovation of HistFitter is its design, which is rooted in core analysis strategies of particle physics. The concepts of control, signal and validation regions are woven into its very fabric. These are progressively treated with statistically rigorous built-in methods. Being capable of working with mu...

  8. Comparative analysis of results between CASMO, MCNP and Serpent for a suite of Benchmark problems on BWR reactors

    International Nuclear Information System (INIS)

    In this paper a comparison is made in analyzing the suite of Benchmark problems for reactors type BWR between CASMO-4, MCNP6 and Serpent code. The Benchmark problem consists of two different geometries: a fuel cell of a pin and assembly type BWR. To facilitate the study of reactors physics in the fuel pin their nuclear characteristics are provided to detail, such as burnt dependence, the reactivity of selected nuclide, etc. With respect to the fuel assembly, the presented results are regarding to infinite multiplication factor for burning different steps and different vacuum conditions. Making the analysis of this set of Benchmark problems provides comprehensive test problems for the next fuels generation of BWR reactors with high extended burned. It is important to note that when making this comparison the purpose is to validate the methodologies used in modeling for different operating conditions, if the case is of other BWR assembly. The results will be within a range with some uncertainty, considering that does not depend on code that is used. Escuela Superior de Fisica y Matematicas of Instituto Politecnico Nacional (IPN (Mexico) has accumulated some experience in using Serpent, due to the potential of this code over other commercial codes such as CASMO and MCNP. The obtained results for the infinite multiplication factor are encouraging and motivate the studies to continue with the generation of the X S of a core to a next step a respective nuclear data library is constructed and this can be used by codes developed as part of the development project of the Mexican Analysis Platform of Nuclear Reactors AZTLAN. (Author)

  9. ATLAS tile calorimeter cesium calibration control and analysis software

    International Nuclear Information System (INIS)

    An online control system to calibrate and monitor ATLAS Barrel hadronic calorimeter (TileCal) with a movable radioactive source, driven by liquid flow, is described. To read out and control the system an online software has been developed, using ATLAS TDAQ components like DVS (Diagnostic and Verification System) to verify the hardware before running, IS (Information Server) for data and status exchange between networked computers, and other components like DDC (DCS to DAQ Connection), to connect to PVSS-based slow control systems of Tile Calorimeter, high voltage and low voltage. A system of scripting facilities, based on Python language, is used to handle all the calibration and monitoring processes from hardware perspective to final data storage, including various abnormal situations. A QT based graphical user interface to display the status of the calibration system during the cesium source scan is described. The software for analysis of the detector response, using online data, is discussed. Performance of the system and first experience from the ATLAS pit are presented

  10. Visual data mining and analysis of software repositories

    NARCIS (Netherlands)

    Voinea, Lucian; Telea, Alexandru

    2007-01-01

    In this article we describe an ongoing effort to integrate information visualization techniques into the process of configuration management for software systems. Our focus is to help software engineers manage the evolution of large and complex software systems by offering them effective and efficie

  11. Integrated Software Environment for Pressurized Thermal Shock Analysis

    Directory of Open Access Journals (Sweden)

    Dino Araneo

    2011-01-01

    Full Text Available The present paper describes the main features and an application to a real Nuclear Power Plant (NPP of an Integrated Software Environment (in the following referred to as “platform” developed at University of Pisa (UNIPI to perform Pressurized Thermal Shock (PTS analysis. The platform is written in Java for the portability and it implements all the steps foreseen in the methodology developed at UNIPI for the deterministic analysis of PTS scenarios. The methodology starts with the thermal hydraulic analysis of the NPP with a system code (such as Relap5-3D and Cathare2, during a selected transient scenario. The results so obtained are then processed to provide boundary conditions for the next step, that is, a CFD calculation. Once the system pressure and the RPV wall temperature are known, the stresses inside the RPV wall can be calculated by mean a Finite Element (FE code. The last step of the methodology is the Fracture Mechanics (FM analysis, using weight functions, aimed at evaluating the stress intensity factor (KI at crack tip to be compared with the critical stress intensity factor KIc. The platform automates all these steps foreseen in the methodology once the user specifies a number of boundary conditions at the beginning of the simulation.

  12. Development of the free-space optical communications analysis software

    Science.gov (United States)

    Jeganathan, Muthu; Mecherle, G. Stephen; Lesh, James R.

    1998-05-01

    The Free-space Optical Communication Analysis Software (FOCAS) was developed at the Jet Propulsion Laboratory (JPL) to provide mission planners, systems engineers and communications engineers with an easy to use tool to analyze direct-detection optical communication links. The FOCAS program, implemented in Microsoft Excel, gives it all the power and flexibility built into the spreadsheet. An easy-to-use interface, developed using Visual Basic for Applications (VBA), to the spreadsheet allows easy input of data and parameters. A host of pre- defined components allow an analyst to configure a link without having to know the details of the components. FOCAS replaces the over-a-decade-old FORTRAN program called OPTI widely used previously at JPL. This paper describes the features and capabilities of the Excel-spreadsheet-based FOCAS program.

  13. CancellationTools: All-in-one software for administration and analysis of cancellation tasks.

    Science.gov (United States)

    Dalmaijer, Edwin S; Van der Stigchel, Stefan; Nijboer, Tanja C W; Cornelissen, Tim H W; Husain, Masud

    2015-12-01

    In a cancellation task, a participant is required to search for and cross out ("cancel") targets, which are usually embedded among distractor stimuli. The number of cancelled targets and their location can be used to diagnose the neglect syndrome after stroke. In addition, the organization of search provides a potentially useful way to measure executive control over multitarget search. Although many useful cancellation measures have been introduced, most fail to make their way into research studies and clinical practice due to the practical difficulty of acquiring such parameters from traditional pen-and-paper measures. Here we present new, open-source software that is freely available to all. It allows researchers and clinicians to flexibly administer computerized cancellation tasks using stimuli of their choice, and to directly analyze the data in a convenient manner. The automated analysis suite provides output that includes almost all of the currently existing measures, as well as several new ones introduced here. All tasks can be performed using either a computer mouse or a touchscreen as an input device, and an online version of the task runtime is available for tablet devices. A summary of the results is produced in a single A4-sized PDF document, including high quality data visualizations. For research purposes, batch analysis of large datasets is possible. In sum, CancellationTools allows users to employ a flexible, computerized cancellation task, which provides extensive benefits and ease of use. PMID:25381020

  14. CancellationTools: All-in-one software for administration and analysis of cancellation tasks.

    Science.gov (United States)

    Dalmaijer, Edwin S; Van der Stigchel, Stefan; Nijboer, Tanja C W; Cornelissen, Tim H W; Husain, Masud

    2015-12-01

    In a cancellation task, a participant is required to search for and cross out ("cancel") targets, which are usually embedded among distractor stimuli. The number of cancelled targets and their location can be used to diagnose the neglect syndrome after stroke. In addition, the organization of search provides a potentially useful way to measure executive control over multitarget search. Although many useful cancellation measures have been introduced, most fail to make their way into research studies and clinical practice due to the practical difficulty of acquiring such parameters from traditional pen-and-paper measures. Here we present new, open-source software that is freely available to all. It allows researchers and clinicians to flexibly administer computerized cancellation tasks using stimuli of their choice, and to directly analyze the data in a convenient manner. The automated analysis suite provides output that includes almost all of the currently existing measures, as well as several new ones introduced here. All tasks can be performed using either a computer mouse or a touchscreen as an input device, and an online version of the task runtime is available for tablet devices. A summary of the results is produced in a single A4-sized PDF document, including high quality data visualizations. For research purposes, batch analysis of large datasets is possible. In sum, CancellationTools allows users to employ a flexible, computerized cancellation task, which provides extensive benefits and ease of use.

  15. Detection and Quantification of Nitrogen Compounds in Martian Solid Samples by the Sample Analysis at Mars (SAM) Instrument Suite

    Science.gov (United States)

    Stern, Jennifer C.; Navarro-Gonzalez, Rafael; Freissinet, Caroline; McKay, Christopher P.; Archer, Paul Douglas; Buch, Arnaud; Eigenbrode, Jennifer L.; Franz, Heather; Glavin, Daniel Patrick; Ming, Douglas W/; Steele, Andrew; Szopa, Cyril; Wray, James J.; Conrad, Pamela Gales; Mahaffay, Paul R.

    2013-01-01

    The Sample Analysis at Mars (SAM) instrument suite on the Mars Science Laboratory (MSL) Curiosity Rover detected both reduced and oxidized nitrogen-bearing compounds during the pyrolysis of surface materials from three sites at Gale Crater. Preliminary detections of nitrogen species include NO, HCN, ClCN, CH3CN, and TFMA (trifluoro-Nmethyl-acetamide). On Earth, nitrogen is a crucial bio-element, and nitrogen availability controls productivity in many environments. Nitrogen has also recently been detected in the form of CN in inclusions in the Martian meteorite Tissint, and isotopically heavy nitrogen (delta N-15 approx +100per mille) has been measured during stepped combustion experiments in several SNC meteorites. The detection of nitrogen-bearing compounds in Martian regolith would have important implications for the habitability of ancient Mars. However, confirmation of indigenous Martian nitrogen bearing compounds will require ruling out their formation from the terrestrial derivatization reagents (e.g. N-methyl-N-tert-butyldimethylsilyl-trifluoroacetamide, MTBSTFA and dimethylformamide, DMF) carried for SAM's wet chemistry experiment that contribute to the SAM background. The nitrogen species we detect in the SAM solid sample analyses can also be produced during laboratory pyrolysis experiments where these reagents are heated in the presence of perchlorate, a compound that has also been identified by SAM in Mars solid samples. However, this does not preclude a Martian origin for some of these compounds, which are present in nanomolar concentrations in SAM evolved gas analyses. Analysis of SAM data and laboratory breadboard tests are underway to determine whether nitrogen species are present at higher concentrations than can be accounted for by maximum estimates of nitrogen contribution from MTBSTFA and DMF. In addition, methods are currently being developed to use GC Column 6, (functionally similar to a commercial Q-Bond column), to separate and identify

  16. Music Education Suites

    Science.gov (United States)

    Kemp, Wayne

    2009-01-01

    This publication describes options for designing and equipping middle and high school music education suites, and suggests ways of gaining community support for including full service music suites in new and renovated school facilities. In addition to basic music suites, and practice rooms, other options detailed include: (1) small ensemble…

  17. The Application and Extension of Backward Software Analysis

    CERN Document Server

    Perisic, Aleksandar

    2010-01-01

    The backward software analysis is a method that emanates from executing a program backwards - instead of taking input data and following the execution path, we start from output data and by executing the program backwards command by command, analyze data that could lead to the current output. The changed perspective forces a developer to think in a new way about the program. It can be applied as a thorough procedure or casual method. With this method, we have many advantages in testing, algorithm and system analysis. For example, in testing the advantage is obvious if the set of output data is smaller than possible inputs. For some programs or algorithms, we know more precisely the output data, so this backward analysis can help in reducing the number of test cases or even in strict verification of an algorithm. The difficulty lies in the fact that we need types of data that no programming language currently supports, so we need additional effort to understand how this method works, or what effort we need to ...

  18. Software for neutron activation analysis at reactor IBR-2, FLNP, JINR

    CERN Document Server

    Zlokazov, V B

    2004-01-01

    A Delphi program suite, developed for processing gamma-spectra of induced activity of nuclei, obtained from the neutron activation measurements at the reactor IBR-2, FLNF, JINR, is reported. This suite contains components, intended for carrying out all the operations of the analysis cycle, starling with a data acquisition program for gamma -spectrometers Gamma (written in C++ Builder) and including Delphi programs for steps of the analysis. (6 refs).

  19. The PROOF benchmark suite measuring PROOF performance

    Science.gov (United States)

    Ryu, S.; Ganis, G.

    2012-06-01

    The PROOF benchmark suite is a new utility suite of PROOF to measure performance and scalability. The primary goal of the benchmark suite is to determine optimal configuration parameters for a set of machines to be used as PROOF cluster. The suite measures the performance of the cluster for a set of standard tasks as a function of the number of effective processes. Cluster administrators can use the suite to measure the performance of the cluster and find optimal configuration parameters. PROOF developers can also utilize the suite to help them measure, identify problems and improve their software. In this paper, the new tool is explained in detail and use cases are presented to illustrate the new tool.

  20. A practical approach to handling the uncertainty analysis in gamma spectroscopy with the software's Gamma Vision and Genie

    International Nuclear Information System (INIS)

    Full text: The national Swedish network of laboratories in emergency response and preparedness should provide with fast and reliable measurements. That is why these results should be given with a measure of its quality, which is the measurement uncertainty, as has been stated in several international standards. Many gamma spectroscopy software packages contain advance algorithms for calculation of the activity and its measurement uncertainty. They even include elements of quality assurance and quality control. Despite of that, not all sources of uncertainty are always taken into account. The two most used analysis software packages in the Swedish network of laboratories in emergency response are Gamma Vision from Ortec and Genie (with and without APEX) from Canberra. The purpose of this paper is to present two groups of practical evaluations of uncertainty components for the same kind of gamma-spectroscopy analysis, one that would suit Gamma Vision users and other for Genie users, including the Labsocs tool. The main idea is to profit as much as possible of the software capabilities and semi-manually add the contribution of uncertainty sources that are not been taken into account. The reports from both the software packages are modified so as to reflect the contribution of all sources of uncertainty into the reported relative combined uncertainty. The examples of gamma spectroscopy analysis are for samples of the same matrix and the different geometries foreseen in the context of emergency response by the Swedish emergency network. Together with the evaluation of the uncertainty components a review on the uncertainty propagation and the assumptions taken in each of the software packages is presented. (author)

  1. Control software analysis, Part I Open-loop properties

    CERN Document Server

    Feron, Eric

    2008-01-01

    As the digital world enters further into everyday life, questions are raised about the increasing challenges brought by the interaction of real-time software with physical devices. Many accidents and incidents encountered in areas as diverse as medical systems, transportation systems or weapon systems are ultimately attributed to "software failures". Since real-time software that interacts with physical systems might as well be called control software, the long litany of accidents due to real-time software failures might be taken as an equally long list of opportunities for control systems engineering. In this paper, we are interested only in run-time errors in those pieces of software that are a direct implementation of control system specifications: For well-defined and well-understood control architectures such as those present in standard textbooks on digital control systems, the current state of theoretical computer science is well-equipped enough to address and analyze control algorithms. It appears tha...

  2. Software selection based on analysis and forecasting methods, practised in 1C

    Science.gov (United States)

    Vazhdaev, A. N.; Chernysheva, T. Y.; Lisacheva, E. I.

    2015-09-01

    The research focuses on the problem of a “1C: Enterprise 8” platform inboard mechanisms for data analysis and forecasting. It is important to evaluate and select proper software to develop effective strategies for customer relationship management in terms of sales, as well as implementation and further maintenance of software. Research data allows creating new forecast models to schedule further software distribution.

  3. Software selection based on analysis and forecasting methods, practised in 1C

    OpenAIRE

    Vazhdaev, Andrey Nikolaevich; Chernysheva, Tatiana Yurievna; Lisacheva, E. I.

    2015-01-01

    The research focuses on the problem of a "1C: Enterprise 8" platform inboard mechanisms for data analysis and forecasting. It is important to evaluate and select proper software to develop effective strategies for customer relationship management in terms of sales, as well as implementation and further maintenance of software. Research data allows creating new forecast models to schedule further software distribution.

  4. Spacelab user implementation assessment study. (Software requirements analysis). Volume 2: Technical report

    Science.gov (United States)

    1976-01-01

    The engineering analyses and evaluation studies conducted for the Software Requirements Analysis are discussed. Included are the development of the study data base, synthesis of implementation approaches for software required by both mandatory onboard computer services and command/control functions, and identification and implementation of software for ground processing activities.

  5. Proceedings Fourth International Workshop on Testing, Analysis and Verification of Web Software

    CERN Document Server

    Salaün, Gwen; Hallé, Sylvain; 10.4204/EPTCS.35

    2010-01-01

    This volume contains the papers presented at the fourth international workshop on Testing, Analysis and Verification of Software, which was associated with the 25th IEEE/ACM International Conference on Automated Software Engineering (ASE 2010). The collection of papers includes research on formal specification, model-checking, testing, and debugging of Web software.

  6. RVA. 3-D Visualization and Analysis Software to Support Management of Oil and Gas Resources

    Energy Technology Data Exchange (ETDEWEB)

    Keefer, Donald A. [Univ. of Illinois, Champaign, IL (United States); Shaffer, Eric G. [Univ. of Illinois, Champaign, IL (United States); Storsved, Brynne [Univ. of Illinois, Champaign, IL (United States); Vanmoer, Mark [Univ. of Illinois, Champaign, IL (United States); Angrave, Lawrence [Univ. of Illinois, Champaign, IL (United States); Damico, James R. [Univ. of Illinois, Champaign, IL (United States); Grigsby, Nathan [Univ. of Illinois, Champaign, IL (United States)

    2015-12-01

    A free software application, RVA, has been developed as a plugin to the US DOE-funded ParaView visualization package, to provide support in the visualization and analysis of complex reservoirs being managed using multi-fluid EOR techniques. RVA, for Reservoir Visualization and Analysis, was developed as an open-source plugin to the 64 bit Windows version of ParaView 3.14. RVA was developed at the University of Illinois at Urbana-Champaign, with contributions from the Illinois State Geological Survey, Department of Computer Science and National Center for Supercomputing Applications. RVA was designed to utilize and enhance the state-of-the-art visualization capabilities within ParaView, readily allowing joint visualization of geologic framework and reservoir fluid simulation model results. Particular emphasis was placed on enabling visualization and analysis of simulation results highlighting multiple fluid phases, multiple properties for each fluid phase (including flow lines), multiple geologic models and multiple time steps. Additional advanced functionality was provided through the development of custom code to implement data mining capabilities. The built-in functionality of ParaView provides the capacity to process and visualize data sets ranging from small models on local desktop systems to extremely large models created and stored on remote supercomputers. The RVA plugin that we developed and the associated User Manual provide improved functionality through new software tools, and instruction in the use of ParaView-RVA, targeted to petroleum engineers and geologists in industry and research. The RVA web site (http://rva.cs.illinois.edu) provides an overview of functions, and the development web site (https://github.com/shaffer1/RVA) provides ready access to the source code, compiled binaries, user manual, and a suite of demonstration data sets. Key functionality has been included to support a range of reservoirs visualization and analysis needs, including

  7. First-order feasibility analysis of a space suit radiator concept based on estimation of water mass sublimation using Apollo mission data

    Science.gov (United States)

    Metts, Jonathan G.; Klaus, David M.

    2012-01-01

    Thermal control of a space suit during extravehicular activity (EVA) is typically accomplished by sublimating water to provide system cooling. Spacecraft, on the other hand, primarily rely on radiators to dissipate heat. Integrating a radiator into a space suit has been proposed as an alternative design that does not require mass consumption for heat transfer. While providing cooling without water loss offers potential benefits for EVA application, it is not currently practical to rely on a directional, fixed-emissivity radiator to maintain thermal equilibrium of a spacesuit where the radiator orientation, environmental temperature, and crew member metabolic heat load fluctuate unpredictably. One approach that might make this feasible, however, is the use of electrochromic devices that are capable of infrared emissivity modulation and can be actively controlled across the entire suit surface to regulate net heat flux for the system. Integrating these devices onto the irregular, compliant space suit material requires that they be fabricated on a flexible substrate, such as Kapton film. An initial assessment of whether or not this candidate technology presents a feasible design option was conducted by first characterizing the mass of water loss from sublimation that could theoretically be saved if an electrochromic suit radiator was employed for thermal control. This is particularly important for lunar surface exploration, where the expense of transporting water from Earth is excessive, but the technology is potentially beneficial for other space missions as well. In order to define a baseline for this analysis by comparison to actual data, historical documents from the Apollo missions were mined for comprehensive, detailed metabolic data from each lunar surface outing, and related data from NASA's more recent "Advanced Lunar Walkback" tests were also analyzed. This metabolic database was then used to validate estimates for sublimator water consumption during surface

  8. Contracts in Offshore Software Development: An Empirical Analysis

    OpenAIRE

    Anandasivam Gopal; Konduru Sivaramakrishnan; Krishnan, M. S.; Tridas Mukhopadhyay

    2003-01-01

    We study the determinants of contract choice in offshore software development projects and examine how the choice of contract and other factors in the project affect project profits accruing to the software vendor. Using data collected on 93 offshore projects from a leading Indian software vendor, we provide evidence that specific vendor-, client-, and project-related characteristics such as requirement uncertainty, project team size, and resource shortage significantly explain contract choic...

  9. Improving systems software security through program analysis and instrumentation

    OpenAIRE

    Kuznetsov, Volodymyr

    2016-01-01

    Security and reliability bugs are prevalent in systems software. Systems code is often written in low-level languages like C/C++, which offer many benefits but also delegate memory management and type safety to programmers. This invites bugs that cause crashes or can be exploited by attackers to take control of the program. This thesis presents techniques to detect and fix security and reliability issues in systems software without burdening the software developers. First, we present code-po...

  10. How qualitative data analysis software may support the qualitative analysis process

    NARCIS (Netherlands)

    Peters, V.A.M.; Wester, F.P.J.

    2007-01-01

    The last decades have shown large progress in the elaboration of procedures for qualitative data analysis and in the development of computer programs to support this kind of analysis. We believe, however, that the link between methodology and computer software tools is too loose, especially for a no

  11. Improving Software Reliability Forecasting

    NARCIS (Netherlands)

    Burtsy, Bernard; Albeanu, Grigore; Boros, Dragos N.; Popentiu, Florin; Nicola, Victor

    1997-01-01

    This work investigates some methods for software reliability forecasting. A supermodel is presented as a suited tool for prediction of reliability in software project development. Also, times series forecasting for cumulative interfailure time is proposed and illustrated.

  12. Thermal performance analysis of antigravity suit%某型囊式抗荷服热性能分析

    Institute of Scientific and Technical Information of China (English)

    邱义芬; 李艳杰; 任兆生

    2009-01-01

    To study the thermal performance of an antigravity suit, the thermal system simulation modal was build among human body, antigravity suit and environment. The suit was divided into 15 segments, each segment has different thermal resistance and gas permeability. The system heat and mass transfer process were analyzed according to suit characteristic. This modal can simulate human body skin temperature, core temperature and sweat. The combined index of heat stress (CIHS) was calculated with these parameter to evaluating human body hot load. Experiments were designated to validate the modal calculation results. The differences between experimental and calculating results is small. At last, influences of the suit thermal resistance and impermeability index to body hot load ware analyzed with the modal built above.%为了研究某型抗荷服在不同环境条件下的热防护性能,在分析抗荷服传热传质特点的基础上对该服装的不同节段分别建立不同的服装传热模型,再将人体热调节模型和服装模型结合起来模拟不同环境条件下人体着装的动态仿真,得到人体各节段皮肤温度、人体平均温度、核心温度以及出汗量等,计算出飞行员综合热应激指数,并进行实验验证.最后利用所建模型分析服装热阻及透湿指数对人体热负荷的影响.

  13. Pattern recognition software and techniques for biological image analysis.

    Directory of Open Access Journals (Sweden)

    Lior Shamir

    Full Text Available The increasing prevalence of automated image acquisition systems is enabling new types of microscopy experiments that generate large image datasets. However, there is a perceived lack of robust image analysis systems required to process these diverse datasets. Most automated image analysis systems are tailored for specific types of microscopy, contrast methods, probes, and even cell types. This imposes significant constraints on experimental design, limiting their application to the narrow set of imaging methods for which they were designed. One of the approaches to address these limitations is pattern recognition, which was originally developed for remote sensing, and is increasingly being applied to the biology domain. This approach relies on training a computer to recognize patterns in images rather than developing algorithms or tuning parameters for specific image processing tasks. The generality of this approach promises to enable data mining in extensive image repositories, and provide objective and quantitative imaging assays for routine use. Here, we provide a brief overview of the technologies behind pattern recognition and its use in computer vision for biological and biomedical imaging. We list available software tools that can be used by biologists and suggest practical experimental considerations to make the best use of pattern recognition techniques for imaging assays.

  14. HistFitter software framework for statistical data analysis

    Science.gov (United States)

    Baak, M.; Besjes, G. J.; Côté, D.; Koutsman, A.; Lorenz, J.; Short, D.

    2015-04-01

    We present a software framework for statistical data analysis, called HistFitter, that has been used extensively by the ATLAS Collaboration to analyze big datasets originating from proton-proton collisions at the Large Hadron Collider at CERN. Since 2012 HistFitter has been the standard statistical tool in searches for supersymmetric particles performed by ATLAS. HistFitter is a programmable and flexible framework to build, book-keep, fit, interpret and present results of data models of nearly arbitrary complexity. Starting from an object-oriented configuration, defined by users, the framework builds probability density functions that are automatically fit to data and interpreted with statistical tests. Internally HistFitter uses the statistics packages RooStats and HistFactory. A key innovation of HistFitter is its design, which is rooted in analysis strategies of particle physics. The concepts of control, signal and validation regions are woven into its fabric. These are progressively treated with statistically rigorous built-in methods. Being capable of working with multiple models at once that describe the data, HistFitter introduces an additional level of abstraction that allows for easy bookkeeping, manipulation and testing of large collections of signal hypotheses. Finally, HistFitter provides a collection of tools to present results with publication quality style through a simple command-line interface.

  15. Scientific Visualization Using the Flow Analysis Software Toolkit (FAST)

    Science.gov (United States)

    Bancroft, Gordon V.; Kelaita, Paul G.; Mccabe, R. Kevin; Merritt, Fergus J.; Plessel, Todd C.; Sandstrom, Timothy A.; West, John T.

    1993-01-01

    Over the past few years the Flow Analysis Software Toolkit (FAST) has matured into a useful tool for visualizing and analyzing scientific data on high-performance graphics workstations. Originally designed for visualizing the results of fluid dynamics research, FAST has demonstrated its flexibility by being used in several other areas of scientific research. These research areas include earth and space sciences, acid rain and ozone modelling, and automotive design, just to name a few. This paper describes the current status of FAST, including the basic concepts, architecture, existing functionality and features, and some of the known applications for which FAST is being used. A few of the applications, by both NASA and non-NASA agencies, are outlined in more detail. Described in the Outlines are the goals of each visualization project, the techniques or 'tricks' used lo produce the desired results, and custom modifications to FAST, if any, done to further enhance the analysis. Some of the future directions for FAST are also described.

  16. CAVASS: a computer-assisted visualization and analysis software system.

    Science.gov (United States)

    Grevera, George; Udupa, Jayaram; Odhner, Dewey; Zhuge, Ying; Souza, Andre; Iwanaga, Tad; Mishra, Shipra

    2007-11-01

    The Medical Image Processing Group at the University of Pennsylvania has been developing (and distributing with source code) medical image analysis and visualization software systems for a long period of time. Our most recent system, 3DVIEWNIX, was first released in 1993. Since that time, a number of significant advancements have taken place with regard to computer platforms and operating systems, networking capability, the rise of parallel processing standards, and the development of open-source toolkits. The development of CAVASS by our group is the next generation of 3DVIEWNIX. CAVASS will be freely available and open source, and it is integrated with toolkits such as Insight Toolkit and Visualization Toolkit. CAVASS runs on Windows, Unix, Linux, and Mac but shares a single code base. Rather than requiring expensive multiprocessor systems, it seamlessly provides for parallel processing via inexpensive clusters of work stations for more time-consuming algorithms. Most importantly, CAVASS is directed at the visualization, processing, and analysis of 3-dimensional and higher-dimensional medical imagery, so support for digital imaging and communication in medicine data and the efficient implementation of algorithms is given paramount importance. PMID:17786517

  17. Research and Development of Statistical Analysis Software System of Maize Seedling Experiment

    Directory of Open Access Journals (Sweden)

    Hui Cao

    2014-03-01

    Full Text Available In this study, software engineer measures were used to develop a set of software system for maize seedling experiments statistics and analysis works. During development works, B/S structure software design method was used and a set of statistics indicators for maize seedling evaluation were established. The experiments results indicated that this set of software system could finish quality statistics and analysis for maize seedling very well. The development of this software system explored a new method for screening of maize seedlings.

  18. TweezPal - Optical tweezers analysis and calibration software

    Science.gov (United States)

    Osterman, Natan

    2010-11-01

    Optical tweezers, a powerful tool for optical trapping, micromanipulation and force transduction, have in recent years become a standard technique commonly used in many research laboratories and university courses. Knowledge about the optical force acting on a trapped object can be gained only after a calibration procedure which has to be performed (by an expert) for each type of trapped objects. In this paper we present TweezPal, a user-friendly, standalone Windows software tool for optical tweezers analysis and calibration. Using TweezPal, the procedure can be performed in a matter of minutes even by non-expert users. The calibration is based on the Brownian motion of a particle trapped in a stationary optical trap, which is being monitored using video or photodiode detection. The particle trajectory is imported into the software which instantly calculates position histogram, trapping potential, stiffness and anisotropy. Program summaryProgram title: TweezPal Catalogue identifier: AEGR_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEGR_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 44 891 No. of bytes in distributed program, including test data, etc.: 792 653 Distribution format: tar.gz Programming language: Borland Delphi Computer: Any PC running Microsoft Windows Operating system: Windows 95, 98, 2000, XP, Vista, 7 RAM: 12 Mbytes Classification: 3, 4.14, 18, 23 Nature of problem: Quick, robust and user-friendly calibration and analysis of optical tweezers. The optical trap is calibrated from the trajectory of a trapped particle undergoing Brownian motion in a stationary optical trap (input data) using two methods. Solution method: Elimination of the experimental drift in position data. Direct calculation of the trap stiffness from the positional

  19. Software para análise quantitativa da deglutição Swallowing quantitative analysis software

    Directory of Open Access Journals (Sweden)

    André Augusto Spadotto

    2008-02-01

    Full Text Available OBJETIVO: Apresentar um software que permita uma análise detalhada da dinâmica da deglutição. MATERIAIS E MÉTODOS: Participaram deste estudo dez indivíduos após acidente vascular encefálico, sendo seis do gênero masculino, com idade média de 57,6 anos. Foi realizada videofluoroscopia da deglutição e as imagens foram digitalizadas em microcomputador, com posterior análise do tempo do trânsito faríngeo da deglutição, por meio de um cronômetro e do software. RESULTADOS: O tempo médio do trânsito faríngeo da deglutição apresentou-se diferente quando comparados os métodos utilizados (cronômetro e software. CONCLUSÃO: Este software é um instrumento de análise dos parâmetros tempo e velocidade da deglutição, propiciando melhor compreensão da dinâmica da deglutição, com reflexos tanto na abordagem clínica dos pacientes com disfagia como para fins de pesquisa científica.OBJECTIVE: The present paper is aimed at introducing a software to allow a detailed analysis of the swallowing dynamics. MATERIALS AND METHODS: The sample included ten (six male and four female stroke patients, with mean age of 57.6 years. Swallowing videofluoroscopy was performed and images were digitized for posterior analysis of the pharyngeal transit time with the aid of a chronometer and the software. RESULTS: Differences were observed in the average pharyngeal swallowing transit time as a result of measurements with chronometer and software. CONCLUSION: This software is a useful tool for the analysis of parameters such as swallowing time and speed, allowing a better understanding of the swallowing dynamics, both in the clinical approach of patients with oropharyngeal dysphagia and for scientific research purposes.

  20. A Flexible Method for Producing F.E.M. Analysis of Bone Using Open-Source Software

    Science.gov (United States)

    Boppana, Abhishektha; Sefcik, Ryan; Meyers, Jerry G.; Lewandowski, Beth E.

    2016-01-01

    This project, performed in support of the NASA GRC Space Academy summer program, sought to develop an open-source workflow methodology that segmented medical image data, created a 3D model from the segmented data, and prepared the model for finite-element analysis. In an initial step, a technological survey evaluated the performance of various existing open-source software that claim to perform these tasks. However, the survey concluded that no single software exhibited the wide array of functionality required for the potential NASA application in the area of bone, muscle and bio fluidic studies. As a result, development of a series of Python scripts provided the bridging mechanism to address the shortcomings of the available open source tools. The implementation of the VTK library provided the most quick and effective means of segmenting regions of interest from the medical images; it allowed for the export of a 3D model by using the marching cubes algorithm to build a surface mesh. To facilitate the development of the model domain from this extracted information required a surface mesh to be processed in the open-source software packages Blender and Gmsh. The Preview program of the FEBio suite proved to be sufficient for volume filling the model with an unstructured mesh and preparing boundaries specifications for finite element analysis. To fully allow FEM modeling, an in house developed Python script allowed assignment of material properties on an element by element basis by performing a weighted interpolation of voxel intensity of the parent medical image correlated to published information of image intensity to material properties, such as ash density. A graphical user interface combined the Python scripts and other software into a user friendly interface. The work using Python scripts provides a potential alternative to expensive commercial software and inadequate, limited open-source freeware programs for the creation of 3D computational models. More work

  1. Analysis of Software Delivery Process Shortcomings and Architectural Pitfalls

    OpenAIRE

    Patwardhan, Amol

    2016-01-01

    This paper highlights the common pitfalls of overcomplicating the software architecture, development and delivery process by examining two enterprise level web application products built using Microsoft.Net framework. The aim of this paper is to identify, discuss and analyze architectural, development and deployment issues and learn lessons using real world examples from the chosen software products as case studies.

  2. Multi-criteria decision analysis methods and software

    CERN Document Server

    Ishizaka, Alessio

    2013-01-01

    This book presents an introduction to MCDA followed by more detailed chapters about each of the leading methods used in this field. Comparison of methods and software is also featured to enable readers to choose the most appropriate method needed in their research. Worked examples as well as the software featured in the book are available on an accompanying website.

  3. IFDOTMETER : A New Software Application for Automated Immunofluorescence Analysis

    NARCIS (Netherlands)

    Rodriguez-Arribas, Mario; Pizarro-Estrella, Elisa; Gomez-Sanchez, Ruben; Yakhine-Diop, S. M. S.; Gragera-Hidalgo, Antonio; Cristo, Alejandro; Bravo-San Pedro, Jose M.; Gonzalez-Polo, Rosa A.; Fuentes, Jose M.

    2016-01-01

    Most laboratories interested in autophagy use different imaging software for managing and analyzing heterogeneous parameters in immunofluorescence experiments (e.g., LC3-puncta quantification and determination of the number and size of lysosomes). One solution would be software that works on a user'

  4. A pattern framework for software quality assessment and tradeoff analysis

    NARCIS (Netherlands)

    Folmer, Eelke; Boscht, Jan

    2007-01-01

    The earliest design decisions often have a significant impact on software quality and are the most costly to revoke. One of the challenges in architecture design is to reduce the frequency of retrofit problems in software designs; not being able to improve the quality of a system cost effectively, a

  5. Prototype Software for Automated Structural Analysis of Systems

    DEFF Research Database (Denmark)

    Jørgensen, A.; Izadi-Zamanabadi, Roozbeh; Kristensen, M.

    2004-01-01

    In this paper we present a prototype software tool that is developed to analyse the structural model of automated systems in order to identify redundant information that is hence utilized for Fault detection and Isolation (FDI) purposes. The dedicated algorithms in this software tool use a tri...

  6. Research on Application of Enhanced Neural Networks in Software Risk Analysis

    Institute of Scientific and Technical Information of China (English)

    Zhenbang Rong; Juhua Chen; Mei Liu; Yong Hu

    2006-01-01

    This paper puts forward a risk analysis model for software projects using enranced neural networks. The data for analysis are acquired through questionnaires from real software projects. To solve the multicollinearity in software risks, the method of principal components analysis is adopted in the model to enhance network stability. To solve uncertainty of the neural networks structure and the uncertainty of the initial weights, genetic algorithms is employed. The experimental result reveals that the precision of software risk analysis can be improved by using the erhanced neural networks model.

  7. Space Suit Joint Torque Testing

    Science.gov (United States)

    Valish, Dana J.

    2011-01-01

    In 2009 and early 2010, a test was performed to quantify the torque required to manipulate joints in several existing operational and prototype space suits in an effort to develop joint torque requirements appropriate for a new Constellation Program space suit system. The same test method was levied on the Constellation space suit contractors to verify that their suit design meets the requirements. However, because the original test was set up and conducted by a single test operator there was some question as to whether this method was repeatable enough to be considered a standard verification method for Constellation or other future space suits. In order to validate the method itself, a representative subset of the previous test was repeated, using the same information that would be available to space suit contractors, but set up and conducted by someone not familiar with the previous test. The resultant data was compared using graphical and statistical analysis and a variance in torque values for some of the tested joints was apparent. Potential variables that could have affected the data were identified and re-testing was conducted in an attempt to eliminate these variables. The results of the retest will be used to determine if further testing and modification is necessary before the method can be validated.

  8. Performance Analysis of Software Effort Estimation Models Using Neural Networks

    Directory of Open Access Journals (Sweden)

    P.Latha

    2013-08-01

    Full Text Available Software Effort estimation involves the estimation of effort required to develop software. Cost overrun, schedule overrun occur in the software development due to the wrong estimate made during the initial stage of software development. Proper estimation is very essential for successful completion of software development. Lot of estimation techniques available to estimate the effort in which neural network based estimation technique play a prominent role. Back propagation Network is the most widely used architecture. ELMAN neural network a recurrent type network can be used on par with Back propagation Network. For a good predictor system the difference between estimated effort and actual effort should be as low as possible. Data from historic project of NASA is used for training and testing. The experimental Results confirm that Back propagation algorithm is efficient than Elman neural network.

  9. Software safety analysis on the model specified by NuSCR and SMV input language at requirements phase of software development life cycle using SMV

    International Nuclear Information System (INIS)

    Safety-critical software process is composed of development process, verification and validation (V and V) process and safety analysis process. Safety analysis process has been often treated as an additional process and not found in a conventional software process. But software safety analysis (SSA) is required if software is applied to a safety system, and the SSA shall be performed independently for the safety software through software development life cycle (SDLC). Of all the phases in software development, requirements engineering is generally considered to play the most critical role in determining the overall software quality. NASA data demonstrate that nearly 75% of failures found in operational software were caused by errors in the requirements. The verification process in requirements phase checks the correctness of software requirements specification, and the safety analysis process analyzes the safety-related properties in detail. In this paper, the method for safety analysis at requirements phase of software development life cycle using symbolic model verifier (SMV) is proposed. Hazard is discovered by hazard analysis and in other to use SMV for the safety analysis, the safety-related properties are expressed by computation tree logic (CTL)

  10. The MEME Suite.

    Science.gov (United States)

    Bailey, Timothy L; Johnson, James; Grant, Charles E; Noble, William S

    2015-07-01

    The MEME Suite is a powerful, integrated set of web-based tools for studying sequence motifs in proteins, DNA and RNA. Such motifs encode many biological functions, and their detection and characterization is important in the study of molecular interactions in the cell, including the regulation of gene expression. Since the previous description of the MEME Suite in the 2009 Nucleic Acids Research Web Server Issue, we have added six new tools. Here we describe the capabilities of all the tools within the suite, give advice on their best use and provide several case studies to illustrate how to combine the results of various MEME Suite tools for successful motif-based analyses. The MEME Suite is freely available for academic use at http://meme-suite.org, and source code is also available for download and local installation. PMID:25953851

  11. Software Aging Analysis of Web Server Using Neural Networks

    Directory of Open Access Journals (Sweden)

    G.Sumathi

    2012-06-01

    Full Text Available Software aging is a phenomenon that refers to progressive performance degradation or transient failures or even crashes in long running software systems such as web servers. It mainly occurs due to the deterioration of operating system resource, fragmentation and numerical error accumulation. A primitive method to fight against software aging is software rejuvenation. Software rejuvenation is a proactive fault management technique aimed at cleaning up the system internal state to prevent the occurrence of more severe crash failures in the future. It involves occasionally stopping the running software, cleaning its internal state and restarting it. An optimized schedule for performing the software rejuvenation has to be derived in advance because a long running application could not be put down now and then as it may lead to waste of cost. This paper proposes a method to derive an accurate and optimized schedule for rejuvenation of a web server (Apache by using Radial Basis Function (RBF based Feed Forward Neural Network, a variant of Artificial Neural Networks (ANN. Aging indicators are obtained through experimental setup involving Apache web server and clients, which acts as input to the neural network model. This method is better than existing ones because usage of RBF leads to better accuracy and speed in convergence.

  12. Software Aging Analysis of Web Server Using Neural Networks

    Directory of Open Access Journals (Sweden)

    G.Sumathi

    2012-05-01

    Full Text Available Software aging is a phenomenon that refers to progressive performance degradation or transient failures or even crashes in long running software systems such as web servers. It mainly occurs due to the deterioration of operating system resource, fragmentation and numerical error accumulation. A primitive method to fight against software aging is software rejuvenation. Software rejuvenation is a proactive fault management technique aimed at cleaning up the system internal state to prevent the occurrence of more severe crash failures in the future. It involves occasionally stopping the running software, cleaning its internal state and restarting it. An optimized schedule for performing the software rejuvenation has to be derived in advance because a long running application could not be put down now and then as it may lead to waste of cost. This paper proposes a method to derive an accurate and optimized schedule for rejuvenation of a web server (Apache by using Radial Basis Function (RBF based Feed Forward Neural Network, a variant of Artificial Neural Networks (ANN. Aging indicators are obtained through experimental setup involving Apache web server and clients, which acts as input to the neural network model. This method is better than existing ones because usage of RBF leads to better accuracy and speed in convergence.

  13. Instant Spring Tool Suite

    CERN Document Server

    Chiang, Geoff

    2013-01-01

    Filled with practical, step-by-step instructions and clear explanations for the most important and useful tasks. A tutorial guide that walks you through how to use the features of Spring Tool Suite using well defined sections for the different parts of Spring.Instant Spring Tool Suite is for novice to intermediate Java developers looking to get a head-start in enterprise application development using Spring Tool Suite and the Spring framework. If you are looking for a guide for effective application development using Spring Tool Suite, then this book is for you.

  14. The PRISM Benchmark Suite

    OpenAIRE

    Kwiatkowsa, Marta; Norman, Gethin; Parker, David

    2012-01-01

    We present the PRISM benchmark suite: a collection of probabilistic models and property specifications, designed to facilitate testing, benchmarking and comparisons of probabilistic verification tools and implementations.

  15. Hydraulic network analysis with the software package NESEI

    International Nuclear Information System (INIS)

    The software package NESEI allows the steady state and time step history hydraulic analysis of complex water supply networks. It runs on a PC under MS-DOS, WINDOWS or WIN-OS/2. A menu guided user interface with a context specific help system means, that DOS-expertise is not required. Contrary to the topological restrictions imposed by the Hardy-Crossmethod our hydraulic program con deal with networks of any topology. Our mathematical algorithm (successive over-relaxation method) uses a numerical table representation of the Moody diagram instead of some specific analytical formulation and is therefore not limited to a certain flow regime. Armatures like valves, throttles, backflow preventing flaps, forward and backward pressure regulators and flow control valves may be simulated. Memory requirement and computer time increase approximately linearly with the number of nodes. A maximum of some 3000 nodes may be simulated with 640 kB RAM and over 16000 nodes with 8 MB. Operating modes for steady state computation, time step operation histories with reservoir accounting and autodimensioning of pipe diameters are possible. For the digitization of networks, data management and graphical presentation, commercial products like GIS-ARC/INFO, GISCAD and GISVIEW or AUTOCAD may be used. Data exchange with our package is performed via text and DBASE files. A rapid generation of instructive graphical result presentations is possible via menu selections and permits to achieve a high productivity for the hydraulic analysis and optimisation of supply systems. Some auxiliary utility programs allow also small stand alone installations without GIS or CAD support. A water pressure logger which may be connected at hydrants for extended periods was developed to allow an economic calibration of a hydraulic network model by field measurements. It is microprocessor-controlled, programmable via a menu form by PC and allows selective pre-event recording according to selectable triggering

  16. Hydra—The National Earthquake Information Center’s 24/7 seismic monitoring, analysis, catalog production, quality analysis, and special studies tool suite

    Science.gov (United States)

    Patton, John M.; Guy, Michelle R.; Benz, Harley M.; Buland, Raymond P.; Erickson, Brian K.; Kragness, David S.

    2016-08-18

    This report provides an overview of the capabilities and design of Hydra, the global seismic monitoring and analysis system used for earthquake response and catalog production at the U.S. Geological Survey National Earthquake Information Center (NEIC). Hydra supports the NEIC’s worldwide earthquake monitoring mission in areas such as seismic event detection, seismic data insertion and storage, seismic data processing and analysis, and seismic data output.The Hydra system automatically identifies seismic phase arrival times and detects the occurrence of earthquakes in near-real time. The system integrates and inserts parametric and waveform seismic data into discrete events in a database for analysis. Hydra computes seismic event parameters, including locations, multiple magnitudes, moment tensors, and depth estimates. Hydra supports the NEIC’s 24/7 analyst staff with a suite of seismic analysis graphical user interfaces.In addition to the NEIC’s monitoring needs, the system supports the processing of aftershock and temporary deployment data, and supports the NEIC’s quality assurance procedures. The Hydra system continues to be developed to expand its seismic analysis and monitoring capabilities.

  17. DEVELOPMENT OF EDUCATIONAL SOFTWARE FOR STRESS ANALYSIS OF AN AIRCRAFT WING

    Directory of Open Access Journals (Sweden)

    TAZKERA SADEQ

    2012-06-01

    Full Text Available A stress analysis software based on MATLAB, Graphic user interface (GUI has been developed. The developed software can be used to estimate load on a wing and to compute the stresses at any point along the span of the wing of a given aircraft. The generalized formulation allows performing stress analysis even for a multispar (multicell wing. The software is expected to be a useful tool for effective teaching learning process of courses on aircraft structures and aircraft structural design.

  18. Opportunities and Challenges Applying Functional Data Analysis to the Study of Open Source Software Evolution

    OpenAIRE

    Stewart, Katherine J.; Darcy, David P.; Daniel, Sherae L.

    2006-01-01

    This paper explores the application of functional data analysis (FDA) as a means to study the dynamics of software evolution in the open source context. Several challenges in analyzing the data from software projects are discussed, an approach to overcoming those challenges is described, and preliminary results from the analysis of a sample of open source software (OSS) projects are provided. The results demonstrate the utility of FDA for uncovering and categorizing multiple distinct patterns...

  19. BEANS - a software package for distributed Big Data analysis

    CERN Document Server

    Hypki, Arkadiusz

    2016-01-01

    BEANS software is a web based, easy to install and maintain, new tool to store and analyse data in a distributed way for a massive amount of data. It provides a clear interface for querying, filtering, aggregating, and plotting data from an arbitrary number of datasets. Its main purpose is to simplify the process of storing, examining and finding new relations in the so-called Big Data. Creation of BEANS software is an answer to the growing needs of the astronomical community to have a versatile tool to store, analyse and compare the complex astrophysical numerical simulations with observations (e.g. simulations of the Galaxy or star clusters with the Gaia archive). However, this software was built in a general form and it is ready to use in any other research field or open source software.

  20. A Software Risk Analysis Model Using Bayesian Belief Network

    Institute of Scientific and Technical Information of China (English)

    Yong Hu; Juhua Chen; Mei Liu; Yang Yun; Junbiao Tang

    2006-01-01

    The uncertainty during the period of software project development often brings huge risks to contractors and clients. Ifwe can find an effective method to predict the cost and quality of software projects based on facts like the project character and two-side cooperating capability at the beginning of the project, we can reduce the risk.Bayesian Belief Network(BBN) is a good tool for analyzing uncertain consequences, but it is difficult to produce precise network structure and conditional probability table. In this paper, we built up network structure by Delphi method for conditional probability table learning, and learn update probability table and nodes' confidence levels continuously according to the application cases, which made the evaluation network have learning abilities, and evaluate the software development risk of organization more accurately. This paper also introduces EM algorithm, which will enhance the ability to produce hidden nodes caused by variant software projects.

  1. Trends in applied econometrics software development 1985-2008, an analysis of Journal of Applied Econometrics research articles, software reviews, data and code

    OpenAIRE

    Ooms, M.

    2008-01-01

    Trends in software development for applied econometrics emerge from an analysis of the research articles and software reviews of the Journal of Applied Econometrics, appearing since 1986. The data and code archive of the journal provides more specific information on software use for applied econometrics since 1995. GAUSS, Stata, MATLAB and Ox have been the most important softwares after 2001. I compare these higher level programming languages and R in somewhat more detail. An increasing numbe...

  2. Automatic test suite evolution

    OpenAIRE

    Mirzaaghaei, Mehdi; Pezzè, Mauro

    2013-01-01

    Software testing is one of the most common approaches to verify software systems. Despite of many automated techniques proposed in the literature, test cases are often generated manually. When a software system evolves during development and maintenance to accommodate requirement changes, bug fixes, or functionality extensions, test cases may become obsolete, and software developers need to evolve them to verify the new version of the software system. Due to time pressure and effort requir...

  3. A Comparative Analysis of Software Engineering with Knowledge Engineering

    OpenAIRE

    J. F. Vijay; C. Manoharan

    2010-01-01

    Problem statement: Software engineering is not only a technical discipline of its own. It is also a problem domain where technologies coming from other disciplines are relevant and can play an important role. One important example is knowledge engineering, a term that we use in the broad sense to encompass artificial intelligence, computational intelligence, knowledge bases, data mining and machine learning. We see a number of typical software development issues that can benefit from these di...

  4. Analysis of Whole Transcriptome Sequencing Data: Workflow and Software.

    Science.gov (United States)

    Yang, In Seok; Kim, Sangwoo

    2015-12-01

    RNA is a polymeric molecule implicated in various biological processes, such as the coding, decoding, regulation, and expression of genes. Numerous studies have examined RNA features using whole transcriptome sequencing (RNA-seq) approaches. RNA-seq is a powerful technique for characterizing and quantifying the transcriptome and accelerates the development of bioinformatics software. In this review, we introduce routine RNA-seq workflow together with related software, focusing particularly on transcriptome reconstruction and expression quantification. PMID:26865842

  5. Learning from Experience in Software Development: A Multilevel Analysis

    OpenAIRE

    Wai Fong Boh; Slaughter, Sandra A.; J. Alberto Espinosa

    2007-01-01

    This study examines whether individuals, groups, and organizational units learn from experience in software development and whether this learning improves productivity. Although prior research has found the existence of learning curves in manufacturing and service industries, it is not clear whether learning curves also apply to knowledge work like software development. We evaluate the relative productivity impacts from accumulating specialized experience in a system, diversified experience i...

  6. Analysis of Topology Poisoning Attacks in Software-Defined Networking

    OpenAIRE

    Thanh Bui, Tien

    2015-01-01

    Software-defined networking (SDN) is an emerging architecture with a great potential to foster the development of modern networks. By separating the control plane from the network devices and centralizing it at a software-based controller, SDN provides network-wide visibility and flexible programmability to network administrators. However, the security aspects of SDN are not yet fully understood. For example, while SDN is resistant to some topology poisoning attacks in which the attacker misl...

  7. Problem of Office Suite Training at the University

    OpenAIRE

    Natalia A. Nastashchuk; Svetlana S. Litvinova; Tatiana S. Moshkareva

    2013-01-01

    Te paper considers the problem of office suite applications training, caused by a rapid change of their versions, variety of software developers and a rapid development of software and hardware platforms. The content of office suite applications training, based on the system of office suite notions, its basic functional and standards of information technologies development (OpenDocument Format Standard, ISO 26300-200Х) is presented.

  8. Validation suite for MCNP

    Energy Technology Data Exchange (ETDEWEB)

    Mosteller, R. D. (Russell D.)

    2002-01-01

    Two validation suites, one for criticality and another for radiation shielding, have been defined and tested for the MCNP Monte Carlo code. All of the cases in the validation suites are based on experiments so that calculated and measured results can be compared in a meaningful way. The cases in the validation suites are described, and results from those cases are discussed. For several years, the distribution package for the MCNP Monte Carlo code1 has included an installation test suite to verify that MCNP has been installed correctly. However, the cases in that suite have been constructed primarily to test options within the code and to execute quickly. Consequently, they do not produce well-converged answers, and many of them are physically unrealistic. To remedy these deficiencies, sets of validation suites are being defined and tested for specific types of applications. All of the cases in the validation suites are based on benchmark experiments. Consequently, the results from the measurements are reliable and quantifiable, and calculated results can be compared with them in a meaningful way. Currently, validation suites exist for criticality and radiation-shielding applications.

  9. Comparative analysis of results between CASMO, MCNP and Serpent for a suite of Benchmark problems on BWR reactors; Analisis comparativo de resultados entre CASMO, MCNP y SERPENT para una suite de problemas Benchmark en reactores BWR

    Energy Technology Data Exchange (ETDEWEB)

    Xolocostli M, J. V.; Vargas E, S.; Gomez T, A. M. [ININ, Carretera Mexico-Toluca s/n, 52750 Ocoyoacac, Estado de Mexico (Mexico); Reyes F, M. del C.; Del Valle G, E., E-mail: vicente.xolocostli@inin.gob.mx [IPN, Escuela Superior de Fisica y Matematicas, UP - Adolfo Lopez Mateos, Edif. 9, 07738 Mexico D. F. (Mexico)

    2014-10-15

    In this paper a comparison is made in analyzing the suite of Benchmark problems for reactors type BWR between CASMO-4, MCNP6 and Serpent code. The Benchmark problem consists of two different geometries: a fuel cell of a pin and assembly type BWR. To facilitate the study of reactors physics in the fuel pin their nuclear characteristics are provided to detail, such as burnt dependence, the reactivity of selected nuclide, etc. With respect to the fuel assembly, the presented results are regarding to infinite multiplication factor for burning different steps and different vacuum conditions. Making the analysis of this set of Benchmark problems provides comprehensive test problems for the next fuels generation of BWR reactors with high extended burned. It is important to note that when making this comparison the purpose is to validate the methodologies used in modeling for different operating conditions, if the case is of other BWR assembly. The results will be within a range with some uncertainty, considering that does not depend on code that is used. Escuela Superior de Fisica y Matematicas of Instituto Politecnico Nacional (IPN (Mexico) has accumulated some experience in using Serpent, due to the potential of this code over other commercial codes such as CASMO and MCNP. The obtained results for the infinite multiplication factor are encouraging and motivate the studies to continue with the generation of the X S of a core to a next step a respective nuclear data library is constructed and this can be used by codes developed as part of the development project of the Mexican Analysis Platform of Nuclear Reactors AZTLAN. (Author)

  10. Evaluation of Distribution Analysis Software for DER Applications

    Energy Technology Data Exchange (ETDEWEB)

    Staunton, RH

    2003-01-23

    unstoppable. In response, energy providers will be forced to both fully acknowledge the trend and plan for accommodating DER [3]. With bureaucratic barriers [4], lack of time/resources, tariffs, etc. still seen in certain regions of the country, changes still need to be made. Given continued technical advances in DER, the time is fast approaching when the industry, nation-wide, must not only accept DER freely but also provide or review in-depth technical assessments of how DER should be integrated into and managed throughout the distribution system. Characterization studies are needed to fully understand how both the utility system and DER devices themselves will respond to all reasonable events (e.g., grid disturbances, faults, rapid growth, diverse and multiple DER systems, large reactive loads). Some of this work has already begun as it relates to operation and control of DER [5] and microturbine performance characterization [6,7]. One of the most urgently needed tools that can provide these types of analyses is a distribution network analysis program in combination with models for various DER. Together, they can be used for (1) analyzing DER placement in distribution networks and (2) helping to ensure that adequate transmission reliability is maintained. Surveys of the market show products that represent a partial match to these needs; specifically, software that has been developed to plan electrical distribution systems and analyze reliability (in a near total absence of DER). The first part of this study (Sections 2 and 3 of the report) looks at a number of these software programs and provides both summary descriptions and comparisons. The second part of this study (Section 4 of the report) considers the suitability of these analysis tools for DER studies. It considers steady state modeling and assessment work performed by ORNL using one commercially available tool on feeder data provided by a southern utility. Appendix A provides a technical report on the results of

  11. Suite of Benchmark Tests to Conduct Mesh-Convergence Analysis of Nonlinear and Non-constant Coefficient Transport Codes

    Science.gov (United States)

    Zamani, K.; Bombardelli, F. A.

    2014-12-01

    Verification of geophysics codes is imperative to avoid serious academic as well as practical consequences. In case that access to any given source code is not possible, the Method of Manufactured Solution (MMS) cannot be employed in code verification. In contrast, employing the Method of Exact Solution (MES) has several practical advantages. In this research, we first provide four new one-dimensional analytical solutions designed for code verification; these solutions are able to uncover the particular imperfections of the Advection-diffusion-reaction equation, such as nonlinear advection, diffusion or source terms, as well as non-constant coefficient equations. After that, we provide a solution of Burgers' equation in a novel setup. Proposed solutions satisfy the continuity of mass for the ambient flow, which is a crucial factor for coupled hydrodynamics-transport solvers. Then, we use the derived analytical solutions for code verification. To clarify gray-literature issues in the verification of transport codes, we designed a comprehensive test suite to uncover any imperfection in transport solvers via a hierarchical increase in the level of tests' complexity. The test suite includes hundreds of unit tests and system tests to check vis-a-vis the portions of the code. Examples for checking the suite start by testing a simple case of unidirectional advection; then, bidirectional advection and tidal flow and build up to nonlinear cases. We design tests to check nonlinearity in velocity, dispersivity and reactions. The concealing effect of scales (Peclet and Damkohler numbers) on the mesh-convergence study and appropriate remedies are also discussed. For the cases in which the appropriate benchmarks for mesh convergence study are not available, we utilize symmetry. Auxiliary subroutines for automation of the test suite and report generation are designed. All in all, the test package is not only a robust tool for code verification but it also provides comprehensive

  12. Army-NASA aircrew/aircraft integration program: Phase 4 A(3)I Man-Machine Integration Design and Analysis System (MIDAS) software detailed design document

    Science.gov (United States)

    Banda, Carolyn; Bushnell, David; Chen, Scott; Chiu, Alex; Constantine, Betsy; Murray, Jerry; Neukom, Christian; Prevost, Michael; Shankar, Renuka; Staveland, Lowell

    1991-01-01

    The Man-Machine Integration Design and Analysis System (MIDAS) is an integrated suite of software components that constitutes a prototype workstation to aid designers in applying human factors principles to the design of complex human-machine systems. MIDAS is intended to be used at the very early stages of conceptual design to provide an environment wherein designers can use computational representations of the crew station and operator, instead of hardware simulators and man-in-the-loop studies, to discover problems and ask 'what if' questions regarding the projected mission, equipment, and environment. This document is the Software Product Specification for MIDAS. Introductory descriptions of the processing requirements, hardware/software environment, structure, I/O, and control are given in the main body of the document for the overall MIDAS system, with detailed discussion of the individual modules included in Annexes A-J.

  13. Suite versus composite statistics

    Science.gov (United States)

    Balsillie, J.H.; Tanner, W.F.

    1999-01-01

    Suite and composite methodologies, two statistically valid approaches for producing statistical descriptive measures, are investigated for sample groups representing a probability distribution where, in addition, each sample is probability distribution. Suite and composite means (first moment measures) are always equivalent. Composite standard deviations (second moment measures) are always larger than suite standard deviations. Suite and composite values for higher moment measures have more complex relationships. Very seldom, however, are they equivalent, and they normally yield statistically significant but different results. Multiple samples are preferable to single samples (including composites) because they permit the investigator to examine sample-to-sample variability. These and other relationships for suite and composite probability distribution analyses are investigated and reported using granulometric data.

  14. Distortion Analysis Toolkit—A Software Tool for Easy Analysis of Nonlinear Audio Systems

    Directory of Open Access Journals (Sweden)

    Pakarinen Jyri

    2010-01-01

    Full Text Available Several audio effects devices deliberately add nonlinear distortion to the processed signal in order to create a desired sound. When creating virtual analog models of nonlinearly distorting devices, it would be very useful to carefully analyze the type of distortion, so that the model could be made as realistic as possible. While traditional system analysis tools such as the frequency response give detailed information on the operation of linear and time-invariant systems, they are less useful for analyzing nonlinear devices. Furthermore, although there do exist separate algorithms for nonlinear distortion analysis, there is currently no unified, easy-to-use tool for rapid analysis of distorting audio systems. This paper offers a remedy by introducing a new software tool for easy analysis of distorting effects. A comparison between a well-known guitar tube amplifier and two commercial software simulations is presented as a case study. This freely available software is written in Matlab language, but the analysis tool can also run as a standalone program, so the user does not need to have Matlab installed in order to perform the analysis.

  15. Distortion Analysis Toolkit—A Software Tool for Easy Analysis of Nonlinear Audio Systems

    Science.gov (United States)

    Pakarinen, Jyri

    2010-12-01

    Several audio effects devices deliberately add nonlinear distortion to the processed signal in order to create a desired sound. When creating virtual analog models of nonlinearly distorting devices, it would be very useful to carefully analyze the type of distortion, so that the model could be made as realistic as possible. While traditional system analysis tools such as the frequency response give detailed information on the operation of linear and time-invariant systems, they are less useful for analyzing nonlinear devices. Furthermore, although there do exist separate algorithms for nonlinear distortion analysis, there is currently no unified, easy-to-use tool for rapid analysis of distorting audio systems. This paper offers a remedy by introducing a new software tool for easy analysis of distorting effects. A comparison between a well-known guitar tube amplifier and two commercial software simulations is presented as a case study. This freely available software is written in Matlab language, but the analysis tool can also run as a standalone program, so the user does not need to have Matlab installed in order to perform the analysis.

  16. The NEPLAN software package a universal tool for electric power systems analysis

    CERN Document Server

    Kahle, K

    2002-01-01

    The NEPLAN software package has been used by CERN's Electric Power Systems Group since 1997. The software is designed for the calculation of short-circuit currents, load flow, motor start, dynamic stability, harmonic analysis and harmonic filter design. This paper describes the main features of the software package and their application to CERN's electric power systems. The implemented models of CERN's power systems are described in detail. Particular focus is given to fault calculations, harmonic analysis and filter design. Based on this software package and the CERN power network model, several recommendations are given.

  17. Risk Analysis and Decision-Making Software Package (1997 Version) User Manual

    Energy Technology Data Exchange (ETDEWEB)

    Chung, F.T.H.

    1999-02-11

    This manual provides instructions for using the U.S. Department of Energy's (DOE) risk analysis and decision making software (1997 version) developed at BDM Petroleum Technologies by BDM-Oklahoma, Inc. for DOE, under contract No. DE-AC22-94PC91OO8. This software provides petroleum producers with a simple, handy tool for exploration and production risk analysis and decision-making. It collects useful risk analysis tools in one package so that users do not have to use several programs separately. The software is simple to use, but still provides many functions. The 1997 version of the software package includes the following tools: (1) Investment risk (Gambler's ruin) analysis; (2) Monte Carlo simulation; (3) Best fit for distribution functions; (4) Sample and rank correlation; (5) Enhanced oil recovery method screening; and (6) artificial neural network. This software package is subject to change. Suggestions and comments from users are welcome and will be considered for future modifications and enhancements of the software. Please check the opening screen of the software for the current contact information. In the future, more tools will be added to this software package. This manual includes instructions on how to use the software but does not attempt to fully explain the theory and algorithms used to create it.

  18. Astronomical Video Suites

    Science.gov (United States)

    Francisco Salgado, Jose

    2010-01-01

    Astronomer and visual artist Jose Francisco Salgado has directed two astronomical video suites to accompany live performances of classical music works. The suites feature awe-inspiring images, historical illustrations, and visualizations produced by NASA, ESA, and the Adler Planetarium. By the end of 2009, his video suites Gustav Holst's The Planets and Astronomical Pictures at an Exhibition will have been presented more than 40 times in over 10 countries. Lately Salgado, an avid photographer, has been experimenting with high dynamic range imaging, time-lapse, infrared, and fisheye photography, as well as with stereoscopic photography and video to enhance his multimedia works.

  19. Thematic Review and Analysis of Grounded Theory Application in Software Engineering

    Directory of Open Access Journals (Sweden)

    Omar Badreddin

    2013-01-01

    Full Text Available We present metacodes, a new concept to guide grounded theory (GT research in software engineering. Metacodes are high level codes that can help software engineering researchers guide the data coding process. Metacodes are constructed in the course of analyzing software engineering papers that use grounded theory as a research methodology. We performed a high level analysis to discover common themes in such papers and discovered that GT had been applied primarily in three software engineering disciplines: agile development processes, geographically distributed software development, and requirements engineering. For each category, we collected and analyzed all grounded theory codes and created, following a GT analysis process, what we call metacodes that can be used to drive further theory building. This paper surveys the use of grounded theory in software engineering and presents an overview of successes and challenges of applying this research methodology.

  20. Comparative analysis of methods for testing software of radio-electronic equipment

    OpenAIRE

    G. A. Mirskikh; Yu. Yu. Reutskaya

    2011-01-01

    The analysis of the concepts of quality and reliability of software products that are part of the radio-electronic equipment is making. Basis testing methods of software products that are used in the design of hardware and software systems, to ensure quality and reliability are given. We consider testing in accordance with the methodology of the "black box" and "white box" methods of integration testing from the bottom up and top down, as well as various modifications of these methods. Effici...

  1. Algebraic software analysis and embedded simulation of a driving robot

    NARCIS (Netherlands)

    Merkx, L.L.F.; Cuijpers, P.J.L.; Duringhof, H.M.

    2007-01-01

    At TNO Automotive the Generic Driving Actuator (GDA) is developed. The GDA is a device capable of driving a vehicle fully automatically using the same interface as a human driverdoes. In this paper, the design of the GDA is discussed. The software and hardware of the GDA and its effect on vehicle be

  2. Graph based communication analysis for hardware/software codesign

    DEFF Research Database (Denmark)

    Knudsen, Peter Voigt; Madsen, Jan

    1999-01-01

    In this paper we present a coarse grain CDFG (Control/Data Flow Graph) model suitable for hardware/software partitioning of single processes and demonstrate how it is necessary to perform various transformations on the graph structure before partitioning in order to achieve a structure that allows...

  3. Reference Management Software: A Comparative Analysis of Four Products

    Science.gov (United States)

    Gilmour, Ron; Cobus-Kuo, Laura

    2011-01-01

    Reference management (RM) software is widely used by researchers in the health and natural sciences. Librarians are often called upon to provide support for these products. The present study compares four prominent RMs: CiteULike, RefWorks, Mendeley, and Zotero, in terms of features offered and the accuracy of the bibliographies that they…

  4. Program spectra analysis in embedded software: a case study

    NARCIS (Netherlands)

    Abreu, R.; Zoeteweij, P.; Van Gemund, A.J.C.

    2006-01-01

    Because of constraints imposed by the market, embedded software in consumer electronics is almost inevitably shipped with faults and the goal is just to reduce the inherent unreliability to an acceptable level before a product has to be released. Automatic fault diagnosis is a valuable tool to captu

  5. A Comparative Analysis of Software Engineering with Knowledge Engineering

    Directory of Open Access Journals (Sweden)

    J. F. Vijay

    2010-01-01

    Full Text Available Problem statement: Software engineering is not only a technical discipline of its own. It is also a problem domain where technologies coming from other disciplines are relevant and can play an important role. One important example is knowledge engineering, a term that we use in the broad sense to encompass artificial intelligence, computational intelligence, knowledge bases, data mining and machine learning. We see a number of typical software development issues that can benefit from these disciplines and, for the sake of clarifying the discussion, we have divided them into four categories: (1 planning, monitoring and quality control of projects, (2 The quality and process improvement of software organizations, (3 decision making support, (4 automation. Approach: First, the planning, monitoring and quality control of software development was typically based unless it is entirely ad-hoc on past project data and/or expert opinion. Results: Several techniques coming from machine learning, computational intelligence and knowledge-based systems had shown to be useful in this context. Second, software organizations are inherently learning organizations, that need to improve, based on experience and project feedback, the way they develop software in changing and volatile environments. Large amounts of data, numerous documents and other forms of information are typically gathered on projects. The question then became how to enable the intelligent storage and use of such information in future projects. Third, during the course of a project, software engineers and managers have to face important, complex decisions. They need decision models to support them, especially when project pressure is intense. Techniques originally developed for building risk models based on expert elicitation or optimization heuristics can play a key role in such a context. The last category of applications concerns automation. Many automation problems, such as test data

  6. EDL Sensor Suite Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Optical Air Data Systems (OADS) L.L.C. proposes a LIDAR based remote measurement sensor suite capable of satisfying a significant number of the desired sensing...

  7. The Social Construction of the Software Operation

    DEFF Research Database (Denmark)

    Frederiksen, Helle Damborg; Rose, Jeremy

    2003-01-01

    be analyzed using structuration theory. This structurational analysis showed that the company’s software operation followed an easily recognizable and widely understood pattern. The software operation was organized in terms of development projects leading to applications that then needed maintenance...... challenge the underlying social practice of the software operation, the metrics program reinforced it by adopting the same underlying values. Our conclusion is that, under these circumstances, metrics programs are unlikely to result in radical changes to the software operation, and are best suited to small...

  8. Modular reweighting software for statistical mechanical analysis of biased equilibrium data

    Science.gov (United States)

    Sindhikara, Daniel J.

    2012-07-01

    Here a simple, useful, modular approach and software suite designed for statistical reweighting and analysis of equilibrium ensembles is presented. Statistical reweighting is useful and sometimes necessary for analysis of equilibrium enhanced sampling methods, such as umbrella sampling or replica exchange, and also in experimental cases where biasing factors are explicitly known. Essentially, statistical reweighting allows extrapolation of data from one or more equilibrium ensembles to another. Here, the fundamental separable steps of statistical reweighting are broken up into modules - allowing for application to the general case and avoiding the black-box nature of some “all-inclusive” reweighting programs. Additionally, the programs included are, by-design, written with little dependencies. The compilers required are either pre-installed on most systems, or freely available for download with minimal trouble. Examples of the use of this suite applied to umbrella sampling and replica exchange molecular dynamics simulations will be shown along with advice on how to apply it in the general case. New version program summaryProgram title: Modular reweighting version 2 Catalogue identifier: AEJH_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEJH_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License, version 3 No. of lines in distributed program, including test data, etc.: 179 118 No. of bytes in distributed program, including test data, etc.: 8 518 178 Distribution format: tar.gz Programming language: C++, Python 2.6+, Perl 5+ Computer: Any Operating system: Any RAM: 50-500 MB Supplementary material: An updated version of the original manuscript (Comput. Phys. Commun. 182 (2011) 2227) is available Classification: 4.13 Catalogue identifier of previous version: AEJH_v1_0 Journal reference of previous version: Comput. Phys. Commun. 182 (2011) 2227 Does the new

  9. The MEME Suite

    OpenAIRE

    Bailey, Timothy L; Johnson, James,; Grant, Charles E.; Noble, William S.

    2015-01-01

    The MEME Suite is a powerful, integrated set of web-based tools for studying sequence motifs in proteins, DNA and RNA. Such motifs encode many biological functions, and their detection and characterization is important in the study of molecular interactions in the cell, including the regulation of gene expression. Since the previous description of the MEME Suite in the 2009 Nucleic Acids Research Web Server Issue, we have added six new tools. Here we describe the capabilities of all the tools...

  10. Designing and developing of data evaluation and analysis software applied to gamma-ray spectrometry

    International Nuclear Information System (INIS)

    This study is intended to design and develop software for gamma spectral data evaluation and analysis suitable for a variety of gamma-ray spectrometry systems. The software is written in Visual C++. It is designed to run under Microsoft Windows Operating System. The software is capable of covering all the necessary steps for spectral data evaluation and analysis of the collected data. These include peak search, energy calibration, gross and net peak area calculation, peak centroid determination and peak width calculation of the derived gamma-ray peaks. The software offers the ability to report qualitative and quantitative results. The analysis includes: Peak position identification (qualitative analysis) and calculating of its characteristics; Net peak area calculation by subtracting background; Radioactivity estimation (quantitative analysis) using comparison method for gamma peaks from any radioisotopes present during counting; Radioactivity estimation (quantitative analysis) after efficiency calibration; Counting uncertainties calculation; Limit of detection (LOD) estimation. (author)

  11. A Methodology to Evaluate Object oriented Software Systems Using Change Requirement Traceability Based on Impact Analysis

    Directory of Open Access Journals (Sweden)

    Sunil T. D

    2014-06-01

    Full Text Available It is a well known fact that software maintenance plays a major role and finds importance in software development life cycle. As object - oriented programming has become the standard, it is very important to understand the problems of maintaining object -oriented software systems. This paper aims at evaluating object - oriented software system through change requirement traceability – based impact analysis methodology for non functional requirements using functional requirements . The major issues have been related to change impact algorithms and inheritance of functionality.

  12. On The Human, Organizational, and Technical Aspects of Software Development and Analysis

    Science.gov (United States)

    Damaševičius, Robertas

    Information systems are designed, constructed, and used by people. Therefore, a software design process is not purely a technical task, but a complex psycho-socio-technical process embedded within organizational, cultural, and social structures. These structures influence the behavior and products of the programmer's work such as source code and documentation. This chapter (1) discusses the non-technical (organizational, social, cultural, and psychological) aspects of software development reflected in program source code; (2) presents a taxonomy of the social disciplines of computer science; and (3) discusses the socio-technical software analysis methods for discovering the human, organizational, and technical aspects embedded within software development artifacts.

  13. Performance Analysis of the ATLAS Second Level Trigger Software

    CERN Document Server

    Bogaerts, J A C; Li, W; Middleton, R P; Werner, P; Wickens, F J; Zobernig, H

    2002-01-01

    Abstract--In this paper we analyse the performance of the prototype software developed for the ATLAS Second Level Trigger. The OO framework written in C++ has been used to implement a distributed system which collects (simulated) detector data on which it executes event selection algorithms. The software has been used on testbeds of up to 100 nodes with various interconnect technologies. The final system will have to sustain traffic of ~ 40 Gbits/s and require an estimated number of ~750 processors. Timing measurements are crucial for issues such as trigger decision latency, assessment of required CPU and network capacity, scalability, and load-balancing. In addition, final architectural and technological choices, code optimisation and system tuning require a detailed understanding of both CPU utilisation and trigger decision latency. In this paper we describe the instrumentation used to disentangle effects due to such factors as OS system intervention, blocking on interlocks (applications are multi-threaded)...

  14. Critical analysis of interactive media with software affordances

    OpenAIRE

    Curinga, Matthew X.

    2014-01-01

    There is a long standing, and unsettled debate surrounding the ways that technology influences society. There is strong scholarship supporting the social construction perspective, arguing that the effects of technology are wholly socially and politically determined. This paper argues that the social constructivist position needs to be expanded if it can be useful for more than observing the ways technologies are designed and used. We need to develop better ways to talk about software, compute...

  15. The khmer software package: enabling efficient nucleotide sequence analysis.

    Science.gov (United States)

    Crusoe, Michael R; Alameldin, Hussien F; Awad, Sherine; Boucher, Elmar; Caldwell, Adam; Cartwright, Reed; Charbonneau, Amanda; Constantinides, Bede; Edvenson, Greg; Fay, Scott; Fenton, Jacob; Fenzl, Thomas; Fish, Jordan; Garcia-Gutierrez, Leonor; Garland, Phillip; Gluck, Jonathan; González, Iván; Guermond, Sarah; Guo, Jiarong; Gupta, Aditi; Herr, Joshua R; Howe, Adina; Hyer, Alex; Härpfer, Andreas; Irber, Luiz; Kidd, Rhys; Lin, David; Lippi, Justin; Mansour, Tamer; McA'Nulty, Pamela; McDonald, Eric; Mizzi, Jessica; Murray, Kevin D; Nahum, Joshua R; Nanlohy, Kaben; Nederbragt, Alexander Johan; Ortiz-Zuazaga, Humberto; Ory, Jeramia; Pell, Jason; Pepe-Ranney, Charles; Russ, Zachary N; Schwarz, Erich; Scott, Camille; Seaman, Josiah; Sievert, Scott; Simpson, Jared; Skennerton, Connor T; Spencer, James; Srinivasan, Ramakrishnan; Standage, Daniel; Stapleton, James A; Steinman, Susan R; Stein, Joe; Taylor, Benjamin; Trimble, Will; Wiencko, Heather L; Wright, Michael; Wyss, Brian; Zhang, Qingpeng; Zyme, En; Brown, C Titus

    2015-01-01

    The khmer package is a freely available software library for working efficiently with fixed length DNA words, or k-mers. khmer provides implementations of a probabilistic k-mer counting data structure, a compressible De Bruijn graph representation, De Bruijn graph partitioning, and digital normalization. khmer is implemented in C++ and Python, and is freely available under the BSD license at  https://github.com/dib-lab/khmer/.

  16. The khmer software package: enabling efficient nucleotide sequence analysis.

    Science.gov (United States)

    Crusoe, Michael R; Alameldin, Hussien F; Awad, Sherine; Boucher, Elmar; Caldwell, Adam; Cartwright, Reed; Charbonneau, Amanda; Constantinides, Bede; Edvenson, Greg; Fay, Scott; Fenton, Jacob; Fenzl, Thomas; Fish, Jordan; Garcia-Gutierrez, Leonor; Garland, Phillip; Gluck, Jonathan; González, Iván; Guermond, Sarah; Guo, Jiarong; Gupta, Aditi; Herr, Joshua R; Howe, Adina; Hyer, Alex; Härpfer, Andreas; Irber, Luiz; Kidd, Rhys; Lin, David; Lippi, Justin; Mansour, Tamer; McA'Nulty, Pamela; McDonald, Eric; Mizzi, Jessica; Murray, Kevin D; Nahum, Joshua R; Nanlohy, Kaben; Nederbragt, Alexander Johan; Ortiz-Zuazaga, Humberto; Ory, Jeramia; Pell, Jason; Pepe-Ranney, Charles; Russ, Zachary N; Schwarz, Erich; Scott, Camille; Seaman, Josiah; Sievert, Scott; Simpson, Jared; Skennerton, Connor T; Spencer, James; Srinivasan, Ramakrishnan; Standage, Daniel; Stapleton, James A; Steinman, Susan R; Stein, Joe; Taylor, Benjamin; Trimble, Will; Wiencko, Heather L; Wright, Michael; Wyss, Brian; Zhang, Qingpeng; Zyme, En; Brown, C Titus

    2015-01-01

    The khmer package is a freely available software library for working efficiently with fixed length DNA words, or k-mers. khmer provides implementations of a probabilistic k-mer counting data structure, a compressible De Bruijn graph representation, De Bruijn graph partitioning, and digital normalization. khmer is implemented in C++ and Python, and is freely available under the BSD license at  https://github.com/dib-lab/khmer/. PMID:26535114

  17. Specification and analysis of requirements negotiation strategy in software ecosystems

    OpenAIRE

    Fricker, S

    2009-01-01

    The development of software products and systems generally requires collaboration of many individuals, groups, and organizations that form an ecosystem of interdependent stakeholders. The way the interests and expectations of such stakeholders are communicated is critical for whether they are heard, hence whether the stakeholders are successful in influencing future solutions to meet their needs. This paper proposes a model based on negotiation and network theory for analyzing and designing f...

  18. Rapid software development : ANALYSIS OF AGILE METHODS FOR APP STARTUPS

    OpenAIRE

    Wahlqvist, Daniel

    2014-01-01

    This thesis is focused on software development using so called Agile methods. The scope of research is startup companies creating consumer apps. The thesis work was performed at a Swedish app startup; Storypic/Accelit AB. An overview of current research on Agile methods is given. A qualitative case study was undertaken in four parts; 1. Observing the team 2. Testing business hypotheses 3. Interviews with the team and 4. User feedback. Analyzing the findings some conclusions are drawn:  An ag...

  19. Potku - New analysis software for heavy ion elastic recoil detection analysis

    Science.gov (United States)

    Arstila, K.; Julin, J.; Laitinen, M. I.; Aalto, J.; Konu, T.; Kärkkäinen, S.; Rahkonen, S.; Raunio, M.; Itkonen, J.; Santanen, J.-P.; Tuovinen, T.; Sajavaara, T.

    2014-07-01

    Time-of-flight elastic recoil detection (ToF-ERD) analysis software has been developed. The software combines a Python-language graphical front-end with a C code computing back-end in a user-friendly way. The software uses a list of coincident time-of-flight-energy (ToF-E) events as an input. The ToF calibration can be determined with a simple graphical procedure. The graphical interface allows the user to select different elements and isotopes from a ToF-E histogram and to convert the selections to individual elemental energy and depth profiles. The resulting sample composition can be presented as relative or absolute concentrations by integrating the depth profiles over user-defined ranges. Beam induced composition changes can be studied by displaying the event-based data in fractions relative to the substrate reference data. Optional angular input data allows for kinematic correction of the depth profiles. This open source software is distributed under the GPL license for Linux, Mac, and Windows environments.

  20. The IFPUG guide to IT and software measurement

    CERN Document Server

    IFPUG

    2012-01-01

    The widespread deployment of millions of current and emerging software applications has placed software economic studies among the most critical of any form of business analysis. Unfortunately, a lack of an integrated suite of metrics makes software economic analysis extremely difficult. The International Function Point Users Group (IFPUG), a nonprofit and member-governed organization, has become the recognized leader in promoting the effective management of application software development and maintenance activities. The IFPUG Guide to IT and Software Measurement brings together 52 leading so

  1. Domain analysis for the reuse of software development experiences

    Science.gov (United States)

    Basili, V. R.; Briand, L. C.; Thomas, W. M.

    1994-01-01

    We need to be able to learn from past experiences so we can improve our software processes and products. The Experience Factory is an organizational structure designed to support and encourage the effective reuse of software experiences. This structure consists of two organizations which separates project development concerns from organizational concerns of experience packaging and learning. The experience factory provides the processes and support for analyzing, packaging, and improving the organization's stored experience. The project organization is structured to reuse this stored experience in its development efforts. However, a number of questions arise: What past experiences are relevant? Can they all be used (reused) on our current project? How do we take advantage of what has been learned in other parts of the organization? How do we take advantage of experience in the world-at-large? Can someone else's best practices be used in our organization with confidence? This paper describes approaches to help answer these questions. We propose both quantitative and qualitative approaches for effectively reusing software development experiences.

  2. The Architecture of MEG Simulation and Analysis Software

    CERN Document Server

    Cattaneo, PaoloW; Sawada, Ryu; Schneebeli, Matthias; Yamada, Shuei

    2011-01-01

    MEG (Mu to Electron Gamma) is an experiment dedicated to search for the $\\mu^+ \\rightarrow e^+\\gamma$ decay that is strongly suppressed in the Standard Model but predicted in several Super Symmetric extensions of it at an accessible rate. MEG is a small-size experiment ($\\approx 50-60$ physicists at any time) with a life span of about 10 years. The limited human resource available, in particular in the core offline group, emphasized the importance of reusing software and exploiting existing expertise. Great care has been devoted to provide a simple system that hides implementation details to the average programmer. That allowed many members of the collaboration to contribute to the development of the software of the experiment with limited programming skill. The offline software is based on two frameworks: {\\bf REM} in FORTRAN 77 used for the event generation and detector simulation package {\\bf GEM}, based on GEANT 3, and {\\bf ROME} in C++ used in the readout simulation {\\bf Bartender} and in the reconstruct...

  3. Space Telecommunications Radio System Software Architecture Concepts and Analysis

    Science.gov (United States)

    Handler, Louis M.; Hall, Charles S.; Briones, Janette C.; Blaser, Tammy M.

    2008-01-01

    The Space Telecommunications Radio System (STRS) project investigated various Software Defined Radio (SDR) architectures for Space. An STRS architecture has been selected that separates the STRS operating environment from its various waveforms and also abstracts any specialized hardware to limit its effect on the operating environment. The design supports software evolution where new functionality is incorporated into the radio. Radio hardware functionality has been moving from hardware based ASICs into firmware and software based processors such as FPGAs, DSPs and General Purpose Processors (GPPs). Use cases capture the requirements of a system by describing how the system should interact with the users or other systems (the actors) to achieve a specific goal. The Unified Modeling Language (UML) is used to illustrate the Use Cases in a variety of ways. The Top Level Use Case diagram shows groupings of the use cases and how the actors are involved. The state diagrams depict the various states that a system or object may be in and the transitions between those states. The sequence diagrams show the main flow of activity as described in the use cases.

  4. Analysis and design of software ecosystem architectures – towards the 4S telemedicine ecosystem

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak; Hansen, Klaus Marius; Kyng, Morten;

    2014-01-01

    , and application stove-pipes that inhibit the adoption of telemedical solutions. To which extent can a software ecosystem approach to telemedicine alleviate this? Objective In this article, we define the concept of software ecosystem architecture as the structure(s) of a software ecosystem comprising elements......, relations among them, and properties of both. Our objective is to show how this concept can be used i) in the analysis of existing software ecosystems and ii) in the design of new software ecosystems. Method We performed a mixed-method study that consisted of a case study and an experiment. For i), we...... performed a descriptive, revelatory case study of the Danish telemedicine ecosystem and for ii), we experimentally designed, implemented, and evaluated the architecture of 4S. Results We contribute in three areas. First, we define the software ecosystem architecture concept that captures organization...

  5. "Leagile??? software development: an experience report analysis of the application of lean approaches in agile software development

    OpenAIRE

    Wang, Xiaofeng; Conboy, Kieran; Cawley, Ois??n

    2012-01-01

    peer-reviewed In recent years there has been a noticeable shift in attention from those who use agile software development toward lean software development, often labelled as a shift ???from agile to lean???. However, the reality may not be as simple or linear as this label implies. To provide a better understanding of lean software development approaches and how they are applied in agile software development, we have examined 30 experience reports published in past agile software...

  6. Multi-dimensional project evaluation: Combining cost-benefit analysis and multi-criteria analysis with the COSIMA software system

    DEFF Research Database (Denmark)

    This paper proposes a methodology that integrates quantitative and qualitative assessment. The methodology proposed combines conventional cost-benefit analysis (CBA) with multi-criteria analysis (MCA). The CBA methodology, based on welfare theory, assures that the project with the highest welfare...... different methods for combining cost-benefit analysis and multi-criteria analysis are examined and compared and a software system is presented. The software system gives the decision makers some possibilities regarding preference analysis, sensitivity and risk analysis. The aim of the software...... and software system for CBA and MCA decision making is finally compared with other methods for combining the CBA and MCA. Ultimately, some conclusions are made and perspectives are drawn. Keywords: Cost-benefit analysis, Multi-criteria analysis, Multiple Criteria Decision Aiding, Transport infrastructure...

  7. Eval: A software package for analysis of genome annotations

    Directory of Open Access Journals (Sweden)

    Brent Michael R

    2003-10-01

    Full Text Available Abstract Summary Eval is a flexible tool for analyzing the performance of gene annotation systems. It provides summaries and graphical distributions for many descriptive statistics about any set of annotations, regardless of their source. It also compares sets of predictions to standard annotations and to one another. Input is in the standard Gene Transfer Format (GTF. Eval can be run interactively or via the command line, in which case output options include easily parsable tab-delimited files. Availability To obtain the module package with documentation, go to http://genes.cse.wustl.edu/ and follow links for Resources, then Software. Please contact brent@cse.wustl.edu

  8. Software tools for the analysis of video meteors emission spectra

    Science.gov (United States)

    Madiedo, J. M.; Toscano, F. M.; Trigo-Rodriguez, J. M.

    2011-10-01

    One of the goals of the SPanish Meteor Network (SPMN) is related to the study of the chemical composition of meteoroids by analyzing the emission spectra resulting from the ablation of these particles of interplanetary matter in the atmosphere. With this aim, some of the CCD video devices we employ to observe the nigh sky are endowed with holographic diffraction gratings, and a continuous monitoring of meteor activity is performed. We have recently developed a new software to analyze these spectra. A description of this computer program is given, and some of the results obtained so far are presented here.

  9. Waste management facility accident analysis (WASTE ACC) system: software for analysis of waste management alternatives

    International Nuclear Information System (INIS)

    This paper describes the Waste Management Facility Accident Analysis (WASTEunderscoreACC) software, which was developed at Argonne National Laboratory (ANL) to support the US Department of Energy's (DOE's) Waste Management (WM) Programmatic Environmental Impact Statement (PEIS). WASTEunderscoreACC is a decision support and database system that is compatible with Microsoft reg-sign Windows trademark. It assesses potential atmospheric releases from accidents at waste management facilities. The software provides the user with an easy-to-use tool to determine the risk-dominant accident sequences for the many possible combinations of process technologies, waste and facility types, and alternative cases described in the WM PEIS. In addition, its structure will allow additional alternative cases and assumptions to be tested as part of the future DOE programmatic decision-making process. The WASTEunderscoreACC system demonstrates one approach to performing a generic, systemwide evaluation of accident risks at waste management facilities. The advantages of WASTEunderscoreACC are threefold. First, the software gets waste volume and radiological profile data that were used to perform other WM PEIS-related analyses directly from the WASTEunderscoreMGMT system. Second, the system allows for a consistent analysis across all sites and waste streams, which enables decision makers to understand more fully the trade-offs among various policy options and scenarios. Third, the system is easy to operate; even complex scenario runs are completed within minutes

  10. Integrating Multi-Vendor Software Analysis into the Lifecycle for Reliability, Productivity, and Performance Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The goal of the proposed work is to create new ways to manage, visualize, and share data produced by multiple software analysis tools, and to create a framework for...

  11. NIST Statistical Test Suite

    OpenAIRE

    Paul, Rourab; Dey, Hemanta; Chakrabrti, Amlan; Ghosh, Ranjan

    2016-01-01

    The NIST Statistical Test Suite has 15 tests. The principal strategy of the NIST Statistical Test Suite is to judge statistical randomness property of random bit generating algorithms. Based on 300 to 500 different keys, the algorithm generates a series of even number of different long random sequences of n bits, n varying between 13 and 15 lacs, each of which is tested by the 15 tests. Each test has a specific statistic parameter for bit sequences under the assumption of randomness and calcu...

  12. Development of tools for safety analysis of control software in advanced reactors

    Energy Technology Data Exchange (ETDEWEB)

    Guarro, S.; Yau, M.; Motamed, M. [Advanced Systems Concepts Associates, El Segundo, CA (United States)

    1996-04-01

    Software based control systems have gained a pervasive presence in a wide variety of applications, including nuclear power plant control and protection systems which are within the oversight and licensing responsibility of the US Nuclear Regulatory Commission. While the cost effectiveness and flexibility of software based plant process control is widely recognized, it is very difficult to achieve and prove high levels of demonstrated dependability and safety assurance for the functions performed by process control software, due to the very flexibility and potential complexity of the software itself. The development of tools to model, analyze and test software design and implementations in the context of the system that the software is designed to control can greatly assist the task of providing higher levels of assurance than those obtainable by software testing alone. This report presents and discusses the development of the Dynamic Flowgraph Methodology (DFM) and its application in the dependability and assurance analysis of software-based control systems. The features of the methodology and full-scale examples of application to both generic process and nuclear power plant control systems are presented and discussed in detail. The features of a workstation software tool developed to assist users in the application of DFM are also described.

  13. A unified approach to feature-centric analysis of object-oriented software

    DEFF Research Database (Denmark)

    Olszak, Andrzej; Jørgensen, Bo Nørregaard

    2010-01-01

    Feature-centric comprehension of software is a prerequisite to incorporating modifications requested by users during software evolution and maintenance. However, feature-centric understanding of large object-oriented programs is difficult to achieve due to size, complexity and implicit cha......-racter of mappings between features and source code. In this paper, we address these issues through our unified approach to feature-centric analysis of object-oriented software. Our approach supports discovery of feature-code traceability links and their analysis from three perspectives and at three levels...... Featureous supports program comprehension by means of concrete cognitive design elements....

  14. AgriSuit

    NARCIS (Netherlands)

    Yalew, S.G.; Griensven, van A.; Zaag, van der P.

    2016-01-01

    A web-based framework (AgriSuit) that integrates various global data from different sources for multi-criteria based agricultural land suitability assessment based on the Google Earth Engine (GEE) platform is developed and presented. The platform enables online data gathering, training and classi

  15. Novell ZENworks Suite 7

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    8月22日,Novell推出功能更加强大、性能更加稳定的最新ZENworks Suite 7。Novell公司高级技术支持刘长春建议:“利用Novell ZENworks Suite来规范管理您的网络。”

  16. Performance Analysis of Software to Hardware Task Migration in Codesign

    CERN Document Server

    Sebai, Dorsaf; Bennour, Imed

    2010-01-01

    The complexity of multimedia applications in terms of intensity of computation and heterogeneity of treated data led the designers to embark them on multiprocessor systems on chip. The complexity of these systems on one hand and the expectations of the consumers on the other hand complicate the designers job to conceive and supply strong and successful systems in the shortest deadlines. They have to explore the different solutions of the design space and estimate their performances in order to deduce the solution that respects their design constraints. In this context, we propose the modeling of one of the design space possible solutions: the software to hardware task migration. This modeling exploits the synchronous dataflow graphs to take into account the different migration impacts and estimate their performances in terms of throughput.

  17. Development of a wearable motion capture suit and virtual reality biofeedback system for the instruction and analysis of sports rehabilitation exercises.

    Science.gov (United States)

    Fitzgerald, Diarmaid; Foody, John; Kelly, Dan; Ward, Tomas; Markham, Charles; McDonald, John; Caulfield, Brian

    2007-01-01

    This paper describes the design and development of a computer game for instructing an athlete through a series of prescribed rehabilitation exercises. In an attempt to prevent or treat musculoskeletal type injuries along with trying to improve physical performance, athletes are prescribed exercise programmes by appropriately trained specialists. Typically athletes are shown how to perform each exercise in the clinic following examination but they often have no way of knowing if their technique is correct while they are performing their home exercise programme. We describe a system that allows an automatic audit of this activity. Our system utilises ten inertial motion tracking sensors incorporated in a wearable body suit which allows a bluetooth connection from a root hub to a laptop/computer. Using our specifically designed software programme, the athlete can be instructed and analysed as he/she performs the individually tailored exercise programme and a log is recorded of the time and performance level of each exercise completed. We describe a case study that illustrates how a clinician can at a later date review the athletes progress and subsequently alter the exercise programme as they see fit. PMID:18003097

  18. Software for analysis and manipulation of genetic linkage data.

    Science.gov (United States)

    Weaver, R; Helms, C; Mishra, S K; Donis-Keller, H

    1992-06-01

    We present eight computer programs written in the C programming language that are designed to analyze genotypic data and to support existing software used to construct genetic linkage maps. Although each program has a unique purpose, they all share the common goals of affording a greater understanding of genetic linkage data and of automating tasks to make computers more effective tools for map building. The PIC/HET and FAMINFO programs automate calculation of relevant quantities such as heterozygosity, PIC, allele frequencies, and informativeness of markers and pedigrees. PREINPUT simplifies data submissions to the Centre d'Etude du Polymorphisme Humain (CEPH) data base by creating a file with genotype assignments that CEPH's INPUT program would otherwise require to be input manually. INHERIT is a program written specifically for mapping the X chromosome: by assigning a dummy allele to males, in the nonpseudoautosomal region, it eliminates falsely perceived noninheritances in the data set. The remaining four programs complement the previously published genetic linkage mapping software CRI-MAP and LINKAGE. TWOTABLE produces a more readable format for the output of CRI-MAP two-point calculations; UNMERGE is the converse to CRI-MAP's merge option; and GENLINK and LINKGEN automatically convert between the genotypic data file formats required by these packages. All eight applications read input from the same types of data files that are used by CRI-MAP and LINKAGE. Their use has simplified the management of data, has increased knowledge of the content of information in pedigrees, and has reduced the amount of time needed to construct genetic linkage maps of chromosomes. PMID:1598906

  19. Software for analysis and manipulation of genetic linkage data.

    Science.gov (United States)

    Weaver, R; Helms, C; Mishra, S K; Donis-Keller, H

    1992-06-01

    We present eight computer programs written in the C programming language that are designed to analyze genotypic data and to support existing software used to construct genetic linkage maps. Although each program has a unique purpose, they all share the common goals of affording a greater understanding of genetic linkage data and of automating tasks to make computers more effective tools for map building. The PIC/HET and FAMINFO programs automate calculation of relevant quantities such as heterozygosity, PIC, allele frequencies, and informativeness of markers and pedigrees. PREINPUT simplifies data submissions to the Centre d'Etude du Polymorphisme Humain (CEPH) data base by creating a file with genotype assignments that CEPH's INPUT program would otherwise require to be input manually. INHERIT is a program written specifically for mapping the X chromosome: by assigning a dummy allele to males, in the nonpseudoautosomal region, it eliminates falsely perceived noninheritances in the data set. The remaining four programs complement the previously published genetic linkage mapping software CRI-MAP and LINKAGE. TWOTABLE produces a more readable format for the output of CRI-MAP two-point calculations; UNMERGE is the converse to CRI-MAP's merge option; and GENLINK and LINKGEN automatically convert between the genotypic data file formats required by these packages. All eight applications read input from the same types of data files that are used by CRI-MAP and LINKAGE. Their use has simplified the management of data, has increased knowledge of the content of information in pedigrees, and has reduced the amount of time needed to construct genetic linkage maps of chromosomes.

  20. Advanced space system analysis software. Technical, user, and programmer guide

    Science.gov (United States)

    Farrell, C. E.; Zimbelman, H. F.

    1981-01-01

    The LASS computer program provides a tool for interactive preliminary and conceptual design of LSS. Eight program modules were developed, including four automated model geometry generators, an associated mass properties module, an appendage synthesizer module, an rf analysis module, and an orbital transfer analysis module. The existing rigid body controls analysis module was modified to permit analysis of effects of solar pressure on orbital performance. A description of each module, user instructions, and programmer information are included.

  1. Analysis and recommendations for a reliable programming of software based safety systems

    International Nuclear Information System (INIS)

    The present paper summarizes the results of several studies performed for the development of high software on i486 microprocessors, towards its utilization for control and safety systems for nuclear power plants. The work is based on software programmed in C language. Several recommendations oriented to high reliability software are analyzed, relating the requirements on high level language to its influence on assembler level. Several metrics are implemented, that allow for the quantification of the results achieved. New metrics were developed and other were adapted, in order to obtain more efficient indexes for the software description. Such metrics are helpful to visualize the adaptation of the software under development to the quality rules under use. A specific program developed to assist the reliability analyst on this quantification is also present in the paper. It performs the analysis of an executable program written in C language, disassembling it and evaluating its inter al structures. (author)

  2. BASTILLE - Better Analysis Software to Treat ILL Experiments - a unified, unifying approach to data reduction and analysis

    International Nuclear Information System (INIS)

    Data reduction and analysis is a key component in the production of scientific results. If this component, like any other in the chain, is weak, the final output is compromised. The current situation for data reduction and analysis may be regarded as adequate, but it is variable, depending on the instrument, and should be improved. In particular the delivery of new and upgraded instruments in Millennium Phase I and those proposed for Phase II will bring new demands and challenges for software development. Failure to meet these challenges will hamper the exploitation of higher data rates and the delivery of new science. The proposed project is to provide a single, underpinning software infrastructure for data analysis, which would ensure: 1) a clear vision of software provision at ILL; 2) a clear role for the 'Computing for Science' Group (CS) in maintaining and developing the infrastructure and the codes; 3) a well-defined framework for recruiting and training CS staff; 4) ease and efficiency of development within a common, well-defined software environment; 5) safeguarding of key, existing software; and 6) ease of communication with other software like instrument control software to allow real-time data analysis and experiment control, or software from other institutes or sources

  3. Improving the Accuracy of Software-Based Energy Analysis for Residential Buildings (Presentation)

    Energy Technology Data Exchange (ETDEWEB)

    Polly, B.

    2011-09-01

    This presentation describes the basic components of software-based energy analysis for residential buildings, explores the concepts of 'error' and 'accuracy' when analysis predictions are compared to measured data, and explains how NREL is working to continuously improve the accuracy of energy analysis methods.

  4. Comparing Results from Constant Comparative and Computer Software Methods: A Reflection about Qualitative Data Analysis

    Science.gov (United States)

    Putten, Jim Vander; Nolen, Amanda L.

    2010-01-01

    This study compared qualitative research results obtained by manual constant comparative analysis with results obtained by computer software analysis of the same data. An investigated about issues of trustworthiness and accuracy ensued. Results indicated that the inductive constant comparative data analysis generated 51 codes and two coding levels…

  5. A COMPARISON OF STEPWISE AND FUZZY MULTIPLE REGRESSION ANALYSIS TECHNIQUES FOR MANAGING SOFTWARE PROJECT RISKS: ANALYSIS PHASE

    Directory of Open Access Journals (Sweden)

    Abdelrafe Elzamly

    2014-01-01

    Full Text Available Risk is not always avoidable, but it is controllable. The aim of this study is to identify whether those techniques are effective in reducing software failure. This motivates the authors to continue the effort to enrich the managing software project risks with consider mining and quantitative approach with large data set. In this study, two new techniques are introduced namely stepwise multiple regression analysis and fuzzy multiple regression to manage the software risks. Two evaluation procedures such as MMRE and Pred (25 is used to compare the accuracy of techniques. The model’s accuracy slightly improves in stepwise multiple regression rather than fuzzy multiple regression. This study will guide software managers to apply software risk management practices with real world software development organizations and verify the effectiveness of the new techniques and approaches on a software project. The study has been conducted on a group of software project using survey questionnaire. It is hope that this will enable software managers improve their decision to increase the probability of software project success.

  6. Design and validation of Segment - freely available software for cardiovascular image analysis

    Directory of Open Access Journals (Sweden)

    Engblom Henrik

    2010-01-01

    Full Text Available Abstract Background Commercially available software for cardiovascular image analysis often has limited functionality and frequently lacks the careful validation that is required for clinical studies. We have already implemented a cardiovascular image analysis software package and released it as freeware for the research community. However, it was distributed as a stand-alone application and other researchers could not extend it by writing their own custom image analysis algorithms. We believe that the work required to make a clinically applicable prototype can be reduced by making the software extensible, so that researchers can develop their own modules or improvements. Such an initiative might then serve as a bridge between image analysis research and cardiovascular research. The aim of this article is therefore to present the design and validation of a cardiovascular image analysis software package (Segment and to announce its release in a source code format. Results Segment can be used for image analysis in magnetic resonance imaging (MRI, computed tomography (CT, single photon emission computed tomography (SPECT and positron emission tomography (PET. Some of its main features include loading of DICOM images from all major scanner vendors, simultaneous display of multiple image stacks and plane intersections, automated segmentation of the left ventricle, quantification of MRI flow, tools for manual and general object segmentation, quantitative regional wall motion analysis, myocardial viability analysis and image fusion tools. Here we present an overview of the validation results and validation procedures for the functionality of the software. We describe a technique to ensure continued accuracy and validity of the software by implementing and using a test script that tests the functionality of the software and validates the output. The software has been made freely available for research purposes in a source code format on the project home

  7. A Parametric Model of Shoulder Articulation for Virtual Assessment of Space Suit Fit

    Science.gov (United States)

    Kim, K. Han; Young, Karen S.; Bernal, Yaritza; Boppana, Abhishektha; Vu, Linh Q.; Benson, Elizabeth A.; Jarvis, Sarah; Rajulu, Sudhakar L.

    2016-01-01

    Shoulder injury is one of the most severe risks that have the potential to impair crewmembers' performance and health in long duration space flight. Overall, 64% of crewmembers experience shoulder pain after extra-vehicular training in a space suit, and 14% of symptomatic crewmembers require surgical repair (Williams & Johnson, 2003). Suboptimal suit fit, in particular at the shoulder region, has been identified as one of the predominant risk factors. However, traditional suit fit assessments and laser scans represent only a single person's data, and thus may not be generalized across wide variations of body shapes and poses. The aim of this work is to develop a software tool based on a statistical analysis of a large dataset of crewmember body shapes. This tool can accurately predict the skin deformation and shape variations for any body size and shoulder pose for a target population, from which the geometry can be exported and evaluated against suit models in commercial CAD software. A preliminary software tool was developed by statistically analyzing 150 body shapes matched with body dimension ranges specified in the Human-Systems Integration Requirements of NASA ("baseline model"). Further, the baseline model was incorporated with shoulder joint articulation ("articulation model"), using additional subjects scanned in a variety of shoulder poses across a pre-specified range of motion. Scan data was cleaned and aligned using body landmarks. The skin deformation patterns were dimensionally reduced and the co-variation with shoulder angles was analyzed. A software tool is currently in development and will be presented in the final proceeding. This tool would allow suit engineers to parametrically generate body shapes in strategically targeted anthropometry dimensions and shoulder poses. This would also enable virtual fit assessments, with which the contact volume and clearance between the suit and body surface can be predictively quantified at reduced time and

  8. STEM - software test and evaluation methods: fault detection using static analysis techniques

    International Nuclear Information System (INIS)

    STEM is a software reliability project with the objective of evaluating a number of fault detection and fault estimation methods which can be applied to high integrity software. This Report gives some interim results of applying both manual and computer-based static analysis techniques, in particular SPADE, to an early CERL version of the PODS software containing known faults. The main results of this study are that: The scope for thorough verification is determined by the quality of the design documentation; documentation defects become especially apparent when verification is attempted. For well-defined software, the thoroughness of SPADE-assisted verification for detecting a large class of faults was successfully demonstrated. For imprecisely-defined software (not recommended for high-integrity systems) the use of tools such as SPADE is difficult and inappropriate. Analysis and verification tools are helpful, through their reliability and thoroughness. However, they are designed to assist, not replace, a human in validating software. Manual inspection can still reveal errors (such as errors in specification and errors of transcription of systems constants) which current tools cannot detect. There is a need for tools to automatically detect typographical errors in system constants, for example by reporting outliers to patterns. To obtain the maximum benefit from advanced tools, they should be applied during software development (when verification problems can be detected and corrected) rather than retrospectively. (author)

  9. Requirement analysis of the safety-critical software implementation for the nuclear power plant

    International Nuclear Information System (INIS)

    The safety critical software shall be implemented under the strict regulation and standards along with hardware qualification. In general, the safety critical software has been implemented using functional block language (FBL) and structured language like C in the real project. Software design shall comply with such characteristics as; modularity, simplicity, minimizing the use of sub-routine, and excluding the interrupt logic. To meet these prerequisites, we used the computer-aided software engineering (CASE) tool to substantiate the requirements traceability matrix that were manually developed using Word processors or Spreadsheets. And the coding standard and manual have been developed to confirm the quality of software development process, such as; readability, consistency, and maintainability in compliance with NUREG/CR-6463. System level preliminary hazard analysis (PHA) is performed by analyzing preliminary safety analysis report (PSAR) and FMEA document. The modularity concept is effectively implemented for the overall module configurations and functions using RTP software development tool. The response time imposed on the basis of the deterministic structure of the safety-critical software was measured

  10. The R software fundamentals of programming and statistical analysis

    CERN Document Server

    Lafaye de Micheaux, Pierre; Liquet, Benoit

    2013-01-01

    The contents of The R Software are presented so as to be both comprehensive and easy for the reader to use. Besides its application as a self-learning text, this book can support lectures on R at any level from beginner to advanced. This book can serve as a textbook on R for beginners as well as more advanced users, working on Windows, MacOs or Linux OSes. The first part of the book deals with the heart of the R language and its fundamental concepts, including data organization, import and export, various manipulations, documentation, plots, programming and maintenance.  The last chapter in this part deals with oriented object programming as well as interfacing R with C/C++ or Fortran, and contains a section on debugging techniques. This is followed by the second part of the book, which provides detailed explanations on how to perform many standard statistical analyses, mainly in the Biostatistics field. Topics from mathematical and statistical settings that are included are matrix operations, integration, o...

  11. Systematic Analysis Method of Shear-Wave Splitting:SAM Software System

    Institute of Scientific and Technical Information of China (English)

    Gao Yuan; Liu Xiqiang; Liang Wei; Hao Ping

    2004-01-01

    In order to make a more effective use of the data from regional digital seismograph networks and to promote the study on shear wave splitting and its application to earthquake stressforecasting, SAM software system, i.e., the software on systematic analysis method of shear wave splitting has been developed. This paper introduces the design aims, system structure,function and characteristics about the SAM software system and shows some graphical interfaces of data input and result output. Lastly, it discusses preliminarily the study of shear wave splitting and its application to earthquake forecasting.

  12. Space Suit Spins

    Science.gov (United States)

    2005-01-01

    Space is a hostile environment where astronauts combat extreme temperatures, dangerous radiation, and a near-breathless vacuum. Life support in these unforgiving circumstances is crucial and complex, and failure is not an option for the devices meant to keep astronauts safe in an environment that presents constant opposition. A space suit must meet stringent requirements for life support. The suit has to be made of durable material to withstand the impact of space debris and protect against radiation. It must provide essential oxygen, pressure, heating, and cooling while retaining mobility and dexterity. It is not a simple article of clothing but rather a complex modern armor that the space explorers must don if they are to continue exploring the heavens

  13. Microstructural analysis of quartz grains in Vasyugan suite sandstones of layer Ui1-21 in Kazanskoe deposit

    International Nuclear Information System (INIS)

    Microstructural analysis of quartz grains in sandstones revealed preferred directions which define and influence porosity and permeability anisotropy in oil and gas reservoirs In this research, we investigated the Upper Jurassic sandstone reservoir sediments from 14 wells in Kazanskoe field. The authors studied: the orientation of elongated quartz grains, and intergranular fracture within grains, as well as the pore space in oriented thin sections of sandstones. The analysis of elongated quartz grains in the bedding plane showed three main types of preferred directions in quartz grain orientation along different axes in sandstone reservoirs. Obtained results allow identifying a variability of facies and dynamic depositional environment for Upper Jurassic sandstone formation. Subsequently, these results can be used in field modeling, as well as pattern optimization of injection and production wells

  14. A suite of Gateway® cloning vectors for high-throughput genetic analysis in Saccharomyces cerevisiae

    OpenAIRE

    Alberti, Simon; Gitler, Aaron D.; Lindquist, Susan

    2007-01-01

    In the post-genomic era, academic and biotechnological research is increasingly shifting its attention from single proteins to the analysis of complex protein networks. This change in experimental design requires the use of simple and experimentally tractable organisms, such as the unicellular eukaryote Saccharomyces cerevisiae, and a range of new high-throughput techniques. The Gateway® system has emerged as a powerful high-throughput cloning method that allows for the in vitro recombination...

  15. Development of high performance casting analysis software by coupled parallel computation

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Up to now, so much casting analysis software has been continuing to develop the new access way to real casting processes. Those include the melt flow analysis, heat transfer analysis for solidification calculation,mechanical property predictions and microstructure predictions. These trials were successful to obtain the ideal results comparing with real situations, so that CAE technologies became inevitable to design or develop new casting processes. But for manufacturing fields, CAE technologies are not so frequently being used because of their difficulties in using the software or insufficient computing performances. To introduce CAE technologies to manufacturing field,the high performance analysis is essential to shorten the gap between product designing time and prototyping time.The software code optimization can be helpful, but it is not enough, because the codes developed by software experts are already optimized enough. As an alternative proposal for high performance computations, the parallel computation technologies are eagerly being applied to CAE technologies to make the analysis time shorter. In this research, SMP (Shared Memory Processing) and MPI (Message Passing Interface) (1) methods for parallelization were applied to commercial software "Z-Cast" to calculate the casting processes. In the code parallelizing processes,the network stabilization, core optimization were also carried out under Microsoft Windows platform and their performances and results were compared with those of normal linear analysis codes.

  16. Spectral graph theory analysis of software-defined networks to improve performance and security

    OpenAIRE

    Parker, Thomas C.

    2015-01-01

    Software-defined networks are revolutionizing networking by providing unprecedented visibility into and control over data communication networks. The focus of this work is to develop a method to extract network features, develop a closed-loop control framework for a software-defined network, and build a test bed to validate the proposed scheme. The method developed to extract the network features is called the dual-basis analysis, which is based on the eigendecomposition of a weighted graph t...

  17. Flexible Global Software Development (GSD): Antecedents of Success in Requirements Analysis

    OpenAIRE

    Vanita Yadav; Monica Adya; Varadharajan Sridhar; Dhruv Nath

    2009-01-01

    Globalization of software development has resulted in a rapid shift away from the traditional collocated, on-site development model, to the offshoring model. Emerging trends indicate an increasing interest in offshoring even in early phases like requirements analysis. Additionally, the flexibility offered by the agile development approach makes it attractive for adaptation in globally distributed software work. A question of significance then is what impacts the success of offshoring earlier ...

  18. 软件脆弱性分析%Software Vulnerability Analysis

    Institute of Scientific and Technical Information of China (English)

    李新明; 李艺; 徐晓梅; 韩存兵

    2003-01-01

    Software vulnerability is the root reason that cause computer system security problem. It' s a new researchtopic to analyze vulnerability based on the essence of software vulnerability. This paper analyzes the main definitionsand taxonomies of vulnerability,studies vulnerability database and tools for vulnerability analysis and detection,andgives the details about what caused the most common vnlnerabilities in the LINUX/UNIX operating systems.

  19. Parallel line analysis: multifunctional software for the biomedical sciences

    Science.gov (United States)

    Swank, P. R.; Lewis, M. L.; Damron, K. L.; Morrison, D. R.

    1990-01-01

    An easy to use, interactive FORTRAN program for analyzing the results of parallel line assays is described. The program is menu driven and consists of five major components: data entry, data editing, manual analysis, manual plotting, and automatic analysis and plotting. Data can be entered from the terminal or from previously created data files. The data editing portion of the program is used to inspect and modify data and to statistically identify outliers. The manual analysis component is used to test the assumptions necessary for parallel line assays using analysis of covariance techniques and to determine potency ratios with confidence limits. The manual plotting component provides a graphic display of the data on the terminal screen or on a standard line printer. The automatic portion runs through multiple analyses without operator input. Data may be saved in a special file to expedite input at a future time.

  20. Software in military aviation and drone mishaps: Analysis and recommendations for the investigation process

    International Nuclear Information System (INIS)

    Software plays a central role in military systems. It is also an important factor in many recent incidents and accidents. A safety gap is growing between our software-intensive technological capabilities and our understanding of the ways they can fail or lead to accidents. Traditional forms of accident investigation are poorly equipped to trace the sources of software failure, for instance software does not age in the same way that hardware components fail over time. As such, it can be hard to trace the causes of software failure or mechanisms by which it contributed to accidents back into the development and procurement chain to address the deeper, systemic causes of potential accidents. To identify some of these failure mechanisms, we examined the database of the Air Force Accident Investigation Board (AIB) and analyzed mishaps in which software was involved. Although we have chosen to focus on military aviation, many of the insights also apply to civil aviation. Our analysis led to several results and recommendations. Some were specific and related for example to specific shortcomings in the testing and validation of particular avionic subsystems. Others were broader in scope: for instance, we challenged both the investigation process (aspects of) and the findings in several cases, and we provided recommendations, technical and organizational, for improvements. We also identified important safety blind spots in the investigations with respect to software, whose contribution to the escalation of the adverse events was often neglected in the accident reports. These blind spots, we argued, constitute an important missed learning opportunity for improving accident prevention, and it is especially unfortunate at a time when Remotely Piloted Air Systems (RPAS) are being integrated into the National Airspace. Our findings support the growing recognition that the traditional notion of software failure as non-compliance with requirements is too limited to capture the

  1. A software tool for design of process monitoring and analysis systems

    DEFF Research Database (Denmark)

    Singh, Ravendra; Gernaey, Krist; Gani, Rafiqul

    2009-01-01

    rigorously and integrated with the user interface, which made the software more generic and applicable to a wide range of problems. The software for the design of a process monitoring and analysis system is presented and illustrated with a tablet manufacturing process example.......A well designed process monitoring and analysis system is necessary to consistently achieve any predefined end product quality. Systematic computer aided methods and tools provide the means to design the necessary process monitoring and analysis systems and/or to validate any existing monitoring...... and analysis system. A software to achieve this has been developed. Two developed supporting tools for the design, a knowledge base (consisting of the process knowledge as well as the knowledge on measurement methods & tools) and a model library (consisting of the process operational models) have been extended...

  2. Gear Meshing Transmission Analysis of the Automobile Gearbox Based on the Software MASTA

    Directory of Open Access Journals (Sweden)

    Yongxiang Li

    2013-03-01

    Full Text Available As the main drive components of the automobile manual gearbox, the effect of gear meshing plays an important role on transmission performance. Aiming at the existing problems of the traditional gear meshing analysis, the study take a five-speed gearbox as an example, based on the MASTA software, a professional CAE software for simulating and analyzing the gearbox, to accomplish the gear mesh analysis of the automobile gearbox. Further more, the simulation modeling of the gearbox is built to simulate the actual load conditions and complete the process of analysis for the gear. It is indicated that a new design concept is put forward, that is, using specialized software MASTA for transmission modeling and simulation analysis can heavily improve the design level of the gearbox, reduce the test times and shorten the period of research and development as well. Finally, it can provide references for the development and application of new transmission gear.

  3. Detection and Quantification of Nitrogen Compounds in the First Drilled Martian Solid Samples by the Sample Analysis at Mars (SAM) Instrument Suite on the Mars Science Laboratory (MSL)

    Science.gov (United States)

    Stern, Jennifer C.; Navarro-Gonzalez, Rafael; Freissinet, Caroline; McKay, Christopher P.; Archer, P. Douglas, Jr.; Buch, Arnaud; Coll, Patrice; Eigenbrode, Jennifer L.; Franz, Heather B.; Glavin, Daniel P.; Ming, Douglas W.; Steele, Andrew; Szopa, Cyril; Wray, James J.; Conrad, Pamela G.; Mahaffy, Paul R.

    2014-01-01

    The Sampl;e Analysis at Mars (sam) instrument suite on the Mars Science Laboratory (MSL) Curiosity Rover detected both reduced and oxidized nitrogen bearing compounds during the pyrolysis of surface materials from the three sites at Gale Crater. Preliminary detections of nitrogen species include No, HCN, ClCN, and TFMA ((trifluoro-N-methyl-acetamide), Confirmation of indigenous Martian nitrogen-bearing compounds requires quantifying N contribution from the terrestrial derivatization reagents carried for SAM's wet chemistry experiment that contribute to the SAM background. Nitrogen species detected in the SAM solid sample analyses can also be produced during laboratory pyrolysis experiments where these reagents are heated in the presence of perchlorate a compound that has also been identified by SAM in Mars solid samples.

  4. Detection and Quantification of Nitrogen Compounds in the First Drilled Martian Solid Samples by the Sample Analysis at Mars (SAM) Instrument Suite on the Mars Science Laboratory (MSL)

    Science.gov (United States)

    Stern, J. C.; Navarro-Gonzales, R.; Freissinet, C.; McKay, C. P.; Archer, P. D., Jr.; Buch, A.; Brunner, A. E.; Coll, P.; Eigenbrode, J. L.; Franz, H. B.; Glavin, D. P.; McAdam, A. C.; Ming, D.; Steele, A.; Sutter, B.; Szopa, C.; Wray, J. J.; Conrad, P.; Mahaffy, P. R.

    2014-01-01

    The Sample Analysis at Mars (SAM) instrument suite on the Mars Science Laboratory (MSL) Curiosity Rover detected both reduced and oxidized nitrogen-bearing compounds during the pyrolysis of surface materials at Yellowknife Bay in Gale Crater. Preliminary detections of nitrogen species include NO, HCN, ClCN, CH3CN, and TFMA (trifluoro-N-methyl-acetamide). Confirmation of indigenous Martian N-bearing compounds requires quantifying N contribution from the terrestrial derivatization reagents (e.g. N-methyl-N-tertbutyldimethylsilyltrifluoroacetamide, MTBSTFA and dimethylformamide, DMF) carried for SAM's wet chemistry experiment that contribute to the SAM background. Nitrogen species detected in the SAM solid sample analyses can also be produced during laboratory pyrolysis experiments where these reagents are heated in the presence of perchlorate, a compound that has also been identified by SAM in Mars solid samples.

  5. An Analysis and Design of the Virtual Simulation Software Based on Pattern

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    The paper makes a detailed analysis and design of the Vega application software based on Windows NT platform. It includes object-oriented software analysis and design, design patterns and Windows kernel mechanism. The paper brings forward a design pattern, a fence-pattern, and depends on this pattern. Windows NT memory mapped files adopted, the paper presents a Vega application solution based on the multi-process technique. Although the design solution is developing under a real-time simulation system, it is established at the clear analysis of the Vega system, therefore, the solution has extensive practicability and many uses.

  6. Demonstration of a software design and statistical analysis methodology with application to patient outcomes data sets

    OpenAIRE

    Mayo, Charles; Conners, Steve; Warren, Christopher; Miller, Robert; Court, Laurence; Popple, Richard

    2013-01-01

    Purpose: With emergence of clinical outcomes databases as tools utilized routinely within institutions, comes need for software tools to support automated statistical analysis of these large data sets and intrainstitutional exchange from independent federated databases to support data pooling. In this paper, the authors present a design approach and analysis methodology that addresses both issues.

  7. Software reference for SaTool - a Tool for Structural Analysis of Automated Systems

    DEFF Research Database (Denmark)

    Lorentzen, Torsten; Blanke, Mogens

    2004-01-01

    This software reference details the functions of SaTool – a tool for structural analysis of technical systems. SaTool is intended used as part of an industrial systems design cycle. Structural analysis is a graph-based technique where principal relations between variables express the system...

  8. Scilab and Maxima Environment: Towards Free Software in Numerical Analysis

    Science.gov (United States)

    Mora, Angel; Galan, Jose Luis; Aguilera, Gabriel; Fernandez, Alvaro; Merida, Enrique; Rodriguez, Pedro

    2010-01-01

    In this work we will present the ScilabUMA environment we have developed as an alternative to Matlab. This environment connects Scilab (for numerical analysis) and Maxima (for symbolic computations). Furthermore, the developed interface is, in our opinion at least, as powerful as the interface of Matlab. (Contains 3 figures.)

  9. Field Of View Of A Spacecraft Antenna: Analysis And Software

    Science.gov (United States)

    Wu, Te-Kao; Kipp, R.; Lee, S. W.

    1995-01-01

    Report summarizes computational analysis of field of view of rotating elliptical-cross-section parabolic-reflector antenna for SeaWinds spacecraft. Issues considered include blockage and diffraction by other objects near antenna, related concerns about electromagnetic interference and electromagnetic compatibility, and how far and in which configuration other objects positioned with respect to antenna to achieve required performance.

  10. A Framework for Effective Object-Oriented Software Change Impact Analysis

    Directory of Open Access Journals (Sweden)

    Bassey Isong

    2015-03-01

    Full Text Available Object-oriented (OO software have complex dependencies and different change types which frequently affect their maintenance in terms of ripple-effects identification or may likely introduce some faults which are hard to detect. As change is both important and risky, change impact analysis (CIA is a technique used to preserve the quality of the software system. Several CIA techniques exist but they provide little or no clear information on OO software system representation for effective change impact prediction. Additionally, OO classes are not faults or failures-free and their fault-proneness is not considered during CIA. There is no known CIA approach that incorporates both change impact and fault prediction. Consequently, making changes to software components while neglecting their dependencies and fault-proneness may have some unexpected effects on their quality or may increase their failure risks. Therefore, this paper proposes a novel framework for OO software CIA that allows for impact and fault predictions. Moreover, an intermediate OO program representation that explicitly represents the software and allows its structural complexity to be quantified using complex networks is proposed. The objective is to enhance static CIA and facilitate program comprehension. To assess its effectiveness, a controlled experiment was conducted using students’ project with respect to maintenance duration and correctness. The results obtained were promising, indicating its importance for impact analysis.

  11. Reliability and accuracy of three different computerized cephalometric analysis software.

    Science.gov (United States)

    Rusu, Oana; Petcu, Ana Elena; Drăgan, Eliza; Haba, Danisia; Moscalu, Mihaela; Zetu, Irina Nicoleta

    2015-01-01

    The aim of this investigation was to determine, compare and evaluate three different computerized tracing programs, where the lateral cephalograms were digitized on the screen. 39 randomly selected cephalometric radiographs were used in the present study. Three programs Planmeca Romexis® (Romexis 3.2.0., Helsinki, Finland), Orthalis (France) and AxCeph (A.C 2.3.0.74, Ljubljana, Slovenia) were evaluated. 12 skeletal, 9 dental and 3 soft tissue parameters were measured that consisted of 11 linear and 13 angular measurements. Statistical analysis was carried out using multivariate analysis of variance (MANOVA), Levene test, Tukey Honestly Significant Difference (HSD) test and Kruskal-Wallis test. The measurements obtained with the cephalometric analyses programs used in the study were reliable. PMID:25970975

  12. Implementation of a timeline analysis software for digital forensic investigations

    OpenAIRE

    Nisén, Patrik

    2013-01-01

    Organizations today are trying to manage the many risks they percieve to be threatening the security of their valuable information assets, but often these risks realize into security incidents. Managing risks proactively is important, but equally important and challenging is to efficiently respond to the incidents that have already occurred, to minimize their impact on business processes. A part of managing security incidents is the technical analysis of any related computer systems, also ...

  13. Nucleonica. Web-based software tools for simulation and analysis

    International Nuclear Information System (INIS)

    The authors present a description of the Nucleonica web-based portal for simulation and analysis for a wide range of commonly encountered nuclear science applications. Advantages of a web-based approach include availability wherever there is internet access, intuitive user-friendly interface, remote access to high-power computing resources, and continual maintenance, improvement, and addition of tools and techniques common to the nuclear science industry. A description of the nuclear data resources, and some applications is given.

  14. Statistical Analysis for Test Papers with Software SPSS

    Institute of Scientific and Technical Information of China (English)

    张燕君

    2012-01-01

      Test paper evaluation is an important work for the management of tests, which results are significant bases for scientific summation of teaching and learning. Taking an English test paper of high students’monthly examination as the object, it focuses on the interpretation of SPSS output concerning item and whole quantitative analysis of papers. By analyzing and evaluating the papers, it can be a feedback for teachers to check the students’progress and adjust their teaching process.

  15. HERMES: A user-friendly connectivity analysis software

    OpenAIRE

    Niso Galán, Julia Guiomar; Bruña Fernandez, Ricardo; Pereda, Ernesto; Gutierrez, Ricardo; Bajo Breton, Ricardo; Maestú, Fernando; Pozo Guerrero, Francisco del

    2012-01-01

    The analysis of the interdependence between time series has become an important field of research, mainly as a result of advances in the characterization of dynamical systems from the signals they produce, and the introduction of concepts such as Generalized (GS) and Phase synchronization (PS). This increase in the number of approaches to tackle the existence of the so-called functional (FC) and effective connectivity (EC) (Friston 1994) between two, (or among many) neural networks, along wit...

  16. PyPWA: A partial-wave/amplitude analysis software framework

    Science.gov (United States)

    Salgado, Carlos

    2016-05-01

    The PyPWA project aims to develop a software framework for Partial Wave and Amplitude Analysis of data; providing the user with software tools to identify resonances from multi-particle final states in photoproduction. Most of the code is written in Python. The software is divided into two main branches: one general-shell where amplitude's parameters (or any parametric model) are to be estimated from the data. This branch also includes software to produce simulated data-sets using the fitted amplitudes. A second branch contains a specific realization of the isobar model (with room to include Deck-type and other isobar model extensions) to perform PWA with an interface into the computer resources at Jefferson Lab. We are currently implementing parallelism and vectorization using the Intel's Xeon Phi family of coprocessors.

  17. Software Safety Analysis of a Flight Guidance System

    Science.gov (United States)

    Butler, Ricky W. (Technical Monitor); Tribble, Alan C.; Miller, Steven P.; Lempia, David L.

    2004-01-01

    This document summarizes the safety analysis performed on a Flight Guidance System (FGS) requirements model. In particular, the safety properties desired of the FGS model are identified and the presence of the safety properties in the model is formally verified. Chapter 1 provides an introduction to the entire project, while Chapter 2 gives a brief overview of the problem domain, the nature of accidents, model based development, and the four-variable model. Chapter 3 outlines the approach. Chapter 4 presents the results of the traditional safety analysis techniques and illustrates how the hazardous conditions associated with the system trace into specific safety properties. Chapter 5 presents the results of the formal methods analysis technique model checking that was used to verify the presence of the safety properties in the requirements model. Finally, Chapter 6 summarizes the main conclusions of the study, first and foremost that model checking is a very effective verification technique to use on discrete models with reasonable state spaces. Additional supporting details are provided in the appendices.

  18. Learning DHTMLX suite UI

    CERN Document Server

    Geske, Eli

    2013-01-01

    A fast-paced, example-based guide to learning DHTMLX.""Learning DHTMLX Suite UI"" is for web designers who have a basic knowledge of JavaScript and who are looking for powerful tools that will give them an extra edge in their own application development. This book is also useful for experienced developers who wish to get started with DHTMLX without going through the trouble of learning its quirks through trial and error. Readers are expected to have some knowledge of JavaScript, HTML, Document Object Model, and the ability to install a local web server.

  19. The Implication of Using NVivo Software in Qualitative Data Analysis: Evidence-Based Reflections.

    Science.gov (United States)

    Zamawe, F C

    2015-03-01

    For a long time, electronic data analysis has been associated with quantitative methods. However, Computer Assisted Qualitative Data Analysis Software (CAQDAS) are increasingly being developed. Although the CAQDAS has been there for decades, very few qualitative health researchers report using it. This may be due to the difficulties that one has to go through to master the software and the misconceptions that are associated with using CAQDAS. While the issue of mastering CAQDAS has received ample attention, little has been done to address the misconceptions associated with CAQDAS. In this paper, the author reflects on his experience of interacting with one of the popular CAQDAS (NVivo) in order to provide evidence-based implications of using the software. The key message is that unlike statistical software, the main function of CAQDAS is not to analyse data but rather to aid the analysis process, which the researcher must always remain in control of. In other words, researchers must equally know that no software can analyse qualitative data. CAQDAS are basically data management packages, which support the researcher during analysis.

  20. Felyx : A Free Open Software Solution for the Analysis of Large Earth Observation Datasets

    Science.gov (United States)

    Piolle, Jean-Francois; Shutler, Jamie; Poulter, David; Guidetti, Veronica; Donlon, Craig

    2014-05-01

    GHRSST project, by assembling large collections of earth observation data from various sources and agencies, has also raised the need for providing the user community with tools to inter-compare them, assess and monitor their quality. The ESA /Medspiration project, which implemented the first operating node of GHRSST system for Europe, also paved the way successfully towards such generic analytics tools by developing the High Resolution Diagnostic Dataset System (HR-DDS) and Satellite to In situ Multi-sensor Match-up Databases. Building on this heritage, ESA is now funding the development by IFREMER, PML and Pelamis of felyx, a web tool merging the two capabilities into a single software solution. It will consist in a free open software solution, written in python and javascript, whose aim is to provide Earth Observation data producers and users with an open-source, flexible and reusable tool to allow the quality and performance of data streams (satellite, in situ and model) to be easily monitored and studied. The primary concept of Felyx is to work as an extraction tool, subsetting source data over predefined target areas (which can be static or moving) : these data subsets, and associated metrics, can then be accessed by users or client applications either as raw files, automatic alerts and reports generated periodically, or through a flexible web interface enabling statistical analysis and visualization. Felyx presents itself as an open-source suite of tools, written in python and javascript, enabling : * subsetting large local or remote collections of Earth Observation data over predefined sites (geographical boxes) or moving targets (ship, buoy, hurricane), storing locally the extracted data (refered as miniProds). These miniProds constitute a much smaller representative subset of the original collection on which one can perform any kind of processing or assessment without having to cope with heavy volumes of data. * computing statistical metrics over these

  1. Analysis of Software Test Item Generation- Comparison Between High Skilled and Low Skilled Engineers

    Institute of Scientific and Technical Information of China (English)

    Masayuki Hirayama; Osamu Mizuno; Tohru Kikuno

    2005-01-01

    Recent software system contain many functions to provide various services. According to this tendency, it is difficult to ensure software quality and to eliminate crucial faults by conventional software testing methods. So taking the effect of test engineer's skill on test item generation into consideration, we propose a new test item generation method,which supports the generation of test items for illegal behavior of the system. The proposed method can generate test items based on use-case analysis, deviation analysis for legal behavior, and faults tree analysis for system fault situations. From the results of the experimental applications of our method, we confirmed that test items for illegal behavior of a system were effectively generated, and also the proposed method could effectively assist test item generation by an engineer with low-level skill.

  2. OMA and OPA—Software-Supported Mass Spectra Analysis of Native and Modified Nucleic Acids

    Science.gov (United States)

    Nyakas, Adrien; Blum, Lorenz C.; Stucki, Silvan R.; Reymond, Jean-Louis; Schürch, Stefan

    2013-02-01

    The platform-independent software package consisting of the oligonucleotide mass assembler (OMA) and the oligonucleotide peak analyzer (OPA) was created to support the analysis of oligonucleotide mass spectra. It calculates all theoretically possible fragments of a given input sequence and annotates it to an experimental spectrum, thus, saving a large amount of manual processing time. The software performs analysis of precursor and product ion spectra of oligonucleotides and their analogues comprising user-defined modifications of the backbone, the nucleobases, or the sugar moiety, as well as adducts with metal ions or drugs. The ability to expand the library of building blocks and to implement individual structural variations makes it extremely useful for supporting the analysis of therapeutically active compounds. The functionality of the software tool is demonstrated on the examples of a platinated double-stranded oligonucleotide and a modified RNA sequence. Experiments also reveal the unique dissociation behavior of platinated higher-order DNA structures.

  3. Automated structure solution with the PHENIX suite

    Energy Technology Data Exchange (ETDEWEB)

    Terwilliger, Thomas C [Los Alamos National Laboratory; Zwart, Peter H [LBNL; Afonine, Pavel V [LBNL; Grosse - Kunstleve, Ralf W [LBNL

    2008-01-01

    Significant time and effort are often required to solve and complete a macromolecular crystal structure. The development of automated computational methods for the analysis, solution, and completion of crystallographic structures has the potential to produce minimally biased models in a short time without the need for manual intervention. The PHENIX software suite is a highly automated system for macromolecular structure determination that can rapidly arrive at an initial partial model of a structure without significant human intervention, given moderate resolution, and good quality data. This achievement has been made possible by the development of new algorithms for structure determination, maximum-likelihood molecular replacement (PHASER), heavy-atom search (HySS), template- and pattern-based automated model-building (RESOLVE, TEXTAL), automated macromolecular refinement (phenix. refine), and iterative model-building, density modification and refinement that can operate at moderate resolution (RESOLVE, AutoBuild). These algorithms are based on a highly integrated and comprehensive set of crystallographic libraries that have been built and made available to the community. The algorithms are tightly linked and made easily accessible to users through the PHENIX Wizards and the PHENIX GUI.

  4. Automated Structure Solution with the PHENIX Suite

    Energy Technology Data Exchange (ETDEWEB)

    Zwart, Peter H.; Zwart, Peter H.; Afonine, Pavel; Grosse-Kunstleve, Ralf W.; Hung, Li-Wei; Ioerger, Tom R.; McCoy, A.J.; McKee, Eric; Moriarty, Nigel; Read, Randy J.; Sacchettini, James C.; Sauter, Nicholas K.; Storoni, L.C.; Terwilliger, Tomas C.; Adams, Paul D.

    2008-06-09

    Significant time and effort are often required to solve and complete a macromolecular crystal structure. The development of automated computational methods for the analysis, solution and completion of crystallographic structures has the potential to produce minimally biased models in a short time without the need for manual intervention. The PHENIX software suite is a highly automated system for macromolecular structure determination that can rapidly arrive at an initial partial model of a structure without significant human intervention, given moderate resolution and good quality data. This achievement has been made possible by the development of new algorithms for structure determination, maximum-likelihood molecular replacement (PHASER), heavy-atom search (HySS), template and pattern-based automated model-building (RESOLVE, TEXTAL), automated macromolecular refinement (phenix.refine), and iterative model-building, density modification and refinement that can operate at moderate resolution (RESOLVE, AutoBuild). These algorithms are based on a highly integrated and comprehensive set of crystallographic libraries that have been built and made available to the community. The algorithms are tightly linked and made easily accessible to users through the PHENIX Wizards and the PHENIX GUI.

  5. Long Term Preservation of Data Analysis Software at the NASA/IPAC Infrared Science Archive

    OpenAIRE

    Teplitz, Harry I.; Groom, Steven; Brooke, Timothy; Desai, Vandana; Engler, Diane; Fowler, John; Good, John; Khan, Iffat; Levine, Deborah; Alexov, Anastasia

    2012-01-01

    The NASA/IPAC Infrared Science Archive (IRSA) curates both data and analysis tools from NASA's infrared missions. As part of our primary goal, we provide long term access to mission-specific software from projects such as IRAS and Spitzer. We will review the efforts by IRSA (and within the greater IPAC before that) to keep the IRAS and Spitzer software tools current and available. Data analysis tools are a vital part of the Spitzer Heritage Archive. The IRAS tools HIRES and SCANPI have been i...

  6. Verification Problems of Nuclear Installations Safety Software of Strength Analysis (NISS SA)

    International Nuclear Information System (INIS)

    The use of software in Ukraine for strength analysis of nuclear installation systems end elements is nowadays rather limited in connection to the absence of a uniform concept of their verification and application and complexity of methodological guides. Creation of a methodology of certification is given large importance with numerical researches allowing to establish estimation reliability criterions, common for all nuclear installation safety software for strength analysis (NISS SA). Two examples of such researches are given: dependence convergence of FEM-decision on number of freedom degrees of discrete models; influence of the creation of FEM equation methods on character of decisions convergence

  7. Software systems for processing and analysis at the NOVA high-energy laser facility

    International Nuclear Information System (INIS)

    A typical laser interaction experiment at the NOVA high-energy laser facility produces in excess of 20 Mbytes of digitized data. Extensive processing and analysis of this raw data from a wide variety of instruments is necessary to produce results that can be readily used to interpret the experiment. Using VAX-based computer hardware, software systems have been set up to convert the digitized instrument output to physics quantities describing the experiment. A relational data-base management system is used to coordinate all levels of processing and analysis. Software development emphasizes structured design, flexibility, automation, and ease of use

  8. Explicet: graphical user interface software for metadata-driven management, analysis and visualization of microbiome data.

    Science.gov (United States)

    Robertson, Charles E; Harris, J Kirk; Wagner, Brandie D; Granger, David; Browne, Kathy; Tatem, Beth; Feazel, Leah M; Park, Kristin; Pace, Norman R; Frank, Daniel N

    2013-12-01

    Studies of the human microbiome, and microbial community ecology in general, have blossomed of late and are now a burgeoning source of exciting research findings. Along with the advent of next-generation sequencing platforms, which have dramatically increased the scope of microbiome-related projects, several high-performance sequence analysis pipelines (e.g. QIIME, MOTHUR, VAMPS) are now available to investigators for microbiome analysis. The subject of our manuscript, the graphical user interface-based Explicet software package, fills a previously unmet need for a robust, yet intuitive means of integrating the outputs of the software pipelines with user-specified metadata and then visualizing the combined data. PMID:24021386

  9. Lessons Learned from Application of System and Software Level RAMS Analysis to a Space Control System

    Science.gov (United States)

    Silva, N.; Esper, A.

    2012-01-01

    The work presented in this article represents the results of applying RAMS analysis to a critical space control system, both at system and software levels. The system level RAMS analysis allowed the assignment of criticalities to the high level components, which was further refined by a tailored software level RAMS analysis. The importance of the software level RAMS analysis in the identification of new failure modes and its impact on the system level RAMS analysis is discussed. Recommendations of changes in the software architecture have also been proposed in order to reduce the criticality of the SW components to an acceptable minimum. The dependability analysis was performed in accordance to ECSS-Q-ST-80, which had to be tailored and complemented in some aspects. This tailoring will also be detailed in the article and lessons learned from the application of this tailoring will be shared, stating the importance to space systems safety evaluations. The paper presents the applied techniques, the relevant results obtained, the effort required for performing the tasks and the planned strategy for ROI estimation, as well as the soft skills required and acquired during these activities.

  10. Integrated software for imaging data analysis applied to edge plasma physic and operational safety

    Energy Technology Data Exchange (ETDEWEB)

    Martin, V.; Moncada, V. [CEA, IRFM, F-13108 Saint Paul-lez-Durance (France); Dunand, G. [Sophia Conseil Company, F-06560 Sophia Antipolis (France); Corre, Y.; Delchambre, E. [CEA, IRFM, F-13108 Saint Paul-lez-Durance (France); Travere, J.M., E-mail: jean-marcel.travere@cea.fr [CEA, IRFM, F-13108 Saint Paul-lez-Durance (France)

    2011-06-15

    Fusion tokamaks are complex devices requiring many diagnostics for real time control of the plasma and off-line physical analysis. In current tokamaks, imaging diagnostics have become increasingly used for these two purposes. Such systems produce a lot of data encouraging physicists to use shared tools and codes for data access and analysis. If general purpose software programs for data display and analysis are widely spread, a need exists in developing similar applications for quantitative imaging data analysis applied to plasma physic. In this paper, we introduce a new integrated software program, named WOLFF, dedicated to this task. The main contribution of this software is to gather under the same framework different functionalities for (1) data access and display, (2) signal, image, and video processing, and (3) quantitative analysis based on physical models. After an overview of existing solutions for data processing in the field of plasma data, we present the WOLFF architecture and its currently implemented features. The capabilities of the software are then demonstrated through three applications in the field of physical analysis (heat and particle flux calculations) and tokamak operational safety.

  11. Software Tools for Robust Analysis of High-Dimensional Data

    Directory of Open Access Journals (Sweden)

    Valentin Todorov

    2014-06-01

    Full Text Available The present work discusses robust multivariate methods specifically designed for highdimensions. Their implementation in R is presented and their application is illustratedon examples. The first group are algorithms for outlier detection, already introducedelsewhere and implemented in other packages. The value added of the new package isthat all methods follow the same design pattern and thus can use the same graphicaland diagnostic tools. The next topic covered is sparse principal components including anobject oriented interface to the standard method proposed by Zou, Hastie, and Tibshirani(2006 and the robust one proposed by Croux, Filzmoser, and Fritz (2013. Robust partialleast squares (see Hubert and Vanden Branden 2003 as well as partial least squares fordiscriminant analysis conclude the scope of the new package.

  12. Safety analysis of the software of the power density limiting computer (DNB-module) at Grafenrheinfeld

    International Nuclear Information System (INIS)

    This report starts with a brief description of the computer system used in Grafenrheinfeld for limiting the power density in the reactor core (DNB-Module) and is then followed by a discussion on methods for verification of the systems' software. The different language levels on which analysis may be performed are presented and criteria for choosing the appropriate language level with regard to verification are given. The program analysis splits the software into sections which are manageable and testable. Then the detailed structure of control flow and data flow as well as the functional properties of these sections are investigated. On certain premises completely analysed sections can be regarded as errorfree, i.e. their failure probability can be assumed to be zero. In addition, test cases for the various software sections are established on the basis of the program analysis. The test strategy comprises the module (sectional) test, partially performed with the test cases mentioned above, as well as an integration test of the entire DNB software. Both tests may be performed automatically - the test embedment is already established to a certain extent - thus allowing execution of a great number of tests; a feature which is extremely important in view of an assessment of quantitative software parameters. (orig.)

  13. Anthropometric Accommodation in Space Suit Design

    Science.gov (United States)

    Rajulu, Sudhakar; Thaxton, Sherry

    2007-01-01

    Design requirements for next generation hardware are in process at NASA. Anthropometry requirements are given in terms of minimum and maximum sizes for critical dimensions that hardware must accommodate. These dimensions drive vehicle design and suit design, and implicitly have an effect on crew selection and participation. At this stage in the process, stakeholders such as cockpit and suit designers were asked to provide lists of dimensions that will be critical for their design. In addition, they were asked to provide technically feasible minimum and maximum ranges for these dimensions. Using an adjusted 1988 Anthropometric Survey of U.S. Army (ANSUR) database to represent a future astronaut population, the accommodation ranges provided by the suit critical dimensions were calculated. This project involved participation from the Anthropometry and Biomechanics facility (ABF) as well as suit designers, with suit designers providing expertise about feasible hardware dimensions and the ABF providing accommodation analysis. The initial analysis provided the suit design team with the accommodation levels associated with the critical dimensions provided early in the study. Additional outcomes will include a comparison of principal components analysis as an alternate method for anthropometric analysis.

  14. Computer Software for Design, Analysis and Control of Fluid Power Systems

    DEFF Research Database (Denmark)

    Conrad, Finn; Sørensen, Torben; Grahl-Madsen, Mads

    1999-01-01

    This Deliverable presents contributions from SWING's Task 2.3 Analysis of available software solutions. The Deliverable has focus on the results from this analysis having in mind the task objectives·to carry out a thorough analysis of the state-of the-art solutions for fluid power systems modelling...... and modelling IT tools in the implementation planning (WP3) and pilot implementation (WP4), in particular training programme for key people in the individual SME and/or cluster....

  15. LEVERAGING BIG DATA FOR SUSTAINED COMPETITIVE ADVANTAGE; A STRATEGIC ANALYSIS OF A FINANCIAL SOFTWARE COMPANY

    OpenAIRE

    Jean-Philippe Aubin

    2014-01-01

    This paper is a strategic analysis of Dataphile, a financial software company located in Vancouver, British Columbia. The purpose of this paper is to investigate Dataphile and present strategic options on how it can leverage new trends for sustained competitive advantage. It conducts an external and internal analysis of the company and describes its current strategic positioning. The analysis then provides Dataphile with various strategic alternatives it could pursue and proposes a recommenda...

  16. Computing and software

    Directory of Open Access Journals (Sweden)

    White, G. C.

    2004-06-01

    Full Text Available The reality is that the statistical methods used for analysis of data depend upon the availability of software. Analysis of marked animal data is no different than the rest of the statistical field. The methods used for analysis are those that are available in reliable software packages. Thus, the critical importance of having reliable, up–to–date software available to biologists is obvious. Statisticians have continued to develop more robust models, ever expanding the suite of potential analysis methods available. But without software to implement these newer methods, they will languish in the abstract, and not be applied to the problems deserving them. In the Computers and Software Session, two new software packages are described, a comparison of implementation of methods for the estimation of nest survival is provided, and a more speculative paper about how the next generation of software might be structured is presented. Rotella et al. (2004 compare nest survival estimation with different software packages: SAS logistic regression, SAS non–linear mixed models, and Program MARK. Nests are assumed to be visited at various, possibly infrequent, intervals. All of the approaches described compute nest survival with the same likelihood, and require that the age of the nest is known to account for nests that eventually hatch. However, each approach offers advantages and disadvantages, explored by Rotella et al. (2004. Efford et al. (2004 present a new software package called DENSITY. The package computes population abundance and density from trapping arrays and other detection methods with a new and unique approach. DENSITY represents the first major addition to the analysis of trapping arrays in 20 years. Barker & White (2004 discuss how existing software such as Program MARK require that each new model’s likelihood must be programmed specifically for that model. They wishfully think that future software might allow the user to combine

  17. [Application of Stata software to test heterogeneity in meta-analysis method].

    Science.gov (United States)

    Wang, Dan; Mou, Zhen-yun; Zhai, Jun-xia; Zong, Hong-xia; Zhao, Xiao-dong

    2008-07-01

    To introduce the application of Stata software to heterogeneity test in meta-analysis. A data set was set up according to the example in the study, and the corresponding commands of the methods in Stata 9 software were applied to test the example. The methods used were Q-test and I2 statistic attached to the fixed effect model forest plot, H statistic and Galbraith plot. The existence of the heterogeneity among studies could be detected by Q-test and H statistic and the degree of the heterogeneity could be detected by I2 statistic. The outliers which were the sources of the heterogeneity could be spotted from the Galbraith plot. Heterogeneity test in meta-analysis can be completed by the four methods in Stata software simply and quickly. H and I2 statistics are more robust, and the outliers of the heterogeneity can be clearly seen in the Galbraith plot among the four methods.

  18. Proceedings Fifth Workshop on Formal Languages and Analysis of Contract-Oriented Software

    CERN Document Server

    Pimentel, Ernesto

    2011-01-01

    This volume consists of the proceedings of the 5th Workshop on Formal Languages and Analysis of Contract-Oriented Software (FLACOS'11). The FLACOS Workshops serve as annual meeting places to bring together researchers and practitioners working on language-based solutions to contract-oriented software development. High-level models of contracts are needed as a tool to negotiate contracts and provide services conforming to them. This Workshop provides language-based solutions to the above issues through formalization of contracts, design of appropriate abstraction mechanisms, and formal analysis of contract languages and software. The program of this edition consists of 5 regular papers and 3 invited presentations. Detailed information about the FLACOS 2011 Workshop can be found at http://flacos2011.lcc.uma.es/. The 5th edition of the FLACOS Workshop was organized by the University of M\\'alaga. It took place in M\\'alaga, Spain, during September 22-23, 2011.

  19. Featureous: infrastructure for feature-centric analysis of object-oriented software

    DEFF Research Database (Denmark)

    Olszak, Andrzej; Jørgensen, Bo Nørregaard

    2010-01-01

    encompasses a lightweight feature location mechanism, a number of analytical views and an API allowing for addition of third-party extensions. To form a common conceptual framework for future feature-centric extensions, we propose to structure feature centric analysis along three dimensions: perspective......The decentralized nature of collaborations between objects in object-oriented software makes it difficult to understand how user-observable program features are implemented and how their implementations relate to each other. It is worthwhile to improve this situation, since feature-centric program...... understanding and modification are essential during software evolution and maintenance. In this paper, we present an infrastructure built on top of the NetBeans IDE called Featureous that allows for rapid construction of tools for feature-centric analysis of object-oriented software. Our infrastructure...

  20. [Signal Processing Suite Design

    Science.gov (United States)

    Sahr, John D.; Mir, Hasan; Morabito, Andrew; Grossman, Matthew

    2003-01-01

    Our role in this project was to participate in the design of the signal processing suite to analyze plasma density measurements on board a small constellation (3 or 4) satellites in Low Earth Orbit. As we are new to space craft experiments, one of the challenges was to simply gain understanding of the quantity of data which would flow from the satellites, and possibly to interact with the design teams in generating optimal sampling patterns. For example, as the fleet of satellites were intended to fly through the same volume of space (displaced slightly in time and space), the bulk plasma structure should be common among the spacecraft. Therefore, an optimal, limited bandwidth data downlink would take advantage of this commonality. Also, motivated by techniques in ionospheric radar, we hoped to investigate the possibility of employing aperiodic sampling in order to gain access to a wider spatial spectrum without suffering aliasing in k-space.

  1. Clementine sensor suite

    Energy Technology Data Exchange (ETDEWEB)

    Ledebuhr, A.G. [Lawrence Livermore National Lab., CA (United States)

    1994-11-15

    LLNL designed and built the suite of six miniaturized light-weight space-qualified sensors utilized in the Clementine mission. A major goal of the Clementine program was to demonstrate technologies originally developed for Ballistic Missile Defense Organization Programs. These sensors were modified to gather data from the moon. This overview presents each of these sensors and some preliminary on-orbit performance estimates. The basic subsystems of these sensors include optical baffles to reject off-axis stray light, light-weight ruggedized optical systems, filter wheel assemblies, radiation tolerant focal plane arrays, radiation hardened control and readout electronics and low mass and power mechanical cryogenic coolers for the infrared sensors. Descriptions of each sensor type are given along with design specifications, photographs and on-orbit data collected.

  2. Public-domain software for root image analysis

    Directory of Open Access Journals (Sweden)

    Mirian Cristina Gomes Costa

    2014-10-01

    Full Text Available In the search for high efficiency in root studies, computational systems have been developed to analyze digital images. ImageJ and Safira are public-domain systems that may be used for image analysis of washed roots. However, differences in root properties measured using ImageJ and Safira are supposed. This study compared values of root length and surface area obtained with public-domain systems with values obtained by a reference method. Root samples were collected in a banana plantation in an area of a shallower Typic Carbonatic Haplic Cambisol (CXk, and an area of a deeper Typic Haplic Ta Eutrophic Cambisol (CXve, at six depths in five replications. Root images were digitized and the systems ImageJ and Safira used to determine root length and surface area. The line-intersect method modified by Tennant was used as reference; values of root length and surface area measured with the different systems were analyzed by Pearson's correlation coefficient and compared by the confidence interval and t-test. Both systems ImageJ and Safira had positive correlation coefficients with the reference method for root length and surface area data in CXk and CXve. The correlation coefficient ranged from 0.54 to 0.80, with lowest value observed for ImageJ in the measurement of surface area of roots sampled in CXve. The IC (95 % revealed that root length measurements with Safira did not differ from that with the reference method in CXk (-77.3 to 244.0 mm. Regarding surface area measurements, Safira did not differ from the reference method for samples collected in CXk (-530.6 to 565.8 mm² as well as in CXve (-4231 to 612.1 mm². However, measurements with ImageJ were different from those obtained by the reference method, underestimating length and surface area in samples collected in CXk and CXve. Both ImageJ and Safira allow an identification of increases or decreases in root length and surface area. However, Safira results for root length and surface area are

  3. Methodology for object-oriented real-time systems analysis and design: Software engineering

    Science.gov (United States)

    Schoeffler, James D.

    1991-01-01

    Successful application of software engineering methodologies requires an integrated analysis and design life-cycle in which the various phases flow smoothly 'seamlessly' from analysis through design to implementation. Furthermore, different analysis methodologies often lead to different structuring of the system so that the transition from analysis to design may be awkward depending on the design methodology to be used. This is especially important when object-oriented programming is to be used for implementation when the original specification and perhaps high-level design is non-object oriented. Two approaches to real-time systems analysis which can lead to an object-oriented design are contrasted: (1) modeling the system using structured analysis with real-time extensions which emphasizes data and control flows followed by the abstraction of objects where the operations or methods of the objects correspond to processes in the data flow diagrams and then design in terms of these objects; and (2) modeling the system from the beginning as a set of naturally occurring concurrent entities (objects) each having its own time-behavior defined by a set of states and state-transition rules and seamlessly transforming the analysis models into high-level design models. A new concept of a 'real-time systems-analysis object' is introduced and becomes the basic building block of a series of seamlessly-connected models which progress from the object-oriented real-time systems analysis and design system analysis logical models through the physical architectural models and the high-level design stages. The methodology is appropriate to the overall specification including hardware and software modules. In software modules, the systems analysis objects are transformed into software objects.

  4. Future space suit design considerations.

    Science.gov (United States)

    1991-07-01

    Future space travel to the moon and Mars will present new challenges in space suit design. This paper examines the impact that working on the surface environment of the moon and Mars will have on the requirements of space suits. In particular, habitat pressures will impact suit weight and design. Potential structural materials are explored, as are the difficulties in designing a suit to withstand the severe dust conditions expected.

  5. A document-driven method for certifying scientific computing software for use in nuclear safety analysis

    Energy Technology Data Exchange (ETDEWEB)

    Smith, W. Spencer; Koothoor, Mimitha [Computing and Software Department, McMaster University, Hamilton (Canada)

    2016-04-15

    This paper presents a documentation and development method to facilitate the certification of scientific computing software used in the safety analysis of nuclear facilities. To study the problems faced during quality assurance and certification activities, a case study was performed on legacy software used for thermal analysis of a fuel pin in a nuclear reactor. Although no errors were uncovered in the code, 27 issues of incompleteness and inconsistency were found with the documentation. This work proposes that software documentation follow a rational process, which includes a software requirements specification following a template that is reusable, maintainable, and understandable. To develop the design and implementation, this paper suggests literate programming as an alternative to traditional structured programming. Literate programming allows for documenting of numerical algorithms and code together in what is termed the literate programmer's manual. This manual is developed with explicit traceability to the software requirements specification. The traceability between the theory, numerical algorithms, and implementation facilitates achieving completeness and consistency, as well as simplifies the process of verification and the associated certification.

  6. Software and Database Usage on Metabolomic Studies: Using XCMS on LC-MS Data Analysis

    Directory of Open Access Journals (Sweden)

    Mustafa Celebier

    2014-04-01

    Full Text Available Metabolome is the complete set of small-molecule metabolites to be found in a cell or a single organism. Metabolomics is the scientific study to determine and identify the chemicals in metabolome with advanced analytical techniques. Nowadays, the elucidation of the molecular mechanism of any disease with genome analysis and proteome analysis is not sufficient. Instead of these, a holistic assessment including metabolomic studies provides rational and accurate results. Metabolite levels in an organism are associated with the cellular functions. Thus, determination of the metabolite amounts identifies the phenotype of a cell or tissue related with the genetic and some other variations. Even though, the analysis of metabolites for medical diagnosis and therapy have been performed for a long time, the studies to improve the analysis methods for metabolite profiling are recently increased. The application of metabolomics includes the identification of biomarkers, enzyme-substract interactions, drug-activity studies, metabolic pathway analysis and some other studies related with the system biology. The preprocessing and computing of the data obtained from LC-MS, GC-MS, CE-MS and NMR for metabolite profiling are helpful for preventing from time consuming manual data analysis processes and possible random errors on profiling period. In addition, such preprocesses allow us to identify low amount of metabolites which are not possible to be analyzed by manual processing. Therefore, the usage of software and databases for this purpose could not be ignored. In this study, it is briefly presented the software and database used on metabolomics and it is evaluated the capability of these software on metabolite profiling. Particularly, the performance of one of the most popular software called XCMS on the evaluation of LC-MS results for metabolomics was overviewed. In the near future, metabolomics with software and database support is estimated to be a routine

  7. An Assessmant of a Beofulf System for a Wide Class of Analysis and Design Software

    Science.gov (United States)

    Katz, D. S.; Cwik, T.; Kwan, B. H.; Lou, J. Z.; Springer, P. L.; Sterling, T. L.; Wang, P.

    1997-01-01

    This paper discusses Beowulf systems, focusing on Hyglac, the Beowulf system installed at the Jet Propulsion Laboratory. The purpose of the paper is to assess how a system of this type will perform while running a variety of scientific and engineering analysis and design software.

  8. Long Term Preservation of Data Analysis Software at the NASA/IPAC Infrared Science Archive

    NARCIS (Netherlands)

    H.I. Teplitz; S. Groom; T. Brooke; V. Desai; D. Engler; J. Fowler; J. Good; I. Khan; D. Levine; A. Alexov

    2011-01-01

    The NASA/IPAC Infrared Science Archive (IRSA) curates both data and analysis tools from NASA's infrared missions. As part of our primary goal, we provide long term access to mission-specific software from projects such as IRAS and Spitzer. We will review the efforts by IRSA (and within the greater I

  9. Onboard utilization of ground control points for image correction. Volume 4: Correlation analysis software design

    Science.gov (United States)

    1981-01-01

    The software utilized for image correction accuracy measurement is described. The correlation analysis program is written to allow the user various tools to analyze different correlation algorithms. The algorithms were tested using LANDSAT imagery in two different spectral bands. Three classification algorithms are implemented.

  10. Global review of open access risk assessment software packages valid for global or continental scale analysis

    Science.gov (United States)

    Daniell, James; Simpson, Alanna; Gunasekara, Rashmin; Baca, Abigail; Schaefer, Andreas; Ishizawa, Oscar; Murnane, Rick; Tijssen, Annegien; Deparday, Vivien; Forni, Marc; Himmelfarb, Anne; Leder, Jan

    2015-04-01

    Over the past few decades, a plethora of open access software packages for the calculation of earthquake, volcanic, tsunami, storm surge, wind and flood have been produced globally. As part of the World Bank GFDRR Review released at the Understanding Risk 2014 Conference, over 80 such open access risk assessment software packages were examined. Commercial software was not considered in the evaluation. A preliminary analysis was used to determine whether the 80 models were currently supported and if they were open access. This process was used to select a subset of 31 models that include 8 earthquake models, 4 cyclone models, 11 flood models, and 8 storm surge/tsunami models for more detailed analysis. By using multi-criteria analysis (MCDA) and simple descriptions of the software uses, the review allows users to select a few relevant software packages for their own testing and development. The detailed analysis evaluated the models on the basis of over 100 criteria and provides a synopsis of available open access natural hazard risk modelling tools. In addition, volcano software packages have since been added making the compendium of risk software tools in excess of 100. There has been a huge increase in the quality and availability of open access/source software over the past few years. For example, private entities such as Deltares now have an open source policy regarding some flood models (NGHS). In addition, leaders in developing risk models in the public sector, such as Geoscience Australia (EQRM, TCRM, TsuDAT, AnuGA) or CAPRA (ERN-Flood, Hurricane, CRISIS2007 etc.), are launching and/or helping many other initiatives. As we achieve greater interoperability between modelling tools, we will also achieve a future wherein different open source and open access modelling tools will be increasingly connected and adapted towards unified multi-risk model platforms and highly customised solutions. It was seen that many software tools could be improved by enabling user

  11. Application of Artificial Intelligence technology to the analysis and synthesis of reliable software systems

    Science.gov (United States)

    Wild, Christian; Eckhardt, Dave

    1987-01-01

    The development of a methodology for the production of highly reliable software is one of the greatest challenges facing the computer industry. Meeting this challenge will undoubtably involve the integration of many technologies. This paper describes the use of Artificial Intelligence technologies in the automated analysis of the formal algebraic specifications of abstract data types. These technologies include symbolic execution of specifications using techniques of automated deduction and machine learning through the use of examples. On-going research into the role of knowledge representation and problem solving in the process of developing software is also discussed.

  12. PROMETHEE Method and Sensitivity Analysis in the Software Application for the Support of Decision-Making

    Directory of Open Access Journals (Sweden)

    Petr Moldrik

    2008-01-01

    Full Text Available PROMETHEE is one of methods, which fall into multi-criteria analysis (MCA. The MCA, as the name itself indicates, deals with the evaluation of particular variants according to several criteria. Developed software application (MCA8 for the support of multi-criteria decision-making was upgraded about PROMETHEE method and a graphic tool, which enables the execution of the sensitivity analysis. This analysis is used to ascertain how a given model output depends upon the input parameters. The MCA8 software application with mentioned graphic upgrade was developed for purposes of solving multi-criteria decision tasks. In the MCA8 is possible to perform sensitivity analysis by a simple form – through column graphs. We can change criteria significances (weights in these column graphs directly and watch the changes of the order of variants immediately.

  13. mtsslSuite: In silico spin labelling, trilateration and distance-constrained rigid body docking in PyMOL

    Science.gov (United States)

    Hagelueken, Gregor; Abdullin, Dinar; Ward, Richard; Schiemann, Olav

    2013-10-01

    Nanometer distance measurements based on electron paramagnetic resonance methods in combination with site-directed spin labelling are powerful tools for the structural analysis of macromolecules. The software package mtsslSuite provides scientists with a set of tools for the translation of experimental distance distributions into structural information. The package is based on the previously published mtsslWizard software for in silico spin labelling. The mtsslSuite includes a new version of MtsslWizard that has improved performance and now includes additional types of spin labels. Moreover, it contains applications for the trilateration of paramagnetic centres in biomolecules and for rigid-body docking of subdomains of macromolecular complexes. The mtsslSuite is tested on a number of challenging test cases and its strengths and weaknesses are evaluated.

  14. Meta-Analyst: software for meta-analysis of binary, continuous and diagnostic data

    Directory of Open Access Journals (Sweden)

    Schmid Christopher H

    2009-12-01

    Full Text Available Abstract Background Meta-analysis is increasingly used as a key source of evidence synthesis to inform clinical practice. The theory and statistical foundations of meta-analysis continually evolve, providing solutions to many new and challenging problems. In practice, most meta-analyses are performed in general statistical packages or dedicated meta-analysis programs. Results Herein, we introduce Meta-Analyst, a novel, powerful, intuitive, and free meta-analysis program for the meta-analysis of a variety of problems. Meta-Analyst is implemented in C# atop of the Microsoft .NET framework, and features a graphical user interface. The software performs several meta-analysis and meta-regression models for binary and continuous outcomes, as well as analyses for diagnostic and prognostic test studies in the frequentist and Bayesian frameworks. Moreover, Meta-Analyst includes a flexible tool to edit and customize generated meta-analysis graphs (e.g., forest plots and provides output in many formats (images, Adobe PDF, Microsoft Word-ready RTF. The software architecture employed allows for rapid changes to be made to either the Graphical User Interface (GUI or to the analytic modules. We verified the numerical precision of Meta-Analyst by comparing its output with that from standard meta-analysis routines in Stata over a large database of 11,803 meta-analyses of binary outcome data, and 6,881 meta-analyses of continuous outcome data from the Cochrane Library of Systematic Reviews. Results from analyses of diagnostic and prognostic test studies have been verified in a limited number of meta-analyses versus MetaDisc and MetaTest. Bayesian statistical analyses use the OpenBUGS calculation engine (and are thus as accurate as the standalone OpenBUGS software. Conclusion We have developed and validated a new program for conducting meta-analyses that combines the advantages of existing software for this task.

  15. Detecting Optic Atrophy in Multiple Sclerosis Patients Using New Colorimetric Analysis Software: From Idea to Application.

    Science.gov (United States)

    Bambo, Maria Pilar; Garcia-Martin, Elena; Perez-Olivan, Susana; Larrosa-Povés, José Manuel; Polo-Llorens, Vicente; Gonzalez-De la Rosa, Manuel

    2016-01-01

    Neuro-ophthalmologists typically observe a temporal pallor of the optic disc in patients with multiple sclerosis. Here, we describe the emergence of an idea to quantify these optic disc color changes in multiple sclerosis patients. We recruited 12 multiple sclerosis patients with previous optic neuritis attack and obtained photographs of their optic discs. The Laguna ONhE, a new colorimetric software using hemoglobin as the reference pigment in the papilla, was used for the analysis. The papilla of these multiple sclerosis patients showed greater pallor, especially in the temporal sector. The software detected the pallor and assigned hemoglobin percentages below normal reference values. Measurements of optic disc hemoglobin levels obtained with the Laguna ONhE software program had good ability to detect optic atrophy and, consequently, axonal loss in multiple sclerosis patients. This new technology is easy to implement in routine clinical practice.

  16. Comparative analysis of methods for testing software of radio-electronic equipment

    Directory of Open Access Journals (Sweden)

    G. A. Mirskikh

    2011-03-01

    Full Text Available The analysis of the concepts of quality and reliability of software products that are part of the radio-electronic equipment is making. Basis testing methods of software products that are used in the design of hardware and software systems, to ensure quality and reliability are given. We consider testing in accordance with the methodology of the "black box" and "white box" methods of integration testing from the bottom up and top down, as well as various modifications of these methods. Efficient criteria that allow you to select the method of testing programs based on their structure and organizational and financial factors that affect the quality of the implementation of the design process are leaded.

  17. Fuzzy system for risk analysis in software projects through the attributes of quality standards iso 25000

    Directory of Open Access Journals (Sweden)

    Chau Sen Shia

    2014-02-01

    Full Text Available With the growth in demand for products and services in the IT area, companies encounter difficulties in establishing a metric or measure of quality of services to address measurably qualitative values in their planning. In this work fuzzy logic, standard SQuaRE (measurement of the quality of software products, Likertscale, GQM method (Goal-Question-Metric -indicator of quality of Software and the project risk analysis model of Boehm were used to assess the quality of services and decision-making, according to your demand and requests for software development. With the aim of improving the quality in the provision of services, the application is used to integrate the team and follow the life cycle of a project from its initial phase, and to assist in the comparison with the proposed schedule during the requirements elicitation.

  18. Reliability Analysis of Component Software in Wireless Sensor Networks Based on Transformation of Testing Data

    Directory of Open Access Journals (Sweden)

    Chunyan Hou

    2009-08-01

    Full Text Available We develop an approach of component software reliability analysis which includes the benefits of both time domain, and structure based approaches. This approach overcomes the deficiency of existing NHPP techniques that fall short of addressing repair, and internal system structures simultaneously. Our solution adopts a method of transformation of testing data to cover both methods, and is expected to improve reliability prediction. This paradigm allows component-based software testing process doesn’t meet the assumption of NHPP models, and accounts for software structures by the way of modeling the testing process. According to the testing model it builds the mapping relation from the testing profile to the operational profile which enables the transformation of the testing data to build the reliability dataset required by NHPP models. At last an example is evaluated to validate and show the effectiveness of this approach.

  19. Software package for the design and analysis of DNA origami structures

    DEFF Research Database (Denmark)

    Andersen, Ebbe Sloth; Nielsen, Morten Muhlig; Dong, Mingdong;

    was observed on the mica surface with a fraction of the dolphin nanostructures showing extensive tail flexibility of approximately 90 degrees. The Java editor and tools are free software distributed under the GNU license. The open architecture of the editor makes it easy for the scientific community......A software package was developed for the semi-automated design of DNA origamis and further data analysis of Atomic Force Microscopy (AFM) images. As an example, we design the shape of a bottlenose dolphin and analyze it by means of high resolution AFM imaging. A high yield of DNA dolphins...... to contribute new tools and functionalities. Documentation, tutorials and software will be made available online....

  20. FEATUREOUS: AN INTEGRATED ENVIRONMENT FOR FEATURE-CENTRIC ANALYSIS AND MODIFICATION OF OBJECT-ORIENTED SOFTWARE

    DEFF Research Database (Denmark)

    Olszak, Andrzej; Jørgensen, Bo Nørregaard

    2011-01-01

    a lightweight feature location mechanism, a number of reusable analytical views, and necessary APIs for supporting future extensions. The base of the integrated development environment is a conceptual framework comprising of three complementary dimensions of comprehension: perspective, abstraction......The decentralized nature of collaborations between objects in object-oriented software makes it difficult to understand the implementations of user-observable program features and their respective interdependencies. As feature-centric program understanding and modification are essential during...... software maintenance and evolution, this situation needs to change. In this paper, we present Featureous, an integrated development environment built on top of the NetBeans IDE that facilitates feature-centric analysis of object-oriented software. Our integrated development environment encompasses...

  1. Automated Source Code Analysis to Identify and Remove Software Security Vulnerabilities: Case Studies on Java Programs

    Directory of Open Access Journals (Sweden)

    Natarajan Meghanathan

    2013-01-01

    Full Text Available The high-level contribution of this paper is to illustrate the development of generic solution strategies to remove software security vulnerabilities that could be identified using automated tools for source code analysis on software programs (developed in Java. We use the Source Code Analyzer and Audit Workbench automated tools, developed by HP Fortify Inc., for our testing purposes. We present case studies involving a file writer program embedded with features for password validation, and connection-oriented server socket programs to discover, analyze the impact and remove the following software security vulnerabilities: (i Hardcoded Password, (ii Empty Password Initialization, (iii Denial of Service, (iv System Information Leak, (v Unreleased Resource, (vi Path Manipulation, and (vii Resource Injection vulnerabilities. For each of these vulnerabilities, we describe the potential risks associated with leaving them unattended in a software program, and provide the solutions (including the code snippets in Java that can be incorporated to remove these vulnerabilities. The proposed solutions are very generic in nature, and can be suitably modified to correct any such vulnerabilities in software developed in any other programming language.

  2. Software Analysis of New Space Gravity Data for Geophysics and Climate Research

    Science.gov (United States)

    Deese, Rupert; Ivins, Erik R.; Fielding, Eric J.

    2012-01-01

    Both the Gravity Recovery and Climate Experiment (GRACE) and Gravity field and steady-state Ocean Circulation Explorer (GOCE) satellites are returning rich data for the study of the solid earth, the oceans, and the climate. Current software analysis tools do not provide researchers with the ease and flexibility required to make full use of this data. We evaluate the capabilities and shortcomings of existing software tools including Mathematica, the GOCE User Toolbox, the ICGEM's (International Center for Global Earth Models) web server, and Tesseroids. Using existing tools as necessary, we design and implement software with the capability to produce gridded data and publication quality renderings from raw gravity data. The straight forward software interface marks an improvement over previously existing tools and makes new space gravity data more useful to researchers. Using the software we calculate Bouguer anomalies of the gravity tensor's vertical component in the Gulf of Mexico, Antarctica, and the 2010 Maule earthquake region. These maps identify promising areas of future research.

  3. COMPARATIVE ANALYSIS OF COMPUTER SOFTWARE AND BRAILLE LITERACY TO EDUCATE STUDENTS HAVING VISUAL IMPAIRMENT

    Directory of Open Access Journals (Sweden)

    Ismat Bano

    2011-10-01

    Full Text Available This research investigates the comparative analysis of computer software and Braille literacy to educate students having visual impairment. The main objective of this research focus on compare the feasibility andusage of Braille literacy and computer software to educate children with visual impairment. Main objectives of the study were to identify the importance of Braille and Computer literacy by the perceptions of male and female students with visual impairment, to identify the importance of the Braille and Computer literacy in different classes of students with visual impairment and to identify the difference of Braille and Computer literacy importance in different schools of students with visual impairment. Five special education institutions were selected where students with visual impairment were studying. A convenient sample of 100 students was taken from these schools. A three point rating scale was used as research instrument. Researchers personally collected data from the respondents. Data was analyzed through SPSS. Major findings showed that students were more interested in Braille system than computer software. Braille system and required material was resent in all the schools while computer teachers with required experience were not available in these institutions. Teachers were found expert in Braille literacy as compare to the computer software- It was recommended that proper awareness about most recent technologies were necessary for teachers in special education institutions. Students as well as teachers should be provided chances of hands on practice to create interest in computer software use in special education.

  4. STATIC CODE ANALYSIS FOR SOFTWARE QUALITY IMPROVEMENT: A CASE STUDY IN BCI FRAMEWORK DEVELOPMENT

    Directory of Open Access Journals (Sweden)

    Indar Sugiarto

    2008-01-01

    Full Text Available This paper shows how the systematic approach in software testing using static code analysis method can be used for improving the software quality of a BCI framework. The method is best performed during the development phase of framework programs. In the proposed approach, we evaluate several software metrics which are based on the principles of object oriented design. Since such method is depending on the underlying programming language, we describe the method in term of C++ language programming whereas the Qt platform is also currently being used. One of the most important metric is so called software complexity. Applying the software complexity calculation using both McCabe and Halstead method for the BCI framework which consists of two important types of BCI, those are SSVEP and P300, we found that there are two classes in the framework which have very complex and prone to violation of cohesion principle in OOP. The other metrics are fit the criteria of the proposed framework aspects, such as: MPC is less than 20; average complexity is around value of 5; and the maximum depth is below 10 blocks. Such variables are considered very important when further developing the BCI framework in the future.

  5. Software for analysis of chemical mixtures--composition, occurrence, distribution, and possible toxicity

    Science.gov (United States)

    Scott, Jonathon C.; Skach, Kenneth A.; Toccalino, Patricia L.

    2013-01-01

    The composition, occurrence, distribution, and possible toxicity of chemical mixtures in the environment are research concerns of the U.S. Geological Survey and others. The presence of specific chemical mixtures may serve as indicators of natural phenomena or human-caused events. Chemical mixtures may also have ecological, industrial, geochemical, or toxicological effects. Chemical-mixture occurrences vary by analyte composition and concentration. Four related computer programs have been developed by the National Water-Quality Assessment Program of the U.S. Geological Survey for research of chemical-mixture compositions, occurrences, distributions, and possible toxicities. The compositions and occurrences are identified for the user-supplied data, and therefore the resultant counts are constrained by the user’s choices for the selection of chemicals, reporting limits for the analytical methods, spatial coverage, and time span for the data supplied. The distribution of chemical mixtures may be spatial, temporal, and (or) related to some other variable, such as chemical usage. Possible toxicities optionally are estimated from user-supplied benchmark data. The software for the analysis of chemical mixtures described in this report is designed to work with chemical-analysis data files retrieved from the U.S. Geological Survey National Water Information System but can also be used with appropriately formatted data from other sources. Installation and usage of the mixture software are documented. This mixture software was designed to function with minimal changes on a variety of computer-operating systems. To obtain the software described herein and other U.S. Geological Survey software, visit http://water.usgs.gov/software/.

  6. 软件保护的分析与思考%Analysis and Reflection Software Protection

    Institute of Scientific and Technical Information of China (English)

    袁淑丹; 黎成; 任子亭

    2014-01-01

    该文在分析了以往的软件加密方法和对比研究之后,得出将软硬件结合进行软件保护的方案,于是提出了基于硬盘序列号进行软件加密保护的研究。将软硬件加密技术结合使用,硬件方面通过对比分析得出要使用硬盘序列号进行加密依据,基于计算机硬盘序列号具有唯一性特点,可以更好的实现一码一机制,并且在软件加密技术上进一步改进,使用对称加密算法与非对称加密算法结合,让软件的保护强度进一步提高。%Based on the analysis of the past after a software encryption method and comparative study ,the combination of hard⁃ware and software obtained protection scheme, So put the research-based encryption software to protect the hard drive serial number. The combination of hardware and software encryption technology used by comparative analysis of results of hardware to be used to encrypt the hard drive serial number basis, based on the computer's hard drive serial number of unique characteristics that can better achieve one yard a mechanism, and software encryption technology further improvements, combined with the use of a symmetric encryption algorithm asymmetric encryption algorithm, allowing the software to further enhance the strength of protection.

  7. The review of the modeling methods and numerical analysis software for nanotechnology in material science

    Directory of Open Access Journals (Sweden)

    SMIRNOV Vladimir Alexeevich

    2014-10-01

    Full Text Available Due to the high demand for building materials with universal set of roperties which extend their application area the research efforts are focusing on nanotechnology in material science. The rational combination of theoretical studies, mathematical modeling and simulation can favour reduced resource and time consumption when nanomodified materials are being developed. The development of composite material is based on the principles of system analysis which provides for the necessity of criteria determination and further classification of modeling methods. In this work the criteria of spatial scale, dominant type of interaction and heterogeneity are used for such classification. The presented classification became a framework for analysis of methods and software which can be applied to the development of building materials. For each of selected spatial levels - from atomistic one to macrostructural level of constructional coarsegrained composite – existing theories, modeling algorithms and tools have been considered. At the level of macrostructure which is formed under influence of gravity and exterior forces one can apply probabilistic and geometrical methods to study obtained structure. The existing models are suitable for packing density analysis and solution of percolation problems at the macroscopic level, but there are still no software tools which could be applied in nanotechnology to carry out systematic investigations. At the microstructure level it’s possible to use particle method along with probabilistic and statistical methods to explore structure formation but available software tools are partially suitable for numerical analysis of microstructure models. Therefore, modeling of the microstructure is rather complicated; the model has to include potential of pairwise interaction. After the model has been constructed and parameters of pairwise potential have been determined, many software packages for solution of ordinary

  8. Maintaining and improving of the training program on the analysis software in CMS

    International Nuclear Information System (INIS)

    Since 2009, the CMS experiment at LHC has provided intensive training on the use of Physics Analysis Tools (PAT), a collection of common analysis tools designed to share expertise and maximize productivity in the physics analysis. More than ten one-week courses preceded by prerequisite studies have been organized and the feedback from the participants has been carefully analyzed. This note describes how the training team designs, maintains and improves the course contents based on the feedback, the evolving analysis practices and the software development.

  9. A Comparative Study of Measurement Accuracy of Cyber Space Analysis Software with Manual Method in Mixed Dentition

    Directory of Open Access Journals (Sweden)

    Sheibani Nia A.

    2011-04-01

    Full Text Available Statement of Problem: One of the considerations regarding the space analysis in study casts is the issue of speed and precision in analysis.Purpose: This research aimed to design a software to conduct space analysis and evaluate its accuracy compared to manual space analysis.Materials and Method: This reasearch was conducted in two stages: Exploratory and Cross- Sectional. The subjects were selected randomly from patients between 7 to 11 years of age referring to the orthodontics clinic at the dentistry school of the Islamic Azad University. About 30 study models (15 pairs were randomly selected. Space analysis with manual method was performed .Space analysis with the aid of Cyber Space Analysis software was also carried out by locating the required landmarks on digital images of dental casts. The accuracy of this software in comparison with the manual method in space analysis for mixed dentition was evaluated using T student test.Results: The average time required for space analysis using the software was 3.4 minutes as compared to the manual method which took 7.81 minutes on average. The results from software-assisted space analysis showed no significant difference with that obtained from the manual method ( p < 0.0001.Conclusion: The results from software-assisted space analysis were similar to those of the manual method. However, software-assisted space analysis is carried out more quickly than the manual method.

  10. GEMBASSY: an EMBOSS associated software package for comprehensive genome analyses

    OpenAIRE

    Itaya, Hidetoshi; Oshita, Kazuki; Arakawa, Kazuharu; Tomita, Masaru

    2013-01-01

    The popular European Molecular Biology Open Software Suite (EMBOSS) currently contains over 400 tools used in various bioinformatics researches, equipped with sophisticated development frameworks for interoperability and tool discoverability as well as rich documentations and various user interfaces. In order to further strengthen EMBOSS in the fields of genomics, we here present a novel EMBOSS associated software (EMBASSY) package named GEMBASSY, which adds more than 50 analysis tools from t...

  11. Gardony Map Drawing Analyzer: Software for quantitative analysis of sketch maps.

    Science.gov (United States)

    Gardony, Aaron L; Taylor, Holly A; Brunyé, Tad T

    2016-03-01

    Sketch maps are effective tools for assessing spatial memory. However, despite their widespread use in cognitive science research, sketch map analysis techniques remain unstandardized and carry limitations. In the present article, we present the Gardony Map Drawing Analyzer (GMDA), an open-source software package for sketch map analysis. GMDA combines novel and established analysis techniques into a graphical user interface that permits rapid computational sketch map analysis. GMDA calculates GMDA-unique measures based on pairwise comparisons between landmarks, as well as bidimensional regression parameters (Friedman & Kohler, 2003), which together reflect sketch map quality at two levels: configural and individual landmark. The configural measures assess the overall landmark configuration and provide a whole-map analysis. Individual landmark measures, introduced in GMDA, assess individual landmark placement and indicate how individual landmarks contribute to the configural scores. Together, these measures provide a more complete psychometric picture of sketch map analysis, allowing for comparisons between sketch maps and between landmarks. The calculated measures reflect specific and cognitively relevant aspects of interlandmark spatial relationships, including distance and angular representation. GMDA supports complex environments (up to 48 landmarks) and two software modes that capture aspects of maps not addressed by existing techniques, such as landmark size and shape variation and interlandmark containment relationships. We describe the software and its operation and present a formal specification of calculation procedures for its unique measures. We then validate the software by demonstrating the capabilities and reliability of its measures using simulation and experimental data. The most recent version of GMDA is available at www.aarongardony.com/tools/map-drawing-analyzer. PMID:25673320

  12. PyElph - a software tool for gel images analysis and phylogenetics

    Directory of Open Access Journals (Sweden)

    Pavel Ana Brânduşa

    2012-01-01

    Full Text Available Abstract Background This paper presents PyElph, a software tool which automatically extracts data from gel images, computes the molecular weights of the analyzed molecules or fragments, compares DNA patterns which result from experiments with molecular genetic markers and, also, generates phylogenetic trees computed by five clustering methods, using the information extracted from the analyzed gel image. The software can be successfully used for population genetics, phylogenetics, taxonomic studies and other applications which require gel image analysis. Researchers and students working in molecular biology and genetics would benefit greatly from the proposed software because it is free, open source, easy to use, has a friendly Graphical User Interface and does not depend on specific image acquisition devices like other commercial programs with similar functionalities do. Results PyElph software tool is entirely implemented in Python which is a very popular programming language among the bioinformatics community. It provides a very friendly Graphical User Interface which was designed in six steps that gradually lead to the results. The user is guided through the following steps: image loading and preparation, lane detection, band detection, molecular weights computation based on a molecular weight marker, band matching and finally, the computation and visualization of phylogenetic trees. A strong point of the software is the visualization component for the processed data. The Graphical User Interface provides operations for image manipulation and highlights lanes, bands and band matching in the analyzed gel image. All the data and images generated in each step can be saved. The software has been tested on several DNA patterns obtained from experiments with different genetic markers. Examples of genetic markers which can be analyzed using PyElph are RFLP (Restriction Fragment Length Polymorphism, AFLP (Amplified Fragment Length Polymorphism, RAPD

  13. Space Suit Joint Torque Measurement Method Validation

    Science.gov (United States)

    Valish, Dana; Eversley, Karina

    2012-01-01

    In 2009 and early 2010, a test method was developed and performed to quantify the torque required to manipulate joints in several existing operational and prototype space suits. This was done in an effort to develop joint torque requirements appropriate for a new Constellation Program space suit system. The same test method was levied on the Constellation space suit contractors to verify that their suit design met the requirements. However, because the original test was set up and conducted by a single test operator there was some question as to whether this method was repeatable enough to be considered a standard verification method for Constellation or other future development programs. In order to validate the method itself, a representative subset of the previous test was repeated, using the same information that would be available to space suit contractors, but set up and conducted by someone not familiar with the previous test. The resultant data was compared using graphical and statistical analysis; the results indicated a significant variance in values reported for a subset of the re-tested joints. Potential variables that could have affected the data were identified and a third round of testing was conducted in an attempt to eliminate and/or quantify the effects of these variables. The results of the third test effort will be used to determine whether or not the proposed joint torque methodology can be applied to future space suit development contracts.

  14. XTCE GOVSAT Tool Suite 1.0

    Science.gov (United States)

    Rice, J. Kevin

    2013-01-01

    The XTCE GOVSAT software suite contains three tools: validation, search, and reporting. The Extensible Markup Language (XML) Telemetric and Command Exchange (XTCE) GOVSAT Tool Suite is written in Java for manipulating XTCE XML files. XTCE is a Consultative Committee for Space Data Systems (CCSDS) and Object Management Group (OMG) specification for describing the format and information in telemetry and command packet streams. These descriptions are files that are used to configure real-time telemetry and command systems for mission operations. XTCE s purpose is to exchange database information between different systems. XTCE GOVSAT consists of rules for narrowing the use of XTCE for missions. The Validation Tool is used to syntax check GOVSAT XML files. The Search Tool is used to search (i.e. command and telemetry mnemonics) the GOVSAT XML files and view the results. Finally, the Reporting Tool is used to create command and telemetry reports. These reports can be displayed or printed for use by the operations team.

  15. Development and validation of a video analysis software for marine benthic applications

    Science.gov (United States)

    Romero-Ramirez, A.; Grémare, A.; Bernard, G.; Pascal, L.; Maire, O.; Duchêne, J. C.

    2016-10-01

    Our aim in the EU funded JERICO project was to develop a flexible and scalable imaging platform that could be used in the widest possible set of ecological situations. Depending on research objectives, both image acquisition and analysis procedures may indeed differ. Up to now the attempts for automating image analysis procedures have consisted of the development of pieces of software specifically designed for a given objective. This led to the conception of a new software: AVIExplore. Its general architecture and its three constitutive modules: AVIExplore - Mobile, AVIExplore - Fixed and AVIExplore - ScriptEdit are presented. AVIExplore provides a unique environment for video analysis. Its main features include: (1) image selection tools allowing for the division of videos in homogeneous sections, (2) automatic extraction of targeted information, (3) solutions for long-term time-series as well as large spatial scale image acquisition, (4) real time acquisition and in some cases real time analysis, and (5) a large range of customized image-analysis possibilities through a script editor. The flexibility of use of AVIExplore is illustrated and validated by three case studies: (1) coral identification and mapping, (2) identification and quantification of different types of behaviors in a mud shrimp, and (3) quantification of filtering activity in a passive suspension-feeder. The accuracy of the software is measured comparing with visual assessment. It is: 90.2%, 82.7%, and 98.3% for the three case studies, respectively. Some of the advantages and current limitations of the software as well as some of its foreseen advancements are then briefly discussed.

  16. An Effective Strategy to Build Up a Balanced Test Suite for Spectrum-Based Fault Localization

    Directory of Open Access Journals (Sweden)

    Ning Li

    2016-01-01

    Full Text Available During past decades, many automated software faults diagnosis techniques including Spectrum-Based Fault Localization (SBFL have been proposed to improve the efficiency of software debugging activity. In the field of SBFL, suspiciousness calculation is closely related to the number of failed and passed test cases. Studies have shown that the ratio of the number of failed and passed test case has more significant impact on the accuracy of SBFL than the total number of test cases, and a balanced test suite is more beneficial to improving the accuracy of SBFL. Based on theoretical analysis, we proposed an PNF (Passed test cases, Not execute Faulty statement strategy to reduce test suite and build up a more balanced one for SBFL, which can be used in regression testing. We evaluated the strategy making experiments using the Siemens program and Space program. Experiments indicated that our PNF strategy can be used to construct a new test suite effectively. Compared with the original test suite, the new one has smaller size (average 90% test case was reduced in experiments and more balanced ratio of failed test cases to passed test cases, while it has the same statement coverage and fault localization accuracy.

  17. Structured System Test Suite Generation Process for Multi-Agent System

    Directory of Open Access Journals (Sweden)

    Zina Houhamdi,

    2011-04-01

    Full Text Available In recent years, Agent-Oriented Software Engineering (AOSE methodologies are proposed to develop complex distributed systems based upon the agent paradigm. The implementation for suchsystems has usually the form of Multi-Agent Systems (MAS. MAS’ testing is a challenging task because these systems are often programmed to be autonomous and deliberative, and they operate in an open world, which requires context awareness. In this paper, we introduce a novel approach for goal-oriented software system testing. It specifies a testing process that complements the goal oriented methodology Tropos and reinforces the mutual relationship between goal analysis and testing. Furthermore, it defines a structured and comprehensive system test suite derivation process for engineering software agents by providing a systematic way of deriving test cases from goal analysis.

  18. TiLIA: a software package for image analysis of firefly flash patterns.

    Science.gov (United States)

    Konno, Junsuke; Hatta-Ohashi, Yoko; Akiyoshi, Ryutaro; Thancharoen, Anchana; Silalom, Somyot; Sakchoowong, Watana; Yiu, Vor; Ohba, Nobuyoshi; Suzuki, Hirobumi

    2016-05-01

    As flash signaling patterns of fireflies are species specific, signal-pattern analysis is important for understanding this system of communication. Here, we present time-lapse image analysis (TiLIA), a free open-source software package for signal and flight pattern analyses of fireflies that uses video-recorded image data. TiLIA enables flight path tracing of individual fireflies and provides frame-by-frame coordinates and light intensity data. As an example of TiLIA capabilities, we demonstrate flash pattern analysis of the fireflies Luciola cruciata and L. lateralis during courtship behavior. PMID:27069594

  19. HPC Benchmark Suite NMx Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Intelligent Automation Inc., (IAI) and University of Central Florida (UCF) propose to develop a comprehensive numerical test suite for benchmarking current and...

  20. ICAS-PAT: A Software for Design, Analysis and Validation of PAT Systems

    DEFF Research Database (Denmark)

    Singh, Ravendra; Gernaey, Krist; Gani, Rafiqul

    2010-01-01

    methodology to design appropriate PAT systems. This methodology has now been implemented into a systematic computer-aided framework to develop a software (ICAS-PAT) for design, validation and analysis of PAT systems. Two supporting tools needed by ICAS-PAT have also been developed: a knowledge base...... (consisting of process knowledge as well as knowledge on measurement methods and tools) and a generic model library (consisting of process operational models). Through a tablet manufacturing process example, the application of ICAS-PAT is illustrated, highlighting as well, the main features of the software....... end product qualities. In an earlier article, Singh et al. [Singh, R., Gernaey, K. V., Gani, R. (2009). Model-based computer-aided framework for design of process monitoring and analysis systems. Computers & Chemical Engineering, 33, 22–42] proposed the use of a systematic model and data based...

  1. Software systems for processing and analysis of experimental data at the Nova laser facility

    International Nuclear Information System (INIS)

    A typical laser-plasma interaction experiment at the Nova laser facility produces in excess of 20 megabytes of digitized data. Extensive processing and analysis of this raw data from a wide variety of instruments is necessary to produce data that can be readily used to interpret the experiment. The authors describe how using VAX based computer hardware, a software system has been set up to convert the digitized instrument output to physics quantities describing the experiment. A relational data base management system is used to coordinate all levels of processing and analysis. Extensive data bases of instrument response and set-up parameters are used at all levels of processing and archiving. An extensive set of programs is used to handle the large amounts of X, Y, Z data recorded on film by the bulk of Nova diagnostics. Software development emphasizes structured design, flexibility, automation and ease of use

  2. Integrated State Estimation and Contingency Analysis Software Implementation using High Performance Computing Techniques

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Yousu; Glaesemann, Kurt R.; Rice, Mark J.; Huang, Zhenyu

    2015-12-31

    Power system simulation tools are traditionally developed in sequential mode and codes are optimized for single core computing only. However, the increasing complexity in the power grid models requires more intensive computation. The traditional simulation tools will soon not be able to meet the grid operation requirements. Therefore, power system simulation tools need to evolve accordingly to provide faster and better results for grid operations. This paper presents an integrated state estimation and contingency analysis software implementation using high performance computing techniques. The software is able to solve large size state estimation problems within one second and achieve a near-linear speedup of 9,800 with 10,000 cores for contingency analysis application. The performance evaluation is presented to show its effectiveness.

  3. OpenMS: a flexible open-source software platform for mass spectrometry data analysis.

    Science.gov (United States)

    Röst, Hannes L; Sachsenberg, Timo; Aiche, Stephan; Bielow, Chris; Weisser, Hendrik; Aicheler, Fabian; Andreotti, Sandro; Ehrlich, Hans-Christian; Gutenbrunner, Petra; Kenar, Erhan; Liang, Xiao; Nahnsen, Sven; Nilse, Lars; Pfeuffer, Julianus; Rosenberger, George; Rurik, Marc; Schmitt, Uwe; Veit, Johannes; Walzer, Mathias; Wojnar, David; Wolski, Witold E; Schilling, Oliver; Choudhary, Jyoti S; Malmström, Lars; Aebersold, Ruedi; Reinert, Knut; Kohlbacher, Oliver

    2016-08-30

    High-resolution mass spectrometry (MS) has become an important tool in the life sciences, contributing to the diagnosis and understanding of human diseases, elucidating biomolecular structural information and characterizing cellular signaling networks. However, the rapid growth in the volume and complexity of MS data makes transparent, accurate and reproducible analysis difficult. We present OpenMS 2.0 (http://www.openms.de), a robust, open-source, cross-platform software specifically designed for the flexible and reproducible analysis of high-throughput MS data. The extensible OpenMS software implements common mass spectrometric data processing tasks through a well-defined application programming interface in C++ and Python and through standardized open data formats. OpenMS additionally provides a set of 185 tools and ready-made workflows for common mass spectrometric data processing tasks, which enable users to perform complex quantitative mass spectrometric analyses with ease.

  4. ACEMAN (II): a PDP-11 software package for acoustic emission analysis

    International Nuclear Information System (INIS)

    A powerful, but easy-to-use, software package (ACEMAN) for acoustic emission analysis has been developed at Berkeley Nuclear Laboratories. The system is based on a PDP-11 minicomputer with 24 K of memory, an RK05 DISK Drive and a Tektronix 4010 Graphics terminal. The operation of the system is described in detail in terms of the functions performed in response to the various command mnemonics. The ACEMAN software package offers many useful facilities not found on other acoustic emission monitoring systems. Its main features, many of which are unique, are summarised. The ACEMAN system automatically handles arrays of up to 12 sensors in real-time operation during which data are acquired, analysed, stored on the computer disk for future analysis and displayed on the terminal if required. (author)

  5. OpenMS: a flexible open-source software platform for mass spectrometry data analysis.

    Science.gov (United States)

    Röst, Hannes L; Sachsenberg, Timo; Aiche, Stephan; Bielow, Chris; Weisser, Hendrik; Aicheler, Fabian; Andreotti, Sandro; Ehrlich, Hans-Christian; Gutenbrunner, Petra; Kenar, Erhan; Liang, Xiao; Nahnsen, Sven; Nilse, Lars; Pfeuffer, Julianus; Rosenberger, George; Rurik, Marc; Schmitt, Uwe; Veit, Johannes; Walzer, Mathias; Wojnar, David; Wolski, Witold E; Schilling, Oliver; Choudhary, Jyoti S; Malmström, Lars; Aebersold, Ruedi; Reinert, Knut; Kohlbacher, Oliver

    2016-08-30

    High-resolution mass spectrometry (MS) has become an important tool in the life sciences, contributing to the diagnosis and understanding of human diseases, elucidating biomolecular structural information and characterizing cellular signaling networks. However, the rapid growth in the volume and complexity of MS data makes transparent, accurate and reproducible analysis difficult. We present OpenMS 2.0 (http://www.openms.de), a robust, open-source, cross-platform software specifically designed for the flexible and reproducible analysis of high-throughput MS data. The extensible OpenMS software implements common mass spectrometric data processing tasks through a well-defined application programming interface in C++ and Python and through standardized open data formats. OpenMS additionally provides a set of 185 tools and ready-made workflows for common mass spectrometric data processing tasks, which enable users to perform complex quantitative mass spectrometric analyses with ease. PMID:27575624

  6. Software para análise quantitativa da deglutição Swallowing quantitative analysis software

    OpenAIRE

    André Augusto Spadotto; Ana Rita Gatto; Paula Cristina Cola; Arlindo Neto Montagnoli; Arthur Oscar Schelp; Roberta Gonçalves da Silva; Seizo Yamashita; José Carlos Pereira; Maria Aparecida Coelho de Arruda Henry

    2008-01-01

    OBJETIVO: Apresentar um software que permita uma análise detalhada da dinâmica da deglutição. MATERIAIS E MÉTODOS: Participaram deste estudo dez indivíduos após acidente vascular encefálico, sendo seis do gênero masculino, com idade média de 57,6 anos. Foi realizada videofluoroscopia da deglutição e as imagens foram digitalizadas em microcomputador, com posterior análise do tempo do trânsito faríngeo da deglutição, por meio de um cronômetro e do software. RESULTADOS: O tempo médio do trânsito...

  7. Advanced EVA Suit Camera System Development Project

    Science.gov (United States)

    Mock, Kyla

    2016-01-01

    The National Aeronautics and Space Administration (NASA) at the Johnson Space Center (JSC) is developing a new extra-vehicular activity (EVA) suit known as the Advanced EVA Z2 Suit. All of the improvements to the EVA Suit provide the opportunity to update the technology of the video imagery. My summer internship project involved improving the video streaming capabilities of the cameras that will be used on the Z2 Suit for data acquisition. To accomplish this, I familiarized myself with the architecture of the camera that is currently being tested to be able to make improvements on the design. Because there is a lot of benefit to saving space, power, and weight on the EVA suit, my job was to use Altium Design to start designing a much smaller and simplified interface board for the camera's microprocessor and external components. This involved checking datasheets of various components and checking signal connections to ensure that this architecture could be used for both the Z2 suit and potentially other future projects. The Orion spacecraft is a specific project that may benefit from this condensed camera interface design. The camera's physical placement on the suit also needed to be determined and tested so that image resolution can be maximized. Many of the options of the camera placement may be tested along with other future suit testing. There are multiple teams that work on different parts of the suit, so the camera's placement could directly affect their research or design. For this reason, a big part of my project was initiating contact with other branches and setting up multiple meetings to learn more about the pros and cons of the potential camera placements we are analyzing. Collaboration with the multiple teams working on the Advanced EVA Z2 Suit is absolutely necessary and these comparisons will be used as further progress is made for the overall suit design. This prototype will not be finished in time for the scheduled Z2 Suit testing, so my time was

  8. Cardiomyocyte MEA data analysis (CardioMDA--a novel field potential data analysis software for pluripotent stem cell derived cardiomyocytes.

    Directory of Open Access Journals (Sweden)

    Paruthi Pradhapan

    Full Text Available Cardiac safety pharmacology requires in-vitro testing of all drug candidates before clinical trials in order to ensure they are screened for cardio-toxic effects which may result in severe arrhythmias. Micro-electrode arrays (MEA serve as a complement to current in-vitro methods for drug safety testing. However, MEA recordings produce huge volumes of data and manual analysis forms a bottleneck for high-throughput screening. To overcome this issue, we have developed an offline, semi-automatic data analysis software, 'Cardiomyocyte MEA Data Analysis (CardioMDA', equipped with correlation analysis and ensemble averaging techniques to improve the accuracy, reliability and throughput rate of analysing human pluripotent stem cell derived cardiomyocyte (CM field potentials. With the program, true field potential and arrhythmogenic complexes can be distinguished from one another. The averaged field potential complexes, analysed using our software to determine the field potential duration, were compared with the analogous values obtained from manual analysis. The reliability of the correlation analysis algorithm, evaluated using various arrhythmogenic and morphology changing signals, revealed a mean sensitivity and specificity of 99.27% and 94.49% respectively, in determining true field potential complexes. The field potential duration of the averaged waveforms corresponded well to the manually analysed data, thus demonstrating the reliability of the software. The software has also the capability to create overlay plots for signals recorded under different drug concentrations in order to visualize and compare the magnitude of response on different ion channels as a result of drug treatment. Our novel field potential analysis platform will facilitate the analysis of CM MEA signals in semi-automated way and provide a reliable means of efficient and swift analysis for cardiomyocyte drug or disease model studies.

  9. CSA06 Computing, Software and Analysis challenge at the Spanish Tier-1 and Tier-2 sites

    CERN Document Server

    Alcaraz, J; Cabrillo, Iban Jose; Colino, Nicanor; Cuevas-Maestro, J; Delgado Peris, Antonio; Fernandez Menendez, Javier; Flix, Jose; García-Abia, Pablo; González-Caballero, I; Hernández, Jose M; Marco, Rafael; Martinez Ruiz del Arbol, Pablo; Matorras, Francisco; Merino, Gonzalo; Rodríguez-Calonge, F J; Vizan Garcia, Jesus Manuel

    2007-01-01

    This note describes the participation of the Spanish centres PIC, CIEMAT and IFCA as Tier-1 and Tier-2 sites in the CMS CSA06 Computing, Software and Analysis challenge. A number of the facilities, services and workflows have been demonstrated at the 2008 25% scale. Very valuable experience has been gained running the complex computing system under realistic conditions at a significant scale. The focus of this note is on presenting achieved results, operational experience and lessons learnt during the challenge.

  10. Using a data model from software design to data analysis: What have we learned?

    International Nuclear Information System (INIS)

    The ADAMO data system is being used in a number of particle physics experiments. Experience with it indicates that data modelling is a powerful program design method that extends across the whole software life-cycle, although existing support tools are not yet satisfactory. The entity relationship data of ADAMO can be handled by various programming languages, and can be used effectively in interactive data analysis. (orig.)

  11. Structural dynamics teaching example: A linear test analysis case using open software

    DEFF Research Database (Denmark)

    Sturesson, P. O.; Brandt, A.; Ristinmaa, M.

    2013-01-01

    experimental modal analysis data. By using open software, based on MATLAB®1 as a basis for the example, the applied numerical methods are made transparent to the student. The example is built on a combination of the free CALFEM®2 and ABRAVIBE toolboxes, and thus all code used in this paper is publically...... available as open source code. © The Society for Experimental Mechanics, Inc. 2013....

  12. Periodic precipitation a microcomputer analysis of transport and reaction processes in diffusion media, with software development

    CERN Document Server

    Henisch, H K

    1991-01-01

    Containing illustrations, worked examples, graphs and tables, this book deals with periodic precipitation (also known as Liesegang Ring formation) in terms of mathematical models and their logical consequences, and is entirely concerned with microcomputer analysis and software development. Three distinctive periodic precipitation mechanisms are included: binary diffusion-reaction; solubility modulation, and competitive particle growth. The book provides didactic illustrations of a valuable investigational procedure, in the form of hypothetical experimentation by microcomputer. The development

  13. An Analysis of Mimosa pudica Leaves Movement by Using LoggerPro Software

    Science.gov (United States)

    Sugito; Susilo; Handayani, L.; Marwoto, P.

    2016-08-01

    The unique phenomena of Mimosa pudica are the closing and opening movements of its leaves when they got a stimulus. By using certain software, these movements can be drawn into graphic that can be analysed. The LoggerPro provides facilities needed to analyse recorded videos of the plant's reaction to stimulus. Then, through the resulted graph, analysis of some variables can be carried out. The result showed that the plant's movement fits an equation of y = mx + c.

  14. HDX Workbench: Software for the Analysis of H/D Exchange MS Data

    OpenAIRE

    Pascal, Bruce D; Willis, Scooter; Lauer, Janelle L.; Landgraf, Rachelle R.; West, Graham M.; Marciano, David; Novick, Scott; Goswami, Devrishi; Chalmers, Michael J.; Griffin, Patrick R.

    2012-01-01

    Hydrogen/deuterium exchange mass spectrometry (HDX-MS) is an established method for the interrogation of protein conformation and dynamics. While the data analysis challenge of HDX-MS has been addressed by a number of software packages, new computational tools are needed to keep pace with the improved methods and throughput of this technique. To address these needs, we report an integrated desktop program titled HDX Workbench, which facilitates automation, management, visualization, and stati...

  15. A Study of Method on Connectivity Analysis of Between Software Components

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    An analysis and computation method of conne ctivity betweencomponents that based on logical subtyping is first presented, t he concepts of virtual interface and real interface, and quantitative analy sis and computation formula of connectivity between interfaces are also introduc ed, that based on a extendable software architecture specification language model. We provide a n ew idea for solving the problem of connection between reuse-components.

  16. An Evaluation on the Usage of Intelligent Video Analysis Software for Marketing Strategies

    Directory of Open Access Journals (Sweden)

    Kadri Gökhan Yılmaz

    2013-12-01

    Full Text Available This study investigates the historical development of the relation between companies and technology. Especially, it focuses on the new technology adaptation in the retail industry due to both the widespread technology usage in this sector and its technology guiding role. The usage of one of the current new technologies, intelligent video analysis software systems, in retail industry is evaluated and measures for such systems are determined.

  17. CONAN: copy number variation analysis software for genome-wide association studies

    Directory of Open Access Journals (Sweden)

    Wichmann Heinz-Erich

    2010-06-01

    Full Text Available Abstract Background Genome-wide association studies (GWAS based on single nucleotide polymorphisms (SNPs revolutionized our perception of the genetic regulation of complex traits and diseases. Copy number variations (CNVs promise to shed additional light on the genetic basis of monogenic as well as complex diseases and phenotypes. Indeed, the number of detected associations between CNVs and certain phenotypes are constantly increasing. However, while several software packages support the determination of CNVs from SNP chip data, the downstream statistical inference of CNV-phenotype associations is still subject to complicated and inefficient in-house solutions, thus strongly limiting the performance of GWAS based on CNVs. Results CONAN is a freely available client-server software solution which provides an intuitive graphical user interface for categorizing, analyzing and associating CNVs with phenotypes. Moreover, CONAN assists the evaluation process by visualizing detected associations via Manhattan plots in order to enable a rapid identification of genome-wide significant CNV regions. Various file formats including the information on CNVs in population samples are supported as input data. Conclusions CONAN facilitates the performance of GWAS based on CNVs and the visual analysis of calculated results. CONAN provides a rapid, valid and straightforward software solution to identify genetic variation underlying the 'missing' heritability for complex traits that remains unexplained by recent GWAS. The freely available software can be downloaded at http://genepi-conan.i-med.ac.at.

  18. RCAUSE – A ROOT CAUSE ANALYSIS MODEL TO IDENTIFY THE ROOT CAUSES OF SOFTWARE REENGINEERING PROBLEMS

    Directory of Open Access Journals (Sweden)

    Er. Anand Rajavat

    2011-01-01

    Full Text Available Organizations that wish to modernize their legacy systems, must adopt a financial viable evolution strategy to gratify the needs of modern business environment. There are various options available to modernize legacy system in to more contemporary system. Over the last few years’ legacy system reengineering has emerged as a popular system modernization technique. The reengineering generally focuses on the increased productivity and quality of the system. However many of these efforts are often less than successful because they only concentrate on symptoms of software reengineering risk without targeting root causes of those risk. A subjective assessment (diagnosis of software reengineering risk from different domain of legacy system is required to identify the root causes of those risks. The goal of this paper is to highlight root causes of software reengineering risk. We proposed a root cause analysis model RCause that classify root causes of software reengineering risk in to three distinctive but connected areas of interest i.e. system domain, managerial domain and technical domain. .

  19. Incorporation of Cutting for Customized Suits

    Institute of Scientific and Technical Information of China (English)

    XU Ji-hong; ZHANG Wen-bin

    2006-01-01

    The theoretical male body size and its distribution plan are studied in consideration with the nation standard and the information provided by several garment companies.Through nine times tries of marking two suits with the CAD marking module, the fabric length and rates are derived.The formulas of the marking fabric length, the length total,and the girth total are obtained by using SPSS software.Moreover, by comparing the two incorporation methods of cutting in which one in the arithmetic sequence and the other in the geometric sequence, it is found that the one in the arithmetic sequence is better than the one in the geometric sequence.

  20. Basic analysis of reflectometry data software package for the analysis of multilayered structures according to reflectometry data

    International Nuclear Information System (INIS)

    The main principles of developing the Basic Analysis of Reflectometry Data (BARD) software package, which is aimed at obtaining a unified (standardized) tool for analyzing the structure of thin multilayer films and nanostructures of different nature based on reflectometry data, are considered. This software package contains both traditionally used procedures for processing reflectometry data and the authors’ original developments on the basis of new methods for carrying out and analyzing reflectometry experiments. The structure of the package, its functional possibilities, examples of application, and prospects of development are reviewed.