WorldWideScience

Sample records for analysis software suite

  1. The ESA's Space Trajectory Analysis software suite

    Science.gov (United States)

    Ortega, Guillermo

    The European Space Agency (ESA) initiated in 2005 an internal activity to develop an open source software suite involving university science departments and research institutions all over the world. This project is called the "Space Trajectory Analysis" or STA. This article describes the birth of STA and its present configuration. One of the STA aims is to promote the exchange of technical ideas, and raise knowledge and competence in the areas of applied mathematics, space engineering, and informatics at University level. Conceived as a research and education tool to support the analysis phase of a space mission, STA is able to visualize a wide range of space trajectories. These include among others ascent, re-entry, descent and landing trajectories, orbits around planets and moons, interplanetary trajectories, rendezvous trajectories, etc. The article explains that STA project is an original idea of the Technical Directorate of ESA. It was born in August 2005 to provide a framework in astrodynamics research at University level. As research and education software applicable to Academia, a number of Universities support this development by joining ESA in leading the development. ESA and Universities partnership are expressed in the STA Steering Board. Together with ESA, each University has a chair in the board whose tasks are develop, control, promote, maintain, and expand the software suite. The article describes that STA provides calculations in the fields of spacecraft tracking, attitude analysis, coverage and visibility analysis, orbit determination, position and velocity of solar system bodies, etc. STA implements the concept of "space scenario" composed of Solar system bodies, spacecraft, ground stations, pads, etc. It is able to propagate the orbit of a spacecraft where orbital propagators are included. STA is able to compute communication links between objects of a scenario (coverage, line of sight), and to represent the trajectory computations and

  2. eXtended CASA Line Analysis Software Suite (XCLASS)

    CERN Document Server

    Möller, T; Schilke, P

    2015-01-01

    The eXtended CASA Line Analysis Software Suite (XCLASS) is a toolbox for the Common Astronomy Software Applications package (CASA) containing new functions for modeling interferometric and single dish data. Among the tools is the myXCLASS program which calculates synthetic spectra by solving the radiative transfer equation for an isothermal object in one dimension, whereas the finite source size and dust attenuation are considered as well. Molecular data required by the myXCLASS program are taken from an embedded SQLite3 database containing entries from the Cologne Database for Molecular Spectroscopy CDMS) and JPL using the Virtual Atomic and Molecular Data Center (VAMDC) portal. Additionally, the toolbox provides an interface for the model optimizer package Modeling and Analysis Generic Interface for eXternal numerical codes (MAGIX), which helps to find the best description of observational data using myXCLASS (or another external model program), i.e., finding the parameter set that most closely reproduces t...

  3. eXtended CASA Line Analysis Software Suite (XCLASS)

    Science.gov (United States)

    Möller, T.; Endres, C.; Schilke, P.

    2017-01-01

    The eXtended CASA Line Analysis Software Suite (XCLASS) is a toolbox for the Common Astronomy Software Applications package (CASA) containing new functions for modeling interferometric and single dish data. Among the tools is the myXCLASS program which calculates synthetic spectra by solving the radiative transfer equation for an isothermal object in one dimension, whereas the finite source size and dust attenuation are considered as well. Molecular data required by the myXCLASS program are taken from an embedded SQLite3 database containing entries from the Cologne Database for Molecular Spectroscopy (CDMS) and JPL using the Virtual Atomic and Molecular Data Center (VAMDC) portal. Additionally, the toolbox provides an interface for the model optimizer package Modeling and Analysis Generic Interface for eXternal numerical codes (MAGIX), which helps to find the best description of observational data using myXCLASS (or another external model program), that is, finding the parameter set that most closely reproduces the data. http://www.astro.uni-koeln.de/projects/schilke/myXCLASSInterface A copy of the code is available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (http://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/598/A7

  4. A Comprehensive Software Suite for the Analysis of cDNAs

    Institute of Scientific and Technical Information of China (English)

    Kazuharu Arakawa; Haruo Suzuki; Kosuke Fujishima; Kenji Fujimoto; Sho Ueda; Motomu Matsui; Masaru Tomita

    2005-01-01

    We have developed a comprehensive software suite for bioinformatics research of cDNAs; it is aimed at rapid characterization of the features of genes and the proteins they code. Methods implemented include the detection of translation initiation and termination signals, statistical analysis of codon usage, comparative study of amino acid composition, comparative modeling of the structures of product proteins, prediction of alternative splice forms, and metabolic pathway reconstruction.The software package is freely available under the GNU General Public License at http://www.g-language.org/data/cdna/.

  5. Tokamak-independent software analysis suite for multi-spectral line-polarization MSE diagnostics

    Science.gov (United States)

    Scott, S. D.; Mumgaard, R. T.

    2016-11-01

    A tokamak-independent analysis suite has been developed to process data from Motional Stark Effect (mse) diagnostics. The software supports multi-spectral line-polarization mse diagnostics which simultaneously measure emission at the mse σ and π lines as well as at two "background" wavelengths that are displaced from the mse spectrum by a few nanometers. This analysis accurately estimates the amplitude of partially polarized background light at the σ and π wavelengths even in situations where the background light changes rapidly in time and space, a distinct improvement over traditional "time-interpolation" background estimation. The signal amplitude at many frequencies is computed using a numerical-beat algorithm which allows the retardance of the mse photo-elastic modulators (pem's) to be monitored during routine operation. It also allows the use of summed intensities at multiple frequencies in the calculation of polarization direction, which increases the effective signal strength and reduces sensitivity to pem retardance drift. The software allows the polarization angles to be corrected for calibration drift using a system that illuminates the mse diagnostic with polarized light at four known polarization angles within ten seconds of a plasma discharge. The software suite is modular, parallelized, and portable to other facilities.

  6. The MetaProteomeAnalyzer: a powerful open-source software suite for metaproteomics data analysis and interpretation.

    Science.gov (United States)

    Muth, Thilo; Behne, Alexander; Heyer, Robert; Kohrs, Fabian; Benndorf, Dirk; Hoffmann, Marcus; Lehtevä, Miro; Reichl, Udo; Martens, Lennart; Rapp, Erdmann

    2015-03-06

    The enormous challenges of mass spectrometry-based metaproteomics are primarily related to the analysis and interpretation of the acquired data. This includes reliable identification of mass spectra and the meaningful integration of taxonomic and functional meta-information from samples containing hundreds of unknown species. To ease these difficulties, we developed a dedicated software suite, the MetaProteomeAnalyzer, an intuitive open-source tool for metaproteomics data analysis and interpretation, which includes multiple search engines and the feature to decrease data redundancy by grouping protein hits to so-called meta-proteins. We also designed a graph database back-end for the MetaProteomeAnalyzer to allow seamless analysis of results. The functionality of the MetaProteomeAnalyzer is demonstrated using a sample of a microbial community taken from a biogas plant.

  7. ORBS, ORCS, OACS, a Software Suite for Data Reduction and Analysis of the Hyperspectral Imagers SITELLE and SpIOMM

    Science.gov (United States)

    Martin, T.; Drissen, L.; Joncas, G.

    2015-09-01

    SITELLE (installed in 2015 at the Canada-France-Hawaii Telescope) and SpIOMM (a prototype attached to the Observatoire du Mont-Mégantic) are the first Imaging Fourier Transform Spectrometers (IFTS) capable of obtaining a hyperspectral data cube which samples a 12 arc minutes field of view into four millions of visible spectra. The result of each observation is made up of two interferometric data cubes which need to be merged, corrected, transformed and calibrated in order to get a spectral cube of the observed region ready to be analysed. ORBS is a fully automatic data reduction software that has been entirely designed for this purpose. The data size (up to 68 Gb for larger science cases) and the computational needs have been challenging and the highly parallelized object-oriented architecture of ORBS reflects the solutions adopted which made possible to process 68 Gb of raw data in less than 11 hours using 8 cores and 22.6 Gb of RAM. It is based on a core framework (ORB) that has been designed to support the whole software suite for data analysis (ORCS and OACS), data simulation (ORUS) and data acquisition (IRIS). They all aim to provide a strong basis for the creation and development of specialized analysis modules that could benefit the scientific community working with SITELLE and SpIOMM.

  8. Analysis of Array-CGH Data Using the R and Bioconductor Software Suite

    Directory of Open Access Journals (Sweden)

    Winfried A. Hofmann

    2009-01-01

    Full Text Available Background. Array-based comparative genomic hybridization (array-CGH is an emerging high-resolution and high-throughput molecular genetic technique that allows genome-wide screening for chromosome alterations. DNA copy number alterations (CNAs are a hallmark of somatic mutations in tumor genomes and congenital abnormalities that lead to diseases such as mental retardation. However, accurate identification of amplified or deleted regions requires a sequence of different computational analysis steps of the microarray data. Results. We have developed a user-friendly and versatile tool for the normalization, visualization, breakpoint detection, and comparative analysis of array-CGH data which allows the accurate and sensitive detection of CNAs. Conclusion. The implemented option for the determination of minimal altered regions (MARs from a series of tumor samples is a step forward in the identification of new tumor suppressor genes or oncogenes.

  9. The BTeV Software Tutorial Suite

    Energy Technology Data Exchange (ETDEWEB)

    Robert K. Kutschke

    2004-02-20

    The BTeV Collaboration is starting to develop its C++ based offline software suite, an integral part of which is a series of tutorials. These tutorials are targeted at a diverse audience, including new graduate students, experienced physicists with little or no C++ experience, those with just enough C++ to be dangerous, and experts who need only an overview of the available tools. The tutorials must both teach C++ in general and the BTeV specific tools in particular. Finally, they must teach physicists how to find and use the detailed documentation. This report will review the status of the BTeV experiment, give an overview of the plans for and the state of the software and will then describe the plans for the tutorial suite.

  10. A metrics suite for coupling measurement of software architecture

    Institute of Scientific and Technical Information of China (English)

    KONG Qing-yan; LUN Li-jun; ZHAO Jia-hua; WANG Yi-he

    2009-01-01

    To better evaluate the quality of software architecture,a metrics suite is proposed to measure the coupling of software architecture models,in which CBC is used to measure the coupling between components,CBCC is used to measure the coupling of transferring message between components,CBCCT is used to measure the coupling of software architecture, WCBCC is used to measure the coupling of transferring message with weight between components,and WCBCCT is used to measure the coupling of message transmission with weight in the whole software architecture.The proposed algorithm for the coupling metrics is applied to the design of serve software architecture.Analysis of an example validates the feasibility of this metrics suite.

  11. Engineering Software Suite Validates System Design

    Science.gov (United States)

    2007-01-01

    EDAptive Computing Inc.'s (ECI) EDAstar engineering software tool suite, created to capture and validate system design requirements, was significantly funded by NASA's Ames Research Center through five Small Business Innovation Research (SBIR) contracts. These programs specifically developed Syscape, used to capture executable specifications of multi-disciplinary systems, and VectorGen, used to automatically generate tests to ensure system implementations meet specifications. According to the company, the VectorGen tests considerably reduce the time and effort required to validate implementation of components, thereby ensuring their safe and reliable operation. EDASHIELD, an additional product offering from ECI, can be used to diagnose, predict, and correct errors after a system has been deployed using EDASTAR -created models. Initial commercialization for EDASTAR included application by a large prime contractor in a military setting, and customers include various branches within the U.S. Department of Defense, industry giants like the Lockheed Martin Corporation, Science Applications International Corporation, and Ball Aerospace and Technologies Corporation, as well as NASA's Langley and Glenn Research Centers

  12. Strengthening Software Authentication with the ROSE Software Suite

    Energy Technology Data Exchange (ETDEWEB)

    White, G

    2006-06-15

    Many recent nonproliferation and arms control software projects include a software authentication regime. These include U.S. Government-sponsored projects both in the United States and in the Russian Federation (RF). This trend toward requiring software authentication is only accelerating. Demonstrating assurance that software performs as expected without hidden ''backdoors'' is crucial to a project's success. In this context, ''authentication'' is defined as determining that a software package performs only its intended purpose and performs said purpose correctly and reliably over the planned duration of an agreement. In addition to visual inspections by knowledgeable computer scientists, automated tools are needed to highlight suspicious code constructs, both to aid visual inspection and to guide program development. While many commercial tools are available for portions of the authentication task, they are proprietary and not extensible. An open-source, extensible tool can be customized to the unique needs of each project (projects can have both common and custom rules to detect flaws and security holes). Any such extensible tool has to be based on a complete language compiler. ROSE is precisely such a compiler infrastructure developed within the Department of Energy (DOE) and targeted at the optimization of scientific applications and user-defined libraries within large-scale applications (typically applications of a million lines of code). ROSE is a robust, source-to-source analysis and optimization infrastructure currently addressing large, million-line DOE applications in C and C++ (handling the full C, C99, C++ languages and with current collaborations to support Fortran90). We propose to extend ROSE to address a number of security-specific requirements, and apply it to software authentication for nonproliferation and arms control projects.

  13. CAMEO (Computer-Aided Management of Emergency Operations) Software Suite

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — CAMEO is the umbrella name for a system of software applications used widely to plan for and respond to chemical emergencies. All of the programs in the suite work...

  14. RIBER/DIBER: a software suite for crystal content analysis in the studies of protein-nucleic acid complexes.

    Science.gov (United States)

    Chojnowski, Grzegorz; Bujnicki, Janusz M; Bochtler, Matthias

    2012-03-15

    Co-crystallization experiments of proteins with nucleic acids do not guarantee that both components are present in the crystal. We have previously developed DIBER to predict crystal content when protein and DNA are present in the crystallization mix. Here, we present RIBER, which should be used when protein and RNA are in the crystallization drop. The combined RIBER/DIBER suite builds on machine learning techniques to make reliable, quantitative predictions of crystal content for non-expert users and high-throughput crystallography.

  15. Tier-3 Monitoring Software Suite (T3MON) proposal

    CERN Document Server

    Andreeva, J; The ATLAS collaboration; Klimentov, A; Korenkov, V; Oleynik, D; Panitkin, S; Petrosyan, A

    2011-01-01

    The ATLAS Distributed Computing activities concentrated so far in the “central” part of the computing system of the experiment, namely the first 3 tiers (CERN Tier0, the 10 Tier1s centres and the 60+ Tier2s). This is a coherent system to perform data processing and management on a global scale and host (re)processing, simulation activities down to group and user analysis. Many ATLAS Institutes and National Communities built (or have plans to build) Tier-3 facilities. The definition of Tier-3 concept has been outlined (REFERENCE). Tier-3 centres consist of non-pledged resources mostly dedicated for the data analysis by the geographically close or local scientific groups. Tier-3 sites comprise a range of architectures and many do not possess Grid middleware, which would render application of Tier-2 monitoring systems useless. This document describes a strategy to develop a software suite for monitoring of the Tier3 sites. This software suite will enable local monitoring of the Tier3 sites and the global vie...

  16. Controlatron Neutron Tube Test Suite Software Manual - Operation Manual (V2.2)

    CERN Document Server

    Noel, W P; Hertrich, R J; Martinez, M L; Wallace, D L

    2002-01-01

    The Controlatron Software Suite is a custom built application to perform automated testing of Controlatron neutron tubes. The software package was designed to allowing users to design tests and to run a series of test suites on a tube. The data is output to ASCII files of a pre-defined format for data analysis and viewing with the Controlatron Data Viewer Application. This manual discusses the operation of the Controlatron Test Suite Software and a brief discussion of state machine theory, as state machine is the functional basis of the software.

  17. Extending and Enhancing SAS (Static Analysis Suite)

    CERN Document Server

    Ho, David

    2016-01-01

    The Static Analysis Suite (SAS) is an open-source software package used to perform static analysis on C and C++ code, helping to ensure safety, readability and maintainability. In this Summer Student project, SAS was enhanced to improve ease of use and user customisation. A straightforward method of integrating static analysis into a project at compilation time was provided using the automated build tool CMake. The process of adding checkers to the suite was streamlined and simplied by developing an automatic code generator. To make SAS more suitable for continuous integration, a reporting mechanism summarising results was added. This suitability has been demonstrated by inclusion of SAS in the Future Circular Collider Software nightly build system. Scalability of the improved package was demonstrated by using the tool to analyse the ROOT code base.

  18. Recent developments in the tmLQCD software suite

    CERN Document Server

    Abdel-Rehim, Abdou; Deuzeman, Alber; Jansen, Karl; Kostrzewa, Bartosz; Scorzato, Luigi; Urbach, Carsten

    2013-01-01

    We present an overview of recent developments in the tmLQCD software suite. We summarise the features of the code, including actions and operators implemented. In particular, we discuss the optimisation efforts for modern architectures using the Blue Gene/Q system as an example.

  19. Approaches in highly parameterized inversion—PEST++ Version 3, a Parameter ESTimation and uncertainty analysis software suite optimized for large environmental models

    Science.gov (United States)

    Welter, David E.; White, Jeremy T.; Hunt, Randall J.; Doherty, John E.

    2015-09-18

    The PEST++ Version 1 object-oriented parameter estimation code is here extended to Version 3 to incorporate additional algorithms and tools to further improve support for large and complex environmental modeling problems. PEST++ Version 3 includes the Gauss-Marquardt-Levenberg (GML) algorithm for nonlinear parameter estimation, Tikhonov regularization, integrated linear-based uncertainty quantification, options of integrated TCP/IP based parallel run management or external independent run management by use of a Version 2 update of the GENIE Version 1 software code, and utilities for global sensitivity analyses. The Version 3 code design is consistent with PEST++ Version 1 and continues to be designed to lower the barriers of entry for users as well as developers while providing efficient and optimized algorithms capable of accommodating large, highly parameterized inverse problems. As such, this effort continues the original focus of (1) implementing the most popular and powerful features of the PEST software suite in a fashion that is easy for novice or experienced modelers to use and (2) developing a software framework that is easy to extend.

  20. Beyond Petascale with the HipGISAXS Software Suite

    Science.gov (United States)

    Hexemer, Alexander; Li, Sherry; Chourou, Slim; Sarje, Abhinav

    2014-03-01

    We have developed HipGISAXS, a software suite to analyze GISAXS and SAXS data for structural characterization of materials at the nano scale using X-rays. The software has been developed as a massively-parallel system capable of harnessing the raw computational power offered by clusters and supercomputers built using graphics processors (GPUs), Intel Phi co-processors, or commodity multi-core CPUs. Currently the forward GISAXS simulation is a major component of HipGISAXS, which simulates the X-ray scattering process based on the Distorted Wave Born Approximation (DWBS) theory, for any given nano structures and morphologies with a set of experimental configurations. These simulations are compute-intensive, and have a high degree of parallelism available, making them well-suited for fine-grained parallel computations on highly parallel many core processors like GPUs. Furthermore, a large number of such simulations can be carried out simultaneously for various experimental input parameters. HipGISAXS also includes a Reverse Monte Carlo based modeling tool for SAXS data. With HipGISAXS we have demonstrated a sustained compute performance of over 1 Petaflop on 8000 GPU nodes of the Titan supercomputer at ORNL, and have shown it to be highly scalable.

  1. User Guide for the STAYSL PNNL Suite of Software Tools

    Energy Technology Data Exchange (ETDEWEB)

    Greenwood, Lawrence R.; Johnson, Christian D.

    2013-02-27

    The STAYSL PNNL software suite provides a set of tools for working with neutron activation rates measured in a nuclear fission reactor, an accelerator-based neutron source, or any neutron field to determine the neutron flux spectrum through a generalized least-squares approach. This process is referred to as neutron spectral adjustment since the preferred approach is to use measured data to adjust neutron spectra provided by neutron physics calculations. The input data consist of the reaction rates based on measured activities, an initial estimate of the neutron flux spectrum, neutron activation cross sections and their associated uncertainties (covariances), and relevant correction factors. The output consists of the adjusted neutron flux spectrum and associated covariance matrix, which is useful for neutron dosimetry and radiation damage calculations.

  2. Digitizing Special Collections Using the CONTENTdm Software Suite.

    Science.gov (United States)

    Bond, Trevor; Cornish, Alan

    2002-01-01

    Describes the CONTENTdm software used by Washington State University Libraries to enhance its online collection building efforts. Topics include online image collections; software architecture; providing access to collections; integrating CONTENTdm with other imaging products; improving reference; and access for remote users. (LRW)

  3. MaTeLo: Automated Testing Suite for Software Validation

    Science.gov (United States)

    Guiotto, A.; Acquaroli, B.; Martelli, A.

    It is universally known that testing has a predominant role when developing software: more and more efforts are spent on testing to detect programming faults, to evaluate the code reliability or performance, to ensure that a critical function of a system meets given requirements. The ratio of time spent on testing should not be neglected and this explains why there is a real need to improve the development process, especially as systems are becoming larger and larger. It is necessary to keep under control the schedule and budget of developments, and controlling the testing phase is a real issue, often underestimated in many industrial sectors. The industry is heightened at different stages regarding testing, and the MaTeLo project is committed to promote the use of statistical tools &methods to answer European industry's needs: • have the ability to choose relevant test cases instead of a human- biased selection • know when to stop testing (definition of a stopping criteria) instead of a vague and informal criteria • adopt an identical strategy for different developments • automate the testing process, and thus to make testing not human error prone MaTeLo (Markov Test Logic) study is a study currently under development in the frame of the IST program of the European Community. The aim of the project is to define, implement and validate a new approach for supporting the software testing activities in various industrial fields. One of the major goals is in particular to provide the software teams with a new tool able to automatically produce and execute the Test Cases starting from the software specifications. Further, the tool is conceived to provide metrics that could help technical staff to determine software quality and to evaluate how much expected results are met. The tool is based on Markov chains theory and belongs to statistical testing software tools family [Runeson] [Whittaker].

  4. Multi Objective Test Suite Reduction for GUI Based Software Using NSGA-II

    Directory of Open Access Journals (Sweden)

    Neha Chaudhary

    2016-08-01

    Full Text Available Regression Testing is a performed to ensure modified code does not have any unintended side effect on the software. If regression testing is performed with retest-all method it will be very time consuming as testing activity. Therefore test suite reduction methods are used to reduce the size of original test suite. Objective of test suite reduction is to reduce those test cases which are redundant or less important in their fault revealing capability. Test suite reduction can only be used when time is critical to run all test cases and selective testing can only be done. Various methods exist in the literature related to test suite reduction of traditional software. Most of the methods are based of single objective optimization. In case of multi objective optimization of test suite, usually researchers assign different weight values to different objectives and combine them as single objective. However in test suite reduction multiple Pareto-optimal solutions are present, it is difficult to select one test case over other. Since GUI based software is our concern there exist very few reduction techniques and none of them consider multiple objective based reduction. In this work we propose a new test suite reduction technique based on two objectives, event weight and number of faults identified by test case. We evaluated our results for 2 different applications and we achieved 20% reduction in test suite size for both applications. In Terp Paint 3.0 application compromise 15.6% fault revealing capability and for Notepad 11.1% fault revealing capability is reduced.

  5. Technical Note: DIRART- A software suite for deformable image registration and adaptive radiotherapy research

    Energy Technology Data Exchange (ETDEWEB)

    Yang Deshan; Brame, Scott; El Naqa, Issam; Aditya, Apte; Wu Yu; Murty Goddu, S.; Mutic, Sasa; Deasy, Joseph O.; Low, Daniel A. [Department of Radiation Oncology, School of Medicine, Washington University in Saint Louis, Missouri 63110 (United States)

    2011-01-15

    Purpose: Recent years have witnessed tremendous progress in image guide radiotherapy technology and a growing interest in the possibilities for adapting treatment planning and delivery over the course of treatment. One obstacle faced by the research community has been the lack of a comprehensive open-source software toolkit dedicated for adaptive radiotherapy (ART). To address this need, the authors have developed a software suite called the Deformable Image Registration and Adaptive Radiotherapy Toolkit (DIRART). Methods: DIRART is an open-source toolkit developed in MATLAB. It is designed in an object-oriented style with focus on user-friendliness, features, and flexibility. It contains four classes of DIR algorithms, including the newer inverse consistency algorithms to provide consistent displacement vector field in both directions. It also contains common ART functions, an integrated graphical user interface, a variety of visualization and image-processing features, dose metric analysis functions, and interface routines. These interface routines make DIRART a powerful complement to the Computational Environment for Radiotherapy Research (CERR) and popular image-processing toolkits such as ITK. Results: DIRART provides a set of image processing/registration algorithms and postprocessing functions to facilitate the development and testing of DIR algorithms. It also offers a good amount of options for DIR results visualization, evaluation, and validation. Conclusions: By exchanging data with treatment planning systems via DICOM-RT files and CERR, and by bringing image registration algorithms closer to radiotherapy applications, DIRART is potentially a convenient and flexible platform that may facilitate ART and DIR research.

  6. A new methane control and prediction software suite for longwall mines

    Science.gov (United States)

    Dougherty, Heather N.; Özgen Karacan, C.

    2011-09-01

    This paper presents technical and application aspects of a new software suite, MCP (Methane Control and Prediction), developed for addressing some of the methane and methane control issues in longwall coal mines. The software suite consists of dynamic link library (DLL) extensions to MS-Access TM, written in C++. In order to create the DLLs, various statistical, mathematical approaches, prediction and classification artificial neural network (ANN) methods were used. The current version of MCP suite (version 1.3) discussed in this paper has four separate modules that (a) predict the dynamic elastic properties of coal-measure rocks, (b) predict ventilation emissions from longwall mines, (c) determine the type of degasification system that needs to be utilized for given situations and (d) assess the production performance of gob gas ventholes that are used to extract methane from longwall gobs. These modules can be used with the data from basic logs, mining, longwall panel, productivity, and coal bed characteristics. The applications of these modules separately or in combination for methane capture and control related problems will help improve the safety of mines. The software suite's version 1.3 is discussed in this paper. Currently, it's new version 2.0 is available and can be downloaded from http://www.cdc.gov/niosh/mining/products/product180.htm free of charge. The models discussed in this paper can be found under "ancillary models" and under "methane prediction models" for specific U.S. conditions in the new version.

  7. Data Analysis and Graphing in an Introductory Physics Laboratory: Spreadsheet versus Statistics Suite

    Science.gov (United States)

    Peterlin, Primoz

    2010-01-01

    Two methods of data analysis are compared: spreadsheet software and a statistics software suite. Their use is compared analysing data collected in three selected experiments taken from an introductory physics laboratory, which include a linear dependence, a nonlinear dependence and a histogram. The merits of each method are compared. (Contains 7…

  8. Software safety hazard analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lawrence, J.D. [Lawrence Livermore National Lab., CA (United States)

    1996-02-01

    Techniques for analyzing the safety and reliability of analog-based electronic protection systems that serve to mitigate hazards in process control systems have been developed over many years, and are reasonably well understood. An example is the protection system in a nuclear power plant. The extension of these techniques to systems which include digital computers is not well developed, and there is little consensus among software engineering experts and safety experts on how to analyze such systems. One possible technique is to extend hazard analysis to include digital computer-based systems. Software is frequently overlooked during system hazard analyses, but this is unacceptable when the software is in control of a potentially hazardous operation. In such cases, hazard analysis should be extended to fully cover the software. A method for performing software hazard analysis is proposed in this paper.

  9. The AST3 controlling and operating software suite for automatic sky survey

    Science.gov (United States)

    Hu, Yi; Shang, Zhaohui; Ma, Bin; Hu, Keliang

    2016-07-01

    We have developed a specialized software package, called ast3suite, to achieve the remote control and automatic sky survey for AST3 (Antarctic Survey Telescope) from scratch. It includes several daemon servers and many basic commands. Each program does only one single task, and they work together to make AST3 a robotic telescope. A survey script calls basic commands to carry out automatic sky survey. Ast3suite was carefully tested in Mohe, China in 2013 and has been used at Dome, Antarctica in 2015 and 2016 with the real hardware for practical sky survey. Both test results and practical using showed that ast3suite had worked very well without any manual auxiliary as we expected.

  10. Xmipp 3.0: an improved software suite for image processing in electron microscopy.

    Science.gov (United States)

    de la Rosa-Trevín, J M; Otón, J; Marabini, R; Zaldívar, A; Vargas, J; Carazo, J M; Sorzano, C O S

    2013-11-01

    Xmipp is a specialized software package for image processing in electron microscopy, and that is mainly focused on 3D reconstruction of macromolecules through single-particles analysis. In this article we present Xmipp 3.0, a major release which introduces several improvements and new developments over the previous version. A central improvement is the concept of a project that stores the entire processing workflow from data import to final results. It is now possible to monitor, reproduce and restart all computing tasks as well as graphically explore the complete set of interrelated tasks associated to a given project. Other graphical tools have also been improved such as data visualization, particle picking and parameter "wizards" that allow the visual selection of some key parameters. Many standard image formats are transparently supported for input/output from all programs. Additionally, results have been standardized, facilitating the interoperation between different Xmipp programs. Finally, as a result of a large code refactoring, the underlying C++ libraries are better suited for future developments and all code has been optimized. Xmipp is an open-source package that is freely available for download from: http://xmipp.cnb.csic.es.

  11. ANALYSIS OF DESIGN ELEMENTS IN SKI SUITS

    Directory of Open Access Journals (Sweden)

    Birsen Çileroğlu

    2014-06-01

    Full Text Available Popularity of Ski Sport in 19th century necessitated a new perspective on protective skiing clothing ag ainst the mountain climates and excessive cold. Winter clothing were the basis of ski attire during this period. By the beginning of 20th century lining cloth were used to minimize the wind effect. The difference between the men and women’s ski attire of the time consisted of a knee - length skirts worn over the golf trousers. Subsequent to the First World War, skiing suit models were influenced by the period uniforms and the producers reflected the fashion trends to the ski clothing. In conformance with th e prevailing trends, ski trousers were designed and produced for the women thus leading to reduction in gender differences. Increases in the ski tourism and holding of the first winter olympics in 1924 resulted in variations in ski attires, development of design characteristics, growth in user numbers, and enlargement of production capacities. Designers emphasized in their collections combined presence of elegance and practicality in the skiing attire. In 1930s, the ski suits influenced by pilots’ uniforms included characteristics permitting freedom of motion, and the design elements exhibited changes in terms of style, material and aerodynamics. In time, the ski attires showed varying design features distinguishing professionals from the amateurs. While protective functionality was primary consideration for the amateurs, for professionals the aerodynamic design was also a leading factor. Eventually, the increased differences in design characteristics were exhibited in ski suit collections, World reknown brands were formed, production and sales volumes showed significant rise. During 20th century the ski suits influenced by fashion trends to acquire unique styles reached a position of dominance to impact current fashion trends, and apart from sports attir es they became a style determinant in the clothing of cold climates. Ski suits

  12. Software Suite for Gene and Protein Annotation Prediction and Similarity Search.

    Science.gov (United States)

    Chicco, Davide; Masseroli, Marco

    2015-01-01

    In the computational biology community, machine learning algorithms are key instruments for many applications, including the prediction of gene-functions based upon the available biomolecular annotations. Additionally, they may also be employed to compute similarity between genes or proteins. Here, we describe and discuss a software suite we developed to implement and make publicly available some of such prediction methods and a computational technique based upon Latent Semantic Indexing (LSI), which leverages both inferred and available annotations to search for semantically similar genes. The suite consists of three components. BioAnnotationPredictor is a computational software module to predict new gene-functions based upon Singular Value Decomposition of available annotations. SimilBio is a Web module that leverages annotations available or predicted by BioAnnotationPredictor to discover similarities between genes via LSI. The suite includes also SemSim, a new Web service built upon these modules to allow accessing them programmatically. We integrated SemSim in the Bio Search Computing framework (http://www.bioinformatics.deib. polimi.it/bio-seco/seco/), where users can exploit the Search Computing technology to run multi-topic complex queries on multiple integrated Web services. Accordingly, researchers may obtain ranked answers involving the computation of the functional similarity between genes in support of biomedical knowledge discovery.

  13. SAGES: a suite of freely-available software tools for electronic disease surveillance in resource-limited settings.

    Directory of Open Access Journals (Sweden)

    Sheri L Lewis

    Full Text Available Public health surveillance is undergoing a revolution driven by advances in the field of information technology. Many countries have experienced vast improvements in the collection, ingestion, analysis, visualization, and dissemination of public health data. Resource-limited countries have lagged behind due to challenges in information technology infrastructure, public health resources, and the costs of proprietary software. The Suite for Automated Global Electronic bioSurveillance (SAGES is a collection of modular, flexible, freely-available software tools for electronic disease surveillance in resource-limited settings. One or more SAGES tools may be used in concert with existing surveillance applications or the SAGES tools may be used en masse for an end-to-end biosurveillance capability. This flexibility allows for the development of an inexpensive, customized, and sustainable disease surveillance system. The ability to rapidly assess anomalous disease activity may lead to more efficient use of limited resources and better compliance with World Health Organization International Health Regulations.

  14. A Suite of Tools for ROC Analysis of Spatial Models

    Directory of Open Access Journals (Sweden)

    Hermann Rodrigues

    2013-09-01

    Full Text Available The Receiver Operating Characteristic (ROC is widely used for assessing the performance of classification algorithms. In GIScience, ROC has been applied to assess models aimed at predicting events, such as land use/cover change (LUCC, species distribution and disease risk. However, GIS software packages offer few statistical tests and guidance tools for ROC analysis and interpretation. This paper presents a suite of GIS tools designed to facilitate ROC curve analysis for GIS users by applying proper statistical tests and analysis procedures. The tools are freely available as models and submodels of Dinamica EGO freeware. The tools give the ROC curve, the area under the curve (AUC, partial AUC, lower and upper AUCs, the confidence interval of AUC, the density of event in probability bins and tests to evaluate the difference between the AUCs of two models. We present first the procedures and statistical tests implemented in Dinamica EGO, then the application of the tools to assess LUCC and species distribution models. Finally, we interpret and discuss the ROC-related statistics resulting from various case studies.

  15. A comprehensive software suite for protein family construction and functional site prediction

    Science.gov (United States)

    Haft, David Renfrew; Haft, Daniel H.

    2017-01-01

    In functionally diverse protein families, conservation in short signature regions may outperform full-length sequence comparisons for identifying proteins that belong to a subgroup within which one specific aspect of their function is conserved. The SIMBAL workflow (Sites Inferred by Metabolic Background Assertion Labeling) is a data-mining procedure for finding such signature regions. It begins by using clues from genomic context, such as co-occurrence or conserved gene neighborhoods, to build a useful training set from a large number of uncharacterized but mutually homologous proteins. When training set construction is successful, the YES partition is enriched in proteins that share function with the user’s query sequence, while the NO partition is depleted. A selected query sequence is then mined for short signature regions whose closest matches overwhelmingly favor proteins from the YES partition. High-scoring signature regions typically contain key residues critical to functional specificity, so proteins with the highest sequence similarity across these regions tend to share the same function. The SIMBAL algorithm was described previously, but significant manual effort, expertise, and a supporting software infrastructure were required to prepare the requisite training sets. Here, we describe a new, distributable software suite that speeds up and simplifies the process for using SIMBAL, most notably by providing tools that automate training set construction. These tools have broad utility for comparative genomics, allowing for flexible collection of proteins or protein domains based on genomic context as well as homology, a capability that can greatly assist in protein family construction. Armed with this new software suite, SIMBAL can serve as a fast and powerful in silico alternative to direct experimentation for characterizing proteins and their functional interactions. PMID:28182651

  16. Automatic Feature Interaction Analysis in PacoSuite

    Directory of Open Access Journals (Sweden)

    Wim Vanderperren

    2004-10-01

    Full Text Available In this paper, we build upon previous work that aims at recuperating aspect oriented ideas into component based software development. In that research, a composition adapter was proposed in order to capture crosscutting concerns in the PacoSuite component based methodology. A composition adapter is visually applied onto a given component composition and the changes it describes are automatically applied. Stacking multiple composition adapters onto the same component composition can however lead to unpredictable and undesired side-effects. In this paper, we propose a solution for this issue, widely known as the feature interaction problem. We present a classification of different interaction levels among composition adapters and the algorithms required to verify them. The proposed algorithms are however of exponential nature and depend on both the composition adapters and the component composition as a whole. In order to enhance the performance of our feature interaction analysis, we present a set of theorems that define the interaction levels solely in terms of the properties of the composition adapters themselves.

  17. Kinematic Analysis of Exoskeleton Suit for Human Arm

    Directory of Open Access Journals (Sweden)

    Surachai Panich

    2010-01-01

    Full Text Available Problem statement: There are many robotic arms developed for providing care to physically disabled people. It is difficult to find robot designs in literature that articulate such a procedure. Therefore, it is our hope that the design work shown in this study may serve as a good example of a systematic method for rehabilitation robot design. Approach: The arm exoskeleton suit was developed to increase human's strength, endurance, or speed enabling them to perform tasks that they previously could not perform. It should not impede the user's natural motion and should be comfortable and safe to wear and easy to use. Although movement is difficult for them, they usually want to go somewhere by themselves. Results: The kinematic exoskeleton suit for human arms is simulated by MATLAB software. The exoskeleton suit of human arm consists of one link length, three link twists, two link offsets and three joint angles. Conclusion: This study introduced the kinematic of exoskeleton suit for human arm. The exoskeleton suit can be used to be instrument for anyone who needs to improve human's performance. It will increase the strength of human that can lift heavy load or help handicapped patients, who cannot use their arm.

  18. The Toast++ software suite for forward and inverse modeling in optical tomography.

    Science.gov (United States)

    Schweiger, Martin; Arridge, Simon

    2014-04-01

    We present the Toast++ open-source software environment for solving the forward and inverse problems in diffuse optical tomography (DOT). The software suite consists of a set of libraries to simulate near-infrared light propagation in highly scattering media with complex boundaries and heterogeneous internal parameter distribution, based on a finite-element solver. Steady-state, time- and frequency-domain data acquisition systems can be modeled. The forward solver is implemented in C++ and supports performance acceleration with parallelization for shared and distributed memory architectures, as well as graphics processing computation. Building on the numerical forward solver, Toast++ contains model-based iterative inverse solvers for reconstructing the volume distribution of absorption and scattering parameters from boundary measurements of light transmission. A range of regularization methods are provided, including the possibility of incorporating prior knowledge of internal structure. The user can link to the Toast++ libraries either directly to compile application programs for DOT, or make use of the included MATLAB and PYTHON bindings to generate script-based solutions. This approach allows rapid prototyping and provides a rich toolset in both environments for debugging, testing, and visualization.

  19. The PyRosetta Toolkit: a graphical user interface for the Rosetta software suite.

    Directory of Open Access Journals (Sweden)

    Jared Adolf-Bryfogle

    Full Text Available The Rosetta Molecular Modeling suite is a command-line-only collection of applications that enable high-resolution modeling and design of proteins and other molecules. Although extremely useful, Rosetta can be difficult to learn for scientists with little computational or programming experience. To that end, we have created a Graphical User Interface (GUI for Rosetta, called the PyRosetta Toolkit, for creating and running protocols in Rosetta for common molecular modeling and protein design tasks and for analyzing the results of Rosetta calculations. The program is highly extensible so that developers can add new protocols and analysis tools to the PyRosetta Toolkit GUI.

  20. Results and Analysis from Space Suit Joint Torque Testing

    Science.gov (United States)

    Matty, Jennifer E.; Aitchison, Lindsay

    2009-01-01

    A space suit s mobility is critical to an astronaut s ability to perform work efficiently. As mobility increases, the astronaut can perform tasks for longer durations with less fatigue. The term mobility, with respect to space suits, is defined in terms of two key components: joint range of motion and joint torque. Individually these measures describe the path which in which a joint travels and the force required to move it through that path. Previous space suits mobility requirements were defined as the collective result of these two measures and verified by the completion of discrete functional tasks. While a valid way to impose mobility requirements, such a method does necessitate a solid understanding of the operational scenarios in which the final suit will be performing. Because the Constellation space suit system requirements are being finalized with a relatively immature concept of operations, the Space Suit Element team elected to define mobility in terms of its constituent parts to increase the likelihood that the future pressure garment will be mobile enough to enable a broad scope of undefined exploration activities. The range of motion requirements were defined by measuring the ranges of motion test subjects achieved while performing a series of joint maximizing tasks in a variety of flight and prototype space suits. The definition of joint torque requirements has proved more elusive. NASA evaluated several different approaches to the problem before deciding to generate requirements based on unmanned joint torque evaluations of six different space suit configurations being articulated through 16 separate joint movements. This paper discusses the experiment design, data analysis and results, and the process used to determine the final values for the Constellation pressure garment joint torque requirements.

  1. The application of formal software engineering methods to the unattended and remote monitoring software suite at Los Alamos National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Determan, John Clifford [Los Alamos National Laboratory; Longo, Joseph F [Los Alamos National Laboratory; Michel, Kelly D [Los Alamos National Laboratory

    2009-01-01

    The Unattended and Remote Monitoring (UNARM) system is a collection of specialized hardware and software used by the International Atomic Energy Agency (IAEA) to institute nuclear safeguards at many nuclear facilities around the world. The hardware consists of detectors, instruments, and networked computers for acquiring various forms of data, including but not limited to radiation data, global position coordinates, camera images, isotopic data, and operator declarations. The software provides two primary functions: the secure and reliable collection of this data from the instruments and the ability to perform an integrated review and analysis of the disparate data sources. Several years ago the team responsible for maintaining the software portion of the UNARM system began the process of formalizing its operations. These formal operations include a configuration management system, a change control board, an issue tracking system, and extensive formal testing, for both functionality and reliability. Functionality is tested with formal test cases chosen to fully represent the data types and methods of analysis that will be commonly encountered. Reliability is tested with iterative, concurrent testing where up to five analyses are executed simultaneously for thousands of cycles. Iterative concurrent testing helps ensure that there are no resource conflicts or leaks when multiple system components are in use simultaneously. The goal of this work is to provide a high quality, reliable product, commensurate with the criticality of the application. Testing results will be presented that demonstrate that this goal has been achieved and the impact of the introduction of a formal software engineering framework to the UNARM product will be presented.

  2. Personal computer security: part 1. Firewalls, antivirus software, and Internet security suites.

    Science.gov (United States)

    Caruso, Ronald D

    2003-01-01

    Personal computer (PC) security in the era of the Health Insurance Portability and Accountability Act of 1996 (HIPAA) involves two interrelated elements: safeguarding the basic computer system itself and protecting the information it contains and transmits, including personal files. HIPAA regulations have toughened the requirements for securing patient information, requiring every radiologist with such data to take further precautions. Security starts with physically securing the computer. Account passwords and a password-protected screen saver should also be set up. A modern antivirus program can easily be installed and configured. File scanning and updating of virus definitions are simple processes that can largely be automated and should be performed at least weekly. A software firewall is also essential for protection from outside intrusion, and an inexpensive hardware firewall can provide yet another layer of protection. An Internet security suite yields additional safety. Regular updating of the security features of installed programs is important. Obtaining a moderate degree of PC safety and security is somewhat inconvenient but is necessary and well worth the effort.

  3. CCP4 Software Suite: history, evolution, content, challenges and future developments

    Directory of Open Access Journals (Sweden)

    Krissinel, Eugene

    2015-04-01

    Full Text Available Collaborative Computational Project Number 4 (CCP4 in Protein Crystallography is a public resource for producing and supporting a world-leading, integrated Suite of programs that allows researchers to determine macromolecular structures by X-ray crystallography, and other biophysical techniques. CCP4 supports the widest possible researcher community, embracing academic, not for profit, and for profit research. The primary aims of CCP4 include development and support of the development of cutting edge approaches to experimental determination and analysis of protein structure, with integration of them into the suite for worldwide dissemination. In addition, CCP4 plays an important role in the education and training of scientists in experimental structural biology. In this paper, we overview CCP4’s 35-year long history and (technical milestones of its evolution. We will also consider how a particular structure of CCP4 Suite and Collaboration has emerged, its main functionality, current state and plans for future.“Collaborative Computational Project Number 4 (CCP4” en Cristalografía de Proteínas es un recurso público líder mundial, encaminado a producir y mantener un conjunto integrado de programas que permite a los investigadores determinar estructuras macromoleculares mediante cristalografía de rayos-X, así como por otras técnicas biofísicas. CCP4 va dirigido a la más amplia comunidad científica posible, abarcando investigaciones en el ámbito académico, tanto sin ánimo de lucro como con él. Sus objetivos principales incluyen el desarrollo y soporte de metodologías punteras para la determinación y análisis de estructuras de proteínas, integradas en un conjunto bien definido para facilitar su fácil difusión mundial. Además, CCP4 juega un papel importante en la formación y entrenamiento de científicos en biología estructural experimental. En este artículo, ofreceré una visión de conjunto de la larga historia e hitos t

  4. Software for multistate analysis

    Directory of Open Access Journals (Sweden)

    Frans J. Willekens

    2014-08-01

    Full Text Available Background: The growing interest in pathways, the increased availability of life-history data,innovations in statistical and demographic techniques, and advances in softwaretechnology have stimulated the development of software packages for multistatemodeling of life histories. Objective: In the paper we list and briefly discuss several software packages for multistate analysisof life-history data. The packages cover the estimation of multistate models (transitionrates and transition probabilities, multistate life tables, multistate populationprojections, and microsimulation. Methods: Brief description of software packages in a historical and comparative perspective. Results: During the past 10 years the advances in multistate modeling software have beenimpressive. New computational tools accompany the development of new methods instatistics and demography. The statistical theory of counting processes is the preferredmethod for the estimation of multistate models and R is the preferred programmingplatform. Conclusions: Innovations in method, data, and computer technology have removed the traditionalbarriers to multistate modeling of life histories and the computation of informative lifecourseindicators. The challenge ahead of us is to model and predict individual lifehistories.

  5. The Sample Analysis at Mars Investigation and Instrument Suite

    Science.gov (United States)

    Mahaffy, Paul; Webster, Christopher R.; Conrad, Pamela G.; Arvey, Robert; Bleacher, Lora; Brinckerhoff, William B.; Eigenbrode, Jennifer L.; Chalmers, Robert A.; Dworkin, Jason P.; Errigo, Therese; Farley, Rodger; Feng, Steven; Frazier, Gregory; Glavin, Daniel P.; Harpold, Daniel N.; Jordan, Partick; Kellogg, James; Lewis, Jesse; Martin, David K.; Maurer, John; McAdam, Amy C.; McLennan, Douglas; Pavlov, Alexander A.; Raaen, Eric; Schinman, Oren

    2012-01-01

    The Sample Analysis at Mars (SAM) investigation of the Mars Science Laboratory (MSL) addresses the chemical and isotopic composition of the atmosphere and volatiles extracted from solid samples. The SAM investigation is designed to contribute substantially to the mission goal of quantitatively assessing the habitability of Mars as an essential step in the search for past or present life on Mars. SAM is a 40 kg instrument suite located in the interior of MSL's Curiosity rover. The SAM instruments are a quadrupole mass spectrometer, a tunable laser spectrometer, and a 6-column gas chromatograph all coupled through solid and gas processing systems to provide complementary information on the same samples. The SAM suite is able to measure a suite of light isotopes and to analyze volatiles directly from the atmosphere or thermally released from solid samples. In addition to measurements of simple inorganic compounds and noble gases SAM will conduct a sensitive search for organic compounds with either thermal or chemical extraction from sieved samples delivered by the sample processing system on the Curiosity rover's robotic arm,

  6. The Sample Analysis at Mars Investigation and Instrument Suite

    Science.gov (United States)

    Mahaffy, Paul; Webster, Chris R.; Cabane, M.; Conrad, Pamela G.; Coll, Patrice; Atreya, Sushil K.; Arvey, Robert; Barciniak, Michael; Benna, Mehdi; Bleacher, L.; Brinckerhoff, William B.; Eigenbrode, Jennifer L.; Carignan, Daniel; Cascia, Mark; Chalmers, Robert A.; Dworkin, Jason P.; Errigo, Therese; Everson, Paula; Franz, Heather; Farley, Rodger; Feng, Steven; Frazier, Gregory; Freissinet, Caroline; Glavin, Daniel P.; Harpold, Daniel N.

    2012-01-01

    The Sample Analysis at Mars (SAM) investigation of the Mars Science Laboratory(MSL) addresses the chemical and isotopic composition of the atmosphere and volatilesextracted from solid samples. The SAM investigation is designed to contribute substantiallyto the mission goal of quantitatively assessing the habitability of Mars as an essentialstep in the search for past or present life on Mars. SAM is a 40 kg instrument suite locatedin the interior of MSLs Curiosity rover. The SAM instruments are a quadrupole massspectrometer, a tunable laser spectrometer, and a 6-column gas chromatograph all coupledthrough solid and gas processing systems to provide complementary information on thesame samples. The SAM suite is able to measure a suite of light isotopes and to analyzevolatiles directly from the atmosphere or thermally released from solid samples. In additionto measurements of simple inorganic compounds and noble gases SAM will conducta sensitive search for organic compounds with either thermal or chemical extraction fromsieved samples delivered by the sample processing system on the Curiosity rovers roboticarm.

  7. The Sample Analysis at Mars Investigation and Instrument Suite

    Science.gov (United States)

    Mahaffy, Paul R.; Webster, Christopher R.; Cabane, Michel; Conrad, Pamela G.; Coll, Patrice; Atreya, Sushil K.; Arvey, Robert; Barciniak, Michael; Benna, Mehdi; Bleacher, Lora; Brinckerhoff, William B.; Eigenbrode, Jennifer L.; Carignan, Daniel; Cascia, Mark; Chalmers, Robert A.; Dworkin, Jason P.; Errigo, Therese; Everson, Paula; Franz, Heather; Farley, Rodger; Feng, Steven; Frazier, Gregory; Freissinet, Caroline; Glavin, Daniel P.; Harpold, Daniel N.; Hawk, Douglas; Holmes, Vincent; Johnson, Christopher S.; Jones, Andrea; Jordan, Patrick; Kellogg, James; Lewis, Jesse; Lyness, Eric; Malespin, Charles A.; Martin, David K.; Maurer, John; McAdam, Amy C.; McLennan, Douglas; Nolan, Thomas J.; Noriega, Marvin; Pavlov, Alexander A.; Prats, Benito; Raaen, Eric; Sheinman, Oren; Sheppard, David; Smith, James; Stern, Jennifer C.; Tan, Florence; Trainer, Melissa; Ming, Douglas W.; Morris, Richard V.; Jones, John; Gundersen, Cindy; Steele, Andrew; Wray, James; Botta, Oliver; Leshin, Laurie A.; Owen, Tobias; Battel, Steve; Jakosky, Bruce M.; Manning, Heidi; Squyres, Steven; Navarro-González, Rafael; McKay, Christopher P.; Raulin, Francois; Sternberg, Robert; Buch, Arnaud; Sorensen, Paul; Kline-Schoder, Robert; Coscia, David; Szopa, Cyril; Teinturier, Samuel; Baffes, Curt; Feldman, Jason; Flesch, Greg; Forouhar, Siamak; Garcia, Ray; Keymeulen, Didier; Woodward, Steve; Block, Bruce P.; Arnett, Ken; Miller, Ryan; Edmonson, Charles; Gorevan, Stephen; Mumm, Erik

    2012-09-01

    The Sample Analysis at Mars (SAM) investigation of the Mars Science Laboratory (MSL) addresses the chemical and isotopic composition of the atmosphere and volatiles extracted from solid samples. The SAM investigation is designed to contribute substantially to the mission goal of quantitatively assessing the habitability of Mars as an essential step in the search for past or present life on Mars. SAM is a 40 kg instrument suite located in the interior of MSL's Curiosity rover. The SAM instruments are a quadrupole mass spectrometer, a tunable laser spectrometer, and a 6-column gas chromatograph all coupled through solid and gas processing systems to provide complementary information on the same samples. The SAM suite is able to measure a suite of light isotopes and to analyze volatiles directly from the atmosphere or thermally released from solid samples. In addition to measurements of simple inorganic compounds and noble gases SAM will conduct a sensitive search for organic compounds with either thermal or chemical extraction from sieved samples delivered by the sample processing system on the Curiosity rover's robotic arm.

  8. Using Rscript for Software Analysis

    NARCIS (Netherlands)

    Klint, P.

    2008-01-01

    RSCRIPT is a concept language that explores the design space of relation-based languages for software analysis. We briefly sketch the RSCRIPT language by way of a stan- dard example, summarize our experience, and point at fu- ture developments.

  9. DelPhi: a comprehensive suite for DelPhi software and associated resources

    Directory of Open Access Journals (Sweden)

    Li Lin

    2012-05-01

    Full Text Available Abstract Background Accurate modeling of electrostatic potential and corresponding energies becomes increasingly important for understanding properties of biological macromolecules and their complexes. However, this is not an easy task due to the irregular shape of biological entities and the presence of water and mobile ions. Results Here we report a comprehensive suite for the well-known Poisson-Boltzmann solver, DelPhi, enriched with additional features to facilitate DelPhi usage. The suite allows for easy download of both DelPhi executable files and source code along with a makefile for local installations. The users can obtain the DelPhi manual and parameter files required for the corresponding investigation. Non-experienced researchers can download examples containing all necessary data to carry out DelPhi runs on a set of selected examples illustrating various DelPhi features and demonstrating DelPhi’s accuracy against analytical solutions. Conclusions DelPhi suite offers not only the DelPhi executable and sources files, examples and parameter files, but also provides links to third party developed resources either utilizing DelPhi or providing plugins for DelPhi. In addition, the users and developers are offered a forum to share ideas, resolve issues, report bugs and seek help with respect to the DelPhi package. The resource is available free of charge for academic users from URL: http://compbio.clemson.edu/DelPhi.php.

  10. Cyber-physical systems software development: way of working and tool suite

    NARCIS (Netherlands)

    Bezemer, Maarten Matthijs

    2013-01-01

    Designing embedded control software for modern cyber-physical systems becomes more and more difficult, because of the increasing amount and complexity of their requirements. The regular requirements are extended with modern requirements, for example, to get a general purpose cyber-physical system ca

  11. Software Design for Smile Analysis

    Directory of Open Access Journals (Sweden)

    A. Sarkhosh

    2010-12-01

    Full Text Available Introduction: Esthetics and attractiveness of the smile is one of the major demands in contemporary orthodontic treatment. In order to improve a smile design, it is necessary to record “posed smile” as an intentional, non-pressure, static, natural and reproduciblesmile. The record then should be analyzed to determine its characteristics. In this study,we intended to design and introduce a software to analyze the smile rapidly and precisely in order to produce an attractive smile for the patients.Materials and Methods: For this purpose, a practical study was performed to design multimedia software “Smile Analysis” which can receive patients’ photographs and videographs. After giving records to the software, the operator should mark the points and lines which are displayed on the system’s guide and also define the correct scale for each image. Thirty-three variables are measured by the software and displayed on the report page. Reliability of measurements in both image and video was significantly high(=0.7-1.Results: In order to evaluate intra- operator and inter-operator reliability, five cases were selected randomly. Statistical analysis showed that calculations performed in smile analysis software were both valid and highly reliable (for both video and photo.Conclusion: The results obtained from smile analysis could be used in diagnosis,treatment planning and evaluation of the treatment progress.

  12. Development of an e-VLBI Data Transport Software Suite with VDIF

    Science.gov (United States)

    Sekido, Mamoru; Takefuji, Kazuhiro; Kimura, Moritaka; Hobiger, Thomas; Kokado, Kensuke; Nozawa, Kentarou; Kurihara, Shinobu; Shinno, Takuya; Takahashi, Fujinobu

    2010-01-01

    We have developed a software library (KVTP-lib) for VLBI data transmission over the network with the VDIF (VLBI Data Interchange Format), which is the newly proposed standard VLBI data format designed for electronic data transfer over the network. The software package keeps the application layer (VDIF frame) and the transmission layer separate, so that each layer can be developed efficiently. The real-time VLBI data transmission tool sudp-send is an application tool based on the KVTP-lib library. sudp-send captures the VLBI data stream from the VSI-H interface with the K5/VSI PC-board and writes the data to file in standard Linux file format or transmits it to the network using the simple- UDP (SUDP) protocol. Another tool, sudp-recv , receives the data stream from the network and writes the data to file in a specific VLBI format (K5/VSSP, VDIF, or Mark 5B). This software system has been implemented on the Wettzell Tsukuba baseline; evaluation before operational employment is under way.

  13. Team Software Development for Aerothermodynamic and Aerodynamic Analysis and Design

    Science.gov (United States)

    Alexandrov, N.; Atkins, H. L.; Bibb, K. L.; Biedron, R. T.; Carpenter, M. H.; Gnoffo, P. A.; Hammond, D. P.; Jones, W. T.; Kleb, W. L.; Lee-Rausch, E. M.

    2003-01-01

    A collaborative approach to software development is described. The approach employs the agile development techniques: project retrospectives, Scrum status meetings, and elements of Extreme Programming to efficiently develop a cohesive and extensible software suite. The software product under development is a fluid dynamics simulator for performing aerodynamic and aerothermodynamic analysis and design. The functionality of the software product is achieved both through the merging, with substantial rewrite, of separate legacy codes and the authorship of new routines. Examples of rapid implementation of new functionality demonstrate the benefits obtained with this agile software development process. The appendix contains a discussion of coding issues encountered while porting legacy Fortran 77 code to Fortran 95, software design principles, and a Fortran 95 coding standard.

  14. Inertial motion capture system for biomechanical analysis in pressure suits

    Science.gov (United States)

    Di Capua, Massimiliano

    A non-invasive system has been developed at the University of Maryland Space System Laboratory with the goal of providing a new capability for quantifying the motion of the human inside a space suit. Based on an array of six microprocessors and eighteen microelectromechanical (MEMS) inertial measurement units (IMUs), the Body Pose Measurement System (BPMS) allows the monitoring of the kinematics of the suit occupant in an unobtrusive, self-contained, lightweight and compact fashion, without requiring any external equipment such as those necessary with modern optical motion capture systems. BPMS measures and stores the accelerations, angular rates and magnetic fields acting upon each IMU, which are mounted on the head, torso, and each segment of each limb. In order to convert the raw data into a more useful form, such as a set of body segment angles quantifying pose and motion, a series of geometrical models and a non-linear complimentary filter were implemented. The first portion of this works focuses on assessing system performance, which was measured by comparing the BPMS filtered data against rigid body angles measured through an external VICON optical motion capture system. This type of system is the industry standard, and is used here for independent measurement of body pose angles. By comparing the two sets of data, performance metrics such as BPMS system operational conditions, accuracy, and drift were evaluated and correlated against VICON data. After the system and models were verified and their capabilities and limitations assessed, a series of pressure suit evaluations were conducted. Three different pressure suits were used to identify the relationship between usable range of motion and internal suit pressure. In addition to addressing range of motion, a series of exploration tasks were also performed, recorded, and analysed in order to identify different motion patterns and trajectories as suit pressure is increased and overall suit mobility is reduced

  15. The gastrointestinal electrical mapping suite (GEMS: software for analyzing and visualizing high-resolution (multi-electrode recordings in spatiotemporal detail

    Directory of Open Access Journals (Sweden)

    Yassi Rita

    2012-06-01

    Full Text Available Abstract Background Gastrointestinal contractions are controlled by an underlying bioelectrical activity. High-resolution spatiotemporal electrical mapping has become an important advance for investigating gastrointestinal electrical behaviors in health and motility disorders. However, research progress has been constrained by the low efficiency of the data analysis tasks. This work introduces a new efficient software package: GEMS (Gastrointestinal Electrical Mapping Suite, for analyzing and visualizing high-resolution multi-electrode gastrointestinal mapping data in spatiotemporal detail. Results GEMS incorporates a number of new and previously validated automated analytical and visualization methods into a coherent framework coupled to an intuitive and user-friendly graphical user interface. GEMS is implemented using MATLAB®, which combines sophisticated mathematical operations and GUI compatibility. Recorded slow wave data can be filtered via a range of inbuilt techniques, efficiently analyzed via automated event-detection and cycle clustering algorithms, and high quality isochronal activation maps, velocity field maps, amplitude maps, frequency (time interval maps and data animations can be rapidly generated. Normal and dysrhythmic activities can be analyzed, including initiation and conduction abnormalities. The software is distributed free to academics via a community user website and forum (http://sites.google.com/site/gimappingsuite. Conclusions This software allows for the rapid analysis and generation of critical results from gastrointestinal high-resolution electrical mapping data, including quantitative analysis and graphical outputs for qualitative analysis. The software is designed to be used by non-experts in data and signal processing, and is intended to be used by clinical researchers as well as physiologists and bioengineers. The use and distribution of this software package will greatly accelerate efforts to improve the

  16. Application of Software Safety Analysis Methods

    Energy Technology Data Exchange (ETDEWEB)

    Park, G. Y.; Hur, S.; Cheon, S. W.; Kim, D. H.; Lee, D. Y.; Kwon, K. C. [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Lee, S. J.; Koo, Y. H. [Doosan Heavy Industries and Construction Co., Daejeon (Korea, Republic of)

    2009-05-15

    A fully digitalized reactor protection system, which is called the IDiPS-RPS, was developed through the KNICS project. The IDiPS-RPS has four redundant and separated channels. Each channel is mainly composed of a group of bistable processors which redundantly compare process variables with their corresponding setpoints and a group of coincidence processors that generate a final trip signal when a trip condition is satisfied. Each channel also contains a test processor called the ATIP and a display and command processor called the COM. All the functions were implemented in software. During the development of the safety software, various software safety analysis methods were applied, in parallel to the verification and validation (V and V) activities, along the software development life cycle. The software safety analysis methods employed were the software hazard and operability (Software HAZOP) study, the software fault tree analysis (Software FTA), and the software failure modes and effects analysis (Software FMEA)

  17. PARENT: A Parallel Software Suite for the Calculation of Configurational Entropy in Biomolecular Systems.

    Science.gov (United States)

    Fleck, Markus; Polyansky, Anton A; Zagrovic, Bojan

    2016-04-12

    Accurate estimation of configurational entropy from the in silico-generated biomolecular ensembles, e.g., from molecular dynamics (MD) trajectories, is dependent strongly on exhaustive sampling for physical reasons. This, however, creates a major computational problem for the subsequent estimation of configurational entropy using the Maximum Information Spanning Tree (MIST) or Mutual Information Expansion (MIE) approaches for internal molecular coordinates. In particular, the available software for such estimation exhibits serious limitations when it comes to molecules with hundreds or thousands of atoms, because of its reliance on a serial program architecture. To overcome this problem, we have developed a parallel, hybrid MPI/openMP C++ implementation of MIST and MIE, called PARENT, which is particularly optimized for high-performance computing and provides efficient estimation of configurational entropy in different biological processes (e.g., protein-protein interactions). In addition, PARENT also allows for a detailed mapping of intramolecular allosteric networks. Here, we benchmark the program on a set of 1-μs-long MD trajectories of 10 different protein complexes and their components, demonstrating robustness and good scalability. A direct comparison between MIST and MIE on the same dataset demonstrates a superior convergence behavior for the former approach, when it comes to total simulation length and configurational-space binning.

  18. OARDAS stray radiation analysis software

    Science.gov (United States)

    Rock, David F.

    1999-09-01

    OARDAS (Off-Axis Rejection Design Analysis Software) is a Raytheon in-house code designed to aid in stray light analysis. The code development started in 1982, and by 1986 the program was fully operational. Since that time, the work has continued--not with the goal of creating a marketable product, but with a focus on creating a powerful, user- friendly, highly graphical tool that makes stray light analysis as easy and efficient as possible. The goal has been to optimize the analysis process, with a clear emphasis on designing an interface between computer and user that allows each to do what he does best. The code evolution has resulted in a number of analysis features that are unique to the industry. This paper looks at a variety of stray light analysis problems that the analyst is typically faced with and shows how they are approached using OARDAS.

  19. Spherical Coordinate Systems for Streamlining Suited Mobility Analysis

    Science.gov (United States)

    Benson, Elizabeth; Cowley, Matthew S.; Harvill. Lauren; Rajulu, Sudhakar

    2014-01-01

    When describing human motion, biomechanists generally report joint angles in terms of Euler angle rotation sequences. However, there are known limitations in using this method to describe complex motions such as the shoulder joint during a baseball pitch. Euler angle notation uses a series of three rotations about an axis where each rotation is dependent upon the preceding rotation. As such, the Euler angles need to be regarded as a set to get accurate angle information. Unfortunately, it is often difficult to visualize and understand these complex motion representations. One of our key functions is to help design engineers understand how a human will perform with new designs and all too often traditional use of Euler rotations becomes as much of a hindrance as a help. It is believed that using a spherical coordinate system will allow ABF personnel to more quickly and easily transmit important mobility data to engineers, in a format that is readily understandable and directly translatable to their design efforts. Objectives: The goal of this project is to establish new analysis and visualization techniques to aid in the examination and comprehension of complex motions. Methods: This project consisted of a series of small sub-projects, meant to validate and verify the method before it was implemented in the ABF's data analysis practices. The first stage was a proof of concept, where a mechanical test rig was built and instrumented with an inclinometer, so that its angle from horizontal was known. The test rig was tracked in 3D using an optical motion capture system, and its position and orientation were reported in both Euler and spherical reference systems. The rig was meant to simulate flexion/extension, transverse rotation and abduction/adduction of the human shoulder, but without the variability inherent in human motion. In the second phase of the project, the ABF estimated the error inherent in a spherical coordinate system, and evaluated how this error would

  20. Software Performs Complex Design Analysis

    Science.gov (United States)

    2008-01-01

    Designers use computational fluid dynamics (CFD) to gain greater understanding of the fluid flow phenomena involved in components being designed. They also use finite element analysis (FEA) as a tool to help gain greater understanding of the structural response of components to loads, stresses and strains, and the prediction of failure modes. Automated CFD and FEA engineering design has centered on shape optimization, which has been hindered by two major problems: 1) inadequate shape parameterization algorithms, and 2) inadequate algorithms for CFD and FEA grid modification. Working with software engineers at Stennis Space Center, a NASA commercial partner, Optimal Solutions Software LLC, was able to utilize its revolutionary, one-of-a-kind arbitrary shape deformation (ASD) capability-a major advancement in solving these two aforementioned problems-to optimize the shapes of complex pipe components that transport highly sensitive fluids. The ASD technology solves the problem of inadequate shape parameterization algorithms by allowing the CFD designers to freely create their own shape parameters, therefore eliminating the restriction of only being able to use the computer-aided design (CAD) parameters. The problem of inadequate algorithms for CFD grid modification is solved by the fact that the new software performs a smooth volumetric deformation. This eliminates the extremely costly process of having to remesh the grid for every shape change desired. The program can perform a design change in a markedly reduced amount of time, a process that would traditionally involve the designer returning to the CAD model to reshape and then remesh the shapes, something that has been known to take hours, days-even weeks or months-depending upon the size of the model.

  1. A suite of models to support the quantitative assessment of spread in pest risk analysis

    NARCIS (Netherlands)

    Robinet, C.; Kehlenbeck, H.; Werf, van der W.

    2012-01-01

    In the frame of the EU project PRATIQUE (KBBE-2007-212459 Enhancements of pest risk analysis techniques) a suite of models was developed to support the quantitative assessment of spread in pest risk analysis. This dataset contains the model codes (R language) for the four models in the suite. Three

  2. Human Factors Analysis in Software Engineering

    Institute of Scientific and Technical Information of China (English)

    Xu Ren-zuo; Ma Ruo-feng; Liu Li-na; Xiong Zhong-wei

    2004-01-01

    The general human factors analysis analyzes human functions, effects and influence in a system. But in a narrow sense, it analyzes human influence upon the reliability of a system, it includes traditional human reliability analysis, human error analysis, man-machine interface analysis, human character analysis, and others. A software development project in software engineering is successful or not to be completely determined by human factors. In this paper, we discuss the human factors intensions, declare the importance of human factors analysis for software engineering by listed some instances. At last, we probe preliminarily into the mentality that a practitioner in software engineering should possess.

  3. Fault tree analysis of KNICS RPS software

    Energy Technology Data Exchange (ETDEWEB)

    Park, Gee Yong; Kwon, Kee Choon [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Koh, Kwang Yong; Jee, Eun Kyoung; Seong, Poong Hyun [Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of); Lee, Dae Hyung [Doosan Heavy Industries and Construction, Yongin (Korea, Republic of)

    2008-08-15

    This paper describes the application of a software Fault Tree Analysis (FTA) as one of the analysis techniques for a Software Safety Analysis (SSA) at the design phase and its analysis results for the safety-critical software of a digital reactor protection system, which is called the KNICS RPS, being developed in the KNICS (Korea Nuclear Instrumentation and Control Systems) project. The software modules in the design description were represented by Function Blocks (FBs), and the software FTA was performed based on the well-defined fault tree templates for the FBs. The SSA, which is part of the verification and validation (V and V) activities, was activated at each phase of the software lifecycle for the KNICS RPS. At the design phase, the software HAZOP (Hazard and Operability) and the software FTA were employed in the SSA in such a way that the software HAZOP was performed first and then the software FTA was applied. The software FTA was applied to some critical modules selected from the software HAZOP analysis.

  4. Linear Analysis and Verification Suite for Edge Turbulence

    Energy Technology Data Exchange (ETDEWEB)

    Myra, J R; Umansky, M

    2008-04-24

    The edge and scrape-off-layer region of a tokamak plasma is subject to well known resistive and ideal instabilities that are driven by various curvature- and sheath-related mechanisms. While the boundary plasma is typically strongly turbulent in experiments, it is useful to have computational tools that can analyze the linear eigenmode structure, predict quantitative trends in growth rates and elucidate and the underlying drive mechanisms. Furthermore, measurement of the linear growth rate of unstable modes emerging from a known, established equilibrium configuration provides one of the few quantitative ways of rigorously benchmarking large-scale plasma turbulence codes with each other and with a universal standard. In this report, a suite of codes that can describe linearized, nonlocal (e.g. separatrix-spanning) modes in axisymmetric (realistic divertor), toroidal geometry is discussed. Examples of several benchmark comparisons are given, and future development plans for a new eigenvalue edge code are presented.

  5. Integrating security analysis and safeguards software engineering

    Energy Technology Data Exchange (ETDEWEB)

    Spencer, D.D.; Axline, R.M.

    1989-01-01

    These initiatives will work together to provide more secure safeguards software, as well as other critical systems software. The resulting design tools and methodologies, the evolving guidelines for software security, and the adversary-resistant software components will be applied to the software design at each stage to increase the design's inherent security and to make the design easier to analyze. The resident hardware monitor or other architectural innovations will provide complementary additions to the design to remove some of the burden of security from the software. The security analysis process, supported by new analysis methodologies and tools, will be applied to the software design as it evolves in an attempt to identify and remove vulnerabilities at the earliest possible point in the safeguards system life cycle. The result should be better and more verifiably secure software systems.

  6. Integrated Methodology for Software Reliability Analysis

    Directory of Open Access Journals (Sweden)

    Marian Pompiliu CRISTESCU

    2012-01-01

    Full Text Available The most used techniques to ensure safety and reliability of the systems are applied together as a whole, and in most cases, the software components are usually overlooked or to little analyzed. The present paper describes the applicability of fault trees analysis software system, analysis defined as Software Fault Tree Analysis (SFTA, fault trees are evaluated using binary decision diagrams, all of these being integrated and used with help from Java library reliability.

  7. Software architecture analysis of usability

    NARCIS (Netherlands)

    Folmer, Eelke

    2005-01-01

    One of the qualities that has received increased attention in recent decades is usability. A software product with poor usability is likely to fail in a highly competitive market; therefore software developing organizations are paying more and more attention to ensuring the usability of their softwa

  8. A Coupled Calculation Suite for Atucha II Operational Transients Analysis

    Directory of Open Access Journals (Sweden)

    Oscar Mazzantini

    2011-01-01

    Full Text Available While more than a decade ago reactor and thermal hydraulic calculations were tedious and often needed a lot of approximations and simplifications that forced the designers to take a very conservative approach, computational resources available nowadays allow engineers to cope with increasingly complex problems in a reasonable time. The use of best-estimate calculations provides tools to justify convenient engineering margins, reduces costs, and maximises economic benefits. In this direction, a suite of coupled best-estimate specific calculation codes was developed to analyse the behaviour of the Atucha II nuclear power plant in Argentina. The developed tool includes three-dimensional spatial neutron kinetics, a channel-level model of the core thermal hydraulics with subcooled boiling correlations, a one-dimensional model of the primary and secondary circuits including pumps, steam generators, heat exchangers, and the turbine with all their associated control loops, and a complete simulation of the reactor control, limitation, and protection system working in closed-loop conditions as a faithful representation of the real power plant. In the present paper, a description of the coupling scheme between the codes involved is given, and some examples of their application to Atucha II are shown.

  9. Software architecture analysis of usability

    NARCIS (Netherlands)

    Folmer, E; van Gurp, J; Bosch, J; Bastide, R; Palanque, P; Roth, J

    2005-01-01

    Studies of software engineering projects show that a large number of usability related change requests are made after its deployment. Fixing usability problems during the later stages of development often proves to be costly, since many of the necessary changes require changes to the system that can

  10. Numerical methods in software and analysis

    CERN Document Server

    Rice, John R

    1992-01-01

    Numerical Methods, Software, and Analysis, Second Edition introduces science and engineering students to the methods, tools, and ideas of numerical computation. Introductory courses in numerical methods face a fundamental problem-there is too little time to learn too much. This text solves that problem by using high-quality mathematical software. In fact, the objective of the text is to present scientific problem solving using standard mathematical software. This book discusses numerous programs and software packages focusing on the IMSL library (including the PROTRAN system) and ACM Algorithm

  11. SiLK: A Tool Suite for Unsampled Network Flow Analysis at Scale

    Science.gov (United States)

    2014-06-01

    SiLK : A Tool Suite for Unsampled Network Flow Analysis at Scale Mark Thomas, Leigh Metcalf, Jonathan Spring, Paul Krystosek, Katherine Prevost netsa...make the problem manageable, but sampling unacceptably reduces the fidelity of ana- lytic conclusions. In this paper we discuss SiLK , a tool suite...created to analyze this high-volume data source without sampling. SiLK implementation and archi- tectural design are optimized to manage this Big Data

  12. Software safety analysis practice in installation phase

    Energy Technology Data Exchange (ETDEWEB)

    Huang, H. W.; Chen, M. H.; Shyu, S. S., E-mail: hwhwang@iner.gov.t [Institute of Nuclear Energy Research, No. 1000 Wenhua Road, Chiaan Village, Longtan Township, 32546 Taoyuan County, Taiwan (China)

    2010-10-15

    This work performed a software safety analysis in the installation phase of the Lung men nuclear power plant in Taiwan, under the cooperation of Institute of Nuclear Energy Research and Tpc. The US Nuclear Regulatory Commission requests licensee to perform software safety analysis and software verification and validation in each phase of software development life cycle with Branch Technical Position 7-14. In this work, 37 safety grade digital instrumentation and control systems were analyzed by failure mode and effects analysis, which is suggested by IEEE standard 7-4.3.2-2003. During the installation phase, skew tests for safety grade network and point to point tests were performed. The failure mode and effects analysis showed all the single failure modes can be resolved by the redundant means. Most of the common mode failures can be resolved by operator manual actions. (Author)

  13. A New Method For Processing Backscatter Imagery Collected By Multibeam Sonars: The HAWAII MR1 Sidescan Sonar Software Suite

    Science.gov (United States)

    Davis, R.; Appelgate, B.

    2003-12-01

    The Hawaii Mapping Research Group of the University of Hawaii has developed and used in-house software to process seafloor acoustic imagery (sidescan sonar) data acquired by its own shallow-towed HAWAII-MR1 phase-difference sidescan bathymetry mapping system, as well as phase-difference sonars operated by other institutions (NAVO SEAMAP-C, LDEO SCAMP, WHOI DSL-120A). During 2003 the HAWAII MR1 software tools were modified to operate on multibeam backscatter data collected by the University of Hawaii research vessel Kilo Moana to test whether the software could improve the quality of the multibeam backscatter and the resulting data products. We found that the MR1 processing tools are effective in eliminating speckle and stripe noise, and cross-track amplitude variability due to angle-varying gain. HMRG software was also used to produce final mosaics of sonar imagery. Once we determined the viability of using the MR1 tools to process Kilo Moana data, we expanded the toolkit to allow operation on data from other multibeam systems as well. HMRG processing software runs on Linux, Unix and Irix platforms. The main processing modules can be run in either graphical or command-line modes. The preferred processing scheme involves using the graphical interface to interactively determine noise filters appropriate for each survey. Once characterized, these filters are then applied in batch processes that run faster than graphical methods allow. Once processing is completed, the HMRG software is used to grid the data and then to assemble individual grids into mosaics. The mosaics can be output as Sun raster files, geotiffs, or Generic Mapping Tools (GMT) grids, which allow the mosaics to be imported into other GIS and charting programs. Here we present results from several recent surveys that used different hull-mounted multibeam systems: 12kHz Sea Beam 2112 data from the US Coast Guard Cutter Healy, 12kHz Simrad EM120 data from deep water surveys aboard the R/V Kilo Moana

  14. Software abstractions logic, language, and analysis

    CERN Document Server

    Jackson, Daniel

    2011-01-01

    In Software Abstractions Daniel Jackson introduces an approach to software design that draws on traditional formal methods but exploits automated tools to find flaws as early as possible. This approach--which Jackson calls "lightweight formal methods" or "agile modeling"--takes from formal specification the idea of a precise and expressive notation based on a tiny core of simple and robust concepts but replaces conventional analysis based on theorem proving with a fully automated analysis that gives designers immediate feedback. Jackson has developed Alloy, a language that captures the essence of software abstractions simply and succinctly, using a minimal toolkit of mathematical notions. This revised edition updates the text, examples, and appendixes to be fully compatible with the latest version of Alloy (Alloy 4). The designer can use automated analysis not only to correct errors but also to make models that are more precise and elegant. This approach, Jackson says, can rescue designers from "the tarpit of...

  15. Power and performance software analysis and optimization

    CERN Document Server

    Kukunas, Jim

    2015-01-01

    Power and Performance: Software Analysis and Optimization is a guide to solving performance problems in modern Linux systems. Power-efficient chips are no help if the software those chips run on is inefficient. Starting with the necessary architectural background as a foundation, the book demonstrates the proper usage of performance analysis tools in order to pinpoint the cause of performance problems, and includes best practices for handling common performance issues those tools identify. Provides expert perspective from a key member of Intel's optimization team on how processors and memory

  16. The emerging Web 2.0 social software: an enabling suite of sociable technologies in health and health care education.

    Science.gov (United States)

    Kamel Boulos, Maged N; Wheeler, Steve

    2007-03-01

    Web 2.0 sociable technologies and social software are presented as enablers in health and health care, for organizations, clinicians, patients and laypersons. They include social networking services, collaborative filtering, social bookmarking, folksonomies, social search engines, file sharing and tagging, mashups, instant messaging, and online multi-player games. The more popular Web 2.0 applications in education, namely wikis, blogs and podcasts, are but the tip of the social software iceberg. Web 2.0 technologies represent a quite revolutionary way of managing and repurposing/remixing online information and knowledge repositories, including clinical and research information, in comparison with the traditional Web 1.0 model. The paper also offers a glimpse of future software, touching on Web 3.0 (the Semantic Web) and how it could be combined with Web 2.0 to produce the ultimate architecture of participation. Although the tools presented in this review look very promising and potentially fit for purpose in many health care applications and scenarios, careful thinking, testing and evaluation research are still needed in order to establish 'best practice models' for leveraging these emerging technologies to boost our teaching and learning productivity, foster stronger 'communities of practice', and support continuing medical education/professional development (CME/CPD) and patient education.

  17. The Pragmatic Analysis on Offensive Words-With Suits as a Case

    Institute of Scientific and Technical Information of China (English)

    高盈盈; 李卉艳

    2016-01-01

    As the researches on offensive words focus more attention on people in the symmetric power context and ignores dynamic contexts, it causes a lack of systematic analysis on the mechanism of the realization forms of offensive words. Based on the analysis on Suits, this paper finds out power plays a significant role in constraining people’s offensive words and their realization forms vary in different power contexts.

  18. Software for computerised analysis of cardiotocographic traces.

    Science.gov (United States)

    Romano, M; Bifulco, P; Ruffo, M; Improta, G; Clemente, F; Cesarelli, M

    2016-02-01

    Despite the widespread use of cardiotocography in foetal monitoring, the evaluation of foetal status suffers from a considerable inter and intra-observer variability. In order to overcome the main limitations of visual cardiotocographic assessment, computerised methods to analyse cardiotocographic recordings have been recently developed. In this study, a new software for automated analysis of foetal heart rate is presented. It allows an automatic procedure for measuring the most relevant parameters derivable from cardiotocographic traces. Simulated and real cardiotocographic traces were analysed to test software reliability. In artificial traces, we simulated a set number of events (accelerations, decelerations and contractions) to be recognised. In the case of real signals, instead, results of the computerised analysis were compared with the visual assessment performed by 18 expert clinicians and three performance indexes were computed to gain information about performances of the proposed software. The software showed preliminary performance we judged satisfactory in that the results matched completely the requirements, as proved by tests on artificial signals in which all simulated events were detected from the software. Performance indexes computed in comparison with obstetricians' evaluations are, on the contrary, not so satisfactory; in fact they led to obtain the following values of the statistical parameters: sensitivity equal to 93%, positive predictive value equal to 82% and accuracy equal to 77%. Very probably this arises from the high variability of trace annotation carried out by clinicians.

  19. GRACAT, Software for grounding and collision analysis

    DEFF Research Database (Denmark)

    Friis-Hansen, Peter; Simonsen, Bo Cerup

    2002-01-01

    and grounding accidents. The software consists of three basic analysis modules and one risk mitigation module: 1) frequency, 2) damage, and 3) consequence. These modules can be used individually or in series and the analyses can be performed in deterministic or probabilistic mode. Finally, in the mitigation...

  20. Intraprocedural Dataflow Analysis for Software Product Lines

    DEFF Research Database (Denmark)

    Brabrand, Claus; Ribeiro, Márcio; Tolêdo, Társis;

    2013-01-01

    Software product lines (SPLs) developed using annotative approaches such as conditional compilation come with an inherent risk of constructing erroneous products. For this reason, it is essential to be able to analyze such SPLs. However, as dataflow analysis techniques are not able to deal with SPLs...

  1. Advanced Software Methods for Physics Analysis

    Science.gov (United States)

    Lista, L.

    2006-01-01

    Unprecedented data analysis complexity is experienced in modern High Energy Physics experiments. The complexity arises from the growing size of recorded data samples, the large number of data analyses performed by different users in each single experiment, and the level of complexity of each single analysis. For this reason, the requirements on software for data analysis impose a very high level of reliability. We present two concrete examples: the former from BaBar experience with the migration to a new Analysis Model with the definition of a new model for the Event Data Store, the latter about a toolkit for multivariate statistical and parametric Monte Carlo analysis developed using generic programming.

  2. Software for analysis of visual meteor data

    Science.gov (United States)

    Veljković, Kristina; Ivanović, Ilija

    2014-02-01

    In this paper, we will present new software for analysis of IMO data collected from visual observations. The software consists of a package of functions written in the statistical programming language R, as well as a Java application which uses these functions in a user friendly environment. R code contains various filters for selection of data, methods for calculation of Zenithal Hourly Rate (ZHR), solar longitude, population index and graphical representation of ZHR and distribution of observed magnitudes. The Java application allows everyone to use these functions without any knowledge of R. Both R code and the Java application are open source and free with user manuals and examples provided.

  3. Improving Software Systems By Flow Control Analysis

    Directory of Open Access Journals (Sweden)

    Piotr Poznanski

    2012-01-01

    Full Text Available Using agile methods during the implementation of the system that meets mission critical requirements can be a real challenge. The change in the system built of dozens or even hundreds of specialized devices with embedded software requires the cooperation of a large group of engineers. This article presents a solution that supports parallel work of groups of system analysts and software developers. Deployment of formal rules to the requirements written in natural language enables using formal analysis of artifacts being a bridge between software and system requirements. Formalism and textual form of requirements allowed the automatic generation of message flow graph for the (sub system, called the “big-picture-model”. Flow diagram analysis helped to avoid a large number of defects whose repair cost in extreme cases could undermine the legitimacy of agile methods in projects of this scale. Retrospectively, a reduction of technical debt was observed. Continuous analysis of the “big picture model” improves the control of the quality parameters of the software architecture. The article also tries to explain why the commercial platform based on UML modeling language may not be sufficient in projects of this complexity.

  4. 76 FR 60939 - Metal Fatigue Analysis Performed by Computer Software

    Science.gov (United States)

    2011-09-30

    ... COMMISSION Metal Fatigue Analysis Performed by Computer Software AGENCY: Nuclear Regulatory Commission... applicants' analyses and methodologies using the computer software package, WESTEMS TM , to demonstrate... by Computer Software Addressees All holders of, and applicants for, a power reactor operating...

  5. Automating Risk Analysis of Software Design Models

    Directory of Open Access Journals (Sweden)

    Maxime Frydman

    2014-01-01

    Full Text Available The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance.

  6. Development of Advanced Suite of Deterministic Codes for VHTR Physics Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Kang Seog; Cho, J. Y.; Lee, K. H. (and others)

    2007-07-15

    Advanced Suites of deterministic codes for VHTR physics analysis has been developed for detailed analysis of current and advanced reactor designs as part of a US-ROK collaborative I-NERI project. These code suites include the conventional 2-step procedure in which a few group constants are generated by a transport lattice calculation, and the reactor physics analysis is performed by a 3-dimensional diffusion calculation, and a whole core transport code that can model local heterogeneities directly at the core level. Particular modeling issues in physics analysis of the gas-cooled VHTRs were resolved, which include a double heterogeneity of the coated fuel particles, a neutron streaming in the coolant channels, a strong core-reflector interaction, and large spectrum shifts due to changes of the surrounding environment, temperature and burnup. And the geometry handling capability of the DeCART code were extended to deal with the hexagonal fuel elements of the VHTR core. The developed code suites were validated and verified by comparing the computational results with those of the Monte Carlo calculations for the benchmark problems.

  7. Development of automated conjunctival hyperemia analysis software.

    Science.gov (United States)

    Sumi, Tamaki; Yoneda, Tsuyoshi; Fukuda, Ken; Hoshikawa, Yasuhiro; Kobayashi, Masahiko; Yanagi, Masahide; Kiuchi, Yoshiaki; Yasumitsu-Lovell, Kahoko; Fukushima, Atsuki

    2013-11-01

    Conjunctival hyperemia is observed in a variety of ocular inflammatory conditions. The evaluation of hyperemia is indispensable for the treatment of patients with ocular inflammation. However, the major methods currently available for evaluation are based on nonquantitative and subjective methods. Therefore, we developed novel software to evaluate bulbar hyperemia quantitatively and objectively. First, we investigated whether the histamine-induced hyperemia of guinea pigs could be quantified by image analysis. Bulbar conjunctival images were taken by means of a digital camera, followed by the binarization of the images and the selection of regions of interest (ROIs) for evaluation. The ROIs were evaluated by counting the number of absolute pixel values. Pixel values peaked significantly 1 minute after histamine challenge was performed and were still increased after 5 minutes. Second, we applied the same method to antigen (ovalbumin)-induced hyperemia of sensitized guinea pigs, acquiring similar results except for the substantial upregulation in the first 5 minutes after challenge. Finally, we analyzed human bulbar hyperemia using the new software we developed especially for human usage. The new software allows the automatic calculation of pixel values once the ROIs have been selected. In our clinical trials, the percentage of blood vessel coverage of ROIs was significantly higher in the images of hyperemia caused by allergic conjunctival diseases and hyperemia induced by Bimatoprost, compared with those of healthy volunteers. We propose that this newly developed automated hyperemia analysis software will be an objective clinical tool for the evaluation of ocular hyperemia.

  8. Complexity of software trustworthiness and its dynamical statistical analysis methods

    Institute of Scientific and Technical Information of China (English)

    ZHENG ZhiMing; MA ShiLong; LI Wei; JIANG Xin; WEI Wei; MA LiLi; TANG ShaoTing

    2009-01-01

    Developing trusted softwares has become an important trend and a natural choice in the development of software technology and applications.At present,the method of measurement and assessment of software trustworthiness cannot guarantee safe and reliable operations of software systems completely and effectively.Based on the dynamical system study,this paper interprets the characteristics of behaviors of software systems and the basic scientific problems of software trustworthiness complexity,analyzes the characteristics of complexity of software trustworthiness,and proposes to study the software trustworthiness measurement in terms of the complexity of software trustworthiness.Using the dynamical statistical analysis methods,the paper advances an invariant-measure based assessment method of software trustworthiness by statistical indices,and hereby provides a dynamical criterion for the untrustworthiness of software systems.By an example,the feasibility of the proposed dynamical statistical analysis method in software trustworthiness measurement is demonstrated using numerical simulations and theoretical analysis.

  9. Analysis and design for architecture-based software

    Institute of Scientific and Technical Information of China (English)

    Jia Xiaolin; He Jian; Qin Zheng; Wang Xianghua

    2005-01-01

    The technologies of software architecture are introduced, and the software analysis-and-design process is divided into requirement analysis, software architecture design and system design. Using these technologies, a model of architecture-centric software analysis and design process(ACSADP) is proposed. Meanwhile, with regard to the completeness, consistency and correctness between the software requirements and design results, the theories of function and process control are applied to ACSADP. Finally, a model of integrated development environment (IDE) for ACSADP is proposed. It can be demonstrated by the practice that the model of ACSADP can aid developer to manage software process effectively and improve the quality of software analysis and design.

  10. A Comparative Analysis of Institutional Repository Software

    OpenAIRE

    2010-01-01

    This proposal outlines the design of a comparative analysis of the four institutional repository software packages that were represented at the 4th International Conference on Open Repositories held in 2009 in Atlanta, Georgia: EPrints, DSpace, Fedora and Zentity (The 4th International Conference on Open Repositories website, https://or09.library.gatech.edu). The study includes 23 qualitative and quantitative measures taken from default installations of the four repositories on a benchmark ma...

  11. LTP data analysis software and infrastructure

    Science.gov (United States)

    Nofrarias Serra, Miquel

    The LTP (LISA Technology Package) is the core part of the LISA Pathfinder mission. The main goal of the mission is to study the sources of any disturbances that perturb the motion of the freely-falling test masses from their geodesic trajectories as well as to test various technologies needed for LISA. The LTP experiment is designed as a sequence of experimental runs in which the performance of the instrument is studied and characterised under different operating conditions. In order to best optimise subsequent experimental runs, each run must be promptly analysed to ensure that the following ones make best use of the available knowledge of the instrument. In order to do this, a robust and flexible data analysis software package is required. The software developed for the LTP Data Analysis is a comprehensive data analysis tool based on MATLAB. The environment provides an object-oriented approach to data analysis which allows the user to design and run data analysis pipelines, either graphically or via scripts. The output objects of the analyses contain a full history of the processing that took place; this history tree can be inspected and used to rebuild the objects. This poster introduces the analysis environment and the concepts that have gone in to its design.

  12. R suite for the Reduction and Analysis of UFO Orbit Data

    Science.gov (United States)

    Campbell-Burns, P.; Kacerek, R.

    2016-02-01

    This paper presents work undertaken by UKMON to compile a suite of simple R scripts for the reduction and analysis of meteor data. The application of R in this context is by no means an original idea and there is no doubt that it has been used already in many reports to the IMO. However, we are unaware of any common libraries or shared resources available to the meteor community. By sharing our work we hope to stimulate interest and discussion. Graphs shown in this paper are illustrative and are based on current data from both EDMOND and UKMON.

  13. Static analysis of software the abstract interpretation

    CERN Document Server

    Boulanger, Jean-Louis

    2013-01-01

    The existing literature currently available to students and researchers is very general, covering only the formal techniques of static analysis. This book presents real examples of the formal techniques called ""abstract interpretation"" currently being used in various industrial fields: railway, aeronautics, space, automotive, etc. The purpose of this book is to present students and researchers, in a single book, with the wealth of experience of people who are intrinsically involved in the realization and evaluation of software-based safety critical systems. As the authors are people curr

  14. VSI-E Software Suite

    CERN Document Server

    Lapsley, D

    2004-01-01

    As broadband access to high speed research and education networks has become increasingly available to radio telescopes around the world the use of e-VLBI has also increased. High bandwidth e-VLBI experiments have been achieved across wide areas (Whitney et al., 2003). e-VLBI has also been used for the transfer of data from "production" experiments. As the use of e-VLBI becomes more and more prevalent, the need to have some form of standard framework for facilitating interoperability between various acquisition, transport and processing systems around the world is becoming increasingly important. This is the motivation behind the VLBI Standard Interface - Electronic (VSI-E) standard. VSI-E is currently in draft form [\\cite{bib-vsie}] and is going through the standards process within the VLBI community. In this poster, we describe an initial reference implementation of the VSI-E protocol. The implementation has been done using the C/C++ language and is in the form of a re-useable library. It has been developed...

  15. A Software Tool for Quantitative Seismicity Analysis - ZMAP

    Science.gov (United States)

    Wiemer, S.; Gerstenberger, M.

    2001-12-01

    Earthquake catalogs are probably the most basic product of seismology, and remain arguably the most useful for tectonic studies. Modern seismograph networks can locate up to 100,000 earthquakes annually, providing a continuous and sometime overwhelming stream of data. ZMAP is a set of tools driven by a graphical user interface (GUI), designed to help seismologists analyze catalog data. ZMAP is primarily a research tool suited to the evaluation of catalog quality and to addressing specific hypotheses; however, it can also be useful in routine network operations. Examples of ZMAP features include catalog quality assessment (artifacts, completeness, explosion contamination), interactive data exploration, mapping transients in seismicity (rate changes, b-values, p-values), fractal dimension analysis and stress tensor inversions. Roughly 100 scientists worldwide have used the software at least occasionally. About 30 peer-reviewed publications have made use of ZMAP. ZMAP code is open source, written in the commercial software language Matlab by the Mathworks, a widely used software in the natural sciences. ZMAP was first published in 1994, and has continued to grow over the past 7 years. Recently, we released ZMAP v.6. The poster will introduce the features of ZMAP. We will specifically focus on ZMAP features related to time-dependent probabilistic hazard assessment. We are currently implementing a ZMAP based system that computes probabilistic hazard maps, which combine the stationary background hazard as well as aftershock and foreshock hazard into a comprehensive time dependent probabilistic hazard map. These maps will be displayed in near real time on the Internet. This poster is also intended as a forum for ZMAP users to provide feedback and discuss the future of ZMAP.

  16. statnet: Software Tools for the Representation, Visualization, Analysis and Simulation of Network Data

    Directory of Open Access Journals (Sweden)

    Mark S. Handcock

    2007-12-01

    Full Text Available statnet is a suite of software packages for statistical network analysis. The packages implement recent advances in network modeling based on exponential-family random graph models (ERGM. The components of the package provide a comprehensive framework for ERGM-based network modeling, including tools for model estimation, model evaluation, model-based network simulation, and network visualization. This broad functionality is powered by a central Markov chain Monte Carlo (MCMC algorithm. The coding is optimized for speed and robustness.

  17. STAR: Software Toolkit for Analysis Research

    Energy Technology Data Exchange (ETDEWEB)

    Doak, J.E.; Prommel, J.M.; Whiteson, R.; Hoffbauer, B.L.; Thomas, T.R. [Los Alamos National Lab., NM (United States); Helman, P. [New Mexico Univ., Albuquerque, NM (United States). Dept. of Computer Science

    1993-08-01

    Analyzing vast quantities of data from diverse information sources is an increasingly important element for nonproliferation and arms control analysis. Much of the work in this area has used human analysts to assimilate, integrate, and interpret complex information gathered from various sources. With the advent of fast computers, we now have the capability to automate this process thereby shifting this burden away from humans. In addition, there now exist huge data storage capabilities which have made it possible to formulate large integrated databases comprising many thereabouts of information spanning a variety of subjects. We are currently designing a Software Toolkit for Analysis Research (STAR) to address these issues. The goal of STAR is to Produce a research tool that facilitates the development and interchange of algorithms for locating phenomena of interest to nonproliferation and arms control experts. One major component deals with the preparation of information. The ability to manage and effectively transform raw data into a meaningful form is a prerequisite for analysis by any methodology. The relevant information to be analyzed can be either unstructured text structured data, signals, or images. Text can be numerical and/or character, stored in raw data files, databases, streams of bytes, or compressed into bits in formats ranging from fixed, to character-delimited, to a count followed by content The data can be analyzed in real-time or batch mode. Once the data are preprocessed, different analysis techniques can be applied. Some are built using expert knowledge. Others are trained using data collected over a period of time. Currently, we are considering three classes of analyzers for use in our software toolkit: (1) traditional machine learning techniques, (2) the purely statistical system, and (3) expert systems.

  18. SWOT Analysis of Software Development Process Models

    Directory of Open Access Journals (Sweden)

    Ashish B. Sasankar

    2011-09-01

    Full Text Available Software worth billions and trillions of dollars have gone waste in the past due to lack of proper techniques used for developing software resulting into software crisis. Historically , the processes of software development has played an important role in the software engineering. A number of life cycle models have been developed in last three decades. This paper is an attempt to Analyze the software process model using SWOT method. The objective is to identify Strength ,Weakness ,Opportunities and Threats of Waterfall, Spiral, Prototype etc.

  19. Visual querying and analysis of large software repositories

    NARCIS (Netherlands)

    Voinea, Lucian; Telea, Alexandru

    2009-01-01

    We present a software framework for mining software repositories. Our extensible framework enables the integration of data extraction from repositories with data analysis and interactive visualization. We demonstrate the applicability of the framework by presenting several case studies performed on

  20. Software Speeds Up Analysis of Breast Cancer Risk

    Science.gov (United States)

    ... page: https://medlineplus.gov/news/fullstory_161117.html Software Speeds Up Analysis of Breast Cancer Risk: Study ... 22, 2016 THURSDAY, Sept. 22, 2016 (HealthDay News) -- Software that quickly analyzes mammograms and patient history to ...

  1. Scientific Data Analysis and Software Support: Geodynamics

    Science.gov (United States)

    Klosko, Steven; Sanchez, B. (Technical Monitor)

    2000-01-01

    The support on this contract centers on development of data analysis strategies, geodynamic models, and software codes to study four-dimensional geodynamic and oceanographic processes, as well as studies and mission support for near-Earth and interplanetary satellite missions. SRE had a subcontract to maintain the optical laboratory for the LTP, where instruments such as MOLA and GLAS are developed. NVI performed work on a Raytheon laser altimetry task through a subcontract, providing data analysis and final data production for distribution to users. HBG had a subcontract for specialized digital topography analysis and map generation. Over the course of this contract, Raytheon ITSS staff have supported over 60 individual tasks. Some tasks have remained in place during this entire interval whereas others have been completed and were of shorter duration. Over the course of events, task numbers were changed to reflect changes in the character of the work or new funding sources. The description presented below will detail the technical accomplishments that have been achieved according to their science and technology areas. What will be shown is a brief overview of the progress that has been made in each of these investigative and software development areas. Raytheon ITSS staff members have received many awards for their work on this contract, including GSFC Group Achievement Awards for TOPEX Precision Orbit Determination and the Joint Gravity Model One Team. NASA JPL gave the TOPEX/POSEIDON team a medal commemorating the completion of the primary mission and a Certificate of Appreciation. Raytheon ITSS has also received a Certificate of Appreciation from GSFC for its extensive support of the Shuttle Laser Altimeter Experiment.

  2. Method for detecting software anomalies based on recurrence plot analysis

    OpenAIRE

    Michał Mosdorf

    2012-01-01

    Presented paper evaluates method for detecting software anomalies based on recurrence plot analysis of trace log generated by software execution. Described method for detecting software anomalies is based on windowed recurrence quantification analysis for selected measures (e.g. Recurrence rate - RR or Determinism - DET). Initial results show that proposed method is useful in detecting silent software anomalies that do not result in typical crashes (e.g. exceptions).

  3. Method for detecting software anomalies based on recurrence plot analysis

    Directory of Open Access Journals (Sweden)

    Michał Mosdorf

    2012-03-01

    Full Text Available Presented paper evaluates method for detecting software anomalies based on recurrence plot analysis of trace log generated by software execution. Described method for detecting software anomalies is based on windowed recurrence quantification analysis for selected measures (e.g. Recurrence rate - RR or Determinism - DET. Initial results show that proposed method is useful in detecting silent software anomalies that do not result in typical crashes (e.g. exceptions.

  4. Modification of ACAP for use in accuracy assessment of safety analysis software at OPG

    Energy Technology Data Exchange (ETDEWEB)

    Popescu, A.I.; Pascoe, J.; Luxat, J.C. [Ontario Power Generation Inc., Nuclear Safety Technology Dept., Toronto, Ontario (Canada)

    2002-07-01

    ACAP (Automated Code Assessment Program) is a software tool designed to perform a comparison between either code results and experimental measurements, or code-to-code comparison by means of figures of merit (FOM). The FOM are based on equations used in approximation theory, time-series data analysis, and statistical analysis and provide an objective assessment of the agreement between individual or suite comparisons. This paper describes new ACAP features and FOM developed and implemented at OPG. These modifications were performed to increase productivity and enable ACAP to quantify accuracy of SA software in support of the OPG Software Qualification process. The capabilities added to ACAP are focused on data spectral analysis and normalcy assessment of residual values distribution. The latter functionality is provided by new FOM that compare the distribution of measured minus computed data with a normal distribution using the method of normal probability plot. Other new features implemented at OPG relate to linkage with other Windows applications, improvement of plot engine (power spectra graphics, Q-Q plots), etc. Application of ACAP to quantify the accuracy of the GOTHIC code modeling of buoyancy induced gas mixing using the LSGMF tests is described. Analyses of 170 experiment/code prediction comparisons using ACAP indicate that it is well suited for qualification of accuracy for software used in Safety Analysis. (author)

  5. Mutation Analysis Approach to Develop Reliable Object-Oriented Software

    Directory of Open Access Journals (Sweden)

    Monalisa Sarma

    2014-01-01

    Full Text Available In general, modern programs are large and complex and it is essential that they should be highly reliable in applications. In order to develop highly reliable software, Java programming language developer provides a rich set of exceptions and exception handling mechanisms. Exception handling mechanisms are intended to help developers build robust programs. Given a program with exception handling constructs, for an effective testing, we are to detect whether all possible exceptions are raised and caught or not. However, complex exception handling constructs make it tedious to trace which exceptions are handled and where and which exceptions are passed on. In this paper, we address this problem and propose a mutation analysis approach to develop reliable object-oriented programs. We have applied a number of mutation operators to create a large set of mutant programs with different type of faults. We then generate test cases and test data to uncover exception related faults. The test suite so obtained is applied to the mutant programs measuring the mutation score and hence verifying whether mutant programs are effective or not. We have tested our approach with a number of case studies to substantiate the efficacy of the proposed mutation analysis technique.

  6. Analysis of Empirical Software Effort Estimation Models

    CERN Document Server

    Basha, Saleem

    2010-01-01

    Reliable effort estimation remains an ongoing challenge to software engineers. Accurate effort estimation is the state of art of software engineering, effort estimation of software is the preliminary phase between the client and the business enterprise. The relationship between the client and the business enterprise begins with the estimation of the software. The credibility of the client to the business enterprise increases with the accurate estimation. Effort estimation often requires generalizing from a small number of historical projects. Generalization from such limited experience is an inherently under constrained problem. Accurate estimation is a complex process because it can be visualized as software effort prediction, as the term indicates prediction never becomes an actual. This work follows the basics of the empirical software effort estimation models. The goal of this paper is to study the empirical software effort estimation. The primary conclusion is that no single technique is best for all sit...

  7. LANDSAFE: LANDING SITE RISK ANALYSIS SOFTWARE FRAMEWORK

    Directory of Open Access Journals (Sweden)

    R. Schmidt

    2012-08-01

    Full Text Available The European Space Agency (ESA is planning a Lunar Lander mission in the 2018 timeframe that will demonstrate precise soft landing at the polar regions of the Moon. To ensure a safe and successful landing a careful risk analysis has to be carried out. This is comprised of identifying favorable target areas and evaluating the surface conditions in these areas. Features like craters, boulders, steep slopes, rough surfaces and shadow areas have to be identified in order to assess the risk associated to a landing site in terms of a successful touchdown and subsequent surface operation of the lander. In addition, global illumination conditions at the landing site have to be simulated and analyzed. The Landing Site Risk Analysis software framework (LandSAfe is a system for the analysis, selection and certification of safe landing sites on the lunar surface. LandSAfe generates several data products including high resolution digital terrain models (DTMs, hazard maps, illumination maps, temperature maps and surface reflectance maps which assist the user in evaluating potential landing site candidates. This paper presents the LandSAfe system and describes the methods and products of the different modules. For one candidate landing site on the rim of Shackleton crater at the south pole of the Moon a high resolution DTM is showcased.

  8. A suite of R packages for web-enabled modeling and analysis of surface waters

    Science.gov (United States)

    Read, J. S.; Winslow, L. A.; Nüst, D.; De Cicco, L.; Walker, J. I.

    2014-12-01

    Researchers often create redundant methods for downloading, manipulating, and analyzing data from online resources. Moreover, the reproducibility of science can be hampered by complicated and voluminous data, lack of time for documentation and long-term maintenance of software, and fear of exposing programming skills. The combination of these factors can encourage unshared one-off programmatic solutions instead of openly provided reusable methods. Federal and academic researchers in the water resources and informatics domains have collaborated to address these issues. The result of this collaboration is a suite of modular R packages that can be used independently or as elements in reproducible analytical workflows. These documented and freely available R packages were designed to fill basic needs for the effective use of water data: the retrieval of time-series and spatial data from web resources (dataRetrieval, geoknife), performing quality assurance and quality control checks of these data with robust statistical methods (sensorQC), the creation of useful data derivatives (including physically- and biologically-relevant indices; GDopp, LakeMetabolizer), and the execution and evaluation of models (glmtools, rLakeAnalyzer). Here, we share details and recommendations for the collaborative coding process, and highlight the benefits of an open-source tool development pattern with a popular programming language in the water resources discipline (such as R). We provide examples of reproducible science driven by large volumes of web-available data using these tools, explore benefits of accessing packages as standardized web processing services (WPS) and present a working platform that allows domain experts to publish scientific algorithms in a service-oriented architecture (WPS4R). We assert that in the era of open data, tools that leverage these data should also be freely shared, transparent, and developed in an open innovation environment.

  9. STATISTICAL ANALYSIS FOR OBJECT ORIENTED DESIGN SOFTWARE SECURITY METRICS

    OpenAIRE

    Amjan.Shaik; Dr.C.R.K.Reddy; Dr.A.Damodaran

    2010-01-01

    In the last decade, empirical studies on object-oriented design metrics have shown some of them to be useful for predicting the fault-proneness of classes in object-oriented software systems. In the era of Computerization Object Oriented Paradigm is becoming more and more pronounced. This has provoked the need of high quality object oriented software, as the traditional metrics cannot be applied on the object-oriented systems. This paper gives the evaluation of CK suit of metrics. There are q...

  10. An Automated Solar Synoptic Analysis Software System

    Science.gov (United States)

    Hong, S.; Lee, S.; Oh, S.; Kim, J.; Lee, J.; Kim, Y.; Lee, J.; Moon, Y.; Lee, D.

    2012-12-01

    We have developed an automated software system of identifying solar active regions, filament channels, and coronal holes, those are three major solar sources causing the space weather. Space weather forecasters of NOAA Space Weather Prediction Center produce the solar synoptic drawings as a daily basis to predict solar activities, i.e., solar flares, filament eruptions, high speed solar wind streams, and co-rotating interaction regions as well as their possible effects to the Earth. As an attempt to emulate this process with a fully automated and consistent way, we developed a software system named ASSA(Automated Solar Synoptic Analysis). When identifying solar active regions, ASSA uses high-resolution SDO HMI intensitygram and magnetogram as inputs and providing McIntosh classification and Mt. Wilson magnetic classification of each active region by applying appropriate image processing techniques such as thresholding, morphology extraction, and region growing. At the same time, it also extracts morphological and physical properties of active regions in a quantitative way for the short-term prediction of flares and CMEs. When identifying filament channels and coronal holes, images of global H-alpha network and SDO AIA 193 are used for morphological identification and also SDO HMI magnetograms for quantitative verification. The output results of ASSA are routinely checked and validated against NOAA's daily SRS(Solar Region Summary) and UCOHO(URSIgram code for coronal hole information). A couple of preliminary scientific results are to be presented using available output results. ASSA will be deployed at the Korean Space Weather Center and serve its customers in an operational status by the end of 2012.

  11. Software reliability experiments data analysis and investigation

    Science.gov (United States)

    Walker, J. Leslie; Caglayan, Alper K.

    1991-01-01

    The objectives are to investigate the fundamental reasons which cause independently developed software programs to fail dependently, and to examine fault tolerant software structures which maximize reliability gain in the presence of such dependent failure behavior. The authors used 20 redundant programs from a software reliability experiment to analyze the software errors causing coincident failures, to compare the reliability of N-version and recovery block structures composed of these programs, and to examine the impact of diversity on software reliability using subpopulations of these programs. The results indicate that both conceptually related and unrelated errors can cause coincident failures and that recovery block structures offer more reliability gain than N-version structures if acceptance checks that fail independently from the software components are available. The authors present a theory of general program checkers that have potential application for acceptance tests.

  12. Software patterns, knowledge maps, and domain analysis

    CERN Document Server

    Fayad, Mohamed E; Hegde, Srikanth GK; Basia, Anshu; Vakil, Ashka

    2014-01-01

    Preface AcknowledgmentsAuthors INTRODUCTIONAn Overview of Knowledge MapsIntroduction: Key Concepts-Software Stable Models, Knowledge Maps, Pattern Language, Goals, Capabilities (Enduring Business Themes + Business Objects) The Motivation The Problem The Objectives Overview of Software Stability Concepts Overview of Knowledge Maps Pattern Languages versus Knowledge Maps: A Brief ComparisonThe Solution Knowledge Maps Methodology or Concurrent Software Development ModelWhy Knowledge Maps? Research Methodology Undertaken Research Verification and Validation The Stratification of This Book Summary

  13. Analysis on Some of Software Reliability Models

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Software reliability & maintainability evaluation tool (SRMET 3.0) is introducted in detail in this paper,which was developed by Software Evaluation and Test Center of China Aerospace Mechanical Corporation. SRMET 3.0is supported by seven soft ware reliability models and four software maintainability models. Numerical characteristicsfor all those models are deeply studied in this paper, and corresponding numerical algorithms for each model are alsogiven in the paper.

  14. Testing the global capabilities of the Antelope software suite: fast location and Mb determination of teleseismic events using the ASAIN and GSN seismic networks

    Science.gov (United States)

    Pesaresi, D.; Russi, M.; Plasencia, M.; Cravos, C.

    2009-04-01

    The Italian National Institute for Oceanography and Experimental Geophysics (Istituto Nazionale di Oceanografia e di Geofisica Sperimentale, OGS) is running the Antarctic Seismographic Argentinean Italian Network (ASAIN), made of 5 seismic stations located in the Scotia Sea region in Antarctica and in Argentina: data from these stations are transferred in real time to the OGS headquarters in Trieste (Italy) via satellite links. OGS is also running, in close cooperation with the Friuli-Venezia Giulia Civil Defense, the North East (NI) Italy seismic network, making use of the Antelope commercial software suite from BRTT as the main acquisition system. As a test to check the global capabilities of Antelope, we set up an instance of Antelope acquiring data in real time from both the regional ASAIN seismic network in Antarctica and a subset of the Global Seismic Network (GSN) funded by the Incorporated Research Institution for Seismology (IRIS). The facilities of the IRIS Data Management System, and specifically the IRIS Data Management Center, were used for real time access to waveform required in this study. Preliminary results over 1 month period indicated that about 82% of the earthquakes with magnitude M>5.0 listed in the PDE catalogue of the National Earthquake Information Center (NEIC) of the United States Geological Survey (USGS) were also correctly detected by Antelope, with an average location error of 0.05 degrees and average body wave magnitude Mb estimation error below 0.1. The average time difference between event origin time and the actual time of event determination by Antelope was of about 45': the comparison with 20', the IASPEI91 P-wave travel time for 180 degrees distance, and 25', the estimate of our test system data latency, indicate that Antelope is a serious candidate for regional and global early warning systems. Updated figures calculated over a longer period of time will be presented and discussed.

  15. Efficacy of a Newly Designed Cephalometric Analysis Software for McNamara Analysis in Comparison with Dolphin Software.

    Directory of Open Access Journals (Sweden)

    Mahtab Nouri

    2015-02-01

    Full Text Available Cephalometric norms of McNamara analysis have been studied in various populations due to their optimal efficiency. Dolphin cephalometric software greatly enhances the conduction of this analysis for orthodontic measurements. However, Dolphin is very expensive and cannot be afforded by many clinicians in developing countries. A suitable alternative software program in Farsi/English will greatly help Farsi speaking clinicians. The present study aimed to develop an affordable Iranian cephalometric analysis software program and compare it with Dolphin, the standard software available on the market for cephalometric analysis.In this diagnostic, descriptive study, 150 lateral cephalograms of normal occlusion individuals were selected in Mashhad and Qazvin, two major cities of Iran mainly populated with Fars ethnicity, the main Iranian ethnic group. After tracing the cephalograms, the McNamara analysis standards were measured both with Dolphin and the new software. The cephalometric software was designed using Microsoft Visual C++ program in Windows XP. Measurements made with the new software were compared with those of Dolphin software on both series of cephalograms. The validity and reliability were tested using intra-class correlation coefficient.Calculations showed a very high correlation between the results of the Iranian cephalometric analysis software and Dolphin. This confirms the validity and optimal efficacy of the newly designed software (ICC 0.570-1.0.According to our results, the newly designed software has acceptable validity and reliability and can be used for orthodontic diagnosis, treatment planning and assessment of treatment outcome.

  16. Software architecture reliability analysis using failure scenarios

    NARCIS (Netherlands)

    Tekinerdogan, Bedir; Sozer, Hasan; Aksit, Mehmet

    2008-01-01

    With the increasing size and complexity of software in embedded systems, software has now become a primary threat for the reliability. Several mature conventional reliability engineering techniques exist in literature but traditionally these have primarily addressed failures in hardware components a

  17. Analysis of Test Efficiency during Software Development Process

    CERN Document Server

    Nair, T R Gopalakrishnan; Tiwari, Pranesh Kumar

    2012-01-01

    One of the prerequisites of any organization is an unvarying sustainability in the dynamic and competitive industrial environment. Development of high quality software is therefore an inevitable constraint of any software industry. Defect management being one of the highly influencing factors for the production of high quality software, it is obligatory for the software organizations to orient them towards effective defect management. Since, the time of software evolution, testing is deemed a promising technique of defect management in all IT industries. This paper provides an empirical investigation of several projects through a case study comprising of four software companies having various production capabilities. The aim of this investigation is to analyze the efficiency of test team during software development process. The study indicates very low-test efficiency at requirements analysis phase and even lesser test efficiency at design phase of software development. Subsequently, the study calls for a str...

  18. User manual for freight transportation analysis software

    Energy Technology Data Exchange (ETDEWEB)

    Terziev, M.N.; Wilson, L.B.

    1976-12-01

    Under sponsorship of the Federal Energy Administration, The Center for Transportation Studies at M.I.T. developed and tested a methodology for analysis of the impacts of various government and carrier policies on the demand for freight transportation. The purpose of this document is to familiarize the reader with the computer programs included in this methodology. The purpose of the computer software developed for this project is threefold. First, programs are used to calculate the cost of each of the transport alternatives available for the purchase of a given commodity by a receiver in a given industrial sector. Furthermore, these programs identify the least-cost alternative, and thus provide a forecasting capability at the disaggregate level. Given a description of the population of receivers in the destination city, a second group of programs applies the costing and forecasting programs to each receiver in a sample drawn from the population. The disaggregate forecasts are summed to produce an aggregate forecast of modal tonnages for the given origin/destination city-pair. Finally, a third group of programs computes fuel consumed in transportation from the aggregate modal tonnages. These three groups of programs were placed under the control of a master routine which coordinates the input and output of data.

  19. Development of a New VLBI Data Analysis Software

    Science.gov (United States)

    Bolotin, Sergei; Gipson, John M.; MacMillan, Daniel S.

    2010-01-01

    We present an overview of a new VLBI analysis software under development at NASA GSFC. The new software will replace CALC/SOLVE and many related utility programs. It will have the capabilities of the current system as well as incorporate new models and data analysis techniques. In this paper we give a conceptual overview of the new software. We formulate the main goals of the software. The software should be flexible and modular to implement models and estimation techniques that currently exist or will appear in future. On the other hand it should be reliable and possess production quality for processing standard VLBI sessions. Also, it needs to be capable of processing observations from a fully deployed network of VLBI2010 stations in a reasonable time. We describe the software development process and outline the software architecture.

  20. Rapid Optical Characterization Suite for in situ Target Analysis of Rock Surfaces Project

    Data.gov (United States)

    National Aeronautics and Space Administration — ROCSTAR is an in situ instrument suite that can accomplish rapid mineral and molecular identification without sample preparation for in situ planetary exploration;...

  1. Using the Beopt Automated Residential Simulation Test Suite to Enable Comparative Analysis Between Energy Simulation Engines: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Tabares-Velasco, Paulo Cesar [National Renewable Energy Lab. (NREL), Golden, CO (United States); Maguire, Jeff [National Renewable Energy Lab. (NREL), Golden, CO (United States); Horowitz, Scott [National Renewable Energy Lab. (NREL), Golden, CO (United States); Christensen, Craig [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2014-09-01

    Verification and validation are crucial software quality control procedures to follow when developing and implementing models. This is particularly important because a variety of stakeholders rely on accurate predictions from building simulation programs. This study uses the BEopt Automated Residential Simulation Test Suite (BARTS) to facilitate comparison of two energy simulation engines across various building components and includes building models that isolate the impacts of specific components on annual energy consumption. As a case study, BARTS has been used to identify important discrepancies between the engines for several components of the building models. These discrepancies are caused by differences in the algorithms used by the engines or coding errors.

  2. Using the BEopt Automated Residential Simulation Test Suite to Enable Comparative Analysis Between Energy Simulation Engines: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Tabares-Velasco, P. C.; Maguire, J.; Horowitz, S.; Christensen, C.

    2014-09-01

    Verification and validation are crucial software quality control procedures when developing and implementing models. This is particularly important as a variety of stakeholders rely on accurate predictions from building simulation programs. This study uses the BEopt Automated Residential Simulation Test Suite (BARTS) to facilitate comparison of two energy simulation engines across various building components and includes models that isolate the impacts of specific building components on annual energy consumption. As a case study, BARTS has been used to identify important discrepancies between the engines for several components of the building models; these discrepancies are caused by differences in the models used by the engines or coding errors.

  3. The Combustion Experiment on the Sample Analysis at Mars (SAM) Instrument Suite on the Curiosity Rover

    Science.gov (United States)

    Stern, J. C.; Malespin, C. A.; Eigenbrode, J. L.; Graham, H. V.; Archer, P. D., Jr.; Brunner, A. E.; Freissinet, C.; Franz, H. B.; Fuentes, J.; Glavin, D. P.; Leshin, L. A.; Mahaffy, P. R.; McAdam, A. C.; Ming, D. W.; Navvaro-Gonzales, R.; Niles, P. B.; Steele, A.

    2014-01-01

    The combustion experiment on the Sample Analysis at Mars (SAM) suite on Curiosity will heat a sample of Mars regolith in the presence of oxygen and measure composition of the evolved gases using quadrupole mass spectrometry (QMS) and tunable laser spectrometry (TLS). QMS will enable detection of combustion products such as CO, CO2, NO, and other oxidized species, while TLS will enable precise measurements of the abundance and carbon isotopic composition (delta(sup 13)C) of the evolved CO2 and hydrogen isotopic composition (deltaD) of H2O. SAM will perform a two-step combustion to isolate combustible materials below approx.550 C and above approx.550 C. The combustion experiment on SAM, if properly designed and executed, has the potential to answer multiple questions regarding the origins of volatiles seen thus far in SAM evolved gas analysis (EGA) on Mars. Constraints imposed by SAM and MSL time and power resources, as well as SAM consumables (oxygen gas), will limit the number of SAM combustion experiments, so it is imperative to design an experiment targeting the most pressing science questions. Low temperature combustion experiments will primarily target the quantification of carbon (and nitrogen) contributed by SAM wet chemistry reagants MTBSTFA (N-Methyl-N-tert-butyldimethylsilyltrifluoroacetamide) and DMF (Dimethylformamide), which have been identified in the background of blank and sample runs and may adsorb to the sample while the cup is in the Sample Manipulation System (SMS). In addition, differences between the sample and "blank" may yield information regarding abundance and delta(sup 13)C of bulk (both organic and inorganic) martian carbon. High temperature combustion experiments primarily aim to detect refractory organic matter, if present in Cumberland fines, as well as address the question of quantification and deltaD value of water evolution associated with hydroxyl hydrogen in clay minerals.

  4. Runtime analysis of search heuristics on software engineering problems

    Institute of Scientific and Technical Information of China (English)

    Per Kristian LEHRE; Xin YAO

    2009-01-01

    Many software engineering tasks can potentially be automated using search heuristics. However, much work is needed in designing and evaluating search heuristics before this approach can be routinely applied to a software engineering problem. Experimental methodology should be complemented with theoretical analysis to achieve this goal.Recently, there have been significant theoretical advances in the runtime analysis of evolutionary algorithms (EAs) and other search heuristics in other problem domains. We suggest that these methods could be transferred and adapted to gain insight into the behaviour of search heuristics on software engineering problems while automating software engineering.

  5. Computer software for process hazards analysis.

    Science.gov (United States)

    Hyatt, N

    2000-10-01

    Computerized software tools are assuming major significance in conducting HAZOPs. This is because they have the potential to offer better online presentations and performance to HAZOP teams, as well as better documentation and downstream tracking. The chances of something being "missed" are greatly reduced. We know, only too well, that HAZOP sessions can be like the industrial equivalent of a trip to the dentist. Sessions can (and usually do) become arduous and painstaking. To make the process easier for all those involved, we need all the help computerized software can provide. In this paper I have outlined the challenges addressed in the production of Windows software for performing HAZOP and other forms of PHA. The object is to produce more "intelligent", more user-friendly software for performing HAZOP where technical interaction between team members is of key significance. HAZOP techniques, having already proven themselves, are extending into the field of computer control and human error. This makes further demands on HAZOP software and emphasizes its importance.

  6. Software Piracy in Research: A Moral Analysis.

    Science.gov (United States)

    Santillanes, Gary; Felder, Ryan Marshall

    2015-08-01

    Researchers in virtually every discipline rely on sophisticated proprietary software for their work. However, some researchers are unable to afford the licenses and instead procure the software illegally. We discuss the prohibition of software piracy by intellectual property laws, and argue that the moral basis for the copyright law offers the possibility of cases where software piracy may be morally justified. The ethics codes that scientific institutions abide by are informed by a rule-consequentialist logic: by preserving personal rights to authored works, people able to do so will be incentivized to create. By showing that the law has this rule-consequentialist grounding, we suggest that scientists who blindly adopt their institutional ethics codes will commit themselves to accepting that software piracy could be morally justified, in some cases. We hope that this conclusion will spark debate over important tensions between ethics codes, copyright law, and the underlying moral basis for these regulations. We conclude by offering practical solutions (other than piracy) for researchers.

  7. Failure-Modes-And-Effects Analysis Of Software Logic

    Science.gov (United States)

    Garcia, Danny; Hartline, Thomas; Minor, Terry; Statum, David; Vice, David

    1996-01-01

    Rigorous analysis applied early in design effort. Method of identifying potential inadequacies and modes and effects of failures caused by inadequacies (failure-modes-and-effects analysis or "FMEA" for short) devised for application to software logic.

  8. Data and Analysis Center for Software.

    Science.gov (United States)

    1980-06-01

    DATA AND ANALYS IS CENTER FOR SOFTWARE .(U) JUN 80 L M DUVALL , S A GLOSS-SOLER, J MARTENS F30602-78-6-0255 UNCLASSIFIED RADC-TR-80-204 NI. 1 0 i W...SOFTWAREO entered iuaW~b" 80c,,,iifeet rmReo FS UPEETT NOTES Lr rojnec Eniner /ohDPuavo(IIS IS.KE WORD (C)ontne o eer iefneaeydIetfyyboknuer...UNCLASSIFIED CURITYv CLASSIFICATION OF THIS PAGE(Wha. Palo KIuiaed) inquiries received relating to software technology. UNCLASSIFIED

  9. Regression Testing Cost Reduction Suite

    Directory of Open Access Journals (Sweden)

    Mohamed Alaa El-Din

    2014-08-01

    Full Text Available The estimated cost of software maintenance exceeds 70 percent of total software costs [1], and large portion of this maintenance expenses is devoted to regression testing. Regression testing is an expensive and frequently executed maintenance activity used to revalidate the modified software. Any reduction in the cost of regression testing would help to reduce the software maintenance cost. Test suites once developed are reused and updated frequently as the software evolves. As a result, some test cases in the test suite may become redundant when the software is modified over time since the requirements covered by them are also covered by other test cases. Due to the resource and time constraints for re-executing large test suites, it is important to develop techniques to minimize available test suites by removing redundant test cases. In general, the test suite minimization problem is NP complete. This paper focuses on proposing an effective approach for reducing the cost of regression testing process. The proposed approach is applied on real-time case study. It was found that the reduction in cost of regression testing for each regression testing cycle is ranging highly improved in the case of programs containing high number of selected statements which in turn maximize the benefits of using it in regression testing of complex software systems. The reduction in the regression test suite size will reduce the effort and time required by the testing teams to execute the regression test suite. Since regression testing is done more frequently in software maintenance phase, the overall software maintenance cost can be reduced considerably by applying the proposed approach.

  10. Data and Analysis Center for Software

    Science.gov (United States)

    1993-08-01

    refintement and technoloqy trarnsition orogram ’or Tme I’AMP ýoftware arno techniques All (AMP products were develotped: in accordan’ce iArtW 000 SCD 2t6" A...provided a vehicle for software developers/ reusers to make intelligent choices regarding the selection of one component over another, or selecting

  11. ORION Environmental Control and Life Support Systems Suit Loop and Pressure Control Analysis

    Science.gov (United States)

    Eckhardt, Brad; Conger, Bruce; Stambaugh, Imelda C.

    2015-01-01

    Under NASA's ORION Multi-Purpose Crew Vehicle (MPCV) Environmental Control and Life Support System (ECLSS) Project at Johnson Space Center's (JSC), the Crew and Thermal Systems Division has developed performance models of the air system using Thermal Desktop/FloCAD. The Thermal Desktop model includes an Air Revitalization System (ARS Loop), a Suit Loop, a Cabin Loop, and Pressure Control System (PCS) for supplying make-up gas (N2 and O2) to the Cabin and Suit Loop. The ARS and PCS are designed to maintain air quality at acceptable O2, CO2 and humidity levels as well as internal pressures in the vehicle Cabin and during suited operations. This effort required development of a suite of Thermal Desktop Orion ECLSS models to address the need for various simulation capabilities regarding ECLSS performance. An initial highly detailed model of the ARS Loop was developed in order to simulate rapid pressure transients (water hammer effects) within the ARS Loop caused by events such as cycling of the Pressurized Swing Adsorption (PSA) Beds and required high temporal resolution (small time steps) in the model during simulation. A second ECLSS model was developed to simulate events which occur over longer periods of time (over 30 minutes) where O2, CO2 and humidity levels, as well as internal pressures needed to be monitored in the cabin and for suited operations. Stand-alone models of the PCS and the Negative Pressure relief Valve (NPRV) were developed to study thermal effects within the PCS during emergency scenarios (Cabin Leak) and cabin pressurization during vehicle re-entry into Earth's atmosphere. Results from the Orion ECLSS models were used during Orion Delta-PDR (July, 2014) to address Key Design Requirements (KDR's) for Suit Loop operations for multiple mission scenarios.

  12. Theoretical and software considerations for nonlinear dynamic analysis

    Science.gov (United States)

    Schmidt, R. J.; Dodds, R. H., Jr.

    1983-01-01

    In the finite element method for structural analysis, it is generally necessary to discretize the structural model into a very large number of elements to accurately evaluate displacements, strains, and stresses. As the complexity of the model increases, the number of degrees of freedom can easily exceed the capacity of present-day software system. Improvements of structural analysis software including more efficient use of existing hardware and improved structural modeling techniques are discussed. One modeling technique that is used successfully in static linear and nonlinear analysis is multilevel substructuring. This research extends the use of multilevel substructure modeling to include dynamic analysis and defines the requirements for a general purpose software system capable of efficient nonlinear dynamic analysis. The multilevel substructuring technique is presented, the analytical formulations and computational procedures for dynamic analysis and nonlinear mechanics are reviewed, and an approach to the design and implementation of a general purpose structural software system is presented.

  13. Software metrics a guide to planning, analysis, and application

    CERN Document Server

    Pandian, C Ravindranath

    2003-01-01

    Software Metrics: A Guide to Planning, Analysis, and Application simplifies software measurement and explains its value as a pragmatic tool for management. Ideas and techniques presented in this book are derived from best practices. The ideas are field-proven, down to earth, and straightforward, making this volume an invaluable resource for those striving for process improvement.

  14. Integrated analysis software for bulk power system stability

    Energy Technology Data Exchange (ETDEWEB)

    Tanaka, T.; Nagao, T.; Takahashi, K. [Central Research Inst. of Electric Power Industry, Tokyo (Japan)

    1994-12-31

    This paper presents Central Research Inst.of Electric Power Industry - CRIEPI`s - own developed three softwares for bulk power network analysis and the user support system which arranges tremendous data necessary for these softwares with easy and high reliability. (author) 3 refs., 7 figs., 2 tabs.

  15. An Analysis of Security and Privacy Issues in Smart Grid Software Architectures on Clouds

    Energy Technology Data Exchange (ETDEWEB)

    Simmhan, Yogesh; Kumbhare, Alok; Cao, Baohua; Prasanna, Viktor K.

    2011-07-09

    Power utilities globally are increasingly upgrading to Smart Grids that use bi-directional communication with the consumer to enable an information-driven approach to distributed energy management. Clouds offer features well suited for Smart Grid software platforms and applications, such as elastic resources and shared services. However, the security and privacy concerns inherent in an information rich Smart Grid environment are further exacerbated by their deployment on Clouds. Here, we present an analysis of security and privacy issues in a Smart Grids software architecture operating on different Cloud environments, in the form of a taxonomy. We use the Los Angeles Smart Grid Project that is underway in the largest U.S. municipal utility to drive this analysis that will benefit both Cloud practitioners targeting Smart Grid applications, and Cloud researchers investigating security and privacy.

  16. The Einstein Suite: A Web-Based Tool for Rapid and Collaborative Engineering Design and Analysis

    Science.gov (United States)

    Palmer, Richard S.

    1997-01-01

    Taken together the components of the Einstein Suite provide two revolutionary capabilities - they have the potential to change the way engineering and financial engineering are performed by: (1) providing currently unavailable functionality, and (2) providing a 10-100 times improvement over currently available but impractical or costly functionality.

  17. Software Users Manual (SUM): Extended Testability Analysis (ETA) Tool

    Science.gov (United States)

    Maul, William A.; Fulton, Christopher E.

    2011-01-01

    This software user manual describes the implementation and use the Extended Testability Analysis (ETA) Tool. The ETA Tool is a software program that augments the analysis and reporting capabilities of a commercial-off-the-shelf (COTS) testability analysis software package called the Testability Engineering And Maintenance System (TEAMS) Designer. An initial diagnostic assessment is performed by the TEAMS Designer software using a qualitative, directed-graph model of the system being analyzed. The ETA Tool utilizes system design information captured within the diagnostic model and testability analysis output from the TEAMS Designer software to create a series of six reports for various system engineering needs. The ETA Tool allows the user to perform additional studies on the testability analysis results by determining the detection sensitivity to the loss of certain sensors or tests. The ETA Tool was developed to support design and development of the NASA Ares I Crew Launch Vehicle. The diagnostic analysis provided by the ETA Tool was proven to be valuable system engineering output that provided consistency in the verification of system engineering requirements. This software user manual provides a description of each output report generated by the ETA Tool. The manual also describes the example diagnostic model and supporting documentation - also provided with the ETA Tool software release package - that were used to generate the reports presented in the manual

  18. Free software for performing physical analysis of systems for digital radiography and mammography

    Energy Technology Data Exchange (ETDEWEB)

    Donini, Bruno; Lanconelli, Nico, E-mail: nico.lanconelli@unibo.it [Alma Mater Studiorum, Department of Physics and Astronomy, University of Bologna, Bologna 40127 (Italy); Rivetti, Stefano [Fisica Medica, Ospedale di Sassuolo S.p.A., Sassuolo 41049 (Italy); Bertolini, Marco [Medical Physics Unit, Azienda Ospedaliera ASMN, Istituto di Ricovero e Cura a Carattere Scientifico, Reggio Emilia 42123 (Italy)

    2014-05-15

    Purpose: In this paper, the authors present a free software for assisting users in achieving the physical characterization of x-ray digital systems and image quality checks. Methods: The program was developed as a plugin of a well-known public-domain suite ImageJ. The software can assist users in calculating various physical parameters such as the response curve (also termed signal transfer property), modulation transfer function (MTF), noise power spectra (NPS), and detective quantum efficiency (DQE). It also includes the computation of some image quality checks: defective pixel analysis, uniformity, dark analysis, and lag. Results: The software was made available in 2009 and has been used during the last couple of years by many users who gave us valuable feedback for improving its usability. It was tested for achieving the physical characterization of several clinical systems for digital radiography and mammography. Various published papers made use of the outcomes of the plugin. Conclusions: This software is potentially beneficial to a variety of users: physicists working in hospitals, staff working in radiological departments, such as medical physicists, physicians, engineers. The plugin, together with a brief user manual, are freely available and can be found online ( http://www.medphys.it/downloads.htm ). With our plugin users can estimate all three most important parameters used for physical characterization (MTF, NPS, and also DQE). The plugin can run on any operating system equipped with ImageJ suite. The authors validated the software by comparing MTF and NPS curves on a common set of images with those obtained with other dedicated programs, achieving a very good agreement.

  19. Modular Open-Source Software for Item Factor Analysis

    Science.gov (United States)

    Pritikin, Joshua N.; Hunter, Micheal D.; Boker, Steven M.

    2015-01-01

    This article introduces an item factor analysis (IFA) module for "OpenMx," a free, open-source, and modular statistical modeling package that runs within the R programming environment on GNU/Linux, Mac OS X, and Microsoft Windows. The IFA module offers a novel model specification language that is well suited to programmatic generation…

  20. GammaLib and ctools. A software framework for the analysis of astronomical gamma-ray data

    Science.gov (United States)

    Knödlseder, J.; Mayer, M.; Deil, C.; Cayrou, J.-B.; Owen, E.; Kelley-Hoskins, N.; Lu, C.-C.; Buehler, R.; Forest, F.; Louge, T.; Siejkowski, H.; Kosack, K.; Gerard, L.; Schulz, A.; Martin, P.; Sanchez, D.; Ohm, S.; Hassan, T.; Brau-Nogué, S.

    2016-08-01

    The field of gamma-ray astronomy has seen important progress during the last decade, yet to date no common software framework has been developed for the scientific analysis of gamma-ray telescope data. We propose to fill this gap by means of the GammaLib software, a generic library that we have developed to support the analysis of gamma-ray event data. GammaLib was written in C++ and all functionality is available in Python through an extension module. Based on this framework we have developed the ctools software package, a suite of software tools that enables flexible workflows to be built for the analysis of Imaging Air Cherenkov Telescope event data. The ctools are inspired by science analysis software available for existing high-energy astronomy instruments, and they follow the modular ftools model developed by the High Energy Astrophysics Science Archive Research Center. The ctools were written in Python and C++, and can be either used from the command line via shell scripts or directly from Python. In this paper we present the GammaLib and ctools software versions 1.0 that were released at the end of 2015. GammaLib and ctools are ready for the science analysis of Imaging Air Cherenkov Telescope event data, and also support the analysis of Fermi-LAT data and the exploitation of the COMPTEL legacy data archive. We propose using ctools as the science tools software for the Cherenkov Telescope Array Observatory.

  1. Analysis of Variance: What Is Your Statistical Software Actually Doing?

    Science.gov (United States)

    Li, Jian; Lomax, Richard G.

    2011-01-01

    Users assume statistical software packages produce accurate results. In this article, the authors systematically examined Statistical Package for the Social Sciences (SPSS) and Statistical Analysis System (SAS) for 3 analysis of variance (ANOVA) designs, mixed-effects ANOVA, fixed-effects analysis of covariance (ANCOVA), and nested ANOVA. For each…

  2. New Graphical User Interface for EXAFS analysis with the GNXAS suite of programs

    Science.gov (United States)

    Hatada, Keisuke; Iesari, Fabio; Properzi, Leonardo; Minicucci, M.; di Cicco, Andrea

    2016-05-01

    GNXAS is a suite of programs based on multiple scattering calculations which performs a structural refinement of EXAFS spectra. It can be used for any system although it has been mainly developed to determine the local structure of disordered substances. We developed a user-friendly graphical user interface (GUI) to facilitate use of the codes by using wxPython. The developed GUI and the codes are multiplatform running on Windows, Macintosh and Linux systems, and are free shareware (http://gnxas.unicam.it). In this work we illustrate features and potentials of this newly developed version of GNXAS (w-GNXAS).

  3. Research and Development on Food Nutrition Statistical Analysis Software System

    Directory of Open Access Journals (Sweden)

    Du Li

    2013-12-01

    Full Text Available Designing and developing a set of food nutrition component statistical analysis software can realize the automation of nutrition calculation, improve the nutrition processional professional’s working efficiency and achieve the informatization of the nutrition propaganda and education. In the software development process, the software engineering method and database technology are used to calculate the human daily nutritional intake and the intelligent system is used to evaluate the user’s health condition. The experiment can show that the system can correctly evaluate the human health condition and offer the reasonable suggestion, thus exploring a new road to solve the complex nutrition computational problem with information engineering.

  4. Software safety analysis techniques for developing safety critical software in the digital protection system of the LMR

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jang Soo; Cheon, Se Woo; Kim, Chang Hoi; Sim, Yun Sub

    2001-02-01

    This report has described the software safety analysis techniques and the engineering guidelines for developing safety critical software to identify the state of the art in this field and to give the software safety engineer a trail map between the code and standards layer and the design methodology and documents layer. We have surveyed the management aspects of software safety activities during the software lifecycle in order to improve the safety. After identifying the conventional safety analysis techniques for systems, we have surveyed in details the software safety analysis techniques, software FMEA(Failure Mode and Effects Analysis), software HAZOP(Hazard and Operability Analysis), and software FTA(Fault Tree Analysis). We have also surveyed the state of the art in the software reliability assessment techniques. The most important results from the reliability techniques are not the specific probability numbers generated, but the insights into the risk importance of software features. To defend against potential common-mode failures, high quality, defense-in-depth, and diversity are considered to be key elements in digital I and C system design. To minimize the possibility of CMFs and thus increase the plant reliability, we have provided D-in-D and D analysis guidelines.

  5. The RUMBA software: tools for neuroimaging data analysis.

    Science.gov (United States)

    Bly, Benjamin Martin; Rebbechi, Donovan; Hanson, Stephen Jose; Grasso, Giorgio

    2004-01-01

    The enormous scale and complexity of data sets in functional neuroimaging makes it crucial to have well-designed and flexible software for image processing, modeling, and statistical analysis. At present, researchers must choose between general purpose scientific computing environments (e.g., Splus and Matlab), and specialized human brain mapping packages that implement particular analysis strategies (e.g., AFNI, SPM, VoxBo, FSL or FIASCO). For the vast majority of users in Human Brain Mapping and Cognitive Neuroscience, general purpose computing environments provide an insufficient framework for a complex data-analysis regime. On the other hand, the operational particulars of more specialized neuroimaging analysis packages are difficult or impossible to modify and provide little transparency or flexibility to the user for approaches other than massively multiple comparisons based on inferential statistics derived from linear models. In order to address these problems, we have developed open-source software that allows a wide array of data analysis procedures. The RUMBA software includes programming tools that simplify the development of novel methods, and accommodates data in several standard image formats. A scripting interface, along with programming libraries, defines a number of useful analytic procedures, and provides an interface to data analysis procedures. The software also supports a graphical functional programming environment for implementing data analysis streams based on modular functional components. With these features, the RUMBA software provides researchers programmability, reusability, modular analysis tools, novel data analysis streams, and an analysis environment in which multiple approaches can be contrasted and compared. The RUMBA software retains the flexibility of general scientific computing environments while adding a framework in which both experts and novices can develop and adapt neuroimaging-specific analyses.

  6. JEM-X science analysis software

    DEFF Research Database (Denmark)

    Westergaard, Niels Jørgen Stenfeldt; Kretschmar, P.; Oxborrow, Carol Anne

    2003-01-01

    The science analysis of the data from JEM-X on INTEGRAL is performed through a number of levels including corrections, good time selection, imaging and source finding, spectrum and light-curve extraction. These levels consist of individual executables and the running of the complete analysis is c...

  7. Power Analysis Software for Educational Researchers

    Science.gov (United States)

    Peng, Chao-Ying Joanne; Long, Haiying; Abaci, Serdar

    2012-01-01

    Given the importance of statistical power analysis in quantitative research and the repeated emphasis on it by American Educational Research Association/American Psychological Association journals, the authors examined the reporting practice of power analysis by the quantitative studies published in 12 education/psychology journals between 2005…

  8. Establishment of the Data and Analysis Center for Software (DACS).

    Science.gov (United States)

    1982-01-01

    This resulted in the development of a set of report generation, graphical and data analysis software packages which will be utilized to provide...synthesized biblio- Custom Searches" (1B-I) graphic database information. Produced SOA Report *Quantitative Produced SOA Report "Software Main- Produced...of high interest tu ORS tape specifications uASA-DM, Commerce Business Dally or. rio dese The data Is being collect durinng a Beta Treat al IPamsu of

  9. Strategic Analysis of a Video Compression Software Project

    OpenAIRE

    Bai, Chun Jung Rosalind

    2008-01-01

    The objective of this project is to develop a strategic recommendation for market entry of the Client's new software product based on a breakthrough predictive-decoding technology. The analysis examines videoconferencing market and reveals that there is a strong demand for the software products that can reduce delays in interactive video communications while maintaining reasonable video quality. The evaluation of the key external competitive forces suggests that the market has low intensity o...

  10. A ''Toolbox''21 Equivalent Process for Safety Analysis Software

    Energy Technology Data Exchange (ETDEWEB)

    O' KULA, KR

    2004-04-30

    Defense Nuclear Facilities Safety Board (DNFSB) Recommendation 2002-1 (Quality Assurance for Safety-Related Software) identified a number of quality assurance issues on the use of software in Department of Energy (DOE) facilities for analyzing hazards, and designing and operating controls that prevent or mitigate potential accidents. The development and maintenance of a collection, or ''toolbox,'' of multiple-site use, standard solution, Software Quality Assurance (SQA)-compliant safety software is one of the major improvements identified in the associated DOE Implementation Plan (IP). The DOE safety analysis toolbox will contain a set of appropriately quality-assured, configuration-controlled, safety analysis codes, recognized for DOE-broad, safety basis applications. Currently, six widely applied safety analysis computer codes have been designated for toolbox consideration. While the toolbox concept considerably reduces SQA burdens among DOE users of these codes, many users of unique, single-purpose, or single-site software may still have sufficient technical justification to continue use of their computer code of choice, but are thwarted by the multiple-site condition on toolbox candidate software. The process discussed here provides a roadmap for an equivalency argument, i.e., establishing satisfactory SQA credentials for single-site software that can be deemed ''toolbox-equivalent''. The process is based on the model established to meet IP Commitment 4.2.1.2: Establish SQA criteria for the safety analysis ''toolbox'' codes. Implementing criteria that establish the set of prescriptive SQA requirements are based on implementation plan/procedures from the Savannah River Site, also incorporating aspects of those from the Waste Isolation Pilot Plant (SNL component) and the Yucca Mountain Project. The major requirements are met with evidence of a software quality assurance plan, software requirements and

  11. Do You Need ERP? In the Business World, Enterprise Resource Planning Software Keeps Costs down and Productivity up. Should Districts Follow Suit?

    Science.gov (United States)

    Careless, James

    2007-01-01

    Enterprise resource planning (ERP) software does what school leaders have always wanted their computer systems to do: It sees all. By integrating every IT application an organization has--from purchasing and inventory control to payroll--ERPs create a single unified system. Not only does this give IT managers a holistic view to what is happening…

  12. Adapted wavelet analysis from theory to software

    CERN Document Server

    Wickerhauser, Mladen Victor

    1994-01-01

    This detail-oriented text is intended for engineers and applied mathematicians who must write computer programs to perform wavelet and related analysis on real data. It contains an overview of mathematical prerequisites and proceeds to describe hands-on programming techniques to implement special programs for signal analysis and other applications. From the table of contents: - Mathematical Preliminaries - Programming Techniques - The Discrete Fourier Transform - Local Trigonometric Transforms - Quadrature Filters - The Discrete Wavelet Transform - Wavelet Packets - The Best Basis Algorithm - Multidimensional Library Trees - Time-Frequency Analysis - Some Applications - Solutions to Some of the Exercises - List of Symbols - Quadrature Filter Coefficients

  13. Development of output user interface software to support analysis

    Science.gov (United States)

    Wahanani, Nursinta Adi; Natsir, Khairina; Hartini, Entin

    2014-09-01

    Data processing software packages such as VSOP and MCNPX are softwares that has been scientifically proven and complete. The result of VSOP and MCNPX are huge and complex text files. In the analyze process, user need additional processing like Microsoft Excel to show informative result. This research develop an user interface software for output of VSOP and MCNPX. VSOP program output is used to support neutronic analysis and MCNPX program output is used to support burn-up analysis. Software development using iterative development methods which allow for revision and addition of features according to user needs. Processing time with this software 500 times faster than with conventional methods using Microsoft Excel. PYTHON is used as a programming language, because Python is available for all major operating systems: Windows, Linux/Unix, OS/2, Mac, Amiga, among others. Values that support neutronic analysis are k-eff, burn-up and mass Pu239 and Pu241. Burn-up analysis used the mass inventory values of actinide (Thorium, Plutonium, Neptunium and Uranium). Values are visualized in graphical shape to support analysis.

  14. Development of output user interface software to support analysis

    Energy Technology Data Exchange (ETDEWEB)

    Wahanani, Nursinta Adi, E-mail: sintaadi@batan.go.id; Natsir, Khairina, E-mail: sintaadi@batan.go.id; Hartini, Entin, E-mail: sintaadi@batan.go.id [Center for Development of Nuclear Informatics - National Nuclear Energy Agency, PUSPIPTEK, Serpong, Tangerang, Banten (Indonesia)

    2014-09-30

    Data processing software packages such as VSOP and MCNPX are softwares that has been scientifically proven and complete. The result of VSOP and MCNPX are huge and complex text files. In the analyze process, user need additional processing like Microsoft Excel to show informative result. This research develop an user interface software for output of VSOP and MCNPX. VSOP program output is used to support neutronic analysis and MCNPX program output is used to support burn-up analysis. Software development using iterative development methods which allow for revision and addition of features according to user needs. Processing time with this software 500 times faster than with conventional methods using Microsoft Excel. PYTHON is used as a programming language, because Python is available for all major operating systems: Windows, Linux/Unix, OS/2, Mac, Amiga, among others. Values that support neutronic analysis are k-eff, burn-up and mass Pu{sup 239} and Pu{sup 241}. Burn-up analysis used the mass inventory values of actinide (Thorium, Plutonium, Neptunium and Uranium). Values are visualized in graphical shape to support analysis.

  15. QSoas: A Versatile Software for Data Analysis.

    Science.gov (United States)

    Fourmond, Vincent

    2016-05-17

    Undoubtedly, the most natural way to confirm a model is to quantitatively verify its predictions. However, this is not done systematically, and one of the reasons for that is the lack of appropriate tools for analyzing data, because the existing tools do not implement the required models or they lack the flexibility required to perform data analysis in a reasonable time. We present QSoas, an open-source, cross-platform data analysis program written to overcome these problems. In addition to standard data analysis procedures and full automation using scripts, QSoas features a very powerful data fitting interface with support for arbitrary functions, differential equation and kinetic system integration, and flexible global fits. QSoas is available from http://www.qsoas.org .

  16. Accuracy of 3D Imaging Software in Cephalometric Analysis

    Science.gov (United States)

    2013-06-21

    orthodontic software program ( Dolphin 3D, mfg, city, state) used for measurement and analysis of craniofacial dimensions. Three-dimensional reconstructions...143(8), 899-902. Baik H, Jeon J, Lee H. (2007). Facial soft tissue analysis of Korean adults with normal occlusion using a 3-dimensional laser

  17. Software Quality Attribute Analysis by Architecture Reconstruction (SQUA3RE)

    NARCIS (Netherlands)

    Stormer, C.

    2007-01-01

    Software Quality Attribute Analysis by Architecture Reconstruction (SQUA3RE) is a method that fosters a goal-driven process to evaluate the impact of what-if scenarios on existing systems. The method is partitioned into SQA2 and ARE. The SQA2 part provides the analysis models that can be used for q

  18. Buying in to bioinformatics: an introduction to commercial sequence analysis software.

    Science.gov (United States)

    Smith, David Roy

    2015-07-01

    Advancements in high-throughput nucleotide sequencing techniques have brought with them state-of-the-art bioinformatics programs and software packages. Given the importance of molecular sequence data in contemporary life science research, these software suites are becoming an essential component of many labs and classrooms, and as such are frequently designed for non-computer specialists and marketed as one-stop bioinformatics toolkits. Although beautifully designed and powerful, user-friendly bioinformatics packages can be expensive and, as more arrive on the market each year, it can be difficult for researchers, teachers and students to choose the right software for their needs, especially if they do not have a bioinformatics background. This review highlights some of the currently available and most popular commercial bioinformatics packages, discussing their prices, usability, features and suitability for teaching. Although several commercial bioinformatics programs are arguably overpriced and overhyped, many are well designed, sophisticated and, in my opinion, worth the investment. If you are just beginning your foray into molecular sequence analysis or an experienced genomicist, I encourage you to explore proprietary software bundles. They have the potential to streamline your research, increase your productivity, energize your classroom and, if anything, add a bit of zest to the often dry detached world of bioinformatics.

  19. Analysis on testing and operational reliability of software

    Institute of Scientific and Technical Information of China (English)

    ZHAO Jing; LIU Hong-wei; CUI Gang; WANG Hui-qiang

    2008-01-01

    Software reliability was estimated based on NHPP software reliability growth models. Testing reliability and operational reliability may be essentially different. On the basis of analyzing similarities and differences of the testing phase and the operational phase, using the concept of operational reliability and the testing reliability, different forms of the comparison between the operational failure ratio and the predicted testing failure ratio were conducted, and the mathematical discussion and analysis were performed in detail. Finally, software optimal release was studied using software failure data. The results show that two kinds of conclusions can be derived by applying this method, one conclusion is to continue testing to meet the required reliability level of users, and the other is that testing stops when the required operational reliability is met, thus the testing cost can be reduced.

  20. IFDOTMETER: A New Software Application for Automated Immunofluorescence Analysis.

    Science.gov (United States)

    Rodríguez-Arribas, Mario; Pizarro-Estrella, Elisa; Gómez-Sánchez, Rubén; Yakhine-Diop, S M S; Gragera-Hidalgo, Antonio; Cristo, Alejandro; Bravo-San Pedro, Jose M; González-Polo, Rosa A; Fuentes, José M

    2016-04-01

    Most laboratories interested in autophagy use different imaging software for managing and analyzing heterogeneous parameters in immunofluorescence experiments (e.g., LC3-puncta quantification and determination of the number and size of lysosomes). One solution would be software that works on a user's laptop or workstation that can access all image settings and provide quick and easy-to-use analysis of data. Thus, we have designed and implemented an application called IFDOTMETER, which can run on all major operating systems because it has been programmed using JAVA (Sun Microsystems). Briefly, IFDOTMETER software has been created to quantify a variety of biological hallmarks, including mitochondrial morphology and nuclear condensation. The program interface is intuitive and user-friendly, making it useful for users not familiar with computer handling. By setting previously defined parameters, the software can automatically analyze a large number of images without the supervision of the researcher. Once analysis is complete, the results are stored in a spreadsheet. Using software for high-throughput cell image analysis offers researchers the possibility of performing comprehensive and precise analysis of a high number of images in an automated manner, making this routine task easier.

  1. WHIPPET: a collaborative software environment for medical image processing and analysis

    Science.gov (United States)

    Hu, Yangqiu; Haynor, David R.; Maravilla, Kenneth R.

    2007-03-01

    While there are many publicly available software packages for medical image processing, making them available to end users in clinical and research labs remains non-trivial. An even more challenging task is to mix these packages to form pipelines that meet specific needs seamlessly, because each piece of software usually has its own input/output formats, parameter sets, and so on. To address these issues, we are building WHIPPET (Washington Heterogeneous Image Processing Pipeline EnvironmenT), a collaborative platform for integrating image analysis tools from different sources. The central idea is to develop a set of Python scripts which glue the different packages together and make it possible to connect them in processing pipelines. To achieve this, an analysis is carried out for each candidate package for WHIPPET, describing input/output formats, parameters, ROI description methods, scripting and extensibility and classifying its compatibility with other WHIPPET components as image file level, scripting level, function extension level, or source code level. We then identify components that can be connected in a pipeline directly via image format conversion. We set up a TWiki server for web-based collaboration so that component analysis and task request can be performed online, as well as project tracking, knowledge base management, and technical support. Currently WHIPPET includes the FSL, MIPAV, FreeSurfer, BrainSuite, Measure, DTIQuery, and 3D Slicer software packages, and is expanding. Users have identified several needed task modules and we report on their implementation.

  2. Power Analysis Tutorial for Experimental Design Software

    Science.gov (United States)

    2014-11-01

    Details ............................................................ D-1 Appendix E – JMP Monte Carlo Simulation Script...freedom for error. • In Design Expert, when constructing a design, you are asked for delta and sigma . The default model for power analysis is...Designed Experiments. Third Edition. New York: John Wiley and Sons, 2009. 12. Muthen, Linda, and Bengt Muthen. “How to Use a Monte Carlo Study to

  3. The software analysis project for the Office of Human Resources

    Science.gov (United States)

    Tureman, Robert L., Jr.

    1994-01-01

    There were two major sections of the project for the Office of Human Resources (OHR). The first section was to conduct a planning study to analyze software use with the goal of recommending software purchases and determining whether the need exists for a file server. The second section was analysis and distribution planning for retirement planning computer program entitled VISION provided by NASA Headquarters. The software planning study was developed to help OHR analyze the current administrative desktop computing environment and make decisions regarding software acquisition and implementation. There were three major areas addressed by the study: current environment new software requirements, and strategies regarding the implementation of a server in the Office. To gather data on current environment, employees were surveyed and an inventory of computers were produced. The surveys were compiled and analyzed by the ASEE fellow with interpretation help by OHR staff. New software requirements represented a compilation and analysis of the surveyed requests of OHR personnel. Finally, the information on the use of a server represents research done by the ASEE fellow and analysis of survey data to determine software requirements for a server. This included selection of a methodology to estimate the number of copies of each software program required given current use and estimated growth. The report presents the results of the computing survey, a description of the current computing environment, recommenations for changes in the computing environment, current software needs, management advantages of using a server, and management considerations in the implementation of a server. In addition, detailed specifications were presented for the hardware and software recommendations to offer a complete picture to OHR management. The retirement planning computer program available to NASA employees will aid in long-range retirement planning. The intended audience is the NASA civil

  4. Software for Data Analysis Programming with R

    CERN Document Server

    Chambers, John

    2008-01-01

    Although statistical design is one of the oldest branches of statistics, its importance is ever increasing, especially in the face of the data flood that often faces statisticians. It is important to recognize the appropriate design, and to understand how to effectively implement it, being aware that the default settings from a computer package can easily provide an incorrect analysis. The goal of this book is to describe the principles that drive good design, paying attention to both the theoretical background and the problems arising from real experimental situations. Designs are motivated t

  5. Applications of the BEam Cross section Analysis Software (BECAS)

    DEFF Research Database (Denmark)

    Blasques, José Pedro Albergaria Amaral; Bitsche, Robert; Fedorov, Vladimir;

    2013-01-01

    A newly developed framework is presented for structural design and analysis of long slender beam-like structures, e.g., wind turbine blades. The framework is based on the BEam Cross section Analysis Software – BECAS – a finite element based cross section analysis tool. BECAS is used for the gener......A newly developed framework is presented for structural design and analysis of long slender beam-like structures, e.g., wind turbine blades. The framework is based on the BEam Cross section Analysis Software – BECAS – a finite element based cross section analysis tool. BECAS is used...... for the generation of beam finite element models which correctly account for effects stemming from material anisotropy and inhomogeneity in cross sections of arbitrary geometry. These type of modelling approach allows for an accurate yet computationally inexpensive representation of a general class of three...

  6. FIRE: an open-software suite for real-time 2D/3D image registration for image guided radiotherapy research

    Science.gov (United States)

    Furtado, H.; Gendrin, C.; Spoerk, J.; Steiner, E.; Underwood, T.; Kuenzler, T.; Georg, D.; Birkfellner, W.

    2016-03-01

    Radiotherapy treatments have changed at a tremendously rapid pace. Dose delivered to the tumor has escalated while organs at risk (OARs) are better spared. The impact of moving tumors during dose delivery has become higher due to very steep dose gradients. Intra-fractional tumor motion has to be managed adequately to reduce errors in dose delivery. For tumors with large motion such as tumors in the lung, tracking is an approach that can reduce position uncertainty. Tumor tracking approaches range from purely image intensity based techniques to motion estimation based on surrogate tracking. Research efforts are often based on custom designed software platforms which take too much time and effort to develop. To address this challenge we have developed an open software platform especially focusing on tumor motion management. FLIRT is a freely available open-source software platform. The core method for tumor tracking is purely intensity based 2D/3D registration. The platform is written in C++ using the Qt framework for the user interface. The performance critical methods are implemented on the graphics processor using the CUDA extension. One registration can be as fast as 90ms (11Hz). This is suitable to track tumors moving due to respiration (~0.3Hz) or heartbeat (~1Hz). Apart from focusing on high performance, the platform is designed to be flexible and easy to use. Current use cases range from tracking feasibility studies, patient positioning and method validation. Such a framework has the potential of enabling the research community to rapidly perform patient studies or try new methods.

  7. Availability Analysis of Application Servers Using Software Rejuvenation and Virtualization

    Institute of Scientific and Technical Information of China (English)

    Thandar Thein; Jong Sou Park

    2009-01-01

    Demands on software reliability and availability have increased tremendously due to the nature of present day applications. We focus on the aspect of software for the high availability of application servers since the unavailability of servers more often originates from software faults rather than hardware faults. The software rejuvenation technique has been widely used to avoid the occurrence of unplanned failures, mainly due to the phenomena of software aging or caused by transient failures. In this paper, first we present a new way of using the virtual machine based software rejuvenation named VMSR to offer high availability for application server systems. Second we model a single physical server which is used to host multiple virtual machines (VMs) with the VMSR framework using stochastic modeling and evaluate it through both numerical analysis and SHARPE (Symbolic Hierarchical Automated Reliability and Performance Evaluator) tool simulation.This VMSR model is very general and can capture application server characteristics, failure behavior, and performability measures. Our results demonstrate that VMSR approach is a practical way to ensure uninterrupted availability and to optimize performance for aging applications.

  8. The BEPCⅡ Data Production and BESⅢ offline Analysis Software System

    Institute of Scientific and Technical Information of China (English)

    ZepuMAO

    2001-01-01

    The BES detector has operated for about 12 years,and the BES offline data analysis environment also has been developed and upgraded along with developments of the BES hardware and software.The BESⅢ software system will operate for many years.Thus they should meet developments of the new technology in software,It should be highly flexible,Powerful,stable and easy for maintenance.And following points should be taken into account:1) To benefit the collaboration and make better exchanges with the international HEP experiments this system shoule be set up by adopting or referring the newest technology in the software from advanced experiments in the world.2).It should support hundreds of the existing BES software packages and serve for both old experts who familiar with BESII software and computing environment and new members who is going to benefit from the new system.3).The most BESII existing packages will be modified or re-designed according to the hardware changes.

  9. Control and analysis software for a laser scanning microdensitometer

    Indian Academy of Sciences (India)

    H R Bundel; C P Navathe; P A Naik; P D Gupta

    2006-02-01

    A PC-based control software and data acquisition system is developed for an existing commercial microdensitometer (Biomed make model No. SL-2D/1D UV/VIS) to facilitate scanning and analysis of X-ray films. The software is developed in Labview, which includes operation of the microdensitometer in 1D and 2D scans and analysis of spatial or spectral data on X-ray films, such as optical density, intensity and wavelength. It provides a user-friendly Graphical User Interface (GUI) to analyse the scanned data and also store the analysed data/image in popular formats like data in Excel and images in jpeg. It has also on-line calibration facility with standard optical density tablets. The control software and data acquisition system is simple, inexpensive and versatile.

  10. RAVEN, a New Software for Dynamic Risk Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Cristian Rabiti; Andrea Alfonsi; Joshua Cogliati; Diego Mandelli; Robert Kinoshita

    2014-06-01

    RAVEN is a generic software driver to perform parametric and probabilistic analysis of code simulating complex systems. Initially developed to provide dynamic risk analysis capabilities to the RELAP-7 code [1] is currently being generalized with the addition of Application Programming Interfaces (APIs). These interfaces are used to extend RAVEN capabilities to any software as long as all the parameters that need to be perturbed are accessible by inputs files or directly via python interfaces. RAVEN is capable to investigate the system response probing the input space using Monte Carlo, grid strategies, or Latin Hyper Cube schemes, but its strength is its focus toward system feature discovery like limit surfaces separating regions of the input space leading to system failure using dynamic supervised learning techniques. The paper will present an overview of the software capabilities and their implementation schemes followed by same application examples.

  11. Analysis of Gumbel Model for Software Reliability Using Bayesian Paradigm

    Directory of Open Access Journals (Sweden)

    Raj Kumar

    2012-12-01

    Full Text Available In this paper, we have illustrated the suitability of Gumbel Model for software reliability data. The model parameters are estimated using likelihood based inferential procedure: classical as well as Bayesian. The quasi Newton-Raphson algorithm is applied to obtain the maximum likelihood estimates and associated probability intervals. The Bayesian estimates of the parameters of Gumbel model are obtained using Markov Chain Monte Carlo(MCMC simulation method in OpenBUGS(established software for Bayesian analysis using Markov Chain Monte Carlo methods. The R functions are developed to study the statistical properties, model validation and comparison tools of the model and the output analysis of MCMC samples generated from OpenBUGS. Details of applying MCMC to parameter estimation for the Gumbel model are elaborated and a real software reliability data set is considered to illustrate the methods of inference discussed in this paper.

  12. An Overview of the XGAM Code and Related Software for Gamma-ray Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Younes, W. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2014-11-13

    The XGAM spectrum-fitting code and associated software were developed specifically to analyze the complex gamma-ray spectra that can result from neutron-induced reactions. The XGAM code is designed to fit a spectrum over the entire available gamma-ray energy range as a single entity, in contrast to the more traditional piecewise approaches. This global-fit philosophy enforces background continuity as well as consistency between local and global behavior throughout the spectrum, and in a natural way. This report presents XGAM and the suite of programs built around it with an emphasis on how they fit into an overall analysis methodology for complex gamma-ray data. An application to the analysis of time-dependent delayed gamma-ray yields from 235U fission is shown in order to showcase the codes and how they interact.

  13. Orbiter subsystem hardware/software interaction analysis. Volume 8: AFT reaction control system, part 2

    Science.gov (United States)

    Becker, D. D.

    1980-01-01

    The orbiter subsystems and interfacing program elements which interact with the orbiter computer flight software are analyzed. The failure modes identified in the subsystem/element failure mode and effects analysis are examined. Potential interaction with the software is examined through an evaluation of the software requirements. The analysis is restricted to flight software requirements and excludes utility/checkout software. The results of the hardware/software interaction analysis for the forward reaction control system are presented.

  14. Software for analysis of equine ground reaction force data

    NARCIS (Netherlands)

    Schamhardt, H.C.; Merkens, H.W.; Lammertink, J.L.M.A.

    1986-01-01

    Software for analysis of force plate recordings of the horse at normal walk is described. The data of a number of stance phases are averaged to obtain a representative tracing of that horse. The amplitudes of a number of characteristic peaks in the force-time curves are used to compare left and righ

  15. Comparative Analysis and Evaluation of Existing Risk Management Software

    Directory of Open Access Journals (Sweden)

    2007-01-01

    Full Text Available The focus of this article lies on the specific features of the existing software packages for risk management differentiating three categories. Representative for these categories we consider the Crystal Ball, Haufe Risikomanager and MIS - Risk Management solutions, outlining the strenghts and weaknesses in a comparative analysis.

  16. PROTEINCHALLENGE: Crowd sourcing in proteomics analysis and software development

    DEFF Research Database (Denmark)

    Martin, Sarah F.; Falkenberg, Heiner; Dyrlund, Thomas Franck;

    2013-01-01

    , including arguments for community-wide open source software development and “big data” compatible solutions for the future. For the meantime, we have laid out ten top tips for data processing. With these at hand, a first large-scale proteomics analysis hopefully becomes less daunting to navigate...

  17. A Knowledge-based Environment for Software Process Performance Analysis

    Directory of Open Access Journals (Sweden)

    Natália Chaves Lessa Schots

    2015-08-01

    Full Text Available Background: Process performance analysis is a key step for implementing continuous improvement in software organizations. However, the knowledge to execute such analysis is not trivial and the person responsible to executing it must be provided with appropriate support. Aim: This paper presents a knowledge-based environment, named SPEAKER, proposed for supporting software organizations during the execution of process performance analysis. SPEAKER comprises a body of knowledge and a set of activities and tasks for software process performance analysis along with supporting tools to executing these activities and tasks. Method: We conducted an informal literature reviews and a systematic mapping study, which provided basic requirements for the proposed environment. We implemented the SPEAKER environment integrating supporting tools for the execution of activities and tasks of performance analysis and the knowledge necessary to execute them, in order to meet the variability presented by the characteristics of these activities. Results: In this paper, we describe each SPEAKER module and the individual evaluations of these modules, and also present an example of use comprising how the environment can guide the user through a specific performance analysis activity. Conclusion: Although we only conducted individual evaluations of SPEAKER’s modules, the example of use indicates the feasibility of the proposed environment. Therefore, the environment as a whole will be further evaluated to verify if it attains its goal of assisting in the execution of process performance analysis by non-specialist people.

  18. Software"Visual Image"for classical women jackets and men suits reconstruction and recognition%经典男女正装的重建与识别的"视觉形象"软件

    Institute of Scientific and Technical Information of China (English)

    叶洪光; 陈明珍; 维克多·库兹米切夫

    2008-01-01

    "视觉形象"这种新软件其功能是帮助重建20世纪五十年代至21世纪初男女正装图片中地实际尺寸和其所属年代风格的对应.该软件的开发是在研究了大量地时尚杂志地原始图片及服装企业所使用地板型基础上建立地数据库.该程序可以作为独立地计算机软件或服装CAD系统其中一个应用模块.%New software named"Visual Image"allows to reconstruct real sizes of women jackets and men suits have been taken from their photos and to establish the possible time(1950th, 1960th, 1970th. 1980th, 1990th,2000th)when the style was created.Software is consisted the original data bases obtained after exploration a lot of photos from fashion magazines and pattern block for cutting used by industrial enterprises.The program can be used as independent software or as an integrated module in CAD.

  19. Array2BIO: A Comprehensive Suite of Utilities for the Analysis of Microarray Data

    Energy Technology Data Exchange (ETDEWEB)

    Loots, G G; Chain, P G; Mabery, S; Rasley, A; Garcia, E; Ovcharenko, I

    2006-02-13

    We have developed an integrative and automated toolkit for the analysis of Affymetrix microarray data, named Array2BIO. It identifies groups of coexpressed genes using two complementary approaches--comparative analysis of signal versus control microarrays and clustering analysis of gene expression across different conditions. The identified genes are assigned to functional categories based on the Gene Ontology classification, and a detection of corresponding KEGG protein interaction pathways. Array2BIO reliably handles low-expressor genes and provides a set of statistical methods to quantify the odds of observations, including the Benjamini-Hochberg and Bonferroni multiple testing corrections. Automated interface with the ECR Browser provides evolutionary conservation analysis of identified gene loci while the interconnection with Creme allows high-throughput analysis of human promoter regions and prediction of gene regulatory elements that underlie the observed expression patterns. Array2BIO is publicly available at http://array2bio.dcode.org.

  20. One-Click Data Analysis Software for Science Operations

    Science.gov (United States)

    Navarro, Vicente

    2015-12-01

    One of the important activities of ESA Science Operations Centre is to provide Data Analysis Software (DAS) to enable users and scientists to process data further to higher levels. During operations and post-operations, Data Analysis Software (DAS) is fully maintained and updated for new OS and library releases. Nonetheless, once a Mission goes into the "legacy" phase, there are very limited funds and long-term preservation becomes more and more difficult. Building on Virtual Machine (VM), Cloud computing and Software as a Service (SaaS) technologies, this project has aimed at providing long-term preservation of Data Analysis Software for the following missions: - PIA for ISO (1995) - SAS for XMM-Newton (1999) - Hipe for Herschel (2009) - EXIA for EXOSAT (1983) Following goals have guided the architecture: - Support for all operations, post-operations and archive/legacy phases. - Support for local (user's computer) and cloud environments (ESAC-Cloud, Amazon - AWS). - Support for expert users, requiring full capabilities. - Provision of a simple web-based interface. This talk describes the architecture, challenges, results and lessons learnt gathered in this project.

  1. Software and codes for analysis of concentrating solar power technologies.

    Energy Technology Data Exchange (ETDEWEB)

    Ho, Clifford Kuofei

    2008-12-01

    This report presents a review and evaluation of software and codes that have been used to support Sandia National Laboratories concentrating solar power (CSP) program. Additional software packages developed by other institutions and companies that can potentially improve Sandia's analysis capabilities in the CSP program are also evaluated. The software and codes are grouped according to specific CSP technologies: power tower systems, linear concentrator systems, and dish/engine systems. A description of each code is presented with regard to each specific CSP technology, along with details regarding availability, maintenance, and references. A summary of all the codes is then presented with recommendations regarding the use and retention of the codes. A description of probabilistic methods for uncertainty and sensitivity analyses of concentrating solar power technologies is also provided.

  2. GammaLib and ctools: A software framework for the analysis of astronomical gamma-ray data

    CERN Document Server

    Knödlseder, J; Deil, C; Cayrou, J -B; Owen, E; Kelley-Hoskins, N; Lu, C -C; Buehler, R; Forest, F; Louge, T; Siejkowski, H; Kosack, K; Gerard, L; Schulz, A; Martin, P; Sanchez, D; Ohm, S; Hassan, T; Brau-Nogué, S

    2016-01-01

    The field of gamma-ray astronomy has seen important progress during the last decade, yet there exists so far no common software framework for the scientific analysis of gamma-ray telescope data. We propose to fill this gap by means of the GammaLib software, a generic library that we have developed to support the analysis of gamma-ray event data. GammaLib has been written in C++ and all functionality is available in Python through an extension module. On top of this framework we have developed the ctools software package, a suite of software tools that enables building of flexible workflows for the analysis of Imaging Air Cherenkov Telescope event data. The ctools are inspired by science analysis software available for existing high-energy astronomy instruments, and they follow the modular ftools model developed by the High Energy Astrophysics Science Archive Research Center. The ctools have been written in Python and C++, and can be either used from the command line, via shell scripts, or directly from Python...

  3. Simulated spectra for QA/QC of spectral analysis software

    Energy Technology Data Exchange (ETDEWEB)

    Jackman, K. R. (Kevin R.); Biegalski, S. R.

    2004-01-01

    Monte Carlo simulated spectra have been developed to test the peak analysis algorithms of several spectral analysis software packages. Using MCNP 5, generic sample spectra were generated in order to perform ANSI N42.14 standard spectral tests on Canberra Genie-2000, Ortec GammaVision, and UniSampo. The reference spectra were generated in MCNP 5 using an F8, pulse height, tally with a detector model of an actual Germanium detector used in counting. The detector model matches the detector resolution, energy calibration, and efficiency. The simulated spectra have been found to be useful in testing the reliability and performance of spectral analysis programs. The detector model used was found to be useful in testing the performance of modern spectral analysis software tools. The software packages were analyzed and found to be in compliance with the ANSI 42.14 tests of the peak-search and peak-fitting algorithms. This method of using simulated spectra can be used to perform the ANSI 42.14 tests on the reliability and performance of spectral analysis programs in the absence of standard radioactive materials.

  4. STING Millennium: a web-based suite of programs for comprehensive and simultaneous analysis of protein structure and sequence

    Science.gov (United States)

    Neshich, Goran; Togawa, Roberto C.; Mancini, Adauto L.; Kuser, Paula R.; Yamagishi, Michel E. B.; Pappas, Georgios; Torres, Wellington V.; Campos, Tharsis Fonseca e; Ferreira, Leonardo L.; Luna, Fabio M.; Oliveira, Adilton G.; Miura, Ronald T.; Inoue, Marcus K.; Horita, Luiz G.; de Souza, Dimas F.; Dominiquini, Fabiana; Álvaro, Alexandre; Lima, Cleber S.; Ogawa, Fabio O.; Gomes, Gabriel B.; Palandrani, Juliana F.; dos Santos, Gabriela F.; de Freitas, Esther M.; Mattiuz, Amanda R.; Costa, Ivan C.; de Almeida, Celso L.; Souza, Savio; Baudet, Christian; Higa, Roberto H.

    2003-01-01

    STING Millennium Suite (SMS) is a new web-based suite of programs and databases providing visualization and a complex analysis of molecular sequence and structure for the data deposited at the Protein Data Bank (PDB). SMS operates with a collection of both publicly available data (PDB, HSSP, Prosite) and its own data (contacts, interface contacts, surface accessibility). Biologists find SMS useful because it provides a variety of algorithms and validated data, wrapped-up in a user friendly web interface. Using SMS it is now possible to analyze sequence to structure relationships, the quality of the structure, nature and volume of atomic contacts of intra and inter chain type, relative conservation of amino acids at the specific sequence position based on multiple sequence alignment, indications of folding essential residue (FER) based on the relationship of the residue conservation to the intra-chain contacts and Cα–Cα and Cβ–Cβ distance geometry. Specific emphasis in SMS is given to interface forming residues (IFR)—amino acids that define the interactive portion of the protein surfaces. SMS may simultaneously display and analyze previously superimposed structures. PDB updates trigger SMS updates in a synchronized fashion. SMS is freely accessible for public data at http://www.cbi.cnptia.embrapa.br, http://mirrors.rcsb.org/SMS and http://trantor.bioc.columbia.edu/SMS. PMID:12824333

  5. Knickpoint finder: A software tool that improves neotectonic analysis

    Science.gov (United States)

    Queiroz, G. L.; Salamuni, E.; Nascimento, E. R.

    2015-03-01

    This work presents a new software tool for morphometric analysis of drainage networks based on the methods of Hack (1973) and Etchebehere et al. (2004). This tool is applicable to studies of morphotectonics and neotectonics. The software used a digital elevation model (DEM) to identify the relief breakpoints along drainage profiles (knickpoints). The program was coded in Python for use on the ArcGIS platform and is called Knickpoint Finder. A study area was selected to test and evaluate the software's ability to analyze and identify neotectonic morphostructures based on the morphology of the terrain. For an assessment of its validity, we chose an area of the James River basin, which covers most of the Piedmont area of Virginia (USA), which is an area of constant intraplate seismicity and non-orogenic active tectonics and exhibits a relatively homogeneous geodesic surface currently being altered by the seismogenic features of the region. After using the tool in the chosen area, we found that the knickpoint locations are associated with the geologic structures, epicenters of recent earthquakes, and drainages with rectilinear anomalies. The regional analysis demanded the use of a spatial representation of the data after processing using Knickpoint Finder. The results were satisfactory in terms of the correlation of dense areas of knickpoints with active lineaments and the rapidity of the identification of deformed areas. Therefore, this software tool may be considered useful in neotectonic analyses of large areas and may be applied to any area where there is DEM coverage.

  6. SMV model-based safety analysis of software requirements

    Energy Technology Data Exchange (ETDEWEB)

    Koh, Kwang Yong [Department of Nuclear and Quantum Engineering, Korea Advanced Institute of Science and Technology, 373-1, Guseong-dong, Yuseong-gu, Daejeon 305-701 (Korea, Republic of); Seong, Poong Hyun [Department of Nuclear and Quantum Engineering, Korea Advanced Institute of Science and Technology, 373-1, Guseong-dong, Yuseong-gu, Daejeon 305-701 (Korea, Republic of)], E-mail: phseong@kaist.ac.kr

    2009-02-15

    Fault tree analysis (FTA) is one of the most frequently applied safety analysis techniques when developing safety-critical industrial systems such as software-based emergency shutdown systems of nuclear power plants and has been used for safety analysis of software requirements in the nuclear industry. However, the conventional method for safety analysis of software requirements has several problems in terms of correctness and efficiency; the fault tree generated from natural language specifications may contain flaws or errors while the manual work of safety verification is very labor-intensive and time-consuming. In this paper, we propose a new approach to resolve problems of the conventional method; we generate a fault tree from a symbolic model verifier (SMV) model, not from natural language specifications, and verify safety properties automatically, not manually, by a model checker SMV. To demonstrate the feasibility of this approach, we applied it to shutdown system 2 (SDS2) of Wolsong nuclear power plant (NPP). In spite of subtle ambiguities present in the approach, the results of this case study demonstrate its overall feasibility and effectiveness.

  7. Model Based Analysis and Test Generation for Flight Software

    Science.gov (United States)

    Pasareanu, Corina S.; Schumann, Johann M.; Mehlitz, Peter C.; Lowry, Mike R.; Karsai, Gabor; Nine, Harmon; Neema, Sandeep

    2009-01-01

    We describe a framework for model-based analysis and test case generation in the context of a heterogeneous model-based development paradigm that uses and combines Math- Works and UML 2.0 models and the associated code generation tools. This paradigm poses novel challenges to analysis and test case generation that, to the best of our knowledge, have not been addressed before. The framework is based on a common intermediate representation for different modeling formalisms and leverages and extends model checking and symbolic execution tools for model analysis and test case generation, respectively. We discuss the application of our framework to software models for a NASA flight mission.

  8. Effectiveness of an Automatic Tracking Software in Underwater Motion Analysis

    Directory of Open Access Journals (Sweden)

    Fabrício A. Magalhaes

    2013-12-01

    Full Text Available Tracking of markers placed on anatomical landmarks is a common practice in sports science to perform the kinematic analysis that interests both athletes and coaches. Although different software programs have been developed to automatically track markers and/or features, none of them was specifically designed to analyze underwater motion. Hence, this study aimed to evaluate the effectiveness of a software developed for automatic tracking of underwater movements (DVP, based on the Kanade-Lucas-Tomasi feature tracker. Twenty-one video recordings of different aquatic exercises (n = 2940 markers’ positions were manually tracked to determine the markers’ center coordinates. Then, the videos were automatically tracked using DVP and a commercially available software (COM. Since tracking techniques may produce false targets, an operator was instructed to stop the automatic procedure and to correct the position of the cursor when the distance between the calculated marker’s coordinate and the reference one was higher than 4 pixels. The proportion of manual interventions required by the software was used as a measure of the degree of automation. Overall, manual interventions were 10.4% lower for DVP (7.4% than for COM (17.8%. Moreover, when examining the different exercise modes separately, the percentage of manual interventions was 5.6% to 29.3% lower for DVP than for COM. Similar results were observed when analyzing the type of marker rather than the type of exercise, with 9.9% less manual interventions for DVP than for COM. In conclusion, based on these results, the developed automatic tracking software presented can be used as a valid and useful tool for underwater motion analysis.

  9. Calibration Analysis Software for the ATLAS Pixel Detector

    CERN Document Server

    Stramaglia, Maria Elena; The ATLAS collaboration

    2015-01-01

    The calibration of the ATLAS Pixel detector at LHC fulfils two main purposes: to tune the front-end configuration parameters for establishing the best operational settings and to measure the tuning performance through a subset of scans. An analysis framework has been set up in order to take actions on the detector given the outcome of a calibration scan (e.g. to create a mask for disabling noisy pixels). The software framework to control all aspects of the Pixel detector scans and analyses is called Calibration Console. The introduction of a new layer, equipped with new Front End-I4 Chips, required an update the Console architecture. It now handles scans and scans analyses applied together to chips with different characteristics. An overview of the newly developed Calibration Analysis Software will be presented, together with some preliminary result.

  10. Calibration Analysis Software for the ATLAS Pixel Detector

    CERN Document Server

    Stramaglia, Maria Elena; The ATLAS collaboration

    2015-01-01

    The calibration of the Pixel detector fulfills two main purposes: to tune front-end registers for establishing the best operational settings and to measure the tuning performance through a subset of scans. An analysis framework has been set up in order to take actions on the detector given the outcome of a calibration scan (e.g. to create a mask for disabling noisy pixels). The software framework to control all aspects of the Pixel detector scans and analyses is called Calibration Console. The introduction of a new layer, equipped with new Front End-I4 Chips, required an update the Console architecture. It now handles scans and scans analyses applied toghether to chips with dierent characteristics. An overview of the newly developed Calibration Analysis Software will be presented, together with some preliminary result.

  11. Stromatoporoid biometrics using image analysis software: A first order approach

    Science.gov (United States)

    Wolniewicz, Pawel

    2010-04-01

    Strommetric is a new image analysis computer program that performs morphometric measurements of stromatoporoid sponges. The program measures 15 features of skeletal elements (pillars and laminae) visible in both longitudinal and transverse thin sections. The software is implemented in C++, using the Open Computer Vision (OpenCV) library. The image analysis system distinguishes skeletal elements from sparry calcite using Otsu's method for image thresholding. More than 150 photos of thin sections were used as a test set, from which 36,159 measurements were obtained. The software provided about one hundred times more data than the current method applied until now. The data obtained are reproducible, even if the work is repeated by different workers. Thus the method makes the biometric studies of stromatoporoids objective.

  12. TOM software toolbox: acquisition and analysis for electron tomography.

    Science.gov (United States)

    Nickell, Stephan; Förster, Friedrich; Linaroudis, Alexandros; Net, William Del; Beck, Florian; Hegerl, Reiner; Baumeister, Wolfgang; Plitzko, Jürgen M

    2005-03-01

    Automated data acquisition procedures have changed the perspectives of electron tomography (ET) in a profound manner. Elaborate data acquisition schemes with autotuning functions minimize exposure of the specimen to the electron beam and sophisticated image analysis routines retrieve a maximum of information from noisy data sets. "TOM software toolbox" integrates established algorithms and new concepts tailored to the special needs of low dose ET. It provides a user-friendly unified platform for all processing steps: acquisition, alignment, reconstruction, and analysis. Designed as a collection of computational procedures it is a complete software solution within a highly flexible framework. TOM represents a new way of working with the electron microscope and can serve as the basis for future high-throughput applications.

  13. Calibration analysis software for the ATLAS Pixel Detector

    Science.gov (United States)

    Stramaglia, Maria Elena

    2016-07-01

    The calibration of the ATLAS Pixel Detector at LHC fulfils two main purposes: to tune the front-end configuration parameters for establishing the best operational settings and to measure the tuning performance through a subset of scans. An analysis framework has been set up in order to take actions on the detector given the outcome of a calibration scan (e.g. to create a mask for disabling noisy pixels). The software framework to control all aspects of the Pixel Detector scans and analyses is called calibration console. The introduction of a new layer, equipped with new FE-I4 chips, required an update of the console architecture. It now handles scans and scan analyses applied together to chips with different characteristics. An overview of the newly developed calibration analysis software will be presented, together with some preliminary results.

  14. Search for Chemical Biomarkers on Mars Using the Sample Analysis at Mars Instrument Suite on the Mars Science Laboratory

    Science.gov (United States)

    Glavin, D. P.; Conrad, P.; Dworkin, J. P.; Eigenbrode, J.; Mahaffy, P. R.

    2011-01-01

    One key goal for the future exploration of Mars is the search for chemical biomarkers including complex organic compounds important in life on Earth. The Sample Analysis at Mars (SAM) instrument suite on the Mars Science Laboratory (MSL) will provide the most sensitive measurements of the organic composition of rocks and regolith samples ever carried out in situ on Mars. SAM consists of a gas chromatograph (GC), quadrupole mass spectrometer (QMS), and tunable laser spectrometer to measure volatiles in the atmosphere and released from rock powders heated up to 1000 C. The measurement of organics in solid samples will be accomplished by three experiments: (1) pyrolysis QMS to identify alkane fragments and simple aromatic compounds; pyrolysis GCMS to separate and identify complex mixtures of larger hydrocarbons; and (3) chemical derivatization and GCMS extract less volatile compounds including amino and carboxylic acids that are not detectable by the other two experiments.

  15. Orion Relative Navigation Flight Software Analysis and Design

    Science.gov (United States)

    D'Souza, Chris; Christian, John; Zanetti, Renato

    2011-01-01

    The Orion relative Navigation System has sought to take advantage of the latest developments in sensor and algorithm technology while living under the constraints of mass, power, volume, and throughput. In particular, the only sensor specifically designed for relative navigation is the Vision Navigation System (VNS), a lidar-based sensor. But it uses the Star Trackers, GPS (when available) and IMUs, which are part of the overall Orion navigation sensor suite, to produce a relative state accurate enough to dock with the ISS. The Orion Relative Navigation System has significantly matured as the program has evolved from the design phase to the flight software implementation phase. With the development of the VNS system and the STORRM flight test of the Orion Relative Navigation hardware, much of the performance of the system will be characterized before the first flight. However challenges abound, not the least of which is the elimination of the RF range and range-rate system, along with the development of the FSW in the Matlab/Simulink/Stateflow environment. This paper will address the features and the rationale for the Orion Relative Navigation design as well as the performance of the FSW in a 6-DOF environment as well as the initial results of the hardware performance from the STORRM flight.

  16. Evaluating the Quantitative Capabilities of Metagenomic Analysis Software.

    Science.gov (United States)

    Kerepesi, Csaba; Grolmusz, Vince

    2016-05-01

    DNA sequencing technologies are applied widely and frequently today to describe metagenomes, i.e., microbial communities in environmental or clinical samples, without the need for culturing them. These technologies usually return short (100-300 base-pairs long) DNA reads, and these reads are processed by metagenomic analysis software that assign phylogenetic composition-information to the dataset. Here we evaluate three metagenomic analysis software (AmphoraNet--a webserver implementation of AMPHORA2--, MG-RAST, and MEGAN5) for their capabilities of assigning quantitative phylogenetic information for the data, describing the frequency of appearance of the microorganisms of the same taxa in the sample. The difficulties of the task arise from the fact that longer genomes produce more reads from the same organism than shorter genomes, and some software assign higher frequencies to species with longer genomes than to those with shorter ones. This phenomenon is called the "genome length bias." Dozens of complex artificial metagenome benchmarks can be found in the literature. Because of the complexity of those benchmarks, it is usually difficult to judge the resistance of a metagenomic software to this "genome length bias." Therefore, we have made a simple benchmark for the evaluation of the "taxon-counting" in a metagenomic sample: we have taken the same number of copies of three full bacterial genomes of different lengths, break them up randomly to short reads of average length of 150 bp, and mixed the reads, creating our simple benchmark. Because of its simplicity, the benchmark is not supposed to serve as a mock metagenome, but if a software fails on that simple task, it will surely fail on most real metagenomes. We applied three software for the benchmark. The ideal quantitative solution would assign the same proportion to the three bacterial taxa. We have found that AMPHORA2/AmphoraNet gave the most accurate results and the other two software were under

  17. SIMA: Python software for analysis of dynamic fluorescence imaging data

    Directory of Open Access Journals (Sweden)

    Patrick eKaifosh

    2014-09-01

    Full Text Available Fluorescence imaging is a powerful method for monitoring dynamic signals in the nervous system. However, analysis of dynamic fluorescence imaging data remains burdensome, in part due to the shortage of available software tools. To address this need, we have developed SIMA, an open source Python package that facilitates common analysis tasks related to fluorescence imaging. Functionality of this package includes correction of motion artifacts occurring during in vivo imaging with laser-scanning microscopy, segmentation of imaged fields into regions of interest (ROIs, and extraction of signals from the segmented ROIs. We have also developed a graphical user interface (GUI for manual editing of the automatically segmented ROIs and automated registration of ROIs across multiple imaging datasets. This software has been designed with flexibility in mind to allow for future extension with different analysis methods and potential integration with other packages. Software, documentation, and source code for the SIMA package and ROI Buddy GUI are freely available at http://www.losonczylab.org/sima/.

  18. Supervised Semi-Automated Data Analysis Software for Gas Chromatography / Differential Mobility Spectrometry (GC/DMS) Metabolomics Applications.

    Science.gov (United States)

    Peirano, Daniel J; Pasamontes, Alberto; Davis, Cristina E

    2016-09-01

    Modern differential mobility spectrometers (DMS) produce complex and multi-dimensional data streams that allow for near-real-time or post-hoc chemical detection for a variety of applications. An active area of interest for this technology is metabolite monitoring for biological applications, and these data sets regularly have unique technical and data analysis end user requirements. While there are initial publications on how investigators have individually processed and analyzed their DMS metabolomic data, there are no user-ready commercial or open source software packages that are easily used for this purpose. We have created custom software uniquely suited to analyze gas chromatograph / differential mobility spectrometry (GC/DMS) data from biological sources. Here we explain the implementation of the software, describe the user features that are available, and provide an example of how this software functions using a previously-published data set. The software is compatible with many commercial or home-made DMS systems. Because the software is versatile, it can also potentially be used for other similarly structured data sets, such as GC/GC and other IMS modalities.

  19. A Parametric Model of Shoulder Articulation for Virtual Assessment of Space Suit Fit

    Science.gov (United States)

    Kim, K. Han; Young, Karen S.; Bernal, Yaritza; Boppana, Abhishektha; Vu, Linh Q.; Benson, Elizabeth A.; Jarvis, Sarah; Rajulu, Sudhakar L.

    2016-01-01

    Suboptimal suit fit is a known risk factor for crewmember shoulder injury. Suit fit assessment is however prohibitively time consuming and cannot be generalized across wide variations of body shapes and poses. In this work, we have developed a new design tool based on the statistical analysis of body shape scans. This tool is aimed at predicting the skin deformation and shape variations for any body size and shoulder pose for a target population. This new process, when incorporated with CAD software, will enable virtual suit fit assessments, predictively quantifying the contact volume, and clearance between the suit and body surface at reduced time and cost.

  20. Models for composing software : an analysis of software composition and objects

    NARCIS (Netherlands)

    Bergmans, Lodewijk

    1999-01-01

    In this report, we investigate component-based software construction with a focus on composition. In particular we try to analyze the requirements and issues for components and software composition. As a means to understand this research area, we introduce a canonical model for representing software

  1. IMMAN: free software for information theory-based chemometric analysis.

    Science.gov (United States)

    Urias, Ricardo W Pino; Barigye, Stephen J; Marrero-Ponce, Yovani; García-Jacas, César R; Valdes-Martiní, José R; Perez-Gimenez, Facundo

    2015-05-01

    The features and theoretical background of a new and free computational program for chemometric analysis denominated IMMAN (acronym for Information theory-based CheMoMetrics ANalysis) are presented. This is multi-platform software developed in the Java programming language, designed with a remarkably user-friendly graphical interface for the computation of a collection of information-theoretic functions adapted for rank-based unsupervised and supervised feature selection tasks. A total of 20 feature selection parameters are presented, with the unsupervised and supervised frameworks represented by 10 approaches in each case. Several information-theoretic parameters traditionally used as molecular descriptors (MDs) are adapted for use as unsupervised rank-based feature selection methods. On the other hand, a generalization scheme for the previously defined differential Shannon's entropy is discussed, as well as the introduction of Jeffreys information measure for supervised feature selection. Moreover, well-known information-theoretic feature selection parameters, such as information gain, gain ratio, and symmetrical uncertainty are incorporated to the IMMAN software ( http://mobiosd-hub.com/imman-soft/ ), following an equal-interval discretization approach. IMMAN offers data pre-processing functionalities, such as missing values processing, dataset partitioning, and browsing. Moreover, single parameter or ensemble (multi-criteria) ranking options are provided. Consequently, this software is suitable for tasks like dimensionality reduction, feature ranking, as well as comparative diversity analysis of data matrices. Simple examples of applications performed with this program are presented. A comparative study between IMMAN and WEKA feature selection tools using the Arcene dataset was performed, demonstrating similar behavior. In addition, it is revealed that the use of IMMAN unsupervised feature selection methods improves the performance of both IMMAN and WEKA

  2. Analysis of signal acquisition in GPS receiver software

    Directory of Open Access Journals (Sweden)

    Vlada S. Sokolović

    2011-01-01

    Full Text Available This paper presents a critical analysis of the flow signal processing carried out in GPS receiver software, which served as a basis for a critical comparison of different signal processing architectures within the GPS receiver. It is possible to achieve Increased flexibility and reduction of GPS device commercial costs, including those of mobile devices, by using radio technology software (SDR, Software Defined Radio. The SDR application can be realized when certain hardware components in a GPS receiver are replaced. Signal processing in the SDR is implemented using a programmable DSP (Digital Signal Processing or FPGA (Field Programmable Gate Array circuit, which allows a simple change of digital signal processing algorithms and a simple change of the receiver parameters. The starting point of the research is the signal generated on the satellite the structure of which is shown in the paper. Based on the GPS signal structure, a receiver is realized with a task to extract an appropriate signal from the spectrum and detect it. Based on collected navigation data, the receiver calculates the position of the end user. The signal coming from the satellite may be at the carrier frequencies of L1 and L2. Since the SPS is used in the civil service, all the tests shown in the work were performed on the L1 signal. The signal coming to the receiver is generated in the spread spectrum technology and is situated below the level of noise. Such signals often interfere with signals from the environment which presents a difficulty for a receiver to perform proper detection and signal processing. Therefore, signal processing technology is continually being improved, aiming at more accurate and faster signal processing. All tests were carried out on a signal acquired from the satellite using the SE4110 input circuit used for filtering, amplification and signal selection. The samples of the received signal were forwarded to a computer for data post processing, i. e

  3. JPL multipolarization workstation - Hardware, software and examples of data analysis

    Science.gov (United States)

    Burnette, Fred; Norikane, Lynne

    1987-01-01

    A low-cost stand-alone interactive image processing workstation has been developed for operations on multipolarization JPL aircraft SAR data, as well as data from future spaceborne imaging radars. A recently developed data compression technique is used to reduce the data volume to 10 Mbytes, for a typical data set, so that interactive analysis may be accomplished in a timely and efficient manner on a supermicrocomputer. In addition to presenting a hardware description of the work station, attention is given to the software that has been developed. Three illustrative examples of data analysis are presented.

  4. A METHOD FOR SELECTING SOFTWARE FOR DYNAMIC EVENT ANALYSIS I: PROBLEM SELECTION

    Energy Technology Data Exchange (ETDEWEB)

    J. M. Lacy; S. R. Novascone; W. D. Richins; T. K. Larson

    2007-08-01

    New nuclear power reactor designs will require resistance to a variety of possible malevolent attacks, as well as traditional dynamic accident scenarios. The design/analysis team may be faced with a broad range of phenomena including air and ground blasts, high-velocity penetrators or shaped charges, and vehicle or aircraft impacts. With a host of software tools available to address these high-energy events, the analysis team must evaluate and select the software most appropriate for their particular set of problems. The accuracy of the selected software should then be validated with respect to the phenomena governing the interaction of the threat and structure. In this paper, we present a method for systematically comparing current high-energy physics codes for specific applications in new reactor design. Several codes are available for the study of blast, impact, and other shock phenomena. Historically, these packages were developed to study specific phenomena such as explosives performance, penetrator/target interaction, or accidental impacts. As developers generalize the capabilities of their software, legacy biases and assumptions can remain that could affect the applicability of the code to other processes and phenomena. R&D institutions generally adopt one or two software packages and use them almost exclusively, performing benchmarks on a single-problem basis. At the Idaho National Laboratory (INL), new comparative information was desired to permit researchers to select the best code for a particular application by matching its characteristics to the physics, materials, and rate scale (or scales) representing the problem at hand. A study was undertaken to investigate the comparative characteristics of a group of shock and high-strain rate physics codes including ABAQUS, LS-DYNA, CTH, ALEGRA, ALE-3D, and RADIOSS. A series of benchmark problems were identified to exercise the features and capabilities of the subject software. To be useful, benchmark problems

  5. PET-Tool: a software suite for comprehensive processing and managing of Paired-End diTag (PET sequence data

    Directory of Open Access Journals (Sweden)

    Wei Chia-Lin

    2006-08-01

    Full Text Available Abstract Background We recently developed the Paired End diTag (PET strategy for efficient characterization of mammalian transcriptomes and genomes. The paired end nature of short PET sequences derived from long DNA fragments raised a new set of bioinformatics challenges, including how to extract PETs from raw sequence reads, and correctly yet efficiently map PETs to reference genome sequences. To accommodate and streamline data analysis of the large volume PET sequences generated from each PET experiment, an automated PET data process pipeline is desirable. Results We designed an integrated computation program package, PET-Tool, to automatically process PET sequences and map them to the genome sequences. The Tool was implemented as a web-based application composed of four modules: the Extractor module for PET extraction; the Examiner module for analytic evaluation of PET sequence quality; the Mapper module for locating PET sequences in the genome sequences; and the ProjectManager module for data organization. The performance of PET-Tool was evaluated through the analyses of 2.7 million PET sequences. It was demonstrated that PET-Tool is accurate and efficient in extracting PET sequences and removing artifacts from large volume dataset. Using optimized mapping criteria, over 70% of quality PET sequences were mapped specifically to the genome sequences. With a 2.4 GHz LINUX machine, it takes approximately six hours to process one million PETs from extraction to mapping. Conclusion The speed, accuracy, and comprehensiveness have proved that PET-Tool is an important and useful component in PET experiments, and can be extended to accommodate other related analyses of paired-end sequences. The Tool also provides user-friendly functions for data quality check and system for multi-layer data management.

  6. BIM Software Capability and Interoperability Analysis : An analytical approach toward structural usage of BIM software (S-BIM)

    OpenAIRE

    A. Taher, Ali

    2016-01-01

    This study focused on the structuralanalysis of BIM models. Different commercial software (Autodesk products and Rhinoceros)are presented through modelling and analysis of different structures with varying complexity,section properties, geometry, and material. Beside the commercial software, differentarchitectural and different tools for structural analysis are evaluated (dynamo, grasshopper,add-on tool, direct link, indirect link via IFC). BIM and Structural BIM (S-BIM)

  7. BABELOMICS: a suite of web tools for functional annotation and analysis of groups of genes in high-throughput experiments.

    Science.gov (United States)

    Al-Shahrour, Fátima; Minguez, Pablo; Vaquerizas, Juan M; Conde, Lucía; Dopazo, Joaquín

    2005-07-01

    We present Babelomics, a complete suite of web tools for the functional analysis of groups of genes in high-throughput experiments, which includes the use of information on Gene Ontology terms, interpro motifs, KEGG pathways, Swiss-Prot keywords, analysis of predicted transcription factor binding sites, chromosomal positions and presence in tissues with determined histological characteristics, through five integrated modules: FatiGO (fast assignment and transference of information), FatiWise, transcription factor association test, GenomeGO and tissues mining tool, respectively. Additionally, another module, FatiScan, provides a new procedure that integrates biological information in combination with experimental results in order to find groups of genes with modest but coordinate significant differential behaviour. FatiScan is highly sensitive and is capable of finding significant asymmetries in the distribution of genes of common function across a list of ordered genes even if these asymmetries were not extreme. The strong multiple-testing nature of the contrasts made by the tools is taken into account. All the tools are integrated in the gene expression analysis package GEPAS. Babelomics is the natural evolution of our tool FatiGO (which analysed almost 22,000 experiments during the last year) to include more sources on information and new modes of using it. Babelomics can be found at http://www.babelomics.org.

  8. Software applications toward quantitative metabolic flux analysis and modeling.

    Science.gov (United States)

    Dandekar, Thomas; Fieselmann, Astrid; Majeed, Saman; Ahmed, Zeeshan

    2014-01-01

    Metabolites and their pathways are central for adaptation and survival. Metabolic modeling elucidates in silico all the possible flux pathways (flux balance analysis, FBA) and predicts the actual fluxes under a given situation, further refinement of these models is possible by including experimental isotopologue data. In this review, we initially introduce the key theoretical concepts and different analysis steps in the modeling process before comparing flux calculation and metabolite analysis programs such as C13, BioOpt, COBRA toolbox, Metatool, efmtool, FiatFlux, ReMatch, VANTED, iMAT and YANA. Their respective strengths and limitations are discussed and compared to alternative software. While data analysis of metabolites, calculation of metabolic fluxes, pathways and their condition-specific changes are all possible, we highlight the considerations that need to be taken into account before deciding on a specific software. Current challenges in the field include the computation of large-scale networks (in elementary mode analysis), regulatory interactions and detailed kinetics, and these are discussed in the light of powerful new approaches.

  9. UPVapor: Cofrentes nuclear power plant production results analysis software

    Energy Technology Data Exchange (ETDEWEB)

    Curiel, M. [Logistica y Acondicionamientos Industriales SAU, Sorolla Center, local 10, Av. de las Cortes Valencianas No. 58, 46015 Valencia (Spain); Palomo, M. J. [ISIRYM, Universidad Politecnica de Valencia, Camino de Vera s/n, Valencia (Spain); Baraza, A. [Iberdrola Generacion S. A., Central Nuclear Cofrentes, Carretera Almansa Requena s/n, 04662 Cofrentes, Valencia (Spain); Vaquer, J., E-mail: m.curiel@lainsa.co [TITANIA Servicios Tecnologicos SL, Sorolla Center, local 10, Av. de las Cortes Valencianas No. 58, 46015 Valencia (Spain)

    2010-10-15

    UPVapor software version 02 has been developed for the Cofrentes nuclear power plant Data Analysis Department (Spain). It is an analysis graphical environment in which users have available all the plant variables registered in the process computer system (SIEC). In order to perform this, UPVapor software has many advanced graphic tools for work simplicity, as well as a friendly environment easy to use and with many configuration possibilities. Plant variables are classified in the same way that they are in SIEC computer and these values are taken from it through the network of Iberdrola. UPVapor can generate two different types of graphics: evolution graphs and X Y graphs. The first ones analyse the evolution up to twenty plant variables in a user's defined time period and according to historic plant files. Many tools are available: cursors, graphic configuration, mobile means, non valid data visualization ... Moreover, a particular analysis configuration can be saved, as a pre selection, giving the possibility of charging pre selection directly and developing quick monitoring of a group of preselected plant variables. In X Y graphs, it is possible to analyse a variable value against another variable in a defined time. As an option, users can filter previous data depending on a variable certain range, with the possibility of programming up to five filters. As well as the other graph, X Y graph has many configurations, saving and printing options. With UPVapor software, data analysts can save a valuable time during daily work and, as it is of easy utilization, it permits to other users to perform their own analysis without ask the analysts to develop. Besides, it can be used from any work centre with access to network framework. (Author)

  10. Feature-Oriented Nonfunctional Requirement Analysis for Software Product Line

    Institute of Scientific and Technical Information of China (English)

    Xin Peng; Seok-Won Lee; Wen-Yun Zhao

    2009-01-01

    Domain analysis in software product line (SPL) development provides a basis for core assets design and implementation by a systematic and comprehensive commonality/variability analysis. In feature-oriented SPL methods, products of the domain analysis are domain feature models and corresponding feature decision models to facilitate application-oriented customization. As in requirement analysis for a single system, the domain analysis in the SPL development should consider both functional and nonfunctional domain requirements. However, the nonfunctional requirements (NFRs) are often neglected in the existing domain analysis methods. In this paper, we propose a context-based method of the NFR analysis for the SPL development. In the method, NFRs are materialized by connecting nonfunctional goals with real-world context,thus NFR elicitation and variability analysis can be performed by context analysis for the whole domain with the assistance of NFR templates and NFR graphs. After the variability analysis, our method integrates both functional and nonfunctional perspectives by incorporating the nonfunctional goals and operationalizations into an initial functional feature model.NFR-related constraints are also elicited and integrated. Finally, a decision model with both functional and nonfunctional perspectives is constructed to facilitate application-oriented feature model customization. A computer-aided grading system (CAGS) product line is employed to demonstrate the method throughout the paper.

  11. International Atomic Energy Agency intercomparison of ion beam analysis software

    Science.gov (United States)

    Barradas, N. P.; Arstila, K.; Battistig, G.; Bianconi, M.; Dytlewski, N.; Jeynes, C.; Kótai, E.; Lulli, G.; Mayer, M.; Rauhala, E.; Szilágyi, E.; Thompson, M.

    2007-09-01

    Ion beam analysis (IBA) includes a group of techniques for the determination of elemental concentration depth profiles of thin film materials. Often the final results rely on simulations, fits and calculations, made by dedicated codes written for specific techniques. Here we evaluate numerical codes dedicated to the analysis of Rutherford backscattering spectrometry, non-Rutherford elastic backscattering spectrometry, elastic recoil detection analysis and non-resonant nuclear reaction analysis data. Several software packages have been presented and made available to the community. New codes regularly appear, and old codes continue to be used and occasionally updated and expanded. However, those codes have to date not been validated, or even compared to each other. Consequently, IBA practitioners use codes whose validity, correctness and accuracy have never been validated beyond the authors' efforts. In this work, we present the results of an IBA software intercomparison exercise, where seven different packages participated. These were DEPTH, GISA, DataFurnace (NDF), RBX, RUMP, SIMNRA (all analytical codes) and MCERD (a Monte Carlo code). In a first step, a series of simulations were defined, testing different capabilities of the codes, for fixed conditions. In a second step, a set of real experimental data were analysed. The main conclusion is that the codes perform well within the limits of their design, and that the largest differences in the results obtained are due to differences in the fundamental databases used (stopping power and scattering cross section). In particular, spectra can be calculated including Rutherford cross sections with screening, energy resolution convolutions including energy straggling, and pileup effects, with agreement between the codes available at the 0.1% level. This same agreement is also available for the non-RBS techniques. This agreement is not limited to calculation of spectra from particular structures with predetermined

  12. Analysis of Performance of Stereoscopic-Vision Software

    Science.gov (United States)

    Kim, Won; Ansar, Adnan; Steele, Robert; Steinke, Robert

    2007-01-01

    A team of JPL researchers has analyzed stereoscopic vision software and produced a document describing its performance. This software is of the type used in maneuvering exploratory robotic vehicles on Martian terrain. The software in question utilizes correlations between portions of the images recorded by two electronic cameras to compute stereoscopic disparities, which, in conjunction with camera models, are used in computing distances to terrain points to be included in constructing a three-dimensional model of the terrain. The analysis included effects of correlation- window size, a pyramidal image down-sampling scheme, vertical misalignment, focus, maximum disparity, stereo baseline, and range ripples. Contributions of sub-pixel interpolation, vertical misalignment, and foreshortening to stereo correlation error were examined theoretically and experimentally. It was found that camera-calibration inaccuracy contributes to both down-range and cross-range error but stereo correlation error affects only the down-range error. Experimental data for quantifying the stereo disparity error were obtained by use of reflective metrological targets taped to corners of bricks placed at known positions relative to the cameras. For the particular 1,024-by-768-pixel cameras of the system analyzed, the standard deviation of the down-range disparity error was found to be 0.32 pixel.

  13. Engine structures analysis software: Component Specific Modeling (COSMO)

    Science.gov (United States)

    McKnight, R. L.; Maffeo, R. J.; Schwartz, S.

    1994-08-01

    A component specific modeling software program has been developed for propulsion systems. This expert program is capable of formulating the component geometry as finite element meshes for structural analysis which, in the future, can be spun off as NURB geometry for manufacturing. COSMO currently has geometry recipes for combustors, turbine blades, vanes, and disks. Component geometry recipes for nozzles, inlets, frames, shafts, and ducts are being added. COSMO uses component recipes that work through neutral files with the Technology Benefit Estimator (T/BEST) program which provides the necessary base parameters and loadings. This report contains the users manual for combustors, turbine blades, vanes, and disks.

  14. Development of software for the thermohydraulic analysis of air coolers

    Directory of Open Access Journals (Sweden)

    Šerbanović Slobodan P.

    2003-01-01

    Full Text Available Air coolers consume much more energy compared to other heat exchangers due to the large fan power required. This is an additional reason to establish reliable methods for the rational design and thermohydraulic analysis of these devices. The optimal values of the outlet temperature and air flow rate are of particular importance. The paper presents a methodology for the thermohydraulic calculation of air cooler performances, which is incorporated in the "Air Cooler" software module. The module covers two options: cooling and/or condensation of process fluids by ambient air. The calculated results can be given in various ways ie. in the tabular and graphical form.

  15. Development of RCM analysis software for Korean nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Young Ho; Choi, Kwang Hee; Jeong, Hyeong Jong [Korea Electric Power Research Institute, Taejon (Korea, Republic of)

    1998-12-31

    A software called KEPCO RCM workstation (KRCM) has been developed to optimize the maintenance strategies of Korean nuclear power plants. The program modules of the KRCM were designed in a manner that combines EPRI methodologies and KEPRI analysis technique. The KRCM is being applied to the three pilot system, chemical and volume control system, main steam system, and compressed air system of Yonggwang Units 1 and 2. In addition, the KRCM can be utilized as a tool to meet a part of the requirements of maintenance rule (MR) imposed by U.S. NRC. 3 refs., 4 figs. (Author)

  16. Integrating software architectures for distributed simulations and simulation analysis communities.

    Energy Technology Data Exchange (ETDEWEB)

    Goldsby, Michael E.; Fellig, Daniel; Linebarger, John Michael; Moore, Patrick Curtis; Sa, Timothy J.; Hawley, Marilyn F.

    2005-10-01

    The one-year Software Architecture LDRD (No.79819) was a cross-site effort between Sandia California and Sandia New Mexico. The purpose of this research was to further develop and demonstrate integrating software architecture frameworks for distributed simulation and distributed collaboration in the homeland security domain. The integrated frameworks were initially developed through the Weapons of Mass Destruction Decision Analysis Center (WMD-DAC), sited at SNL/CA, and the National Infrastructure Simulation & Analysis Center (NISAC), sited at SNL/NM. The primary deliverable was a demonstration of both a federation of distributed simulations and a federation of distributed collaborative simulation analysis communities in the context of the same integrated scenario, which was the release of smallpox in San Diego, California. To our knowledge this was the first time such a combination of federations under a single scenario has ever been demonstrated. A secondary deliverable was the creation of the standalone GroupMeld{trademark} collaboration client, which uses the GroupMeld{trademark} synchronous collaboration framework. In addition, a small pilot experiment that used both integrating frameworks allowed a greater range of crisis management options to be performed and evaluated than would have been possible without the use of the frameworks.

  17. Development of a software for INAA analysis automation

    Energy Technology Data Exchange (ETDEWEB)

    Zahn, Guilherme S.; Genezini, Frederico A.; Figueiredo, Ana Maria G.; Ticianelli, Regina B., E-mail: gzahn@ipen [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2013-07-01

    In this work, a software to automate the post-counting tasks in comparative INAA has been developed that aims to become more flexible than the available options, integrating itself with some of the routines currently in use in the IPEN Activation Analysis Laboratory and allowing the user to choose between a fully-automatic analysis or an Excel-oriented one. The software makes use of the Genie 2000 data importing and analysis routines and stores each 'energy-counts-uncertainty' table as a separate ASCII file that can be used later on if required by the analyst. Moreover, it generates an Excel-compatible CSV (comma separated values) file with only the relevant results from the analyses for each sample or comparator, as well as the results of the concentration calculations and the results obtained with four different statistical tools (unweighted average, weighted average, normalized residuals and Rajeval technique), allowing the analyst to double-check the results. Finally, a 'summary' CSV file is also produced, with the final concentration results obtained for each element in each sample. (author)

  18. PROTEINCHALLENGE: Crowd sourcing in proteomics analysis and software development

    DEFF Research Database (Denmark)

    Martin, Sarah F.; Falkenberg, Heiner; Dyrlund, Thomas Franck

    2013-01-01

    , including arguments for community-wide open source software development and “big data” compatible solutions for the future. For the meantime, we have laid out ten top tips for data processing. With these at hand, a first large-scale proteomics analysis hopefully becomes less daunting to navigate.......However there is clearly a real need for robust tools, standard operating procedures and general acceptance of best practises. Thus we submit to the proteomics community a call for a community-wide open set of proteomics analysis challenges—PROTEINCHALLENGE—that directly target and compare data analysis workflows......In large-scale proteomics studies there is a temptation, after months of experimental work, to plug resulting data into a convenient—if poorly implemented—set of tools, which may neither do the data justice nor help answer the scientific question. In this paper we have captured key concerns...

  19. Finite Element Analysis of Wheel Rim Using Abaqus Software

    Directory of Open Access Journals (Sweden)

    Bimal Bastin

    2017-02-01

    Full Text Available The rim is the "outer edge of a wheel, holding the tire". It makes up the outer circular design of the wheel on which the inside edge of the tire is mounted on vehicles such as automobiles. A standard automotive steel wheel rim is made from a rectangular sheet metal. Design is an important industrial activity which influences the quality of the product being produced. The wheel rim is modeled by using modeling software SOLIDWORKS . Later this modal is imported to ABAQUS for analysis. Static load analysis has been done by applying a pressure of 5N/mm2 . The materials taken for analysis are steel alloy, Aluminium, Magnesium, and Forged Steel. The displacement occurred to the rim is noted after applying the static load to different materials and maximum principal stresses were also noted

  20. Interfacing with an EVA Suit

    Science.gov (United States)

    Ross, Amy

    2011-01-01

    A NASA spacesuit under the EVA Technology Domain consists of a suit system; a PLSS; and a Power, Avionics, and Software (PAS) system. Ross described the basic functions, components, and interfaces of the PLSS, which consists of oxygen, ventilation, and thermal control subsystems; electronics; and interfaces. Design challenges were reviewed from a packaging perspective. Ross also discussed the development of the PLSS over the last two decades.

  1. 基于蚁群算法的GUI软件回归测试用例集优化%Ant Algorithm-Based Regression Test Suite Optimization for GUI Software

    Institute of Scientific and Technical Information of China (English)

    于长钺; 张萌萌; 窦平安; 于秀山

    2012-01-01

    针对GUI(Graphical User Interface)软件输入/输出图形化、事件驱动、事件触发随机性所带来的回归测试用例数量巨大的难题,在GUI事件模型图基础上,构建了GUI软件回归测试用例集优化数学模型,给出了目标函数和约束条件,提出了一种基于蚁群算法的求解方法,制定了蚂蚁信息素更新规则和蚂蚁路径选择规则.仿真结果表明,该方法在保证覆盖效果的前提下,可以有效减少回归测试用例的数量和长度.%Aimed at the large number of regression test cases caused by the features of graphical input/output, event driven, random event trigger in GUI (Graphical User Interface) software, and on the basis of GUI event model, a mathematical model of regression test suite optimization for GUI software is constructed. The objective function and constraints in the model are given. And an ant algorithm is presented to solve the problem. Ant pheromone update rules and ant path selection rules in the algorithm are set. Simulation results show that under the premise that coverage is guaranteed, this method can reduce both the number and length of test case effectively.

  2. Data-Interpolating Variational Analysis (DIVA) software : recent development and application

    Science.gov (United States)

    Watelet, Sylvain; Barth, Alexander; Troupin, Charles; Ouberdous, Mohamed; Beckers, Jean-Marie

    2014-05-01

    The Data-Interpolating Variational Analysis (DIVA) software is a tool designed to reconstruct a continuous field from discrete measurements. This method is based on the numerical implementation of the Variational Inverse Model (VIM), which consists of a minimization of a cost function, allowing the choice of the analyzed field fitting at best the data sets. The problem is solved efficiently using a finite-element method. This statistical method is particularly suited to deal with irregularly-spaced observations, producing outputs on a regular grid. Initially created to work in a two-dimensional way, the software is now able to handle 3D or even 4D analysis, in order to easily produce ocean climatologies. These analyzes can easily be improved by taking advantage of the DIVA's ability to take topographic and dynamic constraints into account (coastal relief, prevailing wind impacting the advection,...). In DIVA, we assume errors on measurements are not correlated, which means we do not consider the effect of correlated observation errors on the analysis and we therefore use a diagonal observation error covariance matrix. However, the oceanographic data sets are generally clustered in space and time, thus introducing some correlation between observations. In order to determine the impact of such an approximation and provide strategies to mitigate its effects, we conducted several synthetic experiments with known correlation structure. Overall, the best results were obtained with a variant of the covariance inflation method. Finally, a new application of DIVA on satellite altimetry data will be presented : these data have particular space and time distributions, as they consist of repeated tracks (~10-35 days) of measurements with a distance lower than 10 km between two successive measurements in a given track. The tools designed to determine the analysis parameters were adapted to these specificities. Moreover, different weights were applied to measurements in order to

  3. Don't Blame the Software: Using Qualitative Data Analysis Software Successfully in Doctoral Research

    Directory of Open Access Journals (Sweden)

    Michelle Salmona

    2016-07-01

    Full Text Available In this article, we explore the learning experiences of doctoral candidates as they use qualitative data analysis software (QDAS. Of particular interest is the process of adopting technology during the development of research methodology. Using an action research approach, data was gathered over five years from advanced doctoral research candidates and supervisors. The technology acceptance model (TAM was then applied as a theoretical analytic lens for better understanding how students interact with new technology. Findings relate to two significant barriers which doctoral students confront: 1. aligning perceptions of ease of use and usefulness is essential in overcoming resistance to technological change; 2. transparency into the research process through technology promotes insights into methodological challenges. Transitioning through both barriers requires a competent foundation in qualitative research. The study acknowledges the importance of higher degree research, curriculum reform and doctoral supervision in post-graduate research training together with their interconnected relationships in support of high-quality inquiry. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs1603117

  4. General Meta-Models to Analysis of Software Architecture Definitions

    Directory of Open Access Journals (Sweden)

    GholamAli Nejad HajAli Irani

    2011-12-01

    Full Text Available An important step for understanding the architecture will be obtained by providing a clear definition from that. More than 150 valid definitions presented for identifying the software architecture. So an analogy among them is needed to give us a better understanding on the existing definitions. In this paper an analysis over different issues of current definitions is provided based on the incorporated elements. In conjunction with this objective first, the definitions are collected and, after conducting an analysis over them, are broken into different constituent elements which are shown in one table. Then some selected parameters in the table are classified into groups for comparison purposes. Then all parameters of each individual group are specified and compared with each other. This procedure is rendered for all groups respectively. Finally, a meta-model is developed for each group. The aim is not to accept or reject a specific definition, but rather is to contrast the definitions and their respective constituent elements in order to construct a background for gaining better perceptions on software architecture which in turn can benefit the introduction of an appropriate definition.

  5. Evaluation of Peak-Fitting Software for Gamma Spectrum Analysis

    CERN Document Server

    Zahn, Guilherme S; Moralles, Maurício

    2015-01-01

    In all applications of gamma-ray spectroscopy, one of the most important and delicate parts of the data analysis is the fitting of the gamma-ray spectra, where information as the number of counts, the position of the centroid and the width, for instance, are associated with each peak of each spectrum. There's a huge choice of computer programs that perform this type of analysis, and the most commonly used in routine work are the ones that automatically locate and fit the peaks; this fit can be made in several different ways -- the most common ways are to fit a Gaussian function to each peak or simply to integrate the area under the peak, but some software go far beyond and include several small corrections to the simple Gaussian peak function, in order to compensate for secondary effects. In this work several gamma-ray spectroscopy software are compared in the task of finding and fitting the gamma-ray peaks in spectra taken with standard sources of $^{137}$Cs, $^{60}$Co, $^{133}$Ba and $^{152}$Eu. The results...

  6. Real-time development of data acquisition and analysis software for hands-on physiology education in neuroscience: G-PRIME.

    Science.gov (United States)

    Lott, Gus K; Johnson, Bruce R; Bonow, Robert H; Land, Bruce R; Hoy, Ronald R

    2009-01-01

    We report on the real-time creation of an application for hands-on neurophysiology in an advanced undergraduate teaching laboratory. Enabled by the rapid software development tools included in the Matlab technical computing environment (The Mathworks, Natick, MA), a team, consisting of a neurophysiology educator and a biophysicist trained as an electrical engineer, interfaced to a course of approximately 15 students from engineering and biology backgrounds. The result is the powerful freeware data acquisition and analysis environment, "g-PRIME." The software was developed from week to week in response to curriculum demands, and student feedback. The program evolved from a simple software oscilloscope, enabling RC circuit analysis, to a suite of tools supporting analysis of neuronal excitability and synaptic transmission analysis in invertebrate model systems. The program has subsequently expanded in application to university courses, research, and high school projects in the US and abroad as free courseware.

  7. HistFitter software framework for statistical data analysis

    CERN Document Server

    Baak, M.; Côte, D.; Koutsman, A.; Lorenz, J.; Short, D.

    2015-01-01

    We present a software framework for statistical data analysis, called HistFitter, that has been used extensively by the ATLAS Collaboration to analyze big datasets originating from proton-proton collisions at the Large Hadron Collider at CERN. Since 2012 HistFitter has been the standard statistical tool in searches for supersymmetric particles performed by ATLAS. HistFitter is a programmable and flexible framework to build, book-keep, fit, interpret and present results of data models of nearly arbitrary complexity. Starting from an object-oriented configuration, defined by users, the framework builds probability density functions that are automatically fitted to data and interpreted with statistical tests. A key innovation of HistFitter is its design, which is rooted in core analysis strategies of particle physics. The concepts of control, signal and validation regions are woven into its very fabric. These are progressively treated with statistically rigorous built-in methods. Being capable of working with mu...

  8. The COMPTEL Processing and Analysis Software system (COMPASS)

    Science.gov (United States)

    de Vries, C. P.; COMPTEL Collaboration

    The data analysis system of the gamma-ray Compton Telescope (COMPTEL) onboard the Compton-GRO spacecraft is described. A continous stream of data of the order of 1 kbytes per second is generated by the instrument. The data processing and analysis software is build around a relational database managment system (RDBMS) in order to be able to trace heritage and processing status of all data in the processing pipeline. Four institutes cooperate in this effort requiring procedures to keep local RDBMS contents identical between the sites and swift exchange of data using network facilities. Lately, there has been a gradual move of the system from central processing facilities towards clusters of workstations.

  9. Asterias: A Parallelized Web-based Suite for the Analysis of Expression and aCGH Data

    Directory of Open Access Journals (Sweden)

    Ramón Díaz-Uriarte

    2007-01-01

    Full Text Available The analysis of expression and CGH arrays plays a central role in the study of complex diseases, especially cancer, including finding markers for early diagnosis and prognosis, choosing an optimal therapy, or increasing our understanding of cancer development and metastasis. Asterias (http://www.asterias.info is an integrated collection of freely-accessible web tools for the analysis of gene expression and aCGH data. Most of the tools use parallel computing (via MPI and run on a server with 60 CPUs for computation; compared to a desktop or server-based but not parallelized application, parallelization provides speed ups of factors up to 50. Most of our applications allow the user to obtain additional information for user-selected genes (chromosomal location, PubMed ids, Gene Ontology terms, etc. by using clickable links in tables and/or fi gures. Our tools include: normalization of expression and aCGH data (DNMAD; converting between different types of gene/clone and protein identifi ers (IDconverter/IDClight; fi ltering and imputation (preP; finding differentially expressed genes related to patient class and survival data (Pomelo II; searching for models of class prediction (Tnasas; using random forests to search for minimal models for class prediction or for large subsets of genes with predictive capacity (GeneSrF; searching for molecular signatures and predictive genes with survival data (SignS; detecting regions of genomic DNA gain or loss (ADaCGH. The capability to send results between different applications, access to additional functional information, and parallelized computation make our suite unique and exploit features only available to web-based applications.

  10. Flexible software platform for fast-scan cyclic voltammetry data acquisition and analysis.

    Science.gov (United States)

    Bucher, Elizabeth S; Brooks, Kenneth; Verber, Matthew D; Keithley, Richard B; Owesson-White, Catarina; Carroll, Susan; Takmakov, Pavel; McKinney, Collin J; Wightman, R Mark

    2013-11-05

    Over the last several decades, fast-scan cyclic voltammetry (FSCV) has proved to be a valuable analytical tool for the real-time measurement of neurotransmitter dynamics in vitro and in vivo. Indeed, FSCV has found application in a wide variety of disciplines including electrochemistry, neurobiology, and behavioral psychology. The maturation of FSCV as an in vivo technique led users to pose increasingly complex questions that require a more sophisticated experimental design. To accommodate recent and future advances in FSCV application, our lab has developed High Definition Cyclic Voltammetry (HDCV). HDCV is an electrochemical software suite that includes data acquisition and analysis programs. The data collection program delivers greater experimental flexibility and better user feedback through live displays. It supports experiments involving multiple electrodes with customized waveforms. It is compatible with transistor-transistor logic-based systems that are used for monitoring animal behavior, and it enables simultaneous recording of electrochemical and electrophysiological data. HDCV analysis streamlines data processing with superior filtering options, seamlessly manages behavioral events, and integrates chemometric processing. Furthermore, analysis is capable of handling single files collected over extended periods of time, allowing the user to consider biological events on both subsecond and multiminute time scales. Here we describe and demonstrate the utility of HDCV for in vivo experiments.

  11. Visual data mining and analysis of software repositories

    NARCIS (Netherlands)

    Voinea, Lucian; Telea, Alexandru

    2007-01-01

    In this article we describe an ongoing effort to integrate information visualization techniques into the process of configuration management for software systems. Our focus is to help software engineers manage the evolution of large and complex software systems by offering them effective and efficie

  12. Detection and Quantification of Nitrogen Compounds in Martian Solid Samples by the Sample Analysis at Mars (SAM) Instrument Suite

    Science.gov (United States)

    Stern, Jennifer C.; Navarro-Gonzalez, Rafael; Freissinet, Caroline; McKay, Christopher P.; Archer, Paul Douglas; Buch, Arnaud; Eigenbrode, Jennifer L.; Franz, Heather; Glavin, Daniel Patrick; Ming, Douglas W/; Steele, Andrew; Szopa, Cyril; Wray, James J.; Conrad, Pamela Gales; Mahaffay, Paul R.

    2013-01-01

    The Sample Analysis at Mars (SAM) instrument suite on the Mars Science Laboratory (MSL) Curiosity Rover detected both reduced and oxidized nitrogen-bearing compounds during the pyrolysis of surface materials from three sites at Gale Crater. Preliminary detections of nitrogen species include NO, HCN, ClCN, CH3CN, and TFMA (trifluoro-Nmethyl-acetamide). On Earth, nitrogen is a crucial bio-element, and nitrogen availability controls productivity in many environments. Nitrogen has also recently been detected in the form of CN in inclusions in the Martian meteorite Tissint, and isotopically heavy nitrogen (delta N-15 approx +100per mille) has been measured during stepped combustion experiments in several SNC meteorites. The detection of nitrogen-bearing compounds in Martian regolith would have important implications for the habitability of ancient Mars. However, confirmation of indigenous Martian nitrogen bearing compounds will require ruling out their formation from the terrestrial derivatization reagents (e.g. N-methyl-N-tert-butyldimethylsilyl-trifluoroacetamide, MTBSTFA and dimethylformamide, DMF) carried for SAM's wet chemistry experiment that contribute to the SAM background. The nitrogen species we detect in the SAM solid sample analyses can also be produced during laboratory pyrolysis experiments where these reagents are heated in the presence of perchlorate, a compound that has also been identified by SAM in Mars solid samples. However, this does not preclude a Martian origin for some of these compounds, which are present in nanomolar concentrations in SAM evolved gas analyses. Analysis of SAM data and laboratory breadboard tests are underway to determine whether nitrogen species are present at higher concentrations than can be accounted for by maximum estimates of nitrogen contribution from MTBSTFA and DMF. In addition, methods are currently being developed to use GC Column 6, (functionally similar to a commercial Q-Bond column), to separate and identify

  13. The PROOF benchmark suite measuring PROOF performance

    Science.gov (United States)

    Ryu, S.; Ganis, G.

    2012-06-01

    The PROOF benchmark suite is a new utility suite of PROOF to measure performance and scalability. The primary goal of the benchmark suite is to determine optimal configuration parameters for a set of machines to be used as PROOF cluster. The suite measures the performance of the cluster for a set of standard tasks as a function of the number of effective processes. Cluster administrators can use the suite to measure the performance of the cluster and find optimal configuration parameters. PROOF developers can also utilize the suite to help them measure, identify problems and improve their software. In this paper, the new tool is explained in detail and use cases are presented to illustrate the new tool.

  14. Integrated Software Environment for Pressurized Thermal Shock Analysis

    Directory of Open Access Journals (Sweden)

    Dino Araneo

    2011-01-01

    Full Text Available The present paper describes the main features and an application to a real Nuclear Power Plant (NPP of an Integrated Software Environment (in the following referred to as “platform” developed at University of Pisa (UNIPI to perform Pressurized Thermal Shock (PTS analysis. The platform is written in Java for the portability and it implements all the steps foreseen in the methodology developed at UNIPI for the deterministic analysis of PTS scenarios. The methodology starts with the thermal hydraulic analysis of the NPP with a system code (such as Relap5-3D and Cathare2, during a selected transient scenario. The results so obtained are then processed to provide boundary conditions for the next step, that is, a CFD calculation. Once the system pressure and the RPV wall temperature are known, the stresses inside the RPV wall can be calculated by mean a Finite Element (FE code. The last step of the methodology is the Fracture Mechanics (FM analysis, using weight functions, aimed at evaluating the stress intensity factor (KI at crack tip to be compared with the critical stress intensity factor KIc. The platform automates all these steps foreseen in the methodology once the user specifies a number of boundary conditions at the beginning of the simulation.

  15. Cost Analysis of Poor Quality Using a Software Simulation

    Directory of Open Access Journals (Sweden)

    Jana Fabianová

    2017-02-01

    Full Text Available The issues of quality, cost of poor quality and factors affecting quality are crucial to maintaining a competitiveness regarding to business activities. Use of software applications and computer simulation enables more effective quality management. Simulation tools offer incorporating the variability of more variables in experiments and evaluating their common impact on the final output. The article presents a case study focused on the possibility of using computer simulation Monte Carlo in the field of quality management. Two approaches for determining the cost of poor quality are introduced here. One from retrospective scope of view, where the cost of poor quality and production process are calculated based on historical data. The second approach uses the probabilistic characteristics of the input variables by means of simulation, and reflects as a perspective view of the costs of poor quality. Simulation output in the form of a tornado and sensitivity charts complement the risk analysis.

  16. Development of the free-space optical communications analysis software

    Science.gov (United States)

    Jeganathan, Muthu; Mecherle, G. Stephen; Lesh, James R.

    1998-05-01

    The Free-space Optical Communication Analysis Software (FOCAS) was developed at the Jet Propulsion Laboratory (JPL) to provide mission planners, systems engineers and communications engineers with an easy to use tool to analyze direct-detection optical communication links. The FOCAS program, implemented in Microsoft Excel, gives it all the power and flexibility built into the spreadsheet. An easy-to-use interface, developed using Visual Basic for Applications (VBA), to the spreadsheet allows easy input of data and parameters. A host of pre- defined components allow an analyst to configure a link without having to know the details of the components. FOCAS replaces the over-a-decade-old FORTRAN program called OPTI widely used previously at JPL. This paper describes the features and capabilities of the Excel-spreadsheet-based FOCAS program.

  17. Analysis of Software Development Methodologies to Build Safety Software Applications for the SATEX-II: A Mexican Experimental Satellite

    Science.gov (United States)

    Aguilar Cisneros, Jorge; Vargas Martinez, Hector; Pedroza Melendez, Alejandro; Alonso Arevalo, Miguel

    2013-09-01

    Mexico is a country where the experience to build software for satellite applications is beginning. This is a delicate situation because in the near future we will need to develop software for the SATEX-II (Mexican Experimental Satellite). SATEX- II is a SOMECyTA's project (the Mexican Society of Aerospace Science and Technology). We have experienced applying software development methodologies, like TSP (Team Software Process) and SCRUM in other areas. Then, we analyzed these methodologies and we concluded: these can be applied to develop software for the SATEX-II, also, we supported these methodologies with SSP-05-0 Standard in particular with ESA PSS-05-11. Our analysis was focusing on main characteristics of each methodology and how these methodologies could be used with the ESA PSS 05-0 Standards. Our outcomes, in general, may be used by teams who need to build small satellites, but, in particular, these are going to be used when we will build the on board software applications for the SATEX-II.

  18. CancellationTools: All-in-one software for administration and analysis of cancellation tasks.

    Science.gov (United States)

    Dalmaijer, Edwin S; Van der Stigchel, Stefan; Nijboer, Tanja C W; Cornelissen, Tim H W; Husain, Masud

    2015-12-01

    In a cancellation task, a participant is required to search for and cross out ("cancel") targets, which are usually embedded among distractor stimuli. The number of cancelled targets and their location can be used to diagnose the neglect syndrome after stroke. In addition, the organization of search provides a potentially useful way to measure executive control over multitarget search. Although many useful cancellation measures have been introduced, most fail to make their way into research studies and clinical practice due to the practical difficulty of acquiring such parameters from traditional pen-and-paper measures. Here we present new, open-source software that is freely available to all. It allows researchers and clinicians to flexibly administer computerized cancellation tasks using stimuli of their choice, and to directly analyze the data in a convenient manner. The automated analysis suite provides output that includes almost all of the currently existing measures, as well as several new ones introduced here. All tasks can be performed using either a computer mouse or a touchscreen as an input device, and an online version of the task runtime is available for tablet devices. A summary of the results is produced in a single A4-sized PDF document, including high quality data visualizations. For research purposes, batch analysis of large datasets is possible. In sum, CancellationTools allows users to employ a flexible, computerized cancellation task, which provides extensive benefits and ease of use.

  19. Software for neutron activation analysis at reactor IBR-2, FLNP, JINR

    CERN Document Server

    Zlokazov, V B

    2004-01-01

    A Delphi program suite, developed for processing gamma-spectra of induced activity of nuclei, obtained from the neutron activation measurements at the reactor IBR-2, FLNF, JINR, is reported. This suite contains components, intended for carrying out all the operations of the analysis cycle, starling with a data acquisition program for gamma -spectrometers Gamma (written in C++ Builder) and including Delphi programs for steps of the analysis. (6 refs).

  20. The Application and Extension of Backward Software Analysis

    CERN Document Server

    Perisic, Aleksandar

    2010-01-01

    The backward software analysis is a method that emanates from executing a program backwards - instead of taking input data and following the execution path, we start from output data and by executing the program backwards command by command, analyze data that could lead to the current output. The changed perspective forces a developer to think in a new way about the program. It can be applied as a thorough procedure or casual method. With this method, we have many advantages in testing, algorithm and system analysis. For example, in testing the advantage is obvious if the set of output data is smaller than possible inputs. For some programs or algorithms, we know more precisely the output data, so this backward analysis can help in reducing the number of test cases or even in strict verification of an algorithm. The difficulty lies in the fact that we need types of data that no programming language currently supports, so we need additional effort to understand how this method works, or what effort we need to ...

  1. STATISTICAL ANALYSIS ON SOFTWARE METRICS AFFECTING MODULARITY IN OPEN SOURCE SOFTWARE

    Directory of Open Access Journals (Sweden)

    Andi Wahju Rahardjo Emanuel

    2011-06-01

    Full Text Available Modularity has been identified by many researchers as one of the success factors of Open Source Software (OSS Projects. This modularity trait are influenced by some aspects of software metrics such as size, complexity, cohesion, and coupling. In this research, we analyze the software metrics such as Size Metrics (NCLOC, Lines, and Statements, Complexity Metrics (McCabe's Cyclomatic Complexity, Cohesion Metrics (LCOM4, and Coupling Metrics (RFC, Afferent coupling and Efferent coupling of 59 Java-based OSS Projects from Sourceforge.net. By assuming that the number of downloads can be used as the indication of success of these projects, the OSS Projects being selected are the projects which have been downloaded more than 100,000 times. The software metrics reflecting the modularity of these projects are collected using SONAR tool and then statistically analyzed using scatter graph, Pearson rproduct-moment correlation, and least-square-fit linear approximation. It can be shown that there areonly three independent metrics reflecting modularity which are NCLOC, LCOM4, and Afferent Coupling, whereas there is also one inconclusive result regarding Efferent Coupling.

  2. Software use cases to elicit the software requirements analysis within the ASTRI project

    Science.gov (United States)

    Conforti, Vito; Antolini, Elisa; Bonnoli, Giacomo; Bruno, Pietro; Bulgarelli, Andrea; Capalbi, Milvia; Fioretti, Valentina; Fugazza, Dino; Gardiol, Daniele; Grillo, Alessandro; Leto, Giuseppe; Lombardi, Saverio; Lucarelli, Fabrizio; Maccarone, Maria Concetta; Malaguti, Giuseppe; Pareschi, Giovanni; Russo, Federico; Sangiorgi, Pierluca; Schwarz, Joseph; Scuderi, Salvatore; Tanci, Claudio; Tosti, Gino; Trifoglio, Massimo; Vercellone, Stefano; Zanmar Sanchez, Ricardo

    2016-07-01

    The Italian National Institute for Astrophysics (INAF) is leading the Astrofisica con Specchi a Tecnologia Replicante Italiana (ASTRI) project whose main purpose is the realization of small size telescopes (SST) for the Cherenkov Telescope Array (CTA). The first goal of the ASTRI project has been the development and operation of an innovative end-to-end telescope prototype using a dual-mirror optical configuration (SST-2M) equipped with a camera based on silicon photo-multipliers and very fast read-out electronics. The ASTRI SST-2M prototype has been installed in Italy at the INAF "M.G. Fracastoro" Astronomical Station located at Serra La Nave, on Mount Etna, Sicily. This prototype will be used to test several mechanical, optical, control hardware and software solutions which will be used in the ASTRI mini-array, comprising nine telescopes proposed to be placed at the CTA southern site. The ASTRI mini-array is a collaborative and international effort led by INAF and carried out by Italy, Brazil and South-Africa. We present here the use cases, through UML (Unified Modeling Language) diagrams and text details, that describe the functional requirements of the software that will manage the ASTRI SST-2M prototype, and the lessons learned thanks to these activities. We intend to adopt the same approach for the Mini Array Software System that will manage the ASTRI miniarray operations. Use cases are of importance for the whole software life cycle; in particular they provide valuable support to the validation and verification activities. Following the iterative development approach, which breaks down the software development into smaller chunks, we have analysed the requirements, developed, and then tested the code in repeated cycles. The use case technique allowed us to formalize the problem through user stories that describe how the user procedurally interacts with the software system. Through the use cases we improved the communication among team members, fostered

  3. Thermal performance analysis of antigravity suit%某型囊式抗荷服热性能分析

    Institute of Scientific and Technical Information of China (English)

    邱义芬; 李艳杰; 任兆生

    2009-01-01

    To study the thermal performance of an antigravity suit, the thermal system simulation modal was build among human body, antigravity suit and environment. The suit was divided into 15 segments, each segment has different thermal resistance and gas permeability. The system heat and mass transfer process were analyzed according to suit characteristic. This modal can simulate human body skin temperature, core temperature and sweat. The combined index of heat stress (CIHS) was calculated with these parameter to evaluating human body hot load. Experiments were designated to validate the modal calculation results. The differences between experimental and calculating results is small. At last, influences of the suit thermal resistance and impermeability index to body hot load ware analyzed with the modal built above.%为了研究某型抗荷服在不同环境条件下的热防护性能,在分析抗荷服传热传质特点的基础上对该服装的不同节段分别建立不同的服装传热模型,再将人体热调节模型和服装模型结合起来模拟不同环境条件下人体着装的动态仿真,得到人体各节段皮肤温度、人体平均温度、核心温度以及出汗量等,计算出飞行员综合热应激指数,并进行实验验证.最后利用所建模型分析服装热阻及透湿指数对人体热负荷的影响.

  4. Data-Interpolating Variational Analysis (DIVA) software : recent development and application

    Science.gov (United States)

    Watelet, Sylvain; Beckers, Jean-Marie; Barth, Alexander; Back, Örjan

    2016-04-01

    The Data-Interpolating Variational Analysis (DIVA) software is a tool designed to reconstruct a continuous field from discrete measurements. This method is based on the numerical implementation of the Variational Inverse Model (VIM), which consists of a minimization of a cost function, allowing the choice of the analysed field fitting at best the data sets. The problem is solved efficiently using a finite-element method. This statistical method is particularly suited to deal with irregularly-spaced observations, producing outputs on a regular grid. Initially created to work in a two-dimensional way, the software is now able to handle 3D or even 4D analysis, in order to easily produce ocean climatologies. These analyses can easily be improved by taking advantage of the DIVA's ability to take topographic and dynamic constraints into account (coastal relief, prevailing wind impacting the advection,...). DIVA is an open-source software which is continuously upgraded and distributed for free through frequent version releases. The development is funded by the EMODnet and SeaDataNet projects and include many discussions and feedback from the users community. Here, we present two recent major upgrades : the data weighting option and the bottom-based analyses. Since DIVA works with a diagonal observation error covariance matrix, it is assumed that the observation errors are uncorrelated in space and time. In practice, this assumption is not always valid especially when dealing e.g. with cruise measurements (same instrument) or with time series at a fixed geographic point (representativity error). The data weighting option proposes to decrease the weights in the analysis of such observations. Theses weights are based on an exponential function using a 3D (x,y,t) distance between several observations. A comparison between not-weighted and weighted analyses will be shown. It has been a recurrent request from the DIVA users to improve the way the analyses near the ocean bottom

  5. RVA. 3-D Visualization and Analysis Software to Support Management of Oil and Gas Resources

    Energy Technology Data Exchange (ETDEWEB)

    Keefer, Donald A. [Univ. of Illinois, Champaign, IL (United States); Shaffer, Eric G. [Univ. of Illinois, Champaign, IL (United States); Storsved, Brynne [Univ. of Illinois, Champaign, IL (United States); Vanmoer, Mark [Univ. of Illinois, Champaign, IL (United States); Angrave, Lawrence [Univ. of Illinois, Champaign, IL (United States); Damico, James R. [Univ. of Illinois, Champaign, IL (United States); Grigsby, Nathan [Univ. of Illinois, Champaign, IL (United States)

    2015-12-01

    A free software application, RVA, has been developed as a plugin to the US DOE-funded ParaView visualization package, to provide support in the visualization and analysis of complex reservoirs being managed using multi-fluid EOR techniques. RVA, for Reservoir Visualization and Analysis, was developed as an open-source plugin to the 64 bit Windows version of ParaView 3.14. RVA was developed at the University of Illinois at Urbana-Champaign, with contributions from the Illinois State Geological Survey, Department of Computer Science and National Center for Supercomputing Applications. RVA was designed to utilize and enhance the state-of-the-art visualization capabilities within ParaView, readily allowing joint visualization of geologic framework and reservoir fluid simulation model results. Particular emphasis was placed on enabling visualization and analysis of simulation results highlighting multiple fluid phases, multiple properties for each fluid phase (including flow lines), multiple geologic models and multiple time steps. Additional advanced functionality was provided through the development of custom code to implement data mining capabilities. The built-in functionality of ParaView provides the capacity to process and visualize data sets ranging from small models on local desktop systems to extremely large models created and stored on remote supercomputers. The RVA plugin that we developed and the associated User Manual provide improved functionality through new software tools, and instruction in the use of ParaView-RVA, targeted to petroleum engineers and geologists in industry and research. The RVA web site (http://rva.cs.illinois.edu) provides an overview of functions, and the development web site (https://github.com/shaffer1/RVA) provides ready access to the source code, compiled binaries, user manual, and a suite of demonstration data sets. Key functionality has been included to support a range of reservoirs visualization and analysis needs, including

  6. The CCP13 FibreFix program suite: semi-automated analysis of diffraction patterns from non-crystalline materials.

    Science.gov (United States)

    Rajkumar, Ganeshalingam; Al-Khayat, Hind A; Eakins, Felicity; Knupp, Carlo; Squire, John M

    2007-02-01

    The extraction of useful information from recorded diffraction patterns from non-crystalline materials is non-trivial and is not a well defined operation. Unlike protein crystallography where one expects to see well behaved diffraction spots in predictable positions defined by standard space groups, the diffraction patterns from non-crystalline materials are very diverse. They can range from uniaxially oriented fibre patterns which are completely sampled as Bragg peaks, but rotationally averaged around the fibre axis, to fibre patterns that are completely unsampled, to either kind of pattern with considerable axial misalignment (disorientation), to liquid-like order and even to mixtures of these various structure types. In the case of protein crystallography, the specimen is generated artificially and only used if the degree of order is sufficient to yield a three-dimensional density map of high enough resolution to be interpreted sensibly. However, with non-crystalline diffraction, many of the specimens of interest are naturally occurring (e.g. cellulose, rubber, collagen, muscle, hair, silk) and to elucidate their structure it is necessary to extract structural information from the materials as they actually are and to whatever resolution is available. Even when synthetic fibres are generated from purified components (e.g. nylon, polyethylene, DNA, polysaccharides, amyloids etc.) and diffraction occurs to high resolution, it is rarely possible to obtain perfect uniaxial alignment. The CCP13 project was established in the 1990s to generate software which will be generally useful for analysis of non-crystalline diffraction patterns. Various individual programs were written which allowed separate steps in the analysis procedure to be carried out. Many of these programs have now been integrated into a single user-friendly package known as FibreFix, which is freely downloadable from http://www.ccp13.ac.uk. Here the main features of FibreFix are outlined and some of

  7. Control software analysis, Part I Open-loop properties

    CERN Document Server

    Feron, Eric

    2008-01-01

    As the digital world enters further into everyday life, questions are raised about the increasing challenges brought by the interaction of real-time software with physical devices. Many accidents and incidents encountered in areas as diverse as medical systems, transportation systems or weapon systems are ultimately attributed to "software failures". Since real-time software that interacts with physical systems might as well be called control software, the long litany of accidents due to real-time software failures might be taken as an equally long list of opportunities for control systems engineering. In this paper, we are interested only in run-time errors in those pieces of software that are a direct implementation of control system specifications: For well-defined and well-understood control architectures such as those present in standard textbooks on digital control systems, the current state of theoretical computer science is well-equipped enough to address and analyze control algorithms. It appears tha...

  8. Analysis of Software Design Artifacts for Socio-Technical Aspects

    OpenAIRE

    Damaševičius, Robertas; Kaunas University of Technology

    2007-01-01

    Software systems are not purely technical objects. They are designed, constructed and used by people. Therefore, software design process is not purely a technical task, but a socio-technical process embedded within organizational and social structures. These social structures influence and govern their work behavior and final work products such as program source code and documentation. This paper discusses the organizational, social and psychological aspects of software design; and formulates...

  9. Software maintenance: an analysis of industrial needs and constraints

    OpenAIRE

    Haziza, Marc; Voidrot, Jean-François; Queille, Jean-Pierre; Pofelski, Lech; Blazy, Sandrine

    1992-01-01

    The results are given of a series of case studies conducted at different industrial sites in the framework of the ESF/EPSOM (Eureka Software Factory/European Platform for Software Maintenance) project. The approach taken in the case studies was to directly contact software maintainers and obtain their own view of their activity, mainly through the use of interactive methods based on group work. This approach is intended to complement statistical studies which can be found in the literature, b...

  10. The Financial Analysis System: An Integrated Software System for Financial Analysis and Modeling.

    Science.gov (United States)

    Groomer, S. Michael

    This paper discusses the Financial Analysis System (FAS), a software system for financial analysis, display, and modeling of the data found in the COMPUSTAT Annual Industrial, Over-the-Counter and Canadian Company files. The educational utility of FAS is also discussed briefly. (Author)

  11. How qualitative data analysis software may support the qualitative analysis process

    NARCIS (Netherlands)

    Peters, V.A.M.; Wester, F.P.J.

    2007-01-01

    The last decades have shown large progress in the elaboration of procedures for qualitative data analysis and in the development of computer programs to support this kind of analysis. We believe, however, that the link between methodology and computer software tools is too loose, especially for a no

  12. HistFitter software framework for statistical data analysis

    Science.gov (United States)

    Baak, M.; Besjes, G. J.; Côté, D.; Koutsman, A.; Lorenz, J.; Short, D.

    2015-04-01

    We present a software framework for statistical data analysis, called HistFitter, that has been used extensively by the ATLAS Collaboration to analyze big datasets originating from proton-proton collisions at the Large Hadron Collider at CERN. Since 2012 HistFitter has been the standard statistical tool in searches for supersymmetric particles performed by ATLAS. HistFitter is a programmable and flexible framework to build, book-keep, fit, interpret and present results of data models of nearly arbitrary complexity. Starting from an object-oriented configuration, defined by users, the framework builds probability density functions that are automatically fit to data and interpreted with statistical tests. Internally HistFitter uses the statistics packages RooStats and HistFactory. A key innovation of HistFitter is its design, which is rooted in analysis strategies of particle physics. The concepts of control, signal and validation regions are woven into its fabric. These are progressively treated with statistically rigorous built-in methods. Being capable of working with multiple models at once that describe the data, HistFitter introduces an additional level of abstraction that allows for easy bookkeeping, manipulation and testing of large collections of signal hypotheses. Finally, HistFitter provides a collection of tools to present results with publication quality style through a simple command-line interface.

  13. Reduction EMI of BLDC Motor Drive Based on Software Analysis

    Directory of Open Access Journals (Sweden)

    Navid Mousavi

    2016-01-01

    Full Text Available In the BLDC motor-drive system, the leakage current from a motor to a ground network and existence of high-frequency components of the DC link current are the most important factors that cause conducting interference. The leakage currents of the motors, flow through common ground, will interfere with other equipment because of the high density of electrical and electronic systems in the spacecraft and aircrafts. Moreover, generally there are common DC buses in the mentioned systems, which aggravate the problem. Function of the electric motor causes appearance of the high-frequency components in the DC link current, which can interfere with other subsystems. In this paper, the analysis of electromagnetic noise and presentation of the proposed method based on the frequency spectrum of the DC link current and the leakage current from the motor to the ground network are done. The proposed method presents a new process based on the filtering method to overcome EMI. To cover the requirement analysis, the Maxwell software is used.

  14. Pattern recognition software and techniques for biological image analysis.

    Directory of Open Access Journals (Sweden)

    Lior Shamir

    Full Text Available The increasing prevalence of automated image acquisition systems is enabling new types of microscopy experiments that generate large image datasets. However, there is a perceived lack of robust image analysis systems required to process these diverse datasets. Most automated image analysis systems are tailored for specific types of microscopy, contrast methods, probes, and even cell types. This imposes significant constraints on experimental design, limiting their application to the narrow set of imaging methods for which they were designed. One of the approaches to address these limitations is pattern recognition, which was originally developed for remote sensing, and is increasingly being applied to the biology domain. This approach relies on training a computer to recognize patterns in images rather than developing algorithms or tuning parameters for specific image processing tasks. The generality of this approach promises to enable data mining in extensive image repositories, and provide objective and quantitative imaging assays for routine use. Here, we provide a brief overview of the technologies behind pattern recognition and its use in computer vision for biological and biomedical imaging. We list available software tools that can be used by biologists and suggest practical experimental considerations to make the best use of pattern recognition techniques for imaging assays.

  15. Pattern recognition software and techniques for biological image analysis.

    Science.gov (United States)

    Shamir, Lior; Delaney, John D; Orlov, Nikita; Eckley, D Mark; Goldberg, Ilya G

    2010-11-24

    The increasing prevalence of automated image acquisition systems is enabling new types of microscopy experiments that generate large image datasets. However, there is a perceived lack of robust image analysis systems required to process these diverse datasets. Most automated image analysis systems are tailored for specific types of microscopy, contrast methods, probes, and even cell types. This imposes significant constraints on experimental design, limiting their application to the narrow set of imaging methods for which they were designed. One of the approaches to address these limitations is pattern recognition, which was originally developed for remote sensing, and is increasingly being applied to the biology domain. This approach relies on training a computer to recognize patterns in images rather than developing algorithms or tuning parameters for specific image processing tasks. The generality of this approach promises to enable data mining in extensive image repositories, and provide objective and quantitative imaging assays for routine use. Here, we provide a brief overview of the technologies behind pattern recognition and its use in computer vision for biological and biomedical imaging. We list available software tools that can be used by biologists and suggest practical experimental considerations to make the best use of pattern recognition techniques for imaging assays.

  16. Research and Development of Statistical Analysis Software System of Maize Seedling Experiment

    Directory of Open Access Journals (Sweden)

    Hui Cao

    2014-03-01

    Full Text Available In this study, software engineer measures were used to develop a set of software system for maize seedling experiments statistics and analysis works. During development works, B/S structure software design method was used and a set of statistics indicators for maize seedling evaluation were established. The experiments results indicated that this set of software system could finish quality statistics and analysis for maize seedling very well. The development of this software system explored a new method for screening of maize seedlings.

  17. A Flexible Method for Producing F.E.M. Analysis of Bone Using Open-Source Software

    Science.gov (United States)

    Boppana, Abhishektha; Sefcik, Ryan; Meyers, Jerry G.; Lewandowski, Beth E.

    2016-01-01

    This project, performed in support of the NASA GRC Space Academy summer program, sought to develop an open-source workflow methodology that segmented medical image data, created a 3D model from the segmented data, and prepared the model for finite-element analysis. In an initial step, a technological survey evaluated the performance of various existing open-source software that claim to perform these tasks. However, the survey concluded that no single software exhibited the wide array of functionality required for the potential NASA application in the area of bone, muscle and bio fluidic studies. As a result, development of a series of Python scripts provided the bridging mechanism to address the shortcomings of the available open source tools. The implementation of the VTK library provided the most quick and effective means of segmenting regions of interest from the medical images; it allowed for the export of a 3D model by using the marching cubes algorithm to build a surface mesh. To facilitate the development of the model domain from this extracted information required a surface mesh to be processed in the open-source software packages Blender and Gmsh. The Preview program of the FEBio suite proved to be sufficient for volume filling the model with an unstructured mesh and preparing boundaries specifications for finite element analysis. To fully allow FEM modeling, an in house developed Python script allowed assignment of material properties on an element by element basis by performing a weighted interpolation of voxel intensity of the parent medical image correlated to published information of image intensity to material properties, such as ash density. A graphical user interface combined the Python scripts and other software into a user friendly interface. The work using Python scripts provides a potential alternative to expensive commercial software and inadequate, limited open-source freeware programs for the creation of 3D computational models. More work

  18. Instant Spring Tool Suite

    CERN Document Server

    Chiang, Geoff

    2013-01-01

    Filled with practical, step-by-step instructions and clear explanations for the most important and useful tasks. A tutorial guide that walks you through how to use the features of Spring Tool Suite using well defined sections for the different parts of Spring.Instant Spring Tool Suite is for novice to intermediate Java developers looking to get a head-start in enterprise application development using Spring Tool Suite and the Spring framework. If you are looking for a guide for effective application development using Spring Tool Suite, then this book is for you.

  19. TweezPal - Optical tweezers analysis and calibration software

    Science.gov (United States)

    Osterman, Natan

    2010-11-01

    Optical tweezers, a powerful tool for optical trapping, micromanipulation and force transduction, have in recent years become a standard technique commonly used in many research laboratories and university courses. Knowledge about the optical force acting on a trapped object can be gained only after a calibration procedure which has to be performed (by an expert) for each type of trapped objects. In this paper we present TweezPal, a user-friendly, standalone Windows software tool for optical tweezers analysis and calibration. Using TweezPal, the procedure can be performed in a matter of minutes even by non-expert users. The calibration is based on the Brownian motion of a particle trapped in a stationary optical trap, which is being monitored using video or photodiode detection. The particle trajectory is imported into the software which instantly calculates position histogram, trapping potential, stiffness and anisotropy. Program summaryProgram title: TweezPal Catalogue identifier: AEGR_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEGR_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 44 891 No. of bytes in distributed program, including test data, etc.: 792 653 Distribution format: tar.gz Programming language: Borland Delphi Computer: Any PC running Microsoft Windows Operating system: Windows 95, 98, 2000, XP, Vista, 7 RAM: 12 Mbytes Classification: 3, 4.14, 18, 23 Nature of problem: Quick, robust and user-friendly calibration and analysis of optical tweezers. The optical trap is calibrated from the trajectory of a trapped particle undergoing Brownian motion in a stationary optical trap (input data) using two methods. Solution method: Elimination of the experimental drift in position data. Direct calculation of the trap stiffness from the positional

  20. Research on Application of Enhanced Neural Networks in Software Risk Analysis

    Institute of Scientific and Technical Information of China (English)

    Zhenbang Rong; Juhua Chen; Mei Liu; Yong Hu

    2006-01-01

    This paper puts forward a risk analysis model for software projects using enranced neural networks. The data for analysis are acquired through questionnaires from real software projects. To solve the multicollinearity in software risks, the method of principal components analysis is adopted in the model to enhance network stability. To solve uncertainty of the neural networks structure and the uncertainty of the initial weights, genetic algorithms is employed. The experimental result reveals that the precision of software risk analysis can be improved by using the erhanced neural networks model.

  1. Software Developed for the Reduction, Analysis and Presentation of MILOCSURVNORLANT Environmental Data,

    Science.gov (United States)

    seasonal and spatial dependance upon environmental factors. The major software, developed on an Elliott 503 computer, for the reduction, analysis and presentation of MILOCSURVNORLANT 70 data is described.

  2. Prototype Software for Automated Structural Analysis of Systems

    DEFF Research Database (Denmark)

    Jørgensen, A.; Izadi-Zamanabadi, Roozbeh; Kristensen, M.

    2004-01-01

    In this paper we present a prototype software tool that is developed to analyse the structural model of automated systems in order to identify redundant information that is hence utilized for Fault detection and Isolation (FDI) purposes. The dedicated algorithms in this software tool use a tri...

  3. A pattern framework for software quality assessment and tradeoff analysis

    NARCIS (Netherlands)

    Folmer, Eelke; Boscht, Jan

    2007-01-01

    The earliest design decisions often have a significant impact on software quality and are the most costly to revoke. One of the challenges in architecture design is to reduce the frequency of retrofit problems in software designs; not being able to improve the quality of a system cost effectively, a

  4. IFDOTMETER : A New Software Application for Automated Immunofluorescence Analysis

    NARCIS (Netherlands)

    Rodriguez-Arribas, Mario; Pizarro-Estrella, Elisa; Gomez-Sanchez, Ruben; Yakhine-Diop, S. M. S.; Gragera-Hidalgo, Antonio; Cristo, Alejandro; Bravo-San Pedro, Jose M.; Gonzalez-Polo, Rosa A.; Fuentes, Jose M.

    2016-01-01

    Most laboratories interested in autophagy use different imaging software for managing and analyzing heterogeneous parameters in immunofluorescence experiments (e.g., LC3-puncta quantification and determination of the number and size of lysosomes). One solution would be software that works on a user'

  5. An Analysis of Open Source Security Software Products Downloads

    Science.gov (United States)

    Barta, Brian J.

    2014-01-01

    Despite the continued demand for open source security software, a gap in the identification of success factors related to the success of open source security software persists. There are no studies that accurately assess the extent of this persistent gap, particularly with respect to the strength of the relationships of open source software…

  6. Multi-criteria decision analysis methods and software

    CERN Document Server

    Ishizaka, Alessio

    2013-01-01

    This book presents an introduction to MCDA followed by more detailed chapters about each of the leading methods used in this field. Comparison of methods and software is also featured to enable readers to choose the most appropriate method needed in their research. Worked examples as well as the software featured in the book are available on an accompanying website.

  7. Software safety analysis on the model specified by NuSCR and SMV input language at requirements phase of software development life cycle using SMV

    Energy Technology Data Exchange (ETDEWEB)

    Koh, Kwang Yong; Seong, Poong Hyun [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    2005-07-01

    Safety-critical software process is composed of development process, verification and validation (V and V) process and safety analysis process. Safety analysis process has been often treated as an additional process and not found in a conventional software process. But software safety analysis (SSA) is required if software is applied to a safety system, and the SSA shall be performed independently for the safety software through software development life cycle (SDLC). Of all the phases in software development, requirements engineering is generally considered to play the most critical role in determining the overall software quality. NASA data demonstrate that nearly 75% of failures found in operational software were caused by errors in the requirements. The verification process in requirements phase checks the correctness of software requirements specification, and the safety analysis process analyzes the safety-related properties in detail. In this paper, the method for safety analysis at requirements phase of software development life cycle using symbolic model verifier (SMV) is proposed. Hazard is discovered by hazard analysis and in other to use SMV for the safety analysis, the safety-related properties are expressed by computation tree logic (CTL)

  8. Performance Analysis of Software Effort Estimation Models Using Neural Networks

    Directory of Open Access Journals (Sweden)

    P.Latha

    2013-08-01

    Full Text Available Software Effort estimation involves the estimation of effort required to develop software. Cost overrun, schedule overrun occur in the software development due to the wrong estimate made during the initial stage of software development. Proper estimation is very essential for successful completion of software development. Lot of estimation techniques available to estimate the effort in which neural network based estimation technique play a prominent role. Back propagation Network is the most widely used architecture. ELMAN neural network a recurrent type network can be used on par with Back propagation Network. For a good predictor system the difference between estimated effort and actual effort should be as low as possible. Data from historic project of NASA is used for training and testing. The experimental Results confirm that Back propagation algorithm is efficient than Elman neural network.

  9. Software Aging Analysis of Web Server Using Neural Networks

    Directory of Open Access Journals (Sweden)

    G.Sumathi

    2012-05-01

    Full Text Available Software aging is a phenomenon that refers to progressive performance degradation or transient failures or even crashes in long running software systems such as web servers. It mainly occurs due to the deterioration of operating system resource, fragmentation and numerical error accumulation. A primitive method to fight against software aging is software rejuvenation. Software rejuvenation is a proactive fault management technique aimed at cleaning up the system internal state to prevent the occurrence of more severe crash failures in the future. It involves occasionally stopping the running software, cleaning its internal state and restarting it. An optimized schedule for performing the software rejuvenation has to be derived in advance because a long running application could not be put down now and then as it may lead to waste of cost. This paper proposes a method to derive an accurate and optimized schedule for rejuvenation of a web server (Apache by using Radial Basis Function (RBF based Feed Forward Neural Network, a variant of Artificial Neural Networks (ANN. Aging indicators are obtained through experimental setup involving Apache web server and clients, which acts as input to the neural network model. This method is better than existing ones because usage of RBF leads to better accuracy and speed in convergence.

  10. Hydra—The National Earthquake Information Center’s 24/7 seismic monitoring, analysis, catalog production, quality analysis, and special studies tool suite

    Science.gov (United States)

    Patton, John M.; Guy, Michelle R.; Benz, Harley M.; Buland, Raymond P.; Erickson, Brian K.; Kragness, David S.

    2016-08-18

    This report provides an overview of the capabilities and design of Hydra, the global seismic monitoring and analysis system used for earthquake response and catalog production at the U.S. Geological Survey National Earthquake Information Center (NEIC). Hydra supports the NEIC’s worldwide earthquake monitoring mission in areas such as seismic event detection, seismic data insertion and storage, seismic data processing and analysis, and seismic data output.The Hydra system automatically identifies seismic phase arrival times and detects the occurrence of earthquakes in near-real time. The system integrates and inserts parametric and waveform seismic data into discrete events in a database for analysis. Hydra computes seismic event parameters, including locations, multiple magnitudes, moment tensors, and depth estimates. Hydra supports the NEIC’s 24/7 analyst staff with a suite of seismic analysis graphical user interfaces.In addition to the NEIC’s monitoring needs, the system supports the processing of aftershock and temporary deployment data, and supports the NEIC’s quality assurance procedures. The Hydra system continues to be developed to expand its seismic analysis and monitoring capabilities.

  11. Object-oriented data handler for sequence analysis software development.

    Science.gov (United States)

    Ptitsyn, A A; Grigorovich, D A

    1995-12-01

    We report an object-oriented data handler and supplementary tools for the development of molecular genetics application software for various sequence analyses. Our data handler has a flexible and expandable format that supports the most common data types for molecular genetic software. New data types can be constructed in an object-oriented manner from the basic units. The data handler includes an object library, a format-converting program and a viewer that can visualize simultaneously the data contained in several files to construct a general picture from separate data. This software has been implemented on an IBM PC-compatible personal computer.

  12. Comparative analysis of results between CASMO, MCNP and Serpent for a suite of Benchmark problems on BWR reactors; Analisis comparativo de resultados entre CASMO, MCNP y SERPENT para una suite de problemas Benchmark en reactores BWR

    Energy Technology Data Exchange (ETDEWEB)

    Xolocostli M, J. V.; Vargas E, S.; Gomez T, A. M. [ININ, Carretera Mexico-Toluca s/n, 52750 Ocoyoacac, Estado de Mexico (Mexico); Reyes F, M. del C.; Del Valle G, E., E-mail: vicente.xolocostli@inin.gob.mx [IPN, Escuela Superior de Fisica y Matematicas, UP - Adolfo Lopez Mateos, Edif. 9, 07738 Mexico D. F. (Mexico)

    2014-10-15

    In this paper a comparison is made in analyzing the suite of Benchmark problems for reactors type BWR between CASMO-4, MCNP6 and Serpent code. The Benchmark problem consists of two different geometries: a fuel cell of a pin and assembly type BWR. To facilitate the study of reactors physics in the fuel pin their nuclear characteristics are provided to detail, such as burnt dependence, the reactivity of selected nuclide, etc. With respect to the fuel assembly, the presented results are regarding to infinite multiplication factor for burning different steps and different vacuum conditions. Making the analysis of this set of Benchmark problems provides comprehensive test problems for the next fuels generation of BWR reactors with high extended burned. It is important to note that when making this comparison the purpose is to validate the methodologies used in modeling for different operating conditions, if the case is of other BWR assembly. The results will be within a range with some uncertainty, considering that does not depend on code that is used. Escuela Superior de Fisica y Matematicas of Instituto Politecnico Nacional (IPN (Mexico) has accumulated some experience in using Serpent, due to the potential of this code over other commercial codes such as CASMO and MCNP. The obtained results for the infinite multiplication factor are encouraging and motivate the studies to continue with the generation of the X S of a core to a next step a respective nuclear data library is constructed and this can be used by codes developed as part of the development project of the Mexican Analysis Platform of Nuclear Reactors AZTLAN. (Author)

  13. DEVELOPMENT OF EDUCATIONAL SOFTWARE FOR STRESS ANALYSIS OF AN AIRCRAFT WING

    Directory of Open Access Journals (Sweden)

    TAZKERA SADEQ

    2012-06-01

    Full Text Available A stress analysis software based on MATLAB, Graphic user interface (GUI has been developed. The developed software can be used to estimate load on a wing and to compute the stresses at any point along the span of the wing of a given aircraft. The generalized formulation allows performing stress analysis even for a multispar (multicell wing. The software is expected to be a useful tool for effective teaching learning process of courses on aircraft structures and aircraft structural design.

  14. Orbiter subsystem hardware/software interaction analysis. Volume 8: Forward reaction control system

    Science.gov (United States)

    Becker, D. D.

    1980-01-01

    The results of the orbiter hardware/software interaction analysis for the AFT reaction control system are presented. The interaction between hardware failure modes and software are examined in order to identify associated issues and risks. All orbiter subsystems and interfacing program elements which interact with the orbiter computer flight software are analyzed. The failure modes identified in the subsystem/element failure mode and effects analysis are discussed.

  15. Assurance Cases for Design Analysis of Complex System of Systems Software

    Science.gov (United States)

    2016-06-13

    2009 Carnegie Mellon University Assurance Cases for Design Analysis of Complex System of Systems Software Presented at AIAA Infotech@Aerospace...Conference Software Assurance Session 8 April 2009 Stephen Blanchette, Jr. Report Documentation Page Form ApprovedOMB No. 0704-0188 Public reporting...TITLE AND SUBTITLE Assurance Cases for Design Analysis of Complex System of Systems Software 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM

  16. Specification Improvement Through Analysis of Proof Structure (SITAPS): High Assurance Software Development

    Science.gov (United States)

    2016-02-01

    SPECIFICATION IMPROVEMENT THROUGH ANALYSIS OF PROOF STRUCTURE (SITAPS): HIGH ASSURANCE SOFTWARE DEVELOPMENT BAE SYSTEMS FEBRUARY...ANALYSIS OF PROOF STRUCTURE (SITAPS): HIGH ASSURANCE SOFTWARE DEVELOPMENT 5a. CONTRACT NUMBER FA8750-13-C-0240 5b. GRANT NUMBER N/A 5c. PROGRAM...General adoption of these techniques has had limited penetration in the software development community. Two interrelated causes may account for

  17. DEVELOPMENT OF EDUCATIONAL SOFTWARE FOR STRESS ANALYSIS OF AN AIRCRAFT WING

    OpenAIRE

    TAZKERA SADEQ; J. S. MOHAMMED ALI

    2012-01-01

    A stress analysis software based on MATLAB, Graphic user interface (GUI) has been developed. The developed software can be used to estimate load on a wing and to compute the stresses at any point along the span of the wing of a given aircraft. The generalized formulation allows performing stress analysis even for a multispar (multicell) wing. The software is expected to be a useful tool for effective teaching learning process of courses on aircraft structures and aircraft structural design.

  18. The ERP System for an Effective Management of a Small Software Company – Requirements Analysis

    Directory of Open Access Journals (Sweden)

    Jan Mittner

    2014-01-01

    Full Text Available As found out by a questionnaire survey a significant part of small software companies is not satisfied with the way their company processes are supported by software systems. To change this situation it is necessary first to specify requirements for such software systems in small software companies. Based on the analysis of the literature and the market and own experience the first version of the ERP system requirements specification for small software companies was framed and subsequently validated by interviewing the executives of the target group companies.

  19. Suite of Benchmark Tests to Conduct Mesh-Convergence Analysis of Nonlinear and Non-constant Coefficient Transport Codes

    Science.gov (United States)

    Zamani, K.; Bombardelli, F. A.

    2014-12-01

    Verification of geophysics codes is imperative to avoid serious academic as well as practical consequences. In case that access to any given source code is not possible, the Method of Manufactured Solution (MMS) cannot be employed in code verification. In contrast, employing the Method of Exact Solution (MES) has several practical advantages. In this research, we first provide four new one-dimensional analytical solutions designed for code verification; these solutions are able to uncover the particular imperfections of the Advection-diffusion-reaction equation, such as nonlinear advection, diffusion or source terms, as well as non-constant coefficient equations. After that, we provide a solution of Burgers' equation in a novel setup. Proposed solutions satisfy the continuity of mass for the ambient flow, which is a crucial factor for coupled hydrodynamics-transport solvers. Then, we use the derived analytical solutions for code verification. To clarify gray-literature issues in the verification of transport codes, we designed a comprehensive test suite to uncover any imperfection in transport solvers via a hierarchical increase in the level of tests' complexity. The test suite includes hundreds of unit tests and system tests to check vis-a-vis the portions of the code. Examples for checking the suite start by testing a simple case of unidirectional advection; then, bidirectional advection and tidal flow and build up to nonlinear cases. We design tests to check nonlinearity in velocity, dispersivity and reactions. The concealing effect of scales (Peclet and Damkohler numbers) on the mesh-convergence study and appropriate remedies are also discussed. For the cases in which the appropriate benchmarks for mesh convergence study are not available, we utilize symmetry. Auxiliary subroutines for automation of the test suite and report generation are designed. All in all, the test package is not only a robust tool for code verification but it also provides comprehensive

  20. BEANS - a software package for distributed Big Data analysis

    CERN Document Server

    Hypki, Arkadiusz

    2016-01-01

    BEANS software is a web based, easy to install and maintain, new tool to store and analyse data in a distributed way for a massive amount of data. It provides a clear interface for querying, filtering, aggregating, and plotting data from an arbitrary number of datasets. Its main purpose is to simplify the process of storing, examining and finding new relations in the so-called Big Data. Creation of BEANS software is an answer to the growing needs of the astronomical community to have a versatile tool to store, analyse and compare the complex astrophysical numerical simulations with observations (e.g. simulations of the Galaxy or star clusters with the Gaia archive). However, this software was built in a general form and it is ready to use in any other research field or open source software.

  1. A Software Risk Analysis Model Using Bayesian Belief Network

    Institute of Scientific and Technical Information of China (English)

    Yong Hu; Juhua Chen; Mei Liu; Yang Yun; Junbiao Tang

    2006-01-01

    The uncertainty during the period of software project development often brings huge risks to contractors and clients. Ifwe can find an effective method to predict the cost and quality of software projects based on facts like the project character and two-side cooperating capability at the beginning of the project, we can reduce the risk.Bayesian Belief Network(BBN) is a good tool for analyzing uncertain consequences, but it is difficult to produce precise network structure and conditional probability table. In this paper, we built up network structure by Delphi method for conditional probability table learning, and learn update probability table and nodes' confidence levels continuously according to the application cases, which made the evaluation network have learning abilities, and evaluate the software development risk of organization more accurately. This paper also introduces EM algorithm, which will enhance the ability to produce hidden nodes caused by variant software projects.

  2. Network-based analysis of software change propagation.

    Science.gov (United States)

    Wang, Rongcun; Huang, Rubing; Qu, Binbin

    2014-01-01

    The object-oriented software systems frequently evolve to meet new change requirements. Understanding the characteristics of changes aids testers and system designers to improve the quality of softwares. Identifying important modules becomes a key issue in the process of evolution. In this context, a novel network-based approach is proposed to comprehensively investigate change distributions and the correlation between centrality measures and the scope of change propagation. First, software dependency networks are constructed at class level. And then, the number of times of cochanges among classes is minded from software repositories. According to the dependency relationships and the number of times of cochanges among classes, the scope of change propagation is calculated. Using Spearman rank correlation analyzes the correlation between centrality measures and the scope of change propagation. Three case studies on java open source software projects Findbugs, Hibernate, and Spring are conducted to research the characteristics of change propagation. Experimental results show that (i) change distribution is very uneven; (ii) PageRank, Degree, and CIRank are significantly correlated to the scope of change propagation. Particularly, CIRank shows higher correlation coefficient, which suggests it can be a more useful indicator for measuring the scope of change propagation of classes in object-oriented software system.

  3. Army-NASA aircrew/aircraft integration program: Phase 4 A(3)I Man-Machine Integration Design and Analysis System (MIDAS) software detailed design document

    Science.gov (United States)

    Banda, Carolyn; Bushnell, David; Chen, Scott; Chiu, Alex; Constantine, Betsy; Murray, Jerry; Neukom, Christian; Prevost, Michael; Shankar, Renuka; Staveland, Lowell

    1991-01-01

    The Man-Machine Integration Design and Analysis System (MIDAS) is an integrated suite of software components that constitutes a prototype workstation to aid designers in applying human factors principles to the design of complex human-machine systems. MIDAS is intended to be used at the very early stages of conceptual design to provide an environment wherein designers can use computational representations of the crew station and operator, instead of hardware simulators and man-in-the-loop studies, to discover problems and ask 'what if' questions regarding the projected mission, equipment, and environment. This document is the Software Product Specification for MIDAS. Introductory descriptions of the processing requirements, hardware/software environment, structure, I/O, and control are given in the main body of the document for the overall MIDAS system, with detailed discussion of the individual modules included in Annexes A-J.

  4. Inequalities in Open Source Software Development: Analysis of Contributor's Commits in Apache Software Foundation Projects.

    Science.gov (United States)

    Chełkowski, Tadeusz; Gloor, Peter; Jemielniak, Dariusz

    2016-01-01

    While researchers are becoming increasingly interested in studying OSS phenomenon, there is still a small number of studies analyzing larger samples of projects investigating the structure of activities among OSS developers. The significant amount of information that has been gathered in the publicly available open-source software repositories and mailing-list archives offers an opportunity to analyze projects structures and participant involvement. In this article, using on commits data from 263 Apache projects repositories (nearly all), we show that although OSS development is often described as collaborative, but it in fact predominantly relies on radically solitary input and individual, non-collaborative contributions. We also show, in the first published study of this magnitude, that the engagement of contributors is based on a power-law distribution.

  5. Analyzing the State of Static Analysis: A Large-Scale Evaluation in Open Source Software

    NARCIS (Netherlands)

    Beller, M.; Bholanath, R.; McIntosh, S.; Zaidman, A.E.

    2016-01-01

    The use of automatic static analysis has been a software engineering best practice for decades. However, we still do not know a lot about its use in real-world software projects: How prevalent is the use of Automated Static Analysis Tools (ASATs) such as FindBugs and JSHint? How do developers use th

  6. Evaluation of Distribution Analysis Software for DER Applications

    Energy Technology Data Exchange (ETDEWEB)

    Staunton, RH

    2003-01-23

    unstoppable. In response, energy providers will be forced to both fully acknowledge the trend and plan for accommodating DER [3]. With bureaucratic barriers [4], lack of time/resources, tariffs, etc. still seen in certain regions of the country, changes still need to be made. Given continued technical advances in DER, the time is fast approaching when the industry, nation-wide, must not only accept DER freely but also provide or review in-depth technical assessments of how DER should be integrated into and managed throughout the distribution system. Characterization studies are needed to fully understand how both the utility system and DER devices themselves will respond to all reasonable events (e.g., grid disturbances, faults, rapid growth, diverse and multiple DER systems, large reactive loads). Some of this work has already begun as it relates to operation and control of DER [5] and microturbine performance characterization [6,7]. One of the most urgently needed tools that can provide these types of analyses is a distribution network analysis program in combination with models for various DER. Together, they can be used for (1) analyzing DER placement in distribution networks and (2) helping to ensure that adequate transmission reliability is maintained. Surveys of the market show products that represent a partial match to these needs; specifically, software that has been developed to plan electrical distribution systems and analyze reliability (in a near total absence of DER). The first part of this study (Sections 2 and 3 of the report) looks at a number of these software programs and provides both summary descriptions and comparisons. The second part of this study (Section 4 of the report) considers the suitability of these analysis tools for DER studies. It considers steady state modeling and assessment work performed by ORNL using one commercially available tool on feeder data provided by a southern utility. Appendix A provides a technical report on the results of

  7. The Effects of Development Team Skill on Software Product Quality

    Science.gov (United States)

    Beaver, Justin M.; Schiavone, Guy A.

    2006-01-01

    This paper provides an analysis of the effect of the skill/experience of the software development team on the quality of the final software product. A method for the assessment of software development team skill and experience is proposed, and was derived from a workforce management tool currently in use by the National Aeronautics and Space Administration. Using data from 26 smallscale software development projects, the team skill measures are correlated to 5 software product quality metrics from the ISO/IEC 9126 Software Engineering Product Quality standard. in the analysis of the results, development team skill is found to be a significant factor in the adequacy of the design and implementation. In addition, the results imply that inexperienced software developers are tasked with responsibilities ill-suited to their skill level, and thus have a significant adverse effect on the quality of the software product. Keywords: software quality, development skill, software metrics

  8. Instrument-independent software tools for the analysis of MS-MS and LC-MS lipidomics data.

    Science.gov (United States)

    Haimi, Perttu; Chaithanya, Krishna; Kainu, Ville; Hermansson, Martin; Somerharju, Pentti

    2009-01-01

    Mass spectrometry (MS), particularly electrospray-MS, is the key tool in modern lipidomics. However, as even a modest scale experiment produces a great amount of data, data processing often becomes limiting. Notably, the software provided with MS instruments are not well suited for quantitative analysis of lipidomes because of the great variety of species present and complexities in response calibration. Here we describe the use of two recently introduced software tools: lipid mass spectrum analysis (LIMSA) and spectrum extraction from chromatographic data (SECD), which significantly increase the speed and reliability of mass spectrometric analysis of complex lipidomes. LIMSA is a Microsoft Excel add-on that (1) finds and integrates the peaks in an imported spectrum, (2) identifies the peaks, (3) corrects the peak areas for overlap by isotopic peaks of other species and (4) quantifies the identified species using included internal standards. LIMSA is instrument-independent because it processes text-format MS spectra. Typically, the analysis of one spectrum takes only a few seconds.The SECD software allows one to display MS chromatograms as two-dimensional maps, which is useful for visual inspection of the data. More importantly, however, SECD allows one to extract mass spectra from user-defined regions of the map for further analysis with, e.g., LIMSA. The use of select regions rather than simple time-range averaging significantly improves the signal-to-noise ratio as signals outside the region of interest are more efficiently excluded. LIMSA and SECD have proven to be robust and convenient tools and are available free of charge from the authors.

  9. Using recurrence plot analysis for software execution interpretation and fault detection

    Science.gov (United States)

    Mosdorf, M.

    2015-09-01

    This paper shows a method targeted at software execution interpretation and fault detection using recurrence plot analysis. In in the proposed approach recurrence plot analysis is applied to software execution trace that contains executed assembly instructions. Results of this analysis are subject to further processing with PCA (Principal Component Analysis) method that simplifies number coefficients used for software execution classification. This method was used for the analysis of five algorithms: Bubble Sort, Quick Sort, Median Filter, FIR, SHA-1. Results show that some of the collected traces could be easily assigned to particular algorithms (logs from Bubble Sort and FIR algorithms) while others are more difficult to distinguish.

  10. Distortion Analysis Toolkit—A Software Tool for Easy Analysis of Nonlinear Audio Systems

    Science.gov (United States)

    Pakarinen, Jyri

    2010-12-01

    Several audio effects devices deliberately add nonlinear distortion to the processed signal in order to create a desired sound. When creating virtual analog models of nonlinearly distorting devices, it would be very useful to carefully analyze the type of distortion, so that the model could be made as realistic as possible. While traditional system analysis tools such as the frequency response give detailed information on the operation of linear and time-invariant systems, they are less useful for analyzing nonlinear devices. Furthermore, although there do exist separate algorithms for nonlinear distortion analysis, there is currently no unified, easy-to-use tool for rapid analysis of distorting audio systems. This paper offers a remedy by introducing a new software tool for easy analysis of distorting effects. A comparison between a well-known guitar tube amplifier and two commercial software simulations is presented as a case study. This freely available software is written in Matlab language, but the analysis tool can also run as a standalone program, so the user does not need to have Matlab installed in order to perform the analysis.

  11. EDL Sensor Suite Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Optical Air Data Systems (OADS) L.L.C. proposes a LIDAR based remote measurement sensor suite capable of satisfying a significant number of the desired sensing...

  12. A Method for Software Requirement Volatility Analysis Using QFD

    Directory of Open Access Journals (Sweden)

    Yunarso Anang

    2016-10-01

    Full Text Available Changes of software requirements are inevitable during the development life cycle. Rather than avoiding the circumstance, it is easier to just accept it and find a way to anticipate those changes. This paper proposes a method to analyze the volatility of requirement by using the Quality Function Deployment (QFD method and the introduced degree of volatility. Customer requirements are deployed to software functions and subsequently to architectural design elements. And then, after determining the potential for changes of the design elements, the degree of volatility of the software requirements is calculated. In this paper the method is described using a flow diagram and illustrated using a simple example, and is evaluated using a case study.

  13. Risk Analysis and Decision-Making Software Package (1997 Version) User Manual

    Energy Technology Data Exchange (ETDEWEB)

    Chung, F.T.H.

    1999-02-11

    This manual provides instructions for using the U.S. Department of Energy's (DOE) risk analysis and decision making software (1997 version) developed at BDM Petroleum Technologies by BDM-Oklahoma, Inc. for DOE, under contract No. DE-AC22-94PC91OO8. This software provides petroleum producers with a simple, handy tool for exploration and production risk analysis and decision-making. It collects useful risk analysis tools in one package so that users do not have to use several programs separately. The software is simple to use, but still provides many functions. The 1997 version of the software package includes the following tools: (1) Investment risk (Gambler's ruin) analysis; (2) Monte Carlo simulation; (3) Best fit for distribution functions; (4) Sample and rank correlation; (5) Enhanced oil recovery method screening; and (6) artificial neural network. This software package is subject to change. Suggestions and comments from users are welcome and will be considered for future modifications and enhancements of the software. Please check the opening screen of the software for the current contact information. In the future, more tools will be added to this software package. This manual includes instructions on how to use the software but does not attempt to fully explain the theory and algorithms used to create it.

  14. The Social Construction of the Software Operation

    DEFF Research Database (Denmark)

    Frederiksen, Helle Damborg; Rose, Jeremy

    2003-01-01

    be analyzed using structuration theory. This structurational analysis showed that the company’s software operation followed an easily recognizable and widely understood pattern. The software operation was organized in terms of development projects leading to applications that then needed maintenance...... challenge the underlying social practice of the software operation, the metrics program reinforced it by adopting the same underlying values. Our conclusion is that, under these circumstances, metrics programs are unlikely to result in radical changes to the software operation, and are best suited to small...

  15. Thematic Review and Analysis of Grounded Theory Application in Software Engineering

    Directory of Open Access Journals (Sweden)

    Omar Badreddin

    2013-01-01

    Full Text Available We present metacodes, a new concept to guide grounded theory (GT research in software engineering. Metacodes are high level codes that can help software engineering researchers guide the data coding process. Metacodes are constructed in the course of analyzing software engineering papers that use grounded theory as a research methodology. We performed a high level analysis to discover common themes in such papers and discovered that GT had been applied primarily in three software engineering disciplines: agile development processes, geographically distributed software development, and requirements engineering. For each category, we collected and analyzed all grounded theory codes and created, following a GT analysis process, what we call metacodes that can be used to drive further theory building. This paper surveys the use of grounded theory in software engineering and presents an overview of successes and challenges of applying this research methodology.

  16. Reference Management Software: A Comparative Analysis of Four Products

    Science.gov (United States)

    Gilmour, Ron; Cobus-Kuo, Laura

    2011-01-01

    Reference management (RM) software is widely used by researchers in the health and natural sciences. Librarians are often called upon to provide support for these products. The present study compares four prominent RMs: CiteULike, RefWorks, Mendeley, and Zotero, in terms of features offered and the accuracy of the bibliographies that they…

  17. Graph based communication analysis for hardware/software codesign

    DEFF Research Database (Denmark)

    Knudsen, Peter Voigt; Madsen, Jan

    1999-01-01

    In this paper we present a coarse grain CDFG (Control/Data Flow Graph) model suitable for hardware/software partitioning of single processes and demonstrate how it is necessary to perform various transformations on the graph structure before partitioning in order to achieve a structure that allows...

  18. A Comparative Analysis of Software Engineering with Knowledge Engineering

    Directory of Open Access Journals (Sweden)

    J. F. Vijay

    2010-01-01

    Full Text Available Problem statement: Software engineering is not only a technical discipline of its own. It is also a problem domain where technologies coming from other disciplines are relevant and can play an important role. One important example is knowledge engineering, a term that we use in the broad sense to encompass artificial intelligence, computational intelligence, knowledge bases, data mining and machine learning. We see a number of typical software development issues that can benefit from these disciplines and, for the sake of clarifying the discussion, we have divided them into four categories: (1 planning, monitoring and quality control of projects, (2 The quality and process improvement of software organizations, (3 decision making support, (4 automation. Approach: First, the planning, monitoring and quality control of software development was typically based unless it is entirely ad-hoc on past project data and/or expert opinion. Results: Several techniques coming from machine learning, computational intelligence and knowledge-based systems had shown to be useful in this context. Second, software organizations are inherently learning organizations, that need to improve, based on experience and project feedback, the way they develop software in changing and volatile environments. Large amounts of data, numerous documents and other forms of information are typically gathered on projects. The question then became how to enable the intelligent storage and use of such information in future projects. Third, during the course of a project, software engineers and managers have to face important, complex decisions. They need decision models to support them, especially when project pressure is intense. Techniques originally developed for building risk models based on expert elicitation or optimization heuristics can play a key role in such a context. The last category of applications concerns automation. Many automation problems, such as test data

  19. Modular reweighting software for statistical mechanical analysis of biased equilibrium data

    Science.gov (United States)

    Sindhikara, Daniel J.

    2012-07-01

    Here a simple, useful, modular approach and software suite designed for statistical reweighting and analysis of equilibrium ensembles is presented. Statistical reweighting is useful and sometimes necessary for analysis of equilibrium enhanced sampling methods, such as umbrella sampling or replica exchange, and also in experimental cases where biasing factors are explicitly known. Essentially, statistical reweighting allows extrapolation of data from one or more equilibrium ensembles to another. Here, the fundamental separable steps of statistical reweighting are broken up into modules - allowing for application to the general case and avoiding the black-box nature of some “all-inclusive” reweighting programs. Additionally, the programs included are, by-design, written with little dependencies. The compilers required are either pre-installed on most systems, or freely available for download with minimal trouble. Examples of the use of this suite applied to umbrella sampling and replica exchange molecular dynamics simulations will be shown along with advice on how to apply it in the general case. New version program summaryProgram title: Modular reweighting version 2 Catalogue identifier: AEJH_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEJH_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License, version 3 No. of lines in distributed program, including test data, etc.: 179 118 No. of bytes in distributed program, including test data, etc.: 8 518 178 Distribution format: tar.gz Programming language: C++, Python 2.6+, Perl 5+ Computer: Any Operating system: Any RAM: 50-500 MB Supplementary material: An updated version of the original manuscript (Comput. Phys. Commun. 182 (2011) 2227) is available Classification: 4.13 Catalogue identifier of previous version: AEJH_v1_0 Journal reference of previous version: Comput. Phys. Commun. 182 (2011) 2227 Does the new

  20. The IFPUG guide to IT and software measurement

    CERN Document Server

    IFPUG

    2012-01-01

    The widespread deployment of millions of current and emerging software applications has placed software economic studies among the most critical of any form of business analysis. Unfortunately, a lack of an integrated suite of metrics makes software economic analysis extremely difficult. The International Function Point Users Group (IFPUG), a nonprofit and member-governed organization, has become the recognized leader in promoting the effective management of application software development and maintenance activities. The IFPUG Guide to IT and Software Measurement brings together 52 leading so

  1. Spectrum Monitoring Using SpectrumAnalysis LabVIEW Software, Nanoceptors, and Various Digitizing Solutions

    Science.gov (United States)

    2015-02-01

    Spectrum Monitoring Using SpectrumAnalysis LabVIEW Software, Nanoceptors, and Various Digitizing Solutions by Joshua Smith ARL-TR-7217...1138 ARL-TR-7217 February 2015 Spectrum Monitoring Using SpectrumAnalysis LabVIEW Software, Nanoceptors, and Various Digitizing Solutions...REPORT TYPE Final 3. DATES COVERED (From - To) 06/2014–07/2014 4. TITLE AND SUBTITLE Spectrum Monitoring Using Spectrum Analysis LabVIEW

  2. FieldTrip: Open source software for advanced analysis of MEG, EEG, and invasive electrophysiological data

    NARCIS (Netherlands)

    Oostenveld, R.; Fries, P.; Maris, E.; Schoffelen, J.M.

    2011-01-01

    This paper describes FieldTrip, an open source software package that we developed for the analysis of MEG, EEG, and other electrophysiological data. The software is implemented as a MATLAB toolbox and includes a complete set of consistent and user-friendly high-level functions that allow experimenta

  3. A Methodology to Evaluate Object oriented Software Systems Using Change Requirement Traceability Based on Impact Analysis

    Directory of Open Access Journals (Sweden)

    Sunil T. D

    2014-06-01

    Full Text Available It is a well known fact that software maintenance plays a major role and finds importance in software development life cycle. As object - oriented programming has become the standard, it is very important to understand the problems of maintaining object -oriented software systems. This paper aims at evaluating object - oriented software system through change requirement traceability – based impact analysis methodology for non functional requirements using functional requirements . The major issues have been related to change impact algorithms and inheritance of functionality.

  4. On The Human, Organizational, and Technical Aspects of Software Development and Analysis

    Science.gov (United States)

    Damaševičius, Robertas

    Information systems are designed, constructed, and used by people. Therefore, a software design process is not purely a technical task, but a complex psycho-socio-technical process embedded within organizational, cultural, and social structures. These structures influence the behavior and products of the programmer's work such as source code and documentation. This chapter (1) discusses the non-technical (organizational, social, cultural, and psychological) aspects of software development reflected in program source code; (2) presents a taxonomy of the social disciplines of computer science; and (3) discusses the socio-technical software analysis methods for discovering the human, organizational, and technical aspects embedded within software development artifacts.

  5. Analysis and design of software ecosystem architectures – towards the 4S telemedicine ecosystem

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak; Hansen, Klaus Marius; Kyng, Morten;

    2014-01-01

    , and application stove-pipes that inhibit the adoption of telemedical solutions. To which extent can a software ecosystem approach to telemedicine alleviate this? Objective In this article, we define the concept of software ecosystem architecture as the structure(s) of a software ecosystem comprising elements......, relations among them, and properties of both. Our objective is to show how this concept can be used i) in the analysis of existing software ecosystems and ii) in the design of new software ecosystems. Method We performed a mixed-method study that consisted of a case study and an experiment. For i), we...... performed a descriptive, revelatory case study of the Danish telemedicine ecosystem and for ii), we experimentally designed, implemented, and evaluated the architecture of 4S. Results We contribute in three areas. First, we define the software ecosystem architecture concept that captures organization...

  6. Analysis and design of software ecosystem architectures – Towards the 4S telemedicine ecosystem

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak; Hansen, Klaus Marius; Kyng, Morten

    2014-01-01

    , and application stove-pipes that inhibit the adoption of telemedical solutions. To which extent can a software ecosystem approach to telemedicine alleviate this? Objective In this article, we define the concept of software ecosystem architecture as the structure(s) of a software ecosystem comprising elements......, relations among them, and properties of both. Our objective is to show how this concept can be used i) in the analysis of existing software ecosystems and ii) in the design of new software ecosystems. Method We performed a mixed-method study that consisted of a case study and an experiment. For i), we...... performed a descriptive, revelatory case study of the Danish telemedicine ecosystem and for ii), we experimentally designed, implemented, and evaluated the architecture of 4S. Results We contribute in three areas. First, we define the software ecosystem architecture concept that captures organization...

  7. Performance Analysis of the ATLAS Second Level Trigger Software

    CERN Document Server

    Bogaerts, J A C; Li, W; Middleton, R P; Werner, P; Wickens, F J; Zobernig, H

    2002-01-01

    Abstract--In this paper we analyse the performance of the prototype software developed for the ATLAS Second Level Trigger. The OO framework written in C++ has been used to implement a distributed system which collects (simulated) detector data on which it executes event selection algorithms. The software has been used on testbeds of up to 100 nodes with various interconnect technologies. The final system will have to sustain traffic of ~ 40 Gbits/s and require an estimated number of ~750 processors. Timing measurements are crucial for issues such as trigger decision latency, assessment of required CPU and network capacity, scalability, and load-balancing. In addition, final architectural and technological choices, code optimisation and system tuning require a detailed understanding of both CPU utilisation and trigger decision latency. In this paper we describe the instrumentation used to disentangle effects due to such factors as OS system intervention, blocking on interlocks (applications are multi-threaded)...

  8. The khmer software package: enabling efficient nucleotide sequence analysis.

    Science.gov (United States)

    Crusoe, Michael R; Alameldin, Hussien F; Awad, Sherine; Boucher, Elmar; Caldwell, Adam; Cartwright, Reed; Charbonneau, Amanda; Constantinides, Bede; Edvenson, Greg; Fay, Scott; Fenton, Jacob; Fenzl, Thomas; Fish, Jordan; Garcia-Gutierrez, Leonor; Garland, Phillip; Gluck, Jonathan; González, Iván; Guermond, Sarah; Guo, Jiarong; Gupta, Aditi; Herr, Joshua R; Howe, Adina; Hyer, Alex; Härpfer, Andreas; Irber, Luiz; Kidd, Rhys; Lin, David; Lippi, Justin; Mansour, Tamer; McA'Nulty, Pamela; McDonald, Eric; Mizzi, Jessica; Murray, Kevin D; Nahum, Joshua R; Nanlohy, Kaben; Nederbragt, Alexander Johan; Ortiz-Zuazaga, Humberto; Ory, Jeramia; Pell, Jason; Pepe-Ranney, Charles; Russ, Zachary N; Schwarz, Erich; Scott, Camille; Seaman, Josiah; Sievert, Scott; Simpson, Jared; Skennerton, Connor T; Spencer, James; Srinivasan, Ramakrishnan; Standage, Daniel; Stapleton, James A; Steinman, Susan R; Stein, Joe; Taylor, Benjamin; Trimble, Will; Wiencko, Heather L; Wright, Michael; Wyss, Brian; Zhang, Qingpeng; Zyme, En; Brown, C Titus

    2015-01-01

    The khmer package is a freely available software library for working efficiently with fixed length DNA words, or k-mers. khmer provides implementations of a probabilistic k-mer counting data structure, a compressible De Bruijn graph representation, De Bruijn graph partitioning, and digital normalization. khmer is implemented in C++ and Python, and is freely available under the BSD license at  https://github.com/dib-lab/khmer/.

  9. Analysis of a Software Maintenance System: A Case Study

    Science.gov (United States)

    1992-11-01

    of federal, state or local laws, or executive orders While the federal government does continue to exclude gays , lesbians and bisexuals from... coupled across a network, which meant that many of the software changes had to be coordinated. The problem changed from maintaining two sets of... itali - cized font (e.g. install modifications). 4.2 The Top-Level View Figure 4.2 shows a top-level view of the project process. A change (bug

  10. Loss Analysis of the Software-based Packet Capturing

    Directory of Open Access Journals (Sweden)

    Tamas Skopko

    2012-06-01

    Full Text Available Gigabit per second and higher bandwidths imply greater challenge to perform lossless packet capturing on generic PC architectures. This is because of software based capture solutions, which did not improve as fast as network bandwidth and they still heavily rely on the OS's packet processing mechanism. There are hardware and operating system factors that primarily affect capture performance. This paper summarizes these parameters and shows how to predict packet loss ratio during the capture process.

  11. Potku - New analysis software for heavy ion elastic recoil detection analysis

    Science.gov (United States)

    Arstila, K.; Julin, J.; Laitinen, M. I.; Aalto, J.; Konu, T.; Kärkkäinen, S.; Rahkonen, S.; Raunio, M.; Itkonen, J.; Santanen, J.-P.; Tuovinen, T.; Sajavaara, T.

    2014-07-01

    Time-of-flight elastic recoil detection (ToF-ERD) analysis software has been developed. The software combines a Python-language graphical front-end with a C code computing back-end in a user-friendly way. The software uses a list of coincident time-of-flight-energy (ToF-E) events as an input. The ToF calibration can be determined with a simple graphical procedure. The graphical interface allows the user to select different elements and isotopes from a ToF-E histogram and to convert the selections to individual elemental energy and depth profiles. The resulting sample composition can be presented as relative or absolute concentrations by integrating the depth profiles over user-defined ranges. Beam induced composition changes can be studied by displaying the event-based data in fractions relative to the substrate reference data. Optional angular input data allows for kinematic correction of the depth profiles. This open source software is distributed under the GPL license for Linux, Mac, and Windows environments.

  12. The Architecture of MEG Simulation and Analysis Software

    CERN Document Server

    Cattaneo, PaoloW; Sawada, Ryu; Schneebeli, Matthias; Yamada, Shuei

    2011-01-01

    MEG (Mu to Electron Gamma) is an experiment dedicated to search for the $\\mu^+ \\rightarrow e^+\\gamma$ decay that is strongly suppressed in the Standard Model but predicted in several Super Symmetric extensions of it at an accessible rate. MEG is a small-size experiment ($\\approx 50-60$ physicists at any time) with a life span of about 10 years. The limited human resource available, in particular in the core offline group, emphasized the importance of reusing software and exploiting existing expertise. Great care has been devoted to provide a simple system that hides implementation details to the average programmer. That allowed many members of the collaboration to contribute to the development of the software of the experiment with limited programming skill. The offline software is based on two frameworks: {\\bf REM} in FORTRAN 77 used for the event generation and detector simulation package {\\bf GEM}, based on GEANT 3, and {\\bf ROME} in C++ used in the readout simulation {\\bf Bartender} and in the reconstruct...

  13. Space Telecommunications Radio System Software Architecture Concepts and Analysis

    Science.gov (United States)

    Handler, Louis M.; Hall, Charles S.; Briones, Janette C.; Blaser, Tammy M.

    2008-01-01

    The Space Telecommunications Radio System (STRS) project investigated various Software Defined Radio (SDR) architectures for Space. An STRS architecture has been selected that separates the STRS operating environment from its various waveforms and also abstracts any specialized hardware to limit its effect on the operating environment. The design supports software evolution where new functionality is incorporated into the radio. Radio hardware functionality has been moving from hardware based ASICs into firmware and software based processors such as FPGAs, DSPs and General Purpose Processors (GPPs). Use cases capture the requirements of a system by describing how the system should interact with the users or other systems (the actors) to achieve a specific goal. The Unified Modeling Language (UML) is used to illustrate the Use Cases in a variety of ways. The Top Level Use Case diagram shows groupings of the use cases and how the actors are involved. The state diagrams depict the various states that a system or object may be in and the transitions between those states. The sequence diagrams show the main flow of activity as described in the use cases.

  14. Analysis and design of software ecosystem architectures – towards the 4S telemedicine ecosystem

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak; Hansen, Klaus Marius; Kyng, Morten;

    2014-01-01

    performed a descriptive, revelatory case study of the Danish telemedicine ecosystem and for ii), we experimentally designed, implemented, and evaluated the architecture of 4S. Results We contribute in three areas. First, we define the software ecosystem architecture concept that captures organization......, relations among them, and properties of both. Our objective is to show how this concept can be used i) in the analysis of existing software ecosystems and ii) in the design of new software ecosystems. Method We performed a mixed-method study that consisted of a case study and an experiment. For i), we...... experience in creating and evolving the 4S telemedicine ecosystem. Conclusion The concept of software ecosystem architecture can be used analytically and constructively in respectively the analysis and design of software ecosystems....

  15. Development of the Free-space Optical Communications Analysis Software (FOCAS)

    Science.gov (United States)

    Jeganathan, M.; Mecherle, G.; Lesh, J.

    1998-01-01

    The Free-space Optical Communications Analysis Software (FOCAS) was developed at the Jet Propulsion Laboratory (JPL) to provide mission planners, systems engineers and communications engineers with an easy to use tool to analyze optical communications link.

  16. Integrating Multi-Vendor Software Analysis into the Lifecycle for Reliability, Productivity, and Performance Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The goal of the proposed work is to create new ways to manage, visualize, and share data produced by multiple software analysis tools, and to create a framework for...

  17. Development of tools for safety analysis of control software in advanced reactors

    Energy Technology Data Exchange (ETDEWEB)

    Guarro, S.; Yau, M.; Motamed, M. [Advanced Systems Concepts Associates, El Segundo, CA (United States)

    1996-04-01

    Software based control systems have gained a pervasive presence in a wide variety of applications, including nuclear power plant control and protection systems which are within the oversight and licensing responsibility of the US Nuclear Regulatory Commission. While the cost effectiveness and flexibility of software based plant process control is widely recognized, it is very difficult to achieve and prove high levels of demonstrated dependability and safety assurance for the functions performed by process control software, due to the very flexibility and potential complexity of the software itself. The development of tools to model, analyze and test software design and implementations in the context of the system that the software is designed to control can greatly assist the task of providing higher levels of assurance than those obtainable by software testing alone. This report presents and discusses the development of the Dynamic Flowgraph Methodology (DFM) and its application in the dependability and assurance analysis of software-based control systems. The features of the methodology and full-scale examples of application to both generic process and nuclear power plant control systems are presented and discussed in detail. The features of a workstation software tool developed to assist users in the application of DFM are also described.

  18. Analysis of a hardware and software fault tolerant processor for critical applications

    Science.gov (United States)

    Dugan, Joanne B.

    1993-01-01

    Computer systems for critical applications must be designed to tolerate software faults as well as hardware faults. A unified approach to tolerating hardware and software faults is characterized by classifying faults in terms of duration (transient or permanent) rather than source (hardware or software). Errors arising from transient faults can be handled through masking or voting, but errors arising from permanent faults require system reconfiguration to bypass the failed component. Most errors which are caused by software faults can be considered transient, in that they are input-dependent. Software faults are triggered by a particular set of inputs. Quantitative dependability analysis of systems which exhibit a unified approach to fault tolerance can be performed by a hierarchical combination of fault tree and Markov models. A methodology for analyzing hardware and software fault tolerant systems is applied to the analysis of a hypothetical system, loosely based on the Fault Tolerant Parallel Processor. The models consider both transient and permanent faults, hardware and software faults, independent and related software faults, automatic recovery, and reconfiguration.

  19. SEDA: A software package for the Statistical Earthquake Data Analysis

    Science.gov (United States)

    Lombardi, A. M.

    2017-01-01

    In this paper, the first version of the software SEDA (SEDAv1.0), designed to help seismologists statistically analyze earthquake data, is presented. The package consists of a user-friendly Matlab-based interface, which allows the user to easily interact with the application, and a computational core of Fortran codes, to guarantee the maximum speed. The primary factor driving the development of SEDA is to guarantee the research reproducibility, which is a growing movement among scientists and highly recommended by the most important scientific journals. SEDAv1.0 is mainly devoted to produce accurate and fast outputs. Less care has been taken for the graphic appeal, which will be improved in the future. The main part of SEDAv1.0 is devoted to the ETAS modeling. SEDAv1.0 contains a set of consistent tools on ETAS, allowing the estimation of parameters, the testing of model on data, the simulation of catalogs, the identification of sequences and forecasts calculation. The peculiarities of routines inside SEDAv1.0 are discussed in this paper. More specific details on the software are presented in the manual accompanying the program package. PMID:28290482

  20. SEDA: A software package for the Statistical Earthquake Data Analysis

    Science.gov (United States)

    Lombardi, A. M.

    2017-03-01

    In this paper, the first version of the software SEDA (SEDAv1.0), designed to help seismologists statistically analyze earthquake data, is presented. The package consists of a user-friendly Matlab-based interface, which allows the user to easily interact with the application, and a computational core of Fortran codes, to guarantee the maximum speed. The primary factor driving the development of SEDA is to guarantee the research reproducibility, which is a growing movement among scientists and highly recommended by the most important scientific journals. SEDAv1.0 is mainly devoted to produce accurate and fast outputs. Less care has been taken for the graphic appeal, which will be improved in the future. The main part of SEDAv1.0 is devoted to the ETAS modeling. SEDAv1.0 contains a set of consistent tools on ETAS, allowing the estimation of parameters, the testing of model on data, the simulation of catalogs, the identification of sequences and forecasts calculation. The peculiarities of routines inside SEDAv1.0 are discussed in this paper. More specific details on the software are presented in the manual accompanying the program package.

  1. SEDA: A software package for the Statistical Earthquake Data Analysis.

    Science.gov (United States)

    Lombardi, A M

    2017-03-14

    In this paper, the first version of the software SEDA (SEDAv1.0), designed to help seismologists statistically analyze earthquake data, is presented. The package consists of a user-friendly Matlab-based interface, which allows the user to easily interact with the application, and a computational core of Fortran codes, to guarantee the maximum speed. The primary factor driving the development of SEDA is to guarantee the research reproducibility, which is a growing movement among scientists and highly recommended by the most important scientific journals. SEDAv1.0 is mainly devoted to produce accurate and fast outputs. Less care has been taken for the graphic appeal, which will be improved in the future. The main part of SEDAv1.0 is devoted to the ETAS modeling. SEDAv1.0 contains a set of consistent tools on ETAS, allowing the estimation of parameters, the testing of model on data, the simulation of catalogs, the identification of sequences and forecasts calculation. The peculiarities of routines inside SEDAv1.0 are discussed in this paper. More specific details on the software are presented in the manual accompanying the program package.

  2. Politics during Software Process Improvement: A Metatriangulation Analysis

    DEFF Research Database (Denmark)

    Mûller, Sune Dueholm; Mathiassen, Lars; Kræmmergaard, Pernille

    2014-01-01

    -the-talk, and keeping-up-appearances. Finally, we combine the patterns we observed with insights from the literature to build a metaparadigm theory of SPI politics. In addition to contributing to Information Systems (IS) research in the key area of systems development innovation, the study furthers understanding of how......Software Process Improvement (SPI) has played a dominant role in systems development innovation research and practice for more than 20 years. However, while extant theory acknowledges the political nature of SPI initiatives, researchers have yet to empirically investigate and theorize about how...... organizational politics might impact outcomes. Against this backdrop, we apply metatriangulation to build new theory based on rich data from an SPI project in four business units at a high-tech firm. Reflecting the diverse ways in which politics manifests, we first analyze behaviors and outcomes in each unit...

  3. GRO/EGRET data analysis software: An integrated system of custom and commercial software using standard interfaces

    Science.gov (United States)

    Laubenthal, N. A.; Bertsch, D.; Lal, N.; Etienne, A.; Mcdonald, L.; Mattox, J.; Sreekumar, P.; Nolan, P.; Fierro, J.

    1992-01-01

    The Energetic Gamma Ray Telescope Experiment (EGRET) on the Compton Gamma Ray Observatory has been in orbit for more than a year and is being used to map the full sky for gamma rays in a wide energy range from 30 to 20,000 MeV. Already these measurements have resulted in a wide range of exciting new information on quasars, pulsars, galactic sources, and diffuse gamma ray emission. The central part of the analysis is done with sky maps that typically cover an 80 x 80 degree section of the sky for an exposure time of several days. Specific software developed for this program generates the counts, exposure, and intensity maps. The analysis is done on a network of UNIX based workstations and takes full advantage of a custom-built user interface called X-dialog. The maps that are generated are stored in the FITS format for a collection of energies. These, along with similar diffuse emission background maps generated from a model calculation, serve as input to a maximum likelihood program that produces maps of likelihood with optional contours that are used to evaluate regions for sources. Likelihood also evaluates the background corrected intensity at each location for each energy interval from which spectra can be generated. Being in a standard FITS format permits all of the maps to be easily accessed by the full complement of tools available in several commercial astronomical analysis systems. In the EGRET case, IDL is used to produce graphics plots in two and three dimensions and to quickly implement any special evaluation that might be desired. Other custom-built software, such as the spectral and pulsar analyses, take advantage of the XView toolkit for display and Postscript output for the color hard copy. This poster paper outlines the data flow and provides examples of the user interfaces and output products. It stresses the advantages that are derived from the integration of the specific instrument-unique software and powerful commercial tools for graphics and

  4. A Parametric Model of Shoulder Articulation for Virtual Assessment of Space Suit Fit

    Science.gov (United States)

    Kim, K. Han; Young, Karen S.; Bernal, Yaritza; Boppana, Abhishektha; Vu, Linh Q.; Benson, Elizabeth A.; Jarvis, Sarah; Rajulu, Sudhakar L.

    2016-01-01

    Shoulder injury is one of the most severe risks that have the potential to impair crewmembers' performance and health in long duration space flight. Overall, 64% of crewmembers experience shoulder pain after extra-vehicular training in a space suit, and 14% of symptomatic crewmembers require surgical repair (Williams & Johnson, 2003). Suboptimal suit fit, in particular at the shoulder region, has been identified as one of the predominant risk factors. However, traditional suit fit assessments and laser scans represent only a single person's data, and thus may not be generalized across wide variations of body shapes and poses. The aim of this work is to develop a software tool based on a statistical analysis of a large dataset of crewmember body shapes. This tool can accurately predict the skin deformation and shape variations for any body size and shoulder pose for a target population, from which the geometry can be exported and evaluated against suit models in commercial CAD software. A preliminary software tool was developed by statistically analyzing 150 body shapes matched with body dimension ranges specified in the Human-Systems Integration Requirements of NASA ("baseline model"). Further, the baseline model was incorporated with shoulder joint articulation ("articulation model"), using additional subjects scanned in a variety of shoulder poses across a pre-specified range of motion. Scan data was cleaned and aligned using body landmarks. The skin deformation patterns were dimensionally reduced and the co-variation with shoulder angles was analyzed. A software tool is currently in development and will be presented in the final proceeding. This tool would allow suit engineers to parametrically generate body shapes in strategically targeted anthropometry dimensions and shoulder poses. This would also enable virtual fit assessments, with which the contact volume and clearance between the suit and body surface can be predictively quantified at reduced time and

  5. Software for analysis and manipulation of genetic linkage data.

    Science.gov (United States)

    Weaver, R; Helms, C; Mishra, S K; Donis-Keller, H

    1992-06-01

    We present eight computer programs written in the C programming language that are designed to analyze genotypic data and to support existing software used to construct genetic linkage maps. Although each program has a unique purpose, they all share the common goals of affording a greater understanding of genetic linkage data and of automating tasks to make computers more effective tools for map building. The PIC/HET and FAMINFO programs automate calculation of relevant quantities such as heterozygosity, PIC, allele frequencies, and informativeness of markers and pedigrees. PREINPUT simplifies data submissions to the Centre d'Etude du Polymorphisme Humain (CEPH) data base by creating a file with genotype assignments that CEPH's INPUT program would otherwise require to be input manually. INHERIT is a program written specifically for mapping the X chromosome: by assigning a dummy allele to males, in the nonpseudoautosomal region, it eliminates falsely perceived noninheritances in the data set. The remaining four programs complement the previously published genetic linkage mapping software CRI-MAP and LINKAGE. TWOTABLE produces a more readable format for the output of CRI-MAP two-point calculations; UNMERGE is the converse to CRI-MAP's merge option; and GENLINK and LINKGEN automatically convert between the genotypic data file formats required by these packages. All eight applications read input from the same types of data files that are used by CRI-MAP and LINKAGE. Their use has simplified the management of data, has increased knowledge of the content of information in pedigrees, and has reduced the amount of time needed to construct genetic linkage maps of chromosomes.

  6. Space Suit Spins

    Science.gov (United States)

    2005-01-01

    Space is a hostile environment where astronauts combat extreme temperatures, dangerous radiation, and a near-breathless vacuum. Life support in these unforgiving circumstances is crucial and complex, and failure is not an option for the devices meant to keep astronauts safe in an environment that presents constant opposition. A space suit must meet stringent requirements for life support. The suit has to be made of durable material to withstand the impact of space debris and protect against radiation. It must provide essential oxygen, pressure, heating, and cooling while retaining mobility and dexterity. It is not a simple article of clothing but rather a complex modern armor that the space explorers must don if they are to continue exploring the heavens

  7. Computer Software for Design, Analysis and Control of Fluid Power Systems

    DEFF Research Database (Denmark)

    Conrad, Finn; Sørensen, Torben; Grahl-Madsen, Mads

    1999-01-01

    This Deliverable presents contributions from SWING's Task 2.3 Analysis of available software solutions. The Deliverable has focus on the results from this analysis having in mind the task objectives·to carry out a thorough analysis of the state-of the-art solutions for fluid power systems modelli...

  8. Comparing Results from Constant Comparative and Computer Software Methods: A Reflection about Qualitative Data Analysis

    Science.gov (United States)

    Putten, Jim Vander; Nolen, Amanda L.

    2010-01-01

    This study compared qualitative research results obtained by manual constant comparative analysis with results obtained by computer software analysis of the same data. An investigated about issues of trustworthiness and accuracy ensued. Results indicated that the inductive constant comparative data analysis generated 51 codes and two coding levels…

  9. Improving the Accuracy of Software-Based Energy Analysis for Residential Buildings (Presentation)

    Energy Technology Data Exchange (ETDEWEB)

    Polly, B.

    2011-09-01

    This presentation describes the basic components of software-based energy analysis for residential buildings, explores the concepts of 'error' and 'accuracy' when analysis predictions are compared to measured data, and explains how NREL is working to continuously improve the accuracy of energy analysis methods.

  10. Calling For Diversity In Health Care Executive Suites And Evaluation Of Effects On Efficiency Using Data Envelopment Analysis

    Directory of Open Access Journals (Sweden)

    Jocelyn L. Steward

    2011-07-01

    Full Text Available Adequate diversity in the leadership of health care organizations is a problem that potentially affects overall performance.  In this paper, we propose the application of data envelopment analysis (DEA and strategic human information systems to determine how diversity affects the efficiency, stability, and long-term viability of health care organizations at the organization level.  Data envelopment analysis could also be applied within a given health care organization to examine how the organization’s diversity make-up in its various departments affects relative efficiencies across the departments.  After presenting a brief introduction of DEA, we provide examples of inputs and outputs used in a proposed DEA analysis. We also propose the use of strategic information systems in health care organizations in developing countries at both organization and departmental levels.  We suggest that both developed and developing countries would benefit from using these tools as they seek to control costs and improve health care systems.

  11. HeteroGenius: A Framework for Hybrid Analysis of Heterogeneous Software Specifications

    Directory of Open Access Journals (Sweden)

    Manuel Giménez

    2014-01-01

    Full Text Available Nowadays, software artifacts are ubiquitous in our lives being an essential part of home appliances, cars, cell phones, and even in more critical activities like aeronautics and health sciences. In this context software failures may produce enormous losses, either economical or, in the worst case, in human lives. Software analysis is an area in software engineering concerned with the application of diverse techniques in order to prove the absence of errors in software pieces. In many cases different analysis techniques are applied by following specific methodological combinations that ensure better results. These interactions between tools are usually carried out at the user level and it is not supported by the tools. In this work we present HeteroGenius, a framework conceived to develop tools that allow users to perform hybrid analysis of heterogeneous software specifications. HeteroGenius was designed prioritising the possibility of adding new specification languages and analysis tools and enabling a synergic relation of the techniques under a graphical interface satisfying several well-known usability enhancement criteria. As a case-study we implemented the functionality of Dynamite on top of HeteroGenius.

  12. Systematic Analysis Method of Shear-Wave Splitting:SAM Software System

    Institute of Scientific and Technical Information of China (English)

    Gao Yuan; Liu Xiqiang; Liang Wei; Hao Ping

    2004-01-01

    In order to make a more effective use of the data from regional digital seismograph networks and to promote the study on shear wave splitting and its application to earthquake stressforecasting, SAM software system, i.e., the software on systematic analysis method of shear wave splitting has been developed. This paper introduces the design aims, system structure,function and characteristics about the SAM software system and shows some graphical interfaces of data input and result output. Lastly, it discusses preliminarily the study of shear wave splitting and its application to earthquake forecasting.

  13. FEATUREOUS: AN INTEGRATED ENVIRONMENT FOR FEATURE-CENTRIC ANALYSIS AND MODIFICATION OF OBJECT-ORIENTED SOFTWARE

    DEFF Research Database (Denmark)

    Olszak, Andrzej; Jørgensen, Bo Nørregaard

    2011-01-01

    software maintenance and evolution, this situation needs to change. In this paper, we present Featureous, an integrated development environment built on top of the NetBeans IDE that facilitates feature-centric analysis of object-oriented software. Our integrated development environment encompasses......The decentralized nature of collaborations between objects in object-oriented software makes it difficult to understand the implementations of user-observable program features and their respective interdependencies. As feature-centric program understanding and modification are essential during...... and granularity. Together, these dimensions allow the analyst to focus the analysis at the right mode of comprehension during software evolution. We demonstrate applicability of our integrated development environment by conducting a case study of change adoption using the JHotDraw SVG....

  14. The R software fundamentals of programming and statistical analysis

    CERN Document Server

    Lafaye de Micheaux, Pierre; Liquet, Benoit

    2013-01-01

    The contents of The R Software are presented so as to be both comprehensive and easy for the reader to use. Besides its application as a self-learning text, this book can support lectures on R at any level from beginner to advanced. This book can serve as a textbook on R for beginners as well as more advanced users, working on Windows, MacOs or Linux OSes. The first part of the book deals with the heart of the R language and its fundamental concepts, including data organization, import and export, various manipulations, documentation, plots, programming and maintenance.  The last chapter in this part deals with oriented object programming as well as interfacing R with C/C++ or Fortran, and contains a section on debugging techniques. This is followed by the second part of the book, which provides detailed explanations on how to perform many standard statistical analyses, mainly in the Biostatistics field. Topics from mathematical and statistical settings that are included are matrix operations, integration, o...

  15. Open Source Software Tools for Anomaly Detection Analysis

    Science.gov (United States)

    2014-04-01

    Environment for Developing KDD-Applications Supported by Index-Structures (ELKI), RapidMiner , SHOGUN (toolbox) Waikato Environment for Knowledge Analysis...Structures (ELKI) 1 3. RapidMiner 2 4. SHOGUN (toolbox) 3 5. Waikato Environment for Knowledge Analysis (Weka) (Machine Learning) 4 6. Scikit-Learn 5 7...2 Figure 2. RapidMiner output results (7

  16. Risk-Based Measurement and Analysis: Application to Software Security

    Science.gov (United States)

    2012-02-01

    diagram) and Failure Modes and Effects Analysis ( FMEA ) [Stamatis 2003]. Both of these techniques structure the discussion about what can go wrong...http://www.sei.cmu.edu/library/abstracts/reports/09tr022.cfm [Stamatis 2003] Stamatis, D. H. Failure Mode and Effect Analysis: FMEA from Theory to

  17. Development of high performance casting analysis software by coupled parallel computation

    Directory of Open Access Journals (Sweden)

    Sang Hyun CHO

    2007-08-01

    Full Text Available Up to now, so much casting analysis software has been continuing to develop the new access way to real casting processes. Those include the melt flow analysis, heat transfer analysis for solidification calculation, mechanical property predictions and microstructure predictions. These trials were successful to obtain the ideal results comparing with real situations, so that CAE technologies became inevitable to design or develop new casting processes. But for manufacturing fields, CAE technologies are not so frequently being used because of their difficulties in using the software or insufficient computing performances. To introduce CAE technologies to manufacturing field, the high performance analysis is essential to shorten the gap between product designing time and prototyping time. The software code optimization can be helpful, but it is not enough, because the codes developed by software experts are already optimized enough. As an alternative proposal for high performance computations, the parallel computation technologies are eagerly being applied to CAE technologies to make the analysis time shorter. In this research, SMP (Shared Memory Processing and MPI (Message Passing Interface (1 methods for parallelization were applied to commercial software "Z-Cast" to calculate the casting processes. In the code parallelizing processes, the network stabilization, core optimization were also carried out under Microsoft Windows platform and their performances and results were compared with those of normal linear analysis codes.

  18. Development of high performance casting analysis software by coupled parallel computation

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Up to now, so much casting analysis software has been continuing to develop the new access way to real casting processes. Those include the melt flow analysis, heat transfer analysis for solidification calculation,mechanical property predictions and microstructure predictions. These trials were successful to obtain the ideal results comparing with real situations, so that CAE technologies became inevitable to design or develop new casting processes. But for manufacturing fields, CAE technologies are not so frequently being used because of their difficulties in using the software or insufficient computing performances. To introduce CAE technologies to manufacturing field,the high performance analysis is essential to shorten the gap between product designing time and prototyping time.The software code optimization can be helpful, but it is not enough, because the codes developed by software experts are already optimized enough. As an alternative proposal for high performance computations, the parallel computation technologies are eagerly being applied to CAE technologies to make the analysis time shorter. In this research, SMP (Shared Memory Processing) and MPI (Message Passing Interface) (1) methods for parallelization were applied to commercial software "Z-Cast" to calculate the casting processes. In the code parallelizing processes,the network stabilization, core optimization were also carried out under Microsoft Windows platform and their performances and results were compared with those of normal linear analysis codes.

  19. GiA Roots: software for the high throughput analysis of plant root system architecture

    OpenAIRE

    Galkovskyi Taras; Mileyko Yuriy; Bucksch Alexander; Moore Brad; Symonova Olga; Price Charles A; Topp Christopher N; Iyer-Pascuzzi Anjali S; Zurek Paul R; Fang Suqin; Harer John; Benfey Philip N; Weitz Joshua S

    2012-01-01

    Abstract Background Characterizing root system architecture (RSA) is essential to understanding the development and function of vascular plants. Identifying RSA-associated genes also represents an underexplored opportunity for crop improvement. Software tools are needed to accelerate the pace at which quantitative traits of RSA are estimated from images of root networks. Results We have developed GiA Roots (General Image Analysis of Roots), a semi-automated software tool designed specifically...

  20. 软件脆弱性分析%Software Vulnerability Analysis

    Institute of Scientific and Technical Information of China (English)

    李新明; 李艺; 徐晓梅; 韩存兵

    2003-01-01

    Software vulnerability is the root reason that cause computer system security problem. It' s a new researchtopic to analyze vulnerability based on the essence of software vulnerability. This paper analyzes the main definitionsand taxonomies of vulnerability,studies vulnerability database and tools for vulnerability analysis and detection,andgives the details about what caused the most common vnlnerabilities in the LINUX/UNIX operating systems.

  1. Detection and Quantification of Nitrogen Compounds in the First Drilled Martian Solid Samples by the Sample Analysis at Mars (SAM) Instrument Suite on the Mars Science Laboratory (MSL)

    Science.gov (United States)

    Stern, Jennifer C.; Navarro-Gonzalez, Rafael; Freissinet, Caroline; McKay, Christopher P.; Archer, P. Douglas, Jr.; Buch, Arnaud; Coll, Patrice; Eigenbrode, Jennifer L.; Franz, Heather B.; Glavin, Daniel P.; Ming, Douglas W.; Steele, Andrew; Szopa, Cyril; Wray, James J.; Conrad, Pamela G.; Mahaffy, Paul R.

    2014-01-01

    The Sampl;e Analysis at Mars (sam) instrument suite on the Mars Science Laboratory (MSL) Curiosity Rover detected both reduced and oxidized nitrogen bearing compounds during the pyrolysis of surface materials from the three sites at Gale Crater. Preliminary detections of nitrogen species include No, HCN, ClCN, and TFMA ((trifluoro-N-methyl-acetamide), Confirmation of indigenous Martian nitrogen-bearing compounds requires quantifying N contribution from the terrestrial derivatization reagents carried for SAM's wet chemistry experiment that contribute to the SAM background. Nitrogen species detected in the SAM solid sample analyses can also be produced during laboratory pyrolysis experiments where these reagents are heated in the presence of perchlorate a compound that has also been identified by SAM in Mars solid samples.

  2. Detection and Quantification of Nitrogen Compounds in the First Drilled Martian Solid Samples by the Sample Analysis at Mars (SAM) Instrument Suite on the Mars Science Laboratory (MSL)

    Science.gov (United States)

    Stern, J. C.; Navarro-Gonzales, R.; Freissinet, C.; McKay, C. P.; Archer, P. D., Jr.; Buch, A.; Brunner, A. E.; Coll, P.; Eigenbrode, J. L.; Franz, H. B.; Glavin, D. P.; McAdam, A. C.; Ming, D.; Steele, A.; Sutter, B.; Szopa, C.; Wray, J. J.; Conrad, P.; Mahaffy, P. R.

    2014-01-01

    The Sample Analysis at Mars (SAM) instrument suite on the Mars Science Laboratory (MSL) Curiosity Rover detected both reduced and oxidized nitrogen-bearing compounds during the pyrolysis of surface materials at Yellowknife Bay in Gale Crater. Preliminary detections of nitrogen species include NO, HCN, ClCN, CH3CN, and TFMA (trifluoro-N-methyl-acetamide). Confirmation of indigenous Martian N-bearing compounds requires quantifying N contribution from the terrestrial derivatization reagents (e.g. N-methyl-N-tertbutyldimethylsilyltrifluoroacetamide, MTBSTFA and dimethylformamide, DMF) carried for SAM's wet chemistry experiment that contribute to the SAM background. Nitrogen species detected in the SAM solid sample analyses can also be produced during laboratory pyrolysis experiments where these reagents are heated in the presence of perchlorate, a compound that has also been identified by SAM in Mars solid samples.

  3. Parallel line analysis: multifunctional software for the biomedical sciences

    Science.gov (United States)

    Swank, P. R.; Lewis, M. L.; Damron, K. L.; Morrison, D. R.

    1990-01-01

    An easy to use, interactive FORTRAN program for analyzing the results of parallel line assays is described. The program is menu driven and consists of five major components: data entry, data editing, manual analysis, manual plotting, and automatic analysis and plotting. Data can be entered from the terminal or from previously created data files. The data editing portion of the program is used to inspect and modify data and to statistically identify outliers. The manual analysis component is used to test the assumptions necessary for parallel line assays using analysis of covariance techniques and to determine potency ratios with confidence limits. The manual plotting component provides a graphic display of the data on the terminal screen or on a standard line printer. The automatic portion runs through multiple analyses without operator input. Data may be saved in a special file to expedite input at a future time.

  4. Software phantom with realistic speckle modeling for validation of image analysis methods in echocardiography

    Science.gov (United States)

    Law, Yuen C.; Tenbrinck, Daniel; Jiang, Xiaoyi; Kuhlen, Torsten

    2014-03-01

    Computer-assisted processing and interpretation of medical ultrasound images is one of the most challenging tasks within image analysis. Physical phenomena in ultrasonographic images, e.g., the characteristic speckle noise and shadowing effects, make the majority of standard methods from image analysis non optimal. Furthermore, validation of adapted computer vision methods proves to be difficult due to missing ground truth information. There is no widely accepted software phantom in the community and existing software phantoms are not exible enough to support the use of specific speckle models for different tissue types, e.g., muscle and fat tissue. In this work we propose an anatomical software phantom with a realistic speckle pattern simulation to _ll this gap and provide a exible tool for validation purposes in medical ultrasound image analysis. We discuss the generation of speckle patterns and perform statistical analysis of the simulated textures to obtain quantitative measures of the realism and accuracy regarding the resulting textures.

  5. Perspectives Gained in an Evaluation of Uncertainty, Sensitivity, and Decision Analysis Software

    Energy Technology Data Exchange (ETDEWEB)

    Davis, F.J.; Helton, J.C.

    1999-02-24

    The following software packages for uncertainty, sensitivity, and decision analysis were reviewed and also tested with several simple analysis problems: Crystal Ball, RiskQ, SUSA-PC, Analytica, PRISM, Ithink, Stella, LHS, STEPWISE, and JMP. Results from the review and test problems are presented. The study resulted in the recognition of the importance of four considerations in the selection of a software package: (1) the availability of an appropriate selection of distributions, (2) the ease with which data flows through the input sampling, model evaluation, and output analysis process, (3) the type of models that can be incorporated into the analysis process, and (4) the level of confidence in the software modeling and results.

  6. Gear Meshing Transmission Analysis of the Automobile Gearbox Based on the Software MASTA

    Directory of Open Access Journals (Sweden)

    Yongxiang Li

    2013-03-01

    Full Text Available As the main drive components of the automobile manual gearbox, the effect of gear meshing plays an important role on transmission performance. Aiming at the existing problems of the traditional gear meshing analysis, the study take a five-speed gearbox as an example, based on the MASTA software, a professional CAE software for simulating and analyzing the gearbox, to accomplish the gear mesh analysis of the automobile gearbox. Further more, the simulation modeling of the gearbox is built to simulate the actual load conditions and complete the process of analysis for the gear. It is indicated that a new design concept is put forward, that is, using specialized software MASTA for transmission modeling and simulation analysis can heavily improve the design level of the gearbox, reduce the test times and shorten the period of research and development as well. Finally, it can provide references for the development and application of new transmission gear.

  7. A unified approach to feature-centric analysis of object-oriented software

    DEFF Research Database (Denmark)

    Olszak, Andrzej; Jørgensen, Bo Nørregaard

    2010-01-01

    -racter of mappings between features and source code. In this paper, we address these issues through our unified approach to feature-centric analysis of object-oriented software. Our approach supports discovery of feature-code traceability links and their analysis from three perspectives and at three levels......Feature-centric comprehension of software is a prerequisite to incorporating modifications requested by users during software evolution and maintenance. However, feature-centric understanding of large object-oriented programs is difficult to achieve due to size, complexity and implicit cha...... of abstraction. We further improve scalability of analysis by partitioning features into canonical groups. To demonstrate feasibility our approach, we use our NetBeans-integrated tool Featureous for conducting a case study of feature-centric analysis of the JHotDraw project. Lastly, we discuss how Featureous...

  8. A software tool for design of process monitoring and analysis systems

    DEFF Research Database (Denmark)

    Singh, Ravendra; Gernaey, Krist; Gani, Rafiqul

    2009-01-01

    and analysis system. A software to achieve this has been developed. Two developed supporting tools for the design, a knowledge base (consisting of the process knowledge as well as the knowledge on measurement methods & tools) and a model library (consisting of the process operational models) have been extended...... rigorously and integrated with the user interface, which made the software more generic and applicable to a wide range of problems. The software for the design of a process monitoring and analysis system is presented and illustrated with a tablet manufacturing process example.......A well designed process monitoring and analysis system is necessary to consistently achieve any predefined end product quality. Systematic computer aided methods and tools provide the means to design the necessary process monitoring and analysis systems and/or to validate any existing monitoring...

  9. Learning DHTMLX suite UI

    CERN Document Server

    Geske, Eli

    2013-01-01

    A fast-paced, example-based guide to learning DHTMLX.""Learning DHTMLX Suite UI"" is for web designers who have a basic knowledge of JavaScript and who are looking for powerful tools that will give them an extra edge in their own application development. This book is also useful for experienced developers who wish to get started with DHTMLX without going through the trouble of learning its quirks through trial and error. Readers are expected to have some knowledge of JavaScript, HTML, Document Object Model, and the ability to install a local web server.

  10. An Analysis and Design of the Virtual Simulation Software Based on Pattern

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    The paper makes a detailed analysis and design of the Vega application software based on Windows NT platform. It includes object-oriented software analysis and design, design patterns and Windows kernel mechanism. The paper brings forward a design pattern, a fence-pattern, and depends on this pattern. Windows NT memory mapped files adopted, the paper presents a Vega application solution based on the multi-process technique. Although the design solution is developing under a real-time simulation system, it is established at the clear analysis of the Vega system, therefore, the solution has extensive practicability and many uses.

  11. PrimerSuite: A High-Throughput Web-Based Primer Design Program for Multiplex Bisulfite PCR

    Science.gov (United States)

    Lu, Jennifer; Johnston, Andrew; Berichon, Philippe; Ru, Ke-lin; Korbie, Darren; Trau, Matt

    2017-01-01

    The analysis of DNA methylation at CpG dinucleotides has become a major research focus due to its regulatory role in numerous biological processes, but the requisite need for assays which amplify bisulfite-converted DNA represents a major bottleneck due to the unique design constraints imposed on bisulfite-PCR primers. Moreover, a review of the literature indicated no available software solutions which accommodated both high-throughput primer design, support for multiplex amplification assays, and primer-dimer prediction. In response, the tri-modular software package PrimerSuite was developed to support bisulfite multiplex PCR applications. This software was constructed to (i) design bisulfite primers against multiple regions simultaneously (PrimerSuite), (ii) screen for primer-primer dimerizing artefacts (PrimerDimer), and (iii) support multiplex PCR assays (PrimerPlex). Moreover, a major focus in the development of this software package was the emphasis on extensive empirical validation, and over 1300 unique primer pairs have been successfully designed and screened, with over 94% of them producing amplicons of the expected size, and an average mapping efficiency of 93% when screened using bisulfite multiplex resequencing. The potential use of the software in other bisulfite-based applications such as methylation-specific PCR is under consideration for future updates. This resource is freely available for use at PrimerSuite website (www.primer-suite.com). PMID:28117430

  12. PrimerSuite: A High-Throughput Web-Based Primer Design Program for Multiplex Bisulfite PCR.

    Science.gov (United States)

    Lu, Jennifer; Johnston, Andrew; Berichon, Philippe; Ru, Ke-Lin; Korbie, Darren; Trau, Matt

    2017-01-24

    The analysis of DNA methylation at CpG dinucleotides has become a major research focus due to its regulatory role in numerous biological processes, but the requisite need for assays which amplify bisulfite-converted DNA represents a major bottleneck due to the unique design constraints imposed on bisulfite-PCR primers. Moreover, a review of the literature indicated no available software solutions which accommodated both high-throughput primer design, support for multiplex amplification assays, and primer-dimer prediction. In response, the tri-modular software package PrimerSuite was developed to support bisulfite multiplex PCR applications. This software was constructed to (i) design bisulfite primers against multiple regions simultaneously (PrimerSuite), (ii) screen for primer-primer dimerizing artefacts (PrimerDimer), and (iii) support multiplex PCR assays (PrimerPlex). Moreover, a major focus in the development of this software package was the emphasis on extensive empirical validation, and over 1300 unique primer pairs have been successfully designed and screened, with over 94% of them producing amplicons of the expected size, and an average mapping efficiency of 93% when screened using bisulfite multiplex resequencing. The potential use of the software in other bisulfite-based applications such as methylation-specific PCR is under consideration for future updates. This resource is freely available for use at PrimerSuite website (www.primer-suite.com).

  13. Scilab and Maxima Environment: Towards Free Software in Numerical Analysis

    Science.gov (United States)

    Mora, Angel; Galan, Jose Luis; Aguilera, Gabriel; Fernandez, Alvaro; Merida, Enrique; Rodriguez, Pedro

    2010-01-01

    In this work we will present the ScilabUMA environment we have developed as an alternative to Matlab. This environment connects Scilab (for numerical analysis) and Maxima (for symbolic computations). Furthermore, the developed interface is, in our opinion at least, as powerful as the interface of Matlab. (Contains 3 figures.)

  14. Field Of View Of A Spacecraft Antenna: Analysis And Software

    Science.gov (United States)

    Wu, Te-Kao; Kipp, R.; Lee, S. W.

    1995-01-01

    Report summarizes computational analysis of field of view of rotating elliptical-cross-section parabolic-reflector antenna for SeaWinds spacecraft. Issues considered include blockage and diffraction by other objects near antenna, related concerns about electromagnetic interference and electromagnetic compatibility, and how far and in which configuration other objects positioned with respect to antenna to achieve required performance.

  15. Automated Structure Solution with the PHENIX Suite

    Energy Technology Data Exchange (ETDEWEB)

    Zwart, Peter H.; Zwart, Peter H.; Afonine, Pavel; Grosse-Kunstleve, Ralf W.; Hung, Li-Wei; Ioerger, Tom R.; McCoy, A.J.; McKee, Eric; Moriarty, Nigel; Read, Randy J.; Sacchettini, James C.; Sauter, Nicholas K.; Storoni, L.C.; Terwilliger, Tomas C.; Adams, Paul D.

    2008-06-09

    Significant time and effort are often required to solve and complete a macromolecular crystal structure. The development of automated computational methods for the analysis, solution and completion of crystallographic structures has the potential to produce minimally biased models in a short time without the need for manual intervention. The PHENIX software suite is a highly automated system for macromolecular structure determination that can rapidly arrive at an initial partial model of a structure without significant human intervention, given moderate resolution and good quality data. This achievement has been made possible by the development of new algorithms for structure determination, maximum-likelihood molecular replacement (PHASER), heavy-atom search (HySS), template and pattern-based automated model-building (RESOLVE, TEXTAL), automated macromolecular refinement (phenix.refine), and iterative model-building, density modification and refinement that can operate at moderate resolution (RESOLVE, AutoBuild). These algorithms are based on a highly integrated and comprehensive set of crystallographic libraries that have been built and made available to the community. The algorithms are tightly linked and made easily accessible to users through the PHENIX Wizards and the PHENIX GUI.

  16. Automated structure solution with the PHENIX suite

    Energy Technology Data Exchange (ETDEWEB)

    Terwilliger, Thomas C [Los Alamos National Laboratory; Zwart, Peter H [LBNL; Afonine, Pavel V [LBNL; Grosse - Kunstleve, Ralf W [LBNL

    2008-01-01

    Significant time and effort are often required to solve and complete a macromolecular crystal structure. The development of automated computational methods for the analysis, solution, and completion of crystallographic structures has the potential to produce minimally biased models in a short time without the need for manual intervention. The PHENIX software suite is a highly automated system for macromolecular structure determination that can rapidly arrive at an initial partial model of a structure without significant human intervention, given moderate resolution, and good quality data. This achievement has been made possible by the development of new algorithms for structure determination, maximum-likelihood molecular replacement (PHASER), heavy-atom search (HySS), template- and pattern-based automated model-building (RESOLVE, TEXTAL), automated macromolecular refinement (phenix. refine), and iterative model-building, density modification and refinement that can operate at moderate resolution (RESOLVE, AutoBuild). These algorithms are based on a highly integrated and comprehensive set of crystallographic libraries that have been built and made available to the community. The algorithms are tightly linked and made easily accessible to users through the PHENIX Wizards and the PHENIX GUI.

  17. Assessing climate model software quality: a defect density analysis of three models

    Directory of Open Access Journals (Sweden)

    J. Pipitone

    2012-08-01

    Full Text Available A climate model is an executable theory of the climate; the model encapsulates climatological theories in software so that they can be simulated and their implications investigated. Thus, in order to trust a climate model, one must trust that the software it is built from is built correctly. Our study explores the nature of software quality in the context of climate modelling. We performed an analysis of defect reports and defect fixes in several versions of leading global climate models by collecting defect data from bug tracking systems and version control repository comments. We found that the climate models all have very low defect densities compared to well-known, similarly sized open-source projects. We discuss the implications of our findings for the assessment of climate model software trustworthiness.

  18. The Implication of Using NVivo Software in Qualitative Data Analysis: Evidence-Based Reflections.

    Science.gov (United States)

    Zamawe, F C

    2015-03-01

    For a long time, electronic data analysis has been associated with quantitative methods. However, Computer Assisted Qualitative Data Analysis Software (CAQDAS) are increasingly being developed. Although the CAQDAS has been there for decades, very few qualitative health researchers report using it. This may be due to the difficulties that one has to go through to master the software and the misconceptions that are associated with using CAQDAS. While the issue of mastering CAQDAS has received ample attention, little has been done to address the misconceptions associated with CAQDAS. In this paper, the author reflects on his experience of interacting with one of the popular CAQDAS (NVivo) in order to provide evidence-based implications of using the software. The key message is that unlike statistical software, the main function of CAQDAS is not to analyse data but rather to aid the analysis process, which the researcher must always remain in control of. In other words, researchers must equally know that no software can analyse qualitative data. CAQDAS are basically data management packages, which support the researcher during analysis.

  19. Felyx : A Free Open Software Solution for the Analysis of Large Earth Observation Datasets

    Science.gov (United States)

    Piolle, Jean-Francois; Shutler, Jamie; Poulter, David; Guidetti, Veronica; Donlon, Craig

    2014-05-01

    GHRSST project, by assembling large collections of earth observation data from various sources and agencies, has also raised the need for providing the user community with tools to inter-compare them, assess and monitor their quality. The ESA /Medspiration project, which implemented the first operating node of GHRSST system for Europe, also paved the way successfully towards such generic analytics tools by developing the High Resolution Diagnostic Dataset System (HR-DDS) and Satellite to In situ Multi-sensor Match-up Databases. Building on this heritage, ESA is now funding the development by IFREMER, PML and Pelamis of felyx, a web tool merging the two capabilities into a single software solution. It will consist in a free open software solution, written in python and javascript, whose aim is to provide Earth Observation data producers and users with an open-source, flexible and reusable tool to allow the quality and performance of data streams (satellite, in situ and model) to be easily monitored and studied. The primary concept of Felyx is to work as an extraction tool, subsetting source data over predefined target areas (which can be static or moving) : these data subsets, and associated metrics, can then be accessed by users or client applications either as raw files, automatic alerts and reports generated periodically, or through a flexible web interface enabling statistical analysis and visualization. Felyx presents itself as an open-source suite of tools, written in python and javascript, enabling : * subsetting large local or remote collections of Earth Observation data over predefined sites (geographical boxes) or moving targets (ship, buoy, hurricane), storing locally the extracted data (refered as miniProds). These miniProds constitute a much smaller representative subset of the original collection on which one can perform any kind of processing or assessment without having to cope with heavy volumes of data. * computing statistical metrics over these

  20. Software for the Spectral Analysis of Hot Stars

    CERN Document Server

    Rauch, Thomas; Stampa, Ulrike; Demleitner, Markus; Koesterke, Lars

    2009-01-01

    In a collaboration of the German Astrophysical Virtual Observatory (GAVO) and AstroGrid-D, the German Astronomy Community Grid (GACG), we provide a VO service for the access and the calculation of stellar synthetic energy distributions (SEDs) based on static as well as expanding non-LTE model atmospheres. At three levels, a VO user may directly compare observed and theoretical SEDs: The easiest and fastest way is to use pre-calculated SEDs from the GAVO database. For individual objects, grids of model atmospheres and SEDs can be calculated on the compute resources of AstroGrid-D within reasonable wallclock time. Experienced VO users may even create own atomic-data files for a more detailed analysis. This VO service opens also the perspective for a new approach to an automated spectral analysis of a large number of observations, e.g. provided by multi-object spectrographs.

  1. A Software-Assisted Qualitative Content Analysis of News Articles: Example and Reflections

    Directory of Open Access Journals (Sweden)

    Florian Kaefer

    2015-05-01

    Full Text Available This article offers a step-by-step description of how qualitative data analysis software can be used for a qualitative content analysis of newspaper articles. Using NVivo as an example, it illustrates how software tools can facilitate analytical flexibility and how they can enhance transparency and trustworthiness of the qualitative research process. Following a brief discussion of the key characteristics, advantages and limitations of qualitative data analysis software, the article describes a qualitative content analysis of 230 newspaper articles, conducted to determine international media perceptions of New Zealand's environmental performance in connection with climate change and carbon emissions. The article proposes a multi-level coding approach during the analysis of news texts that combines quantitative and qualitative elements, allowing the researcher to move back and forth in coding and between analytical levels. The article concludes that while qualitative data analysis software, such as NVivo, will not do the analysis for the researcher, it can make the analytical process more flexible, transparent and ultimately more trustworthy. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs150283

  2. Analysis of Software Test Item Generation- Comparison Between High Skilled and Low Skilled Engineers

    Institute of Scientific and Technical Information of China (English)

    Masayuki Hirayama; Osamu Mizuno; Tohru Kikuno

    2005-01-01

    Recent software system contain many functions to provide various services. According to this tendency, it is difficult to ensure software quality and to eliminate crucial faults by conventional software testing methods. So taking the effect of test engineer's skill on test item generation into consideration, we propose a new test item generation method,which supports the generation of test items for illegal behavior of the system. The proposed method can generate test items based on use-case analysis, deviation analysis for legal behavior, and faults tree analysis for system fault situations. From the results of the experimental applications of our method, we confirmed that test items for illegal behavior of a system were effectively generated, and also the proposed method could effectively assist test item generation by an engineer with low-level skill.

  3. Anthropometric Accommodation in Space Suit Design

    Science.gov (United States)

    Rajulu, Sudhakar; Thaxton, Sherry

    2007-01-01

    Design requirements for next generation hardware are in process at NASA. Anthropometry requirements are given in terms of minimum and maximum sizes for critical dimensions that hardware must accommodate. These dimensions drive vehicle design and suit design, and implicitly have an effect on crew selection and participation. At this stage in the process, stakeholders such as cockpit and suit designers were asked to provide lists of dimensions that will be critical for their design. In addition, they were asked to provide technically feasible minimum and maximum ranges for these dimensions. Using an adjusted 1988 Anthropometric Survey of U.S. Army (ANSUR) database to represent a future astronaut population, the accommodation ranges provided by the suit critical dimensions were calculated. This project involved participation from the Anthropometry and Biomechanics facility (ABF) as well as suit designers, with suit designers providing expertise about feasible hardware dimensions and the ABF providing accommodation analysis. The initial analysis provided the suit design team with the accommodation levels associated with the critical dimensions provided early in the study. Additional outcomes will include a comparison of principal components analysis as an alternate method for anthropometric analysis.

  4. Statistical Analysis for Test Papers with Software SPSS

    Institute of Scientific and Technical Information of China (English)

    张燕君

    2012-01-01

      Test paper evaluation is an important work for the management of tests, which results are significant bases for scientific summation of teaching and learning. Taking an English test paper of high students’monthly examination as the object, it focuses on the interpretation of SPSS output concerning item and whole quantitative analysis of papers. By analyzing and evaluating the papers, it can be a feedback for teachers to check the students’progress and adjust their teaching process.

  5. Software Safety Analysis of a Flight Guidance System

    Science.gov (United States)

    Butler, Ricky W. (Technical Monitor); Tribble, Alan C.; Miller, Steven P.; Lempia, David L.

    2004-01-01

    This document summarizes the safety analysis performed on a Flight Guidance System (FGS) requirements model. In particular, the safety properties desired of the FGS model are identified and the presence of the safety properties in the model is formally verified. Chapter 1 provides an introduction to the entire project, while Chapter 2 gives a brief overview of the problem domain, the nature of accidents, model based development, and the four-variable model. Chapter 3 outlines the approach. Chapter 4 presents the results of the traditional safety analysis techniques and illustrates how the hazardous conditions associated with the system trace into specific safety properties. Chapter 5 presents the results of the formal methods analysis technique model checking that was used to verify the presence of the safety properties in the requirements model. Finally, Chapter 6 summarizes the main conclusions of the study, first and foremost that model checking is a very effective verification technique to use on discrete models with reasonable state spaces. Additional supporting details are provided in the appendices.

  6. Splitting a Large Software Archive for Easing Future Software Evolution: An Industrial Experience Report using Formal Concept Analysis

    NARCIS (Netherlands)

    Glorie, M.; Zaidman, A.E.; Hofland, L.; Van Deursen, A.

    2008-01-01

    Preprint of paper published in: CSMR 2008 - 12th European Conference on Software Maintenance and Reengineering, 1-4 April 2008; doi:10.1109/CSMR.2008.4493310 Philips medical systems produces medical diagnostic imaging products, such as MR, X-ray and CT scanners. The software of these devices is com

  7. Analyzing the State of Static Analysis: A Large-Scale Evaluation in Open Source Software

    OpenAIRE

    2016-01-01

    The use of automatic static analysis has been a software engineering best practice for decades. However, we still do not know a lot about its use in real-world software projects: How prevalent is the use of Automated Static Analysis Tools (ASATs) such as FindBugs and JSHint? How do developers use these tools, and how does their use evolve over time? We research these questions in two studies on nine different ASATs for Java, JavaScript, Ruby, and Python with a population of 122 and 168,214 op...

  8. Clementine sensor suite

    Energy Technology Data Exchange (ETDEWEB)

    Ledebuhr, A.G. [Lawrence Livermore National Lab., CA (United States)

    1994-11-15

    LLNL designed and built the suite of six miniaturized light-weight space-qualified sensors utilized in the Clementine mission. A major goal of the Clementine program was to demonstrate technologies originally developed for Ballistic Missile Defense Organization Programs. These sensors were modified to gather data from the moon. This overview presents each of these sensors and some preliminary on-orbit performance estimates. The basic subsystems of these sensors include optical baffles to reject off-axis stray light, light-weight ruggedized optical systems, filter wheel assemblies, radiation tolerant focal plane arrays, radiation hardened control and readout electronics and low mass and power mechanical cryogenic coolers for the infrared sensors. Descriptions of each sensor type are given along with design specifications, photographs and on-orbit data collected.

  9. Refining the granulite suite

    Science.gov (United States)

    Taylor, G. Jeffrey; Norman, Marc D.; Keil, Klaus; Cushing, Janet A.

    1992-01-01

    Early studies of rocks retrieved from the Moon during the Apollo missions defined a group of rocks as granulites or 'granulitic impactites'. This included rocks with cataclastic, granulitic, and poikilitic or poikiloblastic textures. Petrographic studies indicate that the textures of 'granulitic breccias' are significantly varied so as to redefine the granulitic suite into at least two distinct groups. The first group consists of rocks that have true granulitic textures: polygonal to rounded, equant grains that are annealed, and have triple junctions with small dispersions from the average 120 degrees. The second group of rocks have poikilitic or poikiloblastic textures, with subhedral to euhedral plagioclase and/or olivine grains enclosed in pyroxene oikocrysts. In some instances, the relationship between the minerals resembles an orthocumulate texture. Rocks previously thought of as granulites may have formed in more than one way. These formation mechanisms are briefly discussed.

  10. FloWave.US: validated, open-source, and flexible software for ultrasound blood flow analysis.

    Science.gov (United States)

    Coolbaugh, Crystal L; Bush, Emily C; Caskey, Charles F; Damon, Bruce M; Towse, Theodore F

    2016-10-01

    Automated software improves the accuracy and reliability of blood velocity, vessel diameter, blood flow, and shear rate ultrasound measurements, but existing software offers limited flexibility to customize and validate analyses. We developed FloWave.US-open-source software to automate ultrasound blood flow analysis-and demonstrated the validity of its blood velocity (aggregate relative error, 4.32%) and vessel diameter (0.31%) measures with a skeletal muscle ultrasound flow phantom. Compared with a commercial, manual analysis software program, FloWave.US produced equivalent in vivo cardiac cycle time-averaged mean (TAMean) velocities at rest and following a 10-s muscle contraction (mean bias blood flow data was 9.8 times faster than the manual method. Finally, a case study of a lower extremity muscle contraction experiment highlighted the ability of FloWave.US to measure small fluctuations in TAMean velocity, vessel diameter, and mean blood flow at specific time points in the cardiac cycle. In summary, the collective features of our newly designed software-accuracy, reliability, reduced processing time, cost-effectiveness, and flexibility-offer advantages over existing proprietary options. Further, public distribution of FloWave.US allows researchers to easily access and customize code to adapt ultrasound blood flow analysis to a variety of vascular physiology applications.

  11. Future space suit design considerations.

    Science.gov (United States)

    1991-07-01

    Future space travel to the moon and Mars will present new challenges in space suit design. This paper examines the impact that working on the surface environment of the moon and Mars will have on the requirements of space suits. In particular, habitat pressures will impact suit weight and design. Potential structural materials are explored, as are the difficulties in designing a suit to withstand the severe dust conditions expected.

  12. Software Tools for Robust Analysis of High-Dimensional Data

    Directory of Open Access Journals (Sweden)

    Valentin Todorov

    2014-06-01

    Full Text Available The present work discusses robust multivariate methods specifically designed for highdimensions. Their implementation in R is presented and their application is illustratedon examples. The first group are algorithms for outlier detection, already introducedelsewhere and implemented in other packages. The value added of the new package isthat all methods follow the same design pattern and thus can use the same graphicaland diagnostic tools. The next topic covered is sparse principal components including anobject oriented interface to the standard method proposed by Zou, Hastie, and Tibshirani(2006 and the robust one proposed by Croux, Filzmoser, and Fritz (2013. Robust partialleast squares (see Hubert and Vanden Branden 2003 as well as partial least squares fordiscriminant analysis conclude the scope of the new package.

  13. STEM_CELL: a software tool for electron microscopy: part 2--analysis of crystalline materials.

    Science.gov (United States)

    Grillo, Vincenzo; Rossi, Francesca

    2013-02-01

    A new graphical software (STEM_CELL) for analysis of HRTEM and STEM-HAADF images is here introduced in detail. The advantage of the software, beyond its graphic interface, is to put together different analysis algorithms and simulation (described in an associated article) to produce novel analysis methodologies. Different implementations and improvements to state of the art approach are reported in the image analysis, filtering, normalization, background subtraction. In particular two important methodological results are here highlighted: (i) the definition of a procedure for atomic scale quantitative analysis of HAADF images, (ii) the extension of geometric phase analysis to large regions up to potentially 1μm through the use of under sampled images with aliasing effects.

  14. Multi-dimensional project evaluation: Combining cost-benefit analysis and multi-criteria analysis with the COSIMA software system

    DEFF Research Database (Denmark)

    , citizens in Nuuk and other citizens in Greenland) are examined and compared. The cost-benefit analysis of the three airport alternatives includes impacts like travel time (for business and local travellers), waiting time, drawback of shifts, regularity, out of pocket costs, operating costs...... down a problem into its constituent parts in order to better understand the problem and consequently arrive at a decision. However, while MCA opens up for the possibility to include non-market impacts, it does not provide the decision makers with guidance combining the CBA with MCA. In the paper...... different methods for combining cost-benefit analysis and multi-criteria analysis are examined and compared and a software system is presented. The software system gives the decision makers some possibilities regarding preference analysis, sensitivity and risk analysis. The aim of the software...

  15. A Systematic Analysis of Functional Safety Certification Practices in Industrial Robot Software Development

    Directory of Open Access Journals (Sweden)

    Tong Xie

    2017-01-01

    Full Text Available For decades, industry robotics have delivered on the promise of speed, efficiency and productivity. The last several years have seen a sharp resurgence in the orders of industrial robots in China, and the areas addressed within industrial robotics has extended into safety-critical domains. However, safety standards have not yet been implemented widely in academia and engineering applications, particularly in robot software development. This paper presents a systematic analysis of functional safety certification practices in software development for the safety-critical software of industrial robots, to identify the safety certification practices used for the development of industrial robots in China and how these practices comply with the safety standard requirements. Reviewing from Chinese academic papers, our research shows that safety standards are barely used in software development of industrial robot. The majority of the papers propose various solutions to achieve safety, but only about two thirds of the papers refer to non-standardized approaches that mainly address the systematic level rather than the software development level. In addition, our research shows that with the development of artificial intelligent, an emerging field is still on the quest for standardized and suitable approaches to develop safety-critical software.

  16. [Application of Stata software to test heterogeneity in meta-analysis method].

    Science.gov (United States)

    Wang, Dan; Mou, Zhen-yun; Zhai, Jun-xia; Zong, Hong-xia; Zhao, Xiao-dong

    2008-07-01

    To introduce the application of Stata software to heterogeneity test in meta-analysis. A data set was set up according to the example in the study, and the corresponding commands of the methods in Stata 9 software were applied to test the example. The methods used were Q-test and I2 statistic attached to the fixed effect model forest plot, H statistic and Galbraith plot. The existence of the heterogeneity among studies could be detected by Q-test and H statistic and the degree of the heterogeneity could be detected by I2 statistic. The outliers which were the sources of the heterogeneity could be spotted from the Galbraith plot. Heterogeneity test in meta-analysis can be completed by the four methods in Stata software simply and quickly. H and I2 statistics are more robust, and the outliers of the heterogeneity can be clearly seen in the Galbraith plot among the four methods.

  17. Featureous: infrastructure for feature-centric analysis of object-oriented software

    DEFF Research Database (Denmark)

    Olszak, Andrzej; Jørgensen, Bo Nørregaard

    2010-01-01

    understanding and modification are essential during software evolution and maintenance. In this paper, we present an infrastructure built on top of the NetBeans IDE called Featureous that allows for rapid construction of tools for feature-centric analysis of object-oriented software. Our infrastructure......The decentralized nature of collaborations between objects in object-oriented software makes it difficult to understand how user-observable program features are implemented and how their implementations relate to each other. It is worthwhile to improve this situation, since feature-centric program...... encompasses a lightweight feature location mechanism, a number of analytical views and an API allowing for addition of third-party extensions. To form a common conceptual framework for future feature-centric extensions, we propose to structure feature centric analysis along three dimensions: perspective...

  18. Meta-analysis for Discovering Rare-Variant Associations: Statistical Methods and Software Programs.

    Science.gov (United States)

    Tang, Zheng-Zheng; Lin, Dan-Yu

    2015-07-02

    There is heightened interest in using next-generation sequencing technologies to identify rare variants that influence complex human diseases and traits. Meta-analysis is essential to this endeavor because large sample sizes are required for detecting associations with rare variants. In this article, we provide a comprehensive overview of statistical methods for meta-analysis of sequencing studies for discovering rare-variant associations. Specifically, we discuss the calculation of relevant summary statistics from participating studies, the construction of gene-level association tests, the choice of transformation for quantitative traits, the use of fixed-effects versus random-effects models, and the removal of shadow association signals through conditional analysis. We also show that meta-analysis based on properly calculated summary statistics is as powerful as joint analysis of individual-participant data. In addition, we demonstrate the performance of different meta-analysis methods by using both simulated and empirical data. We then compare four major software packages for meta-analysis of rare-variant associations-MASS, RAREMETAL, MetaSKAT, and seqMeta-in terms of the underlying statistical methodology, analysis pipeline, and software interface. Finally, we present PreMeta, a software interface that integrates the four meta-analysis packages and allows a consortium to combine otherwise incompatible summary statistics.

  19. Methodology for object-oriented real-time systems analysis and design: Software engineering

    Science.gov (United States)

    Schoeffler, James D.

    1991-01-01

    Successful application of software engineering methodologies requires an integrated analysis and design life-cycle in which the various phases flow smoothly 'seamlessly' from analysis through design to implementation. Furthermore, different analysis methodologies often lead to different structuring of the system so that the transition from analysis to design may be awkward depending on the design methodology to be used. This is especially important when object-oriented programming is to be used for implementation when the original specification and perhaps high-level design is non-object oriented. Two approaches to real-time systems analysis which can lead to an object-oriented design are contrasted: (1) modeling the system using structured analysis with real-time extensions which emphasizes data and control flows followed by the abstraction of objects where the operations or methods of the objects correspond to processes in the data flow diagrams and then design in terms of these objects; and (2) modeling the system from the beginning as a set of naturally occurring concurrent entities (objects) each having its own time-behavior defined by a set of states and state-transition rules and seamlessly transforming the analysis models into high-level design models. A new concept of a 'real-time systems-analysis object' is introduced and becomes the basic building block of a series of seamlessly-connected models which progress from the object-oriented real-time systems analysis and design system analysis logical models through the physical architectural models and the high-level design stages. The methodology is appropriate to the overall specification including hardware and software modules. In software modules, the systems analysis objects are transformed into software objects.

  20. Public-domain software for root image analysis

    Directory of Open Access Journals (Sweden)

    Mirian Cristina Gomes Costa

    2014-10-01

    Full Text Available In the search for high efficiency in root studies, computational systems have been developed to analyze digital images. ImageJ and Safira are public-domain systems that may be used for image analysis of washed roots. However, differences in root properties measured using ImageJ and Safira are supposed. This study compared values of root length and surface area obtained with public-domain systems with values obtained by a reference method. Root samples were collected in a banana plantation in an area of a shallower Typic Carbonatic Haplic Cambisol (CXk, and an area of a deeper Typic Haplic Ta Eutrophic Cambisol (CXve, at six depths in five replications. Root images were digitized and the systems ImageJ and Safira used to determine root length and surface area. The line-intersect method modified by Tennant was used as reference; values of root length and surface area measured with the different systems were analyzed by Pearson's correlation coefficient and compared by the confidence interval and t-test. Both systems ImageJ and Safira had positive correlation coefficients with the reference method for root length and surface area data in CXk and CXve. The correlation coefficient ranged from 0.54 to 0.80, with lowest value observed for ImageJ in the measurement of surface area of roots sampled in CXve. The IC (95 % revealed that root length measurements with Safira did not differ from that with the reference method in CXk (-77.3 to 244.0 mm. Regarding surface area measurements, Safira did not differ from the reference method for samples collected in CXk (-530.6 to 565.8 mm² as well as in CXve (-4231 to 612.1 mm². However, measurements with ImageJ were different from those obtained by the reference method, underestimating length and surface area in samples collected in CXk and CXve. Both ImageJ and Safira allow an identification of increases or decreases in root length and surface area. However, Safira results for root length and surface area are

  1. A document-driven method for certifying scientific computing software for use in nuclear safety analysis

    Energy Technology Data Exchange (ETDEWEB)

    Smith, W. Spencer; Koothoor, Mimitha [Computing and Software Department, McMaster University, Hamilton (Canada)

    2016-04-15

    This paper presents a documentation and development method to facilitate the certification of scientific computing software used in the safety analysis of nuclear facilities. To study the problems faced during quality assurance and certification activities, a case study was performed on legacy software used for thermal analysis of a fuel pin in a nuclear reactor. Although no errors were uncovered in the code, 27 issues of incompleteness and inconsistency were found with the documentation. This work proposes that software documentation follow a rational process, which includes a software requirements specification following a template that is reusable, maintainable, and understandable. To develop the design and implementation, this paper suggests literate programming as an alternative to traditional structured programming. Literate programming allows for documenting of numerical algorithms and code together in what is termed the literate programmer's manual. This manual is developed with explicit traceability to the software requirements specification. The traceability between the theory, numerical algorithms, and implementation facilitates achieving completeness and consistency, as well as simplifies the process of verification and the associated certification.

  2. mtsslSuite: In silico spin labelling, trilateration and distance-constrained rigid body docking in PyMOL

    Science.gov (United States)

    Hagelueken, Gregor; Abdullin, Dinar; Ward, Richard; Schiemann, Olav

    2013-10-01

    Nanometer distance measurements based on electron paramagnetic resonance methods in combination with site-directed spin labelling are powerful tools for the structural analysis of macromolecules. The software package mtsslSuite provides scientists with a set of tools for the translation of experimental distance distributions into structural information. The package is based on the previously published mtsslWizard software for in silico spin labelling. The mtsslSuite includes a new version of MtsslWizard that has improved performance and now includes additional types of spin labels. Moreover, it contains applications for the trilateration of paramagnetic centres in biomolecules and for rigid-body docking of subdomains of macromolecular complexes. The mtsslSuite is tested on a number of challenging test cases and its strengths and weaknesses are evaluated.

  3. An Assessmant of a Beofulf System for a Wide Class of Analysis and Design Software

    Science.gov (United States)

    Katz, D. S.; Cwik, T.; Kwan, B. H.; Lou, J. Z.; Springer, P. L.; Sterling, T. L.; Wang, P.

    1997-01-01

    This paper discusses Beowulf systems, focusing on Hyglac, the Beowulf system installed at the Jet Propulsion Laboratory. The purpose of the paper is to assess how a system of this type will perform while running a variety of scientific and engineering analysis and design software.

  4. Structural dynamics teaching example: A linear test analysis case using open software

    DEFF Research Database (Denmark)

    Sturesson, P. O.; Brandt, A.; Ristinmaa, M.

    2013-01-01

    experimental modal analysis data. By using open software, based on MATLAB®1 as a basis for the example, the applied numerical methods are made transparent to the student. The example is built on a combination of the free CALFEM®2 and ABRAVIBE toolboxes, and thus all code used in this paper is publically...

  5. The Design of Lessons Using Mathematics Analysis Software to Support Multiple Representations in Secondary School Mathematics

    Science.gov (United States)

    Pierce, Robyn; Stacey, Kaye; Wander, Roger; Ball, Lynda

    2011-01-01

    Current technologies incorporating sophisticated mathematical analysis software (calculation, graphing, dynamic geometry, tables, and more) provide easy access to multiple representations of mathematical problems. Realising the affordances of such technology for students' learning requires carefully designed lessons. This paper reports on design…

  6. Long Term Preservation of Data Analysis Software at the NASA/IPAC Infrared Science Archive

    NARCIS (Netherlands)

    H.I. Teplitz; S. Groom; T. Brooke; V. Desai; D. Engler; J. Fowler; J. Good; I. Khan; D. Levine; A. Alexov

    2011-01-01

    The NASA/IPAC Infrared Science Archive (IRSA) curates both data and analysis tools from NASA's infrared missions. As part of our primary goal, we provide long term access to mission-specific software from projects such as IRAS and Spitzer. We will review the efforts by IRSA (and within the greater I

  7. POSTMan (POST-translational modification analysis), a software application for PTM discovery.

    Science.gov (United States)

    Arntzen, Magnus Ø; Osland, Christoffer Leif; Raa, Christopher Rasch-Olsen; Kopperud, Reidun; Døskeland, Stein-Ove; Lewis, Aurélia E; D'Santos, Clive S

    2009-03-01

    Post-translationally modified peptides present in low concentrations are often not selected for CID, resulting in no sequence information for these peptides. We have developed a software POSTMan (POST-translational Modification analysis) allowing post-translationally modified peptides to be targeted for fragmentation. The software aligns LC-MS runs (MS(1) data) between individual runs or within a single run and isolates pairs of peptides which differ by a user defined mass difference (post-translationally modified peptides). The method was validated for acetylated peptides and allowed an assessment of even the basal protein phosphorylation of phenylalanine hydroxylase (PHA) in intact cells.

  8. Validation of a Video Analysis Software Package for Quantifying Movement Velocity in Resistance Exercises.

    Science.gov (United States)

    Sañudo, Borja; Rueda, David; Pozo-Cruz, Borja Del; de Hoyo, Moisés; Carrasco, Luis

    2016-10-01

    Sañudo, B, Rueda, D, del Pozo-Cruz, B, de Hoyo, M, and Carrasco, L. Validation of a video analysis software package for quantifying movement velocity in resistance exercises. J Strength Cond Res 30(10): 2934-2941, 2016-The aim of this study was to establish the validity of a video analysis software package in measuring mean propulsive velocity (MPV) and the maximal velocity during bench press. Twenty-one healthy males (21 ± 1 year) with weight training experience were recruited, and the MPV and the maximal velocity of the concentric phase (Vmax) were compared with a linear position transducer system during a standard bench press exercise. Participants performed a 1 repetition maximum test using the supine bench press exercise. The testing procedures involved the simultaneous assessment of bench press propulsive velocity using 2 kinematic (linear position transducer and semi-automated tracking software) systems. High Pearson's correlation coefficients for MPV and Vmax between both devices (r = 0.473 to 0.993) were observed. The intraclass correlation coefficients for barbell velocity data and the kinematic data obtained from video analysis were high (>0.79). In addition, the low coefficients of variation indicate that measurements had low variability. Finally, Bland-Altman plots with the limits of agreement of the MPV and Vmax with different loads showed a negative trend, which indicated that the video analysis had higher values than the linear transducer. In conclusion, this study has demonstrated that the software used for the video analysis was an easy to use and cost-effective tool with a very high degree of concurrent validity. This software can be used to evaluate changes in velocity of training load in resistance training, which may be important for the prescription and monitoring of training programmes.

  9. Meta-Analyst: software for meta-analysis of binary, continuous and diagnostic data

    Directory of Open Access Journals (Sweden)

    Schmid Christopher H

    2009-12-01

    Full Text Available Abstract Background Meta-analysis is increasingly used as a key source of evidence synthesis to inform clinical practice. The theory and statistical foundations of meta-analysis continually evolve, providing solutions to many new and challenging problems. In practice, most meta-analyses are performed in general statistical packages or dedicated meta-analysis programs. Results Herein, we introduce Meta-Analyst, a novel, powerful, intuitive, and free meta-analysis program for the meta-analysis of a variety of problems. Meta-Analyst is implemented in C# atop of the Microsoft .NET framework, and features a graphical user interface. The software performs several meta-analysis and meta-regression models for binary and continuous outcomes, as well as analyses for diagnostic and prognostic test studies in the frequentist and Bayesian frameworks. Moreover, Meta-Analyst includes a flexible tool to edit and customize generated meta-analysis graphs (e.g., forest plots and provides output in many formats (images, Adobe PDF, Microsoft Word-ready RTF. The software architecture employed allows for rapid changes to be made to either the Graphical User Interface (GUI or to the analytic modules. We verified the numerical precision of Meta-Analyst by comparing its output with that from standard meta-analysis routines in Stata over a large database of 11,803 meta-analyses of binary outcome data, and 6,881 meta-analyses of continuous outcome data from the Cochrane Library of Systematic Reviews. Results from analyses of diagnostic and prognostic test studies have been verified in a limited number of meta-analyses versus MetaDisc and MetaTest. Bayesian statistical analyses use the OpenBUGS calculation engine (and are thus as accurate as the standalone OpenBUGS software. Conclusion We have developed and validated a new program for conducting meta-analyses that combines the advantages of existing software for this task.

  10. The implementation of SOMO (SOlution MOdeller) in the UltraScan analytical ultracentrifugation data analysis suite: enhanced capabilities allow the reliable hydrodynamic modeling of virtually any kind of biomacromolecule.

    Science.gov (United States)

    Brookes, Emre; Demeler, Borries; Rosano, Camillo; Rocco, Mattia

    2010-02-01

    The interpretation of solution hydrodynamic data in terms of macromolecular structural parameters is not a straightforward task. Over the years, several approaches have been developed to cope with this problem, the most widely used being bead modeling in various flavors. We report here the implementation of the SOMO (SOlution MOdeller; Rai et al. in Structure 13:723-734, 2005) bead modeling suite within one of the most widely used analytical ultracentrifugation data analysis software packages, UltraScan (Demeler in Modern analytical ultracentrifugation: techniques and methods, Royal Society of Chemistry, UK, 2005). The US-SOMO version is now under complete graphical interface control, and has been freed from several constraints present in the original implementation. In the direct beads-per-atoms method, virtually any kind of residue as defined in the Protein Data Bank (e.g., proteins, nucleic acids, carbohydrates, prosthetic groups, detergents, etc.) can be now represented with beads whose number, size and position are all defined in user-editable tables. For large structures, a cubic grid method based on the original AtoB program (Byron in Biophys J 72:408-415, 1997) can be applied either directly on the atomic structure, or on a previously generated bead model. The hydrodynamic parameters are then computed in the rigid-body approximation. An extensive set of tests was conducted to further validate the method, and the results are presented here. Owing to its accuracy, speed, and versatility, US-SOMO should allow to fully take advantage of the potential of solution hydrodynamics as a complement to higher resolution techniques in biomacromolecular modeling.

  11. MIDAS: software for analysis and visualisation of interallelic disequilibrium between multiallelic markers

    Directory of Open Access Journals (Sweden)

    Day Ian NM

    2006-04-01

    Full Text Available Abstract Background Various software tools are available for the display of pairwise linkage disequilibrium across multiple single nucleotide polymorphisms. The HapMap project also presents these graphics within their website. However, these approaches are limited in their use of data from multiallelic markers and provide limited information in a graphical form. Results We have developed a software package (MIDAS – Multiallelic Interallelic Disequilibrium Analysis Software for the estimation and graphical display of interallelic linkage disequilibrium. Linkage disequilibrium is analysed for each allelic combination (of one allele from each of two loci, between all pairwise combinations of any type of multiallelic loci in a contig (or any set of many loci (including single nucleotide polymorphisms, microsatellites, minisatellites and haplotypes. Data are presented graphically in a novel and informative way, and can also be exported in tabular form for other analyses. This approach facilitates visualisation of patterns of linkage disequilibrium across genomic regions, analysis of the relationships between different alleles of multiallelic markers and inferences about patterns of evolution and selection. Conclusion MIDAS is a linkage disequilibrium analysis program with a comprehensive graphical user interface providing novel views of patterns of linkage disequilibrium between all types of multiallelic and biallelic markers. Availability Available from http://www.genes.org.uk/software/midas and http://www.sgel.humgen.soton.ac.uk/midas

  12. Comparative analysis of methods for testing software of radio-electronic equipment

    Directory of Open Access Journals (Sweden)

    G. A. Mirskikh

    2011-03-01

    Full Text Available The analysis of the concepts of quality and reliability of software products that are part of the radio-electronic equipment is making. Basis testing methods of software products that are used in the design of hardware and software systems, to ensure quality and reliability are given. We consider testing in accordance with the methodology of the "black box" and "white box" methods of integration testing from the bottom up and top down, as well as various modifications of these methods. Efficient criteria that allow you to select the method of testing programs based on their structure and organizational and financial factors that affect the quality of the implementation of the design process are leaded.

  13. Fuzzy system for risk analysis in software projects through the attributes of quality standards iso 25000

    Directory of Open Access Journals (Sweden)

    Chau Sen Shia

    2014-02-01

    Full Text Available With the growth in demand for products and services in the IT area, companies encounter difficulties in establishing a metric or measure of quality of services to address measurably qualitative values in their planning. In this work fuzzy logic, standard SQuaRE (measurement of the quality of software products, Likertscale, GQM method (Goal-Question-Metric -indicator of quality of Software and the project risk analysis model of Boehm were used to assess the quality of services and decision-making, according to your demand and requests for software development. With the aim of improving the quality in the provision of services, the application is used to integrate the team and follow the life cycle of a project from its initial phase, and to assist in the comparison with the proposed schedule during the requirements elicitation.

  14. Reliability Analysis of Component Software in Wireless Sensor Networks Based on Transformation of Testing Data

    Directory of Open Access Journals (Sweden)

    Chunyan Hou

    2009-08-01

    Full Text Available We develop an approach of component software reliability analysis which includes the benefits of both time domain, and structure based approaches. This approach overcomes the deficiency of existing NHPP techniques that fall short of addressing repair, and internal system structures simultaneously. Our solution adopts a method of transformation of testing data to cover both methods, and is expected to improve reliability prediction. This paradigm allows component-based software testing process doesn’t meet the assumption of NHPP models, and accounts for software structures by the way of modeling the testing process. According to the testing model it builds the mapping relation from the testing profile to the operational profile which enables the transformation of the testing data to build the reliability dataset required by NHPP models. At last an example is evaluated to validate and show the effectiveness of this approach.

  15. Software package for the design and analysis of DNA origami structures

    DEFF Research Database (Denmark)

    Andersen, Ebbe Sloth; Nielsen, Morten Muhlig; Dong, Mingdong

    was observed on the mica surface with a fraction of the dolphin nanostructures showing extensive tail flexibility of approximately 90 degrees. The Java editor and tools are free software distributed under the GNU license. The open architecture of the editor makes it easy for the scientific community......A software package was developed for the semi-automated design of DNA origamis and further data analysis of Atomic Force Microscopy (AFM) images. As an example, we design the shape of a bottlenose dolphin and analyze it by means of high resolution AFM imaging. A high yield of DNA dolphins...... to contribute new tools and functionalities. Documentation, tutorials and software will be made available online....

  16. Detecting Optic Atrophy in Multiple Sclerosis Patients Using New Colorimetric Analysis Software: From Idea to Application.

    Science.gov (United States)

    Bambo, Maria Pilar; Garcia-Martin, Elena; Perez-Olivan, Susana; Larrosa-Povés, José Manuel; Polo-Llorens, Vicente; Gonzalez-De la Rosa, Manuel

    2016-01-01

    Neuro-ophthalmologists typically observe a temporal pallor of the optic disc in patients with multiple sclerosis. Here, we describe the emergence of an idea to quantify these optic disc color changes in multiple sclerosis patients. We recruited 12 multiple sclerosis patients with previous optic neuritis attack and obtained photographs of their optic discs. The Laguna ONhE, a new colorimetric software using hemoglobin as the reference pigment in the papilla, was used for the analysis. The papilla of these multiple sclerosis patients showed greater pallor, especially in the temporal sector. The software detected the pallor and assigned hemoglobin percentages below normal reference values. Measurements of optic disc hemoglobin levels obtained with the Laguna ONhE software program had good ability to detect optic atrophy and, consequently, axonal loss in multiple sclerosis patients. This new technology is easy to implement in routine clinical practice.

  17. The Measurement and Analysis Risk Factors Dependence Correlation in Software Project

    Science.gov (United States)

    Jianjie, Ding; Hong, Hou; Kegang, Hao; Xiaoqun, Guo

    The complexity of software process leads to that there are all kinds of fuzzy correlations among different process management risk factors, such as dependence correlation among software risk factors. It’s difficult to analyze risk data directly by mathematic tools because that risk data is uncertain and rough. Based on the rough set theory and the data in risk management library, the risk factors dependence correlation analysis system(RFDCAS) is established, and the dependence coefficient and its calculate formula on the base of equivalence class is suggested. The RFDCAS unveils the dependence correlation among risk factors contribute to risk management, and can help discover the problems in the software process improvement management.

  18. Software Analysis of New Space Gravity Data for Geophysics and Climate Research

    Science.gov (United States)

    Deese, Rupert; Ivins, Erik R.; Fielding, Eric J.

    2012-01-01

    Both the Gravity Recovery and Climate Experiment (GRACE) and Gravity field and steady-state Ocean Circulation Explorer (GOCE) satellites are returning rich data for the study of the solid earth, the oceans, and the climate. Current software analysis tools do not provide researchers with the ease and flexibility required to make full use of this data. We evaluate the capabilities and shortcomings of existing software tools including Mathematica, the GOCE User Toolbox, the ICGEM's (International Center for Global Earth Models) web server, and Tesseroids. Using existing tools as necessary, we design and implement software with the capability to produce gridded data and publication quality renderings from raw gravity data. The straight forward software interface marks an improvement over previously existing tools and makes new space gravity data more useful to researchers. Using the software we calculate Bouguer anomalies of the gravity tensor's vertical component in the Gulf of Mexico, Antarctica, and the 2010 Maule earthquake region. These maps identify promising areas of future research.

  19. The cleanroom case study in the Software Engineering Laboratory: Project description and early analysis

    Science.gov (United States)

    Green, Scott; Kouchakdjian, Ara; Basili, Victor; Weidow, David

    1990-01-01

    This case study analyzes the application of the cleanroom software development methodology to the development of production software at the NASA/Goddard Space Flight Center. The cleanroom methodology emphasizes human discipline in program verification to produce reliable software products that are right the first time. Preliminary analysis of the cleanroom case study shows that the method can be applied successfully in the FDD environment and may increase staff productivity and product quality. Compared to typical Software Engineering Laboratory (SEL) activities, there is evidence of lower failure rates, a more complete and consistent set of inline code documentation, a different distribution of phase effort activity, and a different growth profile in terms of lines of code developed. The major goals of the study were to: (1) assess the process used in the SEL cleanroom model with respect to team structure, team activities, and effort distribution; (2) analyze the products of the SEL cleanroom model and determine the impact on measures of interest, including reliability, productivity, overall life-cycle cost, and software quality; and (3) analyze the residual products in the application of the SEL cleanroom model, such as fault distribution, error characteristics, system growth, and computer usage.

  20. STATIC CODE ANALYSIS FOR SOFTWARE QUALITY IMPROVEMENT: A CASE STUDY IN BCI FRAMEWORK DEVELOPMENT

    Directory of Open Access Journals (Sweden)

    Indar Sugiarto

    2008-01-01

    Full Text Available This paper shows how the systematic approach in software testing using static code analysis method can be used for improving the software quality of a BCI framework. The method is best performed during the development phase of framework programs. In the proposed approach, we evaluate several software metrics which are based on the principles of object oriented design. Since such method is depending on the underlying programming language, we describe the method in term of C++ language programming whereas the Qt platform is also currently being used. One of the most important metric is so called software complexity. Applying the software complexity calculation using both McCabe and Halstead method for the BCI framework which consists of two important types of BCI, those are SSVEP and P300, we found that there are two classes in the framework which have very complex and prone to violation of cohesion principle in OOP. The other metrics are fit the criteria of the proposed framework aspects, such as: MPC is less than 20; average complexity is around value of 5; and the maximum depth is below 10 blocks. Such variables are considered very important when further developing the BCI framework in the future.

  1. COMPARATIVE ANALYSIS OF COMPUTER SOFTWARE AND BRAILLE LITERACY TO EDUCATE STUDENTS HAVING VISUAL IMPAIRMENT

    Directory of Open Access Journals (Sweden)

    Ismat Bano

    2011-10-01

    Full Text Available This research investigates the comparative analysis of computer software and Braille literacy to educate students having visual impairment. The main objective of this research focus on compare the feasibility andusage of Braille literacy and computer software to educate children with visual impairment. Main objectives of the study were to identify the importance of Braille and Computer literacy by the perceptions of male and female students with visual impairment, to identify the importance of the Braille and Computer literacy in different classes of students with visual impairment and to identify the difference of Braille and Computer literacy importance in different schools of students with visual impairment. Five special education institutions were selected where students with visual impairment were studying. A convenient sample of 100 students was taken from these schools. A three point rating scale was used as research instrument. Researchers personally collected data from the respondents. Data was analyzed through SPSS. Major findings showed that students were more interested in Braille system than computer software. Braille system and required material was resent in all the schools while computer teachers with required experience were not available in these institutions. Teachers were found expert in Braille literacy as compare to the computer software- It was recommended that proper awareness about most recent technologies were necessary for teachers in special education institutions. Students as well as teachers should be provided chances of hands on practice to create interest in computer software use in special education.

  2. 软件保护的分析与思考%Analysis and Reflection Software Protection

    Institute of Scientific and Technical Information of China (English)

    袁淑丹; 黎成; 任子亭

    2014-01-01

    该文在分析了以往的软件加密方法和对比研究之后,得出将软硬件结合进行软件保护的方案,于是提出了基于硬盘序列号进行软件加密保护的研究。将软硬件加密技术结合使用,硬件方面通过对比分析得出要使用硬盘序列号进行加密依据,基于计算机硬盘序列号具有唯一性特点,可以更好的实现一码一机制,并且在软件加密技术上进一步改进,使用对称加密算法与非对称加密算法结合,让软件的保护强度进一步提高。%Based on the analysis of the past after a software encryption method and comparative study ,the combination of hard⁃ware and software obtained protection scheme, So put the research-based encryption software to protect the hard drive serial number. The combination of hardware and software encryption technology used by comparative analysis of results of hardware to be used to encrypt the hard drive serial number basis, based on the computer's hard drive serial number of unique characteristics that can better achieve one yard a mechanism, and software encryption technology further improvements, combined with the use of a symmetric encryption algorithm asymmetric encryption algorithm, allowing the software to further enhance the strength of protection.

  3. Software for analysis of chemical mixtures--composition, occurrence, distribution, and possible toxicity

    Science.gov (United States)

    Scott, Jonathon C.; Skach, Kenneth A.; Toccalino, Patricia L.

    2013-01-01

    The composition, occurrence, distribution, and possible toxicity of chemical mixtures in the environment are research concerns of the U.S. Geological Survey and others. The presence of specific chemical mixtures may serve as indicators of natural phenomena or human-caused events. Chemical mixtures may also have ecological, industrial, geochemical, or toxicological effects. Chemical-mixture occurrences vary by analyte composition and concentration. Four related computer programs have been developed by the National Water-Quality Assessment Program of the U.S. Geological Survey for research of chemical-mixture compositions, occurrences, distributions, and possible toxicities. The compositions and occurrences are identified for the user-supplied data, and therefore the resultant counts are constrained by the user’s choices for the selection of chemicals, reporting limits for the analytical methods, spatial coverage, and time span for the data supplied. The distribution of chemical mixtures may be spatial, temporal, and (or) related to some other variable, such as chemical usage. Possible toxicities optionally are estimated from user-supplied benchmark data. The software for the analysis of chemical mixtures described in this report is designed to work with chemical-analysis data files retrieved from the U.S. Geological Survey National Water Information System but can also be used with appropriately formatted data from other sources. Installation and usage of the mixture software are documented. This mixture software was designed to function with minimal changes on a variety of computer-operating systems. To obtain the software described herein and other U.S. Geological Survey software, visit http://water.usgs.gov/software/.

  4. The review of the modeling methods and numerical analysis software for nanotechnology in material science

    Directory of Open Access Journals (Sweden)

    SMIRNOV Vladimir Alexeevich

    2014-10-01

    Full Text Available Due to the high demand for building materials with universal set of roperties which extend their application area the research efforts are focusing on nanotechnology in material science. The rational combination of theoretical studies, mathematical modeling and simulation can favour reduced resource and time consumption when nanomodified materials are being developed. The development of composite material is based on the principles of system analysis which provides for the necessity of criteria determination and further classification of modeling methods. In this work the criteria of spatial scale, dominant type of interaction and heterogeneity are used for such classification. The presented classification became a framework for analysis of methods and software which can be applied to the development of building materials. For each of selected spatial levels - from atomistic one to macrostructural level of constructional coarsegrained composite – existing theories, modeling algorithms and tools have been considered. At the level of macrostructure which is formed under influence of gravity and exterior forces one can apply probabilistic and geometrical methods to study obtained structure. The existing models are suitable for packing density analysis and solution of percolation problems at the macroscopic level, but there are still no software tools which could be applied in nanotechnology to carry out systematic investigations. At the microstructure level it’s possible to use particle method along with probabilistic and statistical methods to explore structure formation but available software tools are partially suitable for numerical analysis of microstructure models. Therefore, modeling of the microstructure is rather complicated; the model has to include potential of pairwise interaction. After the model has been constructed and parameters of pairwise potential have been determined, many software packages for solution of ordinary

  5. GEMBASSY: an EMBOSS associated software package for comprehensive genome analyses

    OpenAIRE

    Itaya, Hidetoshi; Oshita, Kazuki; Arakawa, Kazuharu; Tomita, Masaru

    2013-01-01

    The popular European Molecular Biology Open Software Suite (EMBOSS) currently contains over 400 tools used in various bioinformatics researches, equipped with sophisticated development frameworks for interoperability and tool discoverability as well as rich documentations and various user interfaces. In order to further strengthen EMBOSS in the fields of genomics, we here present a novel EMBOSS associated software (EMBASSY) package named GEMBASSY, which adds more than 50 analysis tools from t...

  6. Software reference for SaTool - a Tool for Structural Analysis of Automated Systems

    DEFF Research Database (Denmark)

    Lorentzen, Torsten; Blanke, Mogens

    2004-01-01

    This software reference details the functions of SaTool – a tool for structural analysis of technical systems. SaTool is intended used as part of an industrial systems design cycle. Structural analysis is a graph-based technique where principal relations between variables express the system...... of the graph. SaTool makes analysis of the structure graph to provide knowledge about fundamental properties of the system in normal and faulty conditions. Salient features of SaTool include rapid analysis of possibility to diagnose faults and ability to make autonomous recovery should faults occur....

  7. Internet-Based Software Tools for Analysis and Processing of LIDAR Point Cloud Data via the OpenTopography Portal

    Science.gov (United States)

    Nandigam, V.; Crosby, C. J.; Baru, C.; Arrowsmith, R.

    2009-12-01

    LIDAR is an excellent example of the new generation of powerful remote sensing data now available to Earth science researchers. Capable of producing digital elevation models (DEMs) more than an order of magnitude higher resolution than those currently available, LIDAR data allows earth scientists to study the processes that contribute to landscape evolution at resolutions not previously possible, yet essential for their appropriate representation. Along with these high-resolution datasets comes an increase in the volume and complexity of data that the user must efficiently manage and process in order for it to be scientifically useful. Although there are expensive commercial LIDAR software applications available, processing and analysis of these datasets are typically computationally inefficient on the conventional hardware and software that is currently available to most of the Earth science community. We have designed and implemented an Internet-based system, the OpenTopography Portal, that provides integrated access to high-resolution LIDAR data as well as web-based tools for processing of these datasets. By using remote data storage and high performance compute resources, the OpenTopography Portal attempts to simplify data access and standard LIDAR processing tasks for the Earth Science community. The OpenTopography Portal allows users to access massive amounts of raw point cloud LIDAR data as well as a suite of DEM generation tools to enable users to generate custom digital elevation models to best fit their science applications. The Cyberinfrastructure software tools for processing the data are freely available via the portal and conveniently integrated with the data selection in a single user-friendly interface. The ability to run these tools on powerful Cyberinfrastructure resources instead of their own labs provides a huge advantage in terms of performance and compute power. The system also encourages users to explore data processing methods and the

  8. Development and Use of Mathematical Models and Software Frameworks for Integrated Analysis of Agricultural Systems and Associated Water Use Impacts

    Directory of Open Access Journals (Sweden)

    J.C. Chrispell

    2016-05-01

    Full Text Available The development of appropriate water management strategies requires, in part, a methodology for quantifying and evaluating the impact of water policy decisions on regional stakeholders. In this work, we describe the framework we are developing to enhance the body of resources available to policy makers, farmers, and other community members in their e orts to understand, quantify, and assess the often competing objectives water consumers have with respect to usage. The foundation for the framework is the construction of a simulation-based optimization software tool using two existing software packages. In particular, we couple a robust optimization software suite (DAKOTA with the USGS MF-OWHM water management simulation tool to provide a flexible software environment that will enable the evaluation of one or multiple (possibly competing user-defined (or stakeholder objectives. We introduce the individual software components and outline the communication strategy we defined for the coupled development. We present numerical results for case studies related to crop portfolio management with several defined objectives. The objectives are not optimally satisfied for any single user class, demonstrating the capability of the software tool to aid in the evaluation of a variety of competing interests.

  9. Development and use of mathematical models and software frameworks for integrated analysis of agricultural systems and associated water use impacts

    Science.gov (United States)

    Fowler, K. R.; Jenkins, E.W.; Parno, M.; Chrispell, J.C.; Colón, A. I.; Hanson, Randall T.

    2016-01-01

    The development of appropriate water management strategies requires, in part, a methodology for quantifying and evaluating the impact of water policy decisions on regional stakeholders. In this work, we describe the framework we are developing to enhance the body of resources available to policy makers, farmers, and other community members in their e orts to understand, quantify, and assess the often competing objectives water consumers have with respect to usage. The foundation for the framework is the construction of a simulation-based optimization software tool using two existing software packages. In particular, we couple a robust optimization software suite (DAKOTA) with the USGS MF-OWHM water management simulation tool to provide a flexible software environment that will enable the evaluation of one or multiple (possibly competing) user-defined (or stakeholder) objectives. We introduce the individual software components and outline the communication strategy we defined for the coupled development. We present numerical results for case studies related to crop portfolio management with several defined objectives. The objectives are not optimally satisfied for any single user class, demonstrating the capability of the software tool to aid in the evaluation of a variety of competing interests.

  10. An Effective Strategy to Build Up a Balanced Test Suite for Spectrum-Based Fault Localization

    Directory of Open Access Journals (Sweden)

    Ning Li

    2016-01-01

    Full Text Available During past decades, many automated software faults diagnosis techniques including Spectrum-Based Fault Localization (SBFL have been proposed to improve the efficiency of software debugging activity. In the field of SBFL, suspiciousness calculation is closely related to the number of failed and passed test cases. Studies have shown that the ratio of the number of failed and passed test case has more significant impact on the accuracy of SBFL than the total number of test cases, and a balanced test suite is more beneficial to improving the accuracy of SBFL. Based on theoretical analysis, we proposed an PNF (Passed test cases, Not execute Faulty statement strategy to reduce test suite and build up a more balanced one for SBFL, which can be used in regression testing. We evaluated the strategy making experiments using the Siemens program and Space program. Experiments indicated that our PNF strategy can be used to construct a new test suite effectively. Compared with the original test suite, the new one has smaller size (average 90% test case was reduced in experiments and more balanced ratio of failed test cases to passed test cases, while it has the same statement coverage and fault localization accuracy.

  11. Security Risk Minimization for Desktop and Mobile Software Systems. An In-Depth Analysis

    Directory of Open Access Journals (Sweden)

    Florina Camelia PUICAN

    2014-01-01

    Full Text Available In an extremely rapid growing industry such as the information technology nowadays, continuous and efficient workflows need to be established within any integrated enterprise or consumer software system. Taking into consideration the actual trend of data and information migrating to mobile devices, which have became more than just simple gadgets, the security threats and vulnerabilities of software products have created a new playground for attackers, especially when the system offers cross-platform (desktop and mobile functionalities and applicability. In this context, the paper proposes an in depth analysis over some of the weaknesses software systems present, providing also a set of solutions for minimizing and mitigating the risks of any solution, be it mobile or desktop. Subsequently, even though consumer and enterprise systems have fundamentally different structures and architectures (due to the different needs of the end user, data loss or information leakage may and will affect any type of machine if proper securization of the systems is not taken into consideration, therefore risk minimization through an in-depth analysis of any integrated software system becomes mandatory and needs extensive care.

  12. PyElph - a software tool for gel images analysis and phylogenetics

    Directory of Open Access Journals (Sweden)

    Pavel Ana Brânduşa

    2012-01-01

    Full Text Available Abstract Background This paper presents PyElph, a software tool which automatically extracts data from gel images, computes the molecular weights of the analyzed molecules or fragments, compares DNA patterns which result from experiments with molecular genetic markers and, also, generates phylogenetic trees computed by five clustering methods, using the information extracted from the analyzed gel image. The software can be successfully used for population genetics, phylogenetics, taxonomic studies and other applications which require gel image analysis. Researchers and students working in molecular biology and genetics would benefit greatly from the proposed software because it is free, open source, easy to use, has a friendly Graphical User Interface and does not depend on specific image acquisition devices like other commercial programs with similar functionalities do. Results PyElph software tool is entirely implemented in Python which is a very popular programming language among the bioinformatics community. It provides a very friendly Graphical User Interface which was designed in six steps that gradually lead to the results. The user is guided through the following steps: image loading and preparation, lane detection, band detection, molecular weights computation based on a molecular weight marker, band matching and finally, the computation and visualization of phylogenetic trees. A strong point of the software is the visualization component for the processed data. The Graphical User Interface provides operations for image manipulation and highlights lanes, bands and band matching in the analyzed gel image. All the data and images generated in each step can be saved. The software has been tested on several DNA patterns obtained from experiments with different genetic markers. Examples of genetic markers which can be analyzed using PyElph are RFLP (Restriction Fragment Length Polymorphism, AFLP (Amplified Fragment Length Polymorphism, RAPD

  13. HPC Benchmark Suite NMx Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Intelligent Automation Inc., (IAI) and University of Central Florida (UCF) propose to develop a comprehensive numerical test suite for benchmarking current and...

  14. Advanced EVA Suit Camera System Development Project

    Science.gov (United States)

    Mock, Kyla

    2016-01-01

    The National Aeronautics and Space Administration (NASA) at the Johnson Space Center (JSC) is developing a new extra-vehicular activity (EVA) suit known as the Advanced EVA Z2 Suit. All of the improvements to the EVA Suit provide the opportunity to update the technology of the video imagery. My summer internship project involved improving the video streaming capabilities of the cameras that will be used on the Z2 Suit for data acquisition. To accomplish this, I familiarized myself with the architecture of the camera that is currently being tested to be able to make improvements on the design. Because there is a lot of benefit to saving space, power, and weight on the EVA suit, my job was to use Altium Design to start designing a much smaller and simplified interface board for the camera's microprocessor and external components. This involved checking datasheets of various components and checking signal connections to ensure that this architecture could be used for both the Z2 suit and potentially other future projects. The Orion spacecraft is a specific project that may benefit from this condensed camera interface design. The camera's physical placement on the suit also needed to be determined and tested so that image resolution can be maximized. Many of the options of the camera placement may be tested along with other future suit testing. There are multiple teams that work on different parts of the suit, so the camera's placement could directly affect their research or design. For this reason, a big part of my project was initiating contact with other branches and setting up multiple meetings to learn more about the pros and cons of the potential camera placements we are analyzing. Collaboration with the multiple teams working on the Advanced EVA Z2 Suit is absolutely necessary and these comparisons will be used as further progress is made for the overall suit design. This prototype will not be finished in time for the scheduled Z2 Suit testing, so my time was

  15. Development of a User Interface for a Regression Analysis Software Tool

    Science.gov (United States)

    Ulbrich, Norbert Manfred; Volden, Thomas R.

    2010-01-01

    An easy-to -use user interface was implemented in a highly automated regression analysis tool. The user interface was developed from the start to run on computers that use the Windows, Macintosh, Linux, or UNIX operating system. Many user interface features were specifically designed such that a novice or inexperienced user can apply the regression analysis tool with confidence. Therefore, the user interface s design minimizes interactive input from the user. In addition, reasonable default combinations are assigned to those analysis settings that influence the outcome of the regression analysis. These default combinations will lead to a successful regression analysis result for most experimental data sets. The user interface comes in two versions. The text user interface version is used for the ongoing development of the regression analysis tool. The official release of the regression analysis tool, on the other hand, has a graphical user interface that is more efficient to use. This graphical user interface displays all input file names, output file names, and analysis settings for a specific software application mode on a single screen which makes it easier to generate reliable analysis results and to perform input parameter studies. An object-oriented approach was used for the development of the graphical user interface. This choice keeps future software maintenance costs to a reasonable limit. Examples of both the text user interface and graphical user interface are discussed in order to illustrate the user interface s overall design approach.

  16. Analysis of the Articulated Total Body (ATB) and Mathematical Dynamics Model (MADYMO) Software Suites for Modeling Anthropomorphic Test Devices (ATDs) in Blast Environments

    Science.gov (United States)

    2013-05-01

    Maxwell, and Cardan restraints Surfaces Planes and ellipsoids Planes, ellipsoids, hyper-ellipsoids, and FE facets Contacts Functions based on...parallel with a damper. In MADYMO, there are more options for restraint systems including the following: Kelvin, Maxwell, and Cardan restraints...Maxwell restraints are forces calculated from a system with a spring in series with a damper. Cardan restraints apply opposite torques on the connecting

  17. Development and validation of a video analysis software for marine benthic applications

    Science.gov (United States)

    Romero-Ramirez, A.; Grémare, A.; Bernard, G.; Pascal, L.; Maire, O.; Duchêne, J. C.

    2016-10-01

    Our aim in the EU funded JERICO project was to develop a flexible and scalable imaging platform that could be used in the widest possible set of ecological situations. Depending on research objectives, both image acquisition and analysis procedures may indeed differ. Up to now the attempts for automating image analysis procedures have consisted of the development of pieces of software specifically designed for a given objective. This led to the conception of a new software: AVIExplore. Its general architecture and its three constitutive modules: AVIExplore - Mobile, AVIExplore - Fixed and AVIExplore - ScriptEdit are presented. AVIExplore provides a unique environment for video analysis. Its main features include: (1) image selection tools allowing for the division of videos in homogeneous sections, (2) automatic extraction of targeted information, (3) solutions for long-term time-series as well as large spatial scale image acquisition, (4) real time acquisition and in some cases real time analysis, and (5) a large range of customized image-analysis possibilities through a script editor. The flexibility of use of AVIExplore is illustrated and validated by three case studies: (1) coral identification and mapping, (2) identification and quantification of different types of behaviors in a mud shrimp, and (3) quantification of filtering activity in a passive suspension-feeder. The accuracy of the software is measured comparing with visual assessment. It is: 90.2%, 82.7%, and 98.3% for the three case studies, respectively. Some of the advantages and current limitations of the software as well as some of its foreseen advancements are then briefly discussed.

  18. Assessment of Suited Reach Envelope in an Underwater Environment

    Science.gov (United States)

    Kim, Han; Benson, Elizabeth; Bernal, Yaritza; Jarvis, Sarah; Meginnis, Ian; Rajulu, Sudhakar

    2017-01-01

    Predicting the performance of a crewmember in an extravehicular activity (EVA) space suit presents unique challenges. The kinematic patterns of suited motions are difficult to reproduce in gravity. Additionally, 3-D suited kinematics have been practically and technically difficult to quantify in an underwater environment, in which crewmembers are commonly trained and assessed for performance. The goal of this study is to develop a hardware and software system to predictively evaluate the kinematic mobility of suited crewmembers, by measuring the 3-D reach envelope of the suit in an underwater environment. This work is ultimately aimed at developing quantitative metrics to compare the mobility of the existing Extravehicular Mobility Unit (EMU) to newly developed space suit, such as the Z-2. The EMU has been extensively used at NASA since 1981 for EVA outside the Space Shuttle and International Space Station. The Z-2 suit is NASA's newest prototype space suit. The suit is comprised of new upper torso and lower torso architectures, which were designed to improve test subject mobility.

  19. Development and evaluation of a web-based software for crash data collection, processing and analysis.

    Science.gov (United States)

    Montella, Alfonso; Chiaradonna, Salvatore; Criscuolo, Giorgio; De Martino, Salvatore

    2017-02-05

    First step of the development of an effective safety management system is to create reliable crash databases since the quality of decision making in road safety depends on the quality of the data on which decisions are based. Improving crash data is a worldwide priority, as highlighted in the Global Plan for the Decade of Action for Road Safety adopted by the United Nations, which recognizes that the overall goal of the plan will be attained improving the quality of data collection at the national, regional and global levels. Crash databases provide the basic information for effective highway safety efforts at any level of government, but lack of uniformity among countries and among the different jurisdictions in the same country is observed. Several existing databases show significant drawbacks which hinder their effective use for safety analysis and improvement. Furthermore, modern technologies offer great potential for significant improvements of existing methods and procedures for crash data collection, processing and analysis. To address these issues, in this paper we present the development and evaluation of a web-based platform-independent software for crash data collection, processing and analysis. The software is designed for mobile and desktop electronic devices and enables a guided and automated drafting of the crash report, assisting police officers both on-site and in the office. The software development was based both on the detailed critical review of existing Australasian, EU, and U.S. crash databases and software as well as on the continuous consultation with the stakeholders. The evaluation was carried out comparing the completeness, timeliness, and accuracy of crash data before and after the use of the software in the city of Vico Equense, in south of Italy showing significant advantages. The amount of collected information increased from 82 variables to 268 variables, i.e., a 227% increase. The time saving was more than one hour per crash, i

  20. A comparison of conventional and computer-assisted semen analysis (CRISMAS software) using samples from 166 young Danish men

    DEFF Research Database (Denmark)

    Vested, Anne; Ramlau-Hansen, Cecilia; Bonde, Jens P;

    2011-01-01

    The aim of the present study was to compare assessments of sperm concentration and sperm motility analysed by conventional semen analysis with those obtained by computer-assisted semen analysis (CASA) (Copenhagen Rigshospitalet Image House Sperm Motility Analysis System (CRISMAS) 4.6 software) us...... and motility analysis. This needs to be accounted for in clinics using this software and in studies of determinants of these semen characteristics.......The aim of the present study was to compare assessments of sperm concentration and sperm motility analysed by conventional semen analysis with those obtained by computer-assisted semen analysis (CASA) (Copenhagen Rigshospitalet Image House Sperm Motility Analysis System (CRISMAS) 4.6 software......) using semen samples from 166 young Danish men. The CRISMAS software identifies sperm concentration and classifies spermatozoa into three motility categories. To enable comparison of the two methods, the four motility stages obtained by conventional semen analysis were, based on their velocity...

  1. TScratch: a novel and simple software tool for automated analysis of monolayer wound healing assays.

    Science.gov (United States)

    Gebäck, Tobias; Schulz, Martin Michael Peter; Koumoutsakos, Petros; Detmar, Michael

    2009-04-01

    Cell migration plays a major role in development, physiology, and disease, and is frequently evaluated in vitro by the monolayer wound healing assay. The assay analysis, however, is a time-consuming task that is often performed manually. In order to accelerate this analysis, we have developed TScratch, a new, freely available image analysis technique and associated software tool that uses the fast discrete curvelet transform to automate the measurement of the area occupied by cells in the images. This tool helps to significantly reduce the time needed for analysis and enables objective and reproducible quantification of assays. The software also offers a graphical user interface which allows easy inspection of analysis results and, if desired, manual modification of analysis parameters. The automated analysis was validated by comparing its results with manual-analysis results for a range of different cell lines. The comparisons demonstrate a close agreement for the vast majority of images that were examined and indicate that the present computational tool can reproduce statistically significant results in experiments with well-known cell migration inhibitors and enhancers.

  2. OpenMS: a flexible open-source software platform for mass spectrometry data analysis.

    Science.gov (United States)

    Röst, Hannes L; Sachsenberg, Timo; Aiche, Stephan; Bielow, Chris; Weisser, Hendrik; Aicheler, Fabian; Andreotti, Sandro; Ehrlich, Hans-Christian; Gutenbrunner, Petra; Kenar, Erhan; Liang, Xiao; Nahnsen, Sven; Nilse, Lars; Pfeuffer, Julianus; Rosenberger, George; Rurik, Marc; Schmitt, Uwe; Veit, Johannes; Walzer, Mathias; Wojnar, David; Wolski, Witold E; Schilling, Oliver; Choudhary, Jyoti S; Malmström, Lars; Aebersold, Ruedi; Reinert, Knut; Kohlbacher, Oliver

    2016-08-30

    High-resolution mass spectrometry (MS) has become an important tool in the life sciences, contributing to the diagnosis and understanding of human diseases, elucidating biomolecular structural information and characterizing cellular signaling networks. However, the rapid growth in the volume and complexity of MS data makes transparent, accurate and reproducible analysis difficult. We present OpenMS 2.0 (http://www.openms.de), a robust, open-source, cross-platform software specifically designed for the flexible and reproducible analysis of high-throughput MS data. The extensible OpenMS software implements common mass spectrometric data processing tasks through a well-defined application programming interface in C++ and Python and through standardized open data formats. OpenMS additionally provides a set of 185 tools and ready-made workflows for common mass spectrometric data processing tasks, which enable users to perform complex quantitative mass spectrometric analyses with ease.

  3. An Analysis of Mimosa pudica Leaves Movement by Using LoggerPro Software

    Science.gov (United States)

    Sugito; Susilo; Handayani, L.; Marwoto, P.

    2016-08-01

    The unique phenomena of Mimosa pudica are the closing and opening movements of its leaves when they got a stimulus. By using certain software, these movements can be drawn into graphic that can be analysed. The LoggerPro provides facilities needed to analyse recorded videos of the plant's reaction to stimulus. Then, through the resulted graph, analysis of some variables can be carried out. The result showed that the plant's movement fits an equation of y = mx + c.

  4. Periodic precipitation a microcomputer analysis of transport and reaction processes in diffusion media, with software development

    CERN Document Server

    Henisch, H K

    1991-01-01

    Containing illustrations, worked examples, graphs and tables, this book deals with periodic precipitation (also known as Liesegang Ring formation) in terms of mathematical models and their logical consequences, and is entirely concerned with microcomputer analysis and software development. Three distinctive periodic precipitation mechanisms are included: binary diffusion-reaction; solubility modulation, and competitive particle growth. The book provides didactic illustrations of a valuable investigational procedure, in the form of hypothetical experimentation by microcomputer. The development

  5. [Hardware and software for EMG recording and analysis of respiratory muscles of human].

    Science.gov (United States)

    Solnushkin, S D; Chakhman, V N; Segizbaeva, M O; Pogodin, M A; Aleksandrov, V G

    2014-01-01

    This paper presents a new hardware and software system that allows to not only record the EMG of different groups of the respiratory muscles, but also hold its amplitude-frequency analysis, which allows to determine the change in the contribution to the work of breathing of a respiratory muscles and detect early signs of fatigue of the respiratory muscles. Presented complex can be used for functional diagnostics of breath in patients and healthy people and sportsmen.

  6. Spectral Graph Theory Analysis of Software-Defined Networks to Improve Performance and Security

    Science.gov (United States)

    2015-09-01

    theory , spectral graph theory and closed-loop control are provided in Chapter II. The dual-basis and its role in defining the state of the network...finding is a significant research area within graph theory . Its applications range from finding groups within social networks [35] to finding clusters...GRAPH THEORY ANALYSIS OF SOFTWARE-DEFINED NETWORKS TO IMPROVE PERFORMANCE AND SECURITY by Thomas C. Parker September 2015 Dissertation Co

  7. HOURS: Simulation and analysis software for the KM3NeT

    Science.gov (United States)

    Tsirigotis, A. G.; Leisos, A.; Tzamarias, S. E.

    2017-02-01

    The Hellenic Open University Reconstruction & Simulation (HOURS) software package contains a realistic simulation package of the detector response of very large (km3-scale) underwater neutrino telescopes, including an accurate description of all the relevant physical processes, the production of signal and background as well as several analysis strategies for triggering and pattern recognition, event reconstruction, tracking and energy estimation. HOURS also provides tools for simulating calibration techniques and other studies for estimating the detector sensitivity to several neutrino sources.

  8. A Study of Method on Connectivity Analysis of Between Software Components

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    An analysis and computation method of conne ctivity betweencomponents that based on logical subtyping is first presented, t he concepts of virtual interface and real interface, and quantitative analy sis and computation formula of connectivity between interfaces are also introduc ed, that based on a extendable software architecture specification language model. We provide a n ew idea for solving the problem of connection between reuse-components.

  9. CSA06 Computing, Software and Analysis challenge at the Spanish Tier-1 and Tier-2 sites

    CERN Document Server

    Alcaraz, J; Cabrillo, Iban Jose; Colino, Nicanor; Cuevas-Maestro, J; Delgado Peris, Antonio; Fernandez Menendez, Javier; Flix, Jose; García-Abia, Pablo; González-Caballero, I; Hernández, Jose M; Marco, Rafael; Martinez Ruiz del Arbol, Pablo; Matorras, Francisco; Merino, Gonzalo; Rodríguez-Calonge, F J; Vizan Garcia, Jesus Manuel

    2007-01-01

    This note describes the participation of the Spanish centres PIC, CIEMAT and IFCA as Tier-1 and Tier-2 sites in the CMS CSA06 Computing, Software and Analysis challenge. A number of the facilities, services and workflows have been demonstrated at the 2008 25% scale. Very valuable experience has been gained running the complex computing system under realistic conditions at a significant scale. The focus of this note is on presenting achieved results, operational experience and lessons learnt during the challenge.

  10. RCAUSE – A ROOT CAUSE ANALYSIS MODEL TO IDENTIFY THE ROOT CAUSES OF SOFTWARE REENGINEERING PROBLEMS

    Directory of Open Access Journals (Sweden)

    Er. Anand Rajavat

    2011-01-01

    Full Text Available Organizations that wish to modernize their legacy systems, must adopt a financial viable evolution strategy to gratify the needs of modern business environment. There are various options available to modernize legacy system in to more contemporary system. Over the last few years’ legacy system reengineering has emerged as a popular system modernization technique. The reengineering generally focuses on the increased productivity and quality of the system. However many of these efforts are often less than successful because they only concentrate on symptoms of software reengineering risk without targeting root causes of those risk. A subjective assessment (diagnosis of software reengineering risk from different domain of legacy system is required to identify the root causes of those risks. The goal of this paper is to highlight root causes of software reengineering risk. We proposed a root cause analysis model RCause that classify root causes of software reengineering risk in to three distinctive but connected areas of interest i.e. system domain, managerial domain and technical domain. .

  11. CONAN: copy number variation analysis software for genome-wide association studies

    Directory of Open Access Journals (Sweden)

    Wichmann Heinz-Erich

    2010-06-01

    Full Text Available Abstract Background Genome-wide association studies (GWAS based on single nucleotide polymorphisms (SNPs revolutionized our perception of the genetic regulation of complex traits and diseases. Copy number variations (CNVs promise to shed additional light on the genetic basis of monogenic as well as complex diseases and phenotypes. Indeed, the number of detected associations between CNVs and certain phenotypes are constantly increasing. However, while several software packages support the determination of CNVs from SNP chip data, the downstream statistical inference of CNV-phenotype associations is still subject to complicated and inefficient in-house solutions, thus strongly limiting the performance of GWAS based on CNVs. Results CONAN is a freely available client-server software solution which provides an intuitive graphical user interface for categorizing, analyzing and associating CNVs with phenotypes. Moreover, CONAN assists the evaluation process by visualizing detected associations via Manhattan plots in order to enable a rapid identification of genome-wide significant CNV regions. Various file formats including the information on CNVs in population samples are supported as input data. Conclusions CONAN facilitates the performance of GWAS based on CNVs and the visual analysis of calculated results. CONAN provides a rapid, valid and straightforward software solution to identify genetic variation underlying the 'missing' heritability for complex traits that remains unexplained by recent GWAS. The freely available software can be downloaded at http://genepi-conan.i-med.ac.at.

  12. Information Extraction for System-Software Safety Analysis: Calendar Year 2008 Year-End Report

    Science.gov (United States)

    Malin, Jane T.

    2009-01-01

    This annual report describes work to integrate a set of tools to support early model-based analysis of failures and hazards due to system-software interactions. The tools perform and assist analysts in the following tasks: 1) extract model parts from text for architecture and safety/hazard models; 2) combine the parts with library information to develop the models for visualization and analysis; 3) perform graph analysis and simulation to identify and evaluate possible paths from hazard sources to vulnerable entities and functions, in nominal and anomalous system-software configurations and scenarios; and 4) identify resulting candidate scenarios for software integration testing. There has been significant technical progress in model extraction from Orion program text sources, architecture model derivation (components and connections) and documentation of extraction sources. Models have been derived from Internal Interface Requirements Documents (IIRDs) and FMEA documents. Linguistic text processing is used to extract model parts and relationships, and the Aerospace Ontology also aids automated model development from the extracted information. Visualizations of these models assist analysts in requirements overview and in checking consistency and completeness.

  13. T-REX: software for the processing and analysis of T-RFLP data

    Directory of Open Access Journals (Sweden)

    Culman Steven W

    2009-06-01

    Full Text Available Abstract Background Despite increasing popularity and improvements in terminal restriction fragment length polymorphism (T-RFLP and other microbial community fingerprinting techniques, there are still numerous obstacles that hamper the analysis of these datasets. Many steps are required to process raw data into a format ready for analysis and interpretation. These steps can be time-intensive, error-prone, and can introduce unwanted variability into the analysis. Accordingly, we developed T-REX, free, online software for the processing and analysis of T-RFLP data. Results Analysis of T-RFLP data generated from a multiple-factorial study was performed with T-REX. With this software, we were able to i label raw data with attributes related to the experimental design of the samples, ii determine a baseline threshold for identification of true peaks over noise, iii align terminal restriction fragments (T-RFs in all samples (i.e., bin T-RFs, iv construct a two-way data matrix from labeled data and process the matrix in a variety of ways, v produce several measures of data matrix complexity, including the distribution of variance between main and interaction effects and sample heterogeneity, and vi analyze a data matrix with the additive main effects and multiplicative interaction (AMMI model. Conclusion T-REX provides a free, platform-independent tool to the research community that allows for an integrated, rapid, and more robust analysis of T-RFLP data.

  14. OpenFLUX: efficient modelling software for 13C-based metabolic flux analysis

    Directory of Open Access Journals (Sweden)

    Nielsen Lars K

    2009-05-01

    Full Text Available Abstract Background The quantitative analysis of metabolic fluxes, i.e., in vivo activities of intracellular enzymes and pathways, provides key information on biological systems in systems biology and metabolic engineering. It is based on a comprehensive approach combining (i tracer cultivation on 13C substrates, (ii 13C labelling analysis by mass spectrometry and (iii mathematical modelling for experimental design, data processing, flux calculation and statistics. Whereas the cultivation and the analytical part is fairly advanced, a lack of appropriate modelling software solutions for all modelling aspects in flux studies is limiting the application of metabolic flux analysis. Results We have developed OpenFLUX as a user friendly, yet flexible software application for small and large scale 13C metabolic flux analysis. The application is based on the new Elementary Metabolite Unit (EMU framework, significantly enhancing computation speed for flux calculation. From simple notation of metabolic reaction networks defined in a spreadsheet, the OpenFLUX parser automatically generates MATLAB-readable metabolite and isotopomer balances, thus strongly facilitating model creation. The model can be used to perform experimental design, parameter estimation and sensitivity analysis either using the built-in gradient-based search or Monte Carlo algorithms or in user-defined algorithms. Exemplified for a microbial flux study with 71 reactions, 8 free flux parameters and mass isotopomer distribution of 10 metabolites, OpenFLUX allowed to automatically compile the EMU-based model from an Excel file containing metabolic reactions and carbon transfer mechanisms, showing it's user-friendliness. It reliably reproduced the published data and optimum flux distributions for the network under study were found quickly ( Conclusion We have developed a fast, accurate application to perform steady-state 13C metabolic flux analysis. OpenFLUX will strongly facilitate and

  15. ANATI QUANTI: software de análises quantitativas para estudos em anatomia vegetal ANATI QUANTI: quantitative analysis software for plant anatomy studies

    Directory of Open Access Journals (Sweden)

    T.V. Aguiar

    2007-12-01

    Full Text Available Em diversos estudos interdisciplinares em que a Anatomia Vegetal é utilizada, análises quantitativas complementares são necessárias. Geralmente, a avaliação micromorfométrica é feita manualmente e/ou utilizando programas computacionais de análise de imagens não específicos. Este trabalho teve como objetivo desenvolver um programa específico para Anatomia Vegetal quantitativa e testar sua eficiência e aceitação por usuários. A solução foi elaborada na linguagem Java, visando maior mobilidade em relação ao sistema operacional a ser usado. O software desenvolvido foi denominado ANATI QUANTI e testado pelos alunos, pesquisadores e professores do Laboratório de Anatomia Vegetal da Universidade Federal de Viçosa (UFV. Todos os entrevistados receberam fotos para efetuarem medições no ANATI QUANTI e comparar com os resultados obtidos utilizando o software disponível. Os voluntários, através de questionários previamente formulados, destacaram as principais vantagens e desvantagens do programa desenvolvido em relação ao software disponível. Além de ser mais específico, simples e ágil do que o software disponível, o ANATI QUANTI é confiável, atendendo à expectativa dos entrevistados. Entretanto, há necessidade de acrescentar recursos adicionais, como a inserção de novas escalas, o que aumentaria a gama de usuários. O ANATI QUANTI já está em uso nas pesquisas desenvolvidas por usuários na UFV. Por ser um software livre e de código aberto, será disponibilizado na internet gratuitamente.Complementary quantitative analyses are necessary for several interdisciplinary studies using Plant Anatomy. Generally, micromorphometric evaluation is performed manually and/or using non-specific software for image analyses. This work aimed to develop specific quantitative analysis software for Plant Anatomy and test its efficiency and acceptance by users. The solution was elaborated in the JAVA language, which has a greater

  16. Featureous: A Tool for Feature-Centric Analysis of Java Software

    DEFF Research Database (Denmark)

    Olszak, Andrzej; Jørgensen, Bo Nørregaard

    2010-01-01

    Feature-centric comprehension of source code is necessary for incorporating user-requested modifications during software evolution and maintenance. However, such comprehension is difficult to achieve in case of large object-oriented programs due to the size, complexity, and implicit character...... of mappings between features and source code. To support programmers in overcoming these difficulties, we present a feature-centric analysis tool, Featureous. Our tool extends the NetBeans IDE with mechanisms for efficient location of feature implementations in legacy source code, and an extensive analysis...

  17. Algal Functional Annotation Tool: a web-based analysis suite to functionally interpret large gene lists using integrated annotation and expression data

    Directory of Open Access Journals (Sweden)

    Merchant Sabeeha S

    2011-07-01

    Full Text Available Abstract Background Progress in genome sequencing is proceeding at an exponential pace, and several new algal genomes are becoming available every year. One of the challenges facing the community is the association of protein sequences encoded in the genomes with biological function. While most genome assembly projects generate annotations for predicted protein sequences, they are usually limited and integrate functional terms from a limited number of databases. Another challenge is the use of annotations to interpret large lists of 'interesting' genes generated by genome-scale datasets. Previously, these gene lists had to be analyzed across several independent biological databases, often on a gene-by-gene basis. In contrast, several annotation databases, such as DAVID, integrate data from multiple functional databases and reveal underlying biological themes of large gene lists. While several such databases have been constructed for animals, none is currently available for the study of algae. Due to renewed interest in algae as potential sources of biofuels and the emergence of multiple algal genome sequences, a significant need has arisen for such a database to process the growing compendiums of algal genomic data. Description The Algal Functional Annotation Tool is a web-based comprehensive analysis suite integrating annotation data from several pathway, ontology, and protein family databases. The current version provides annotation for the model alga Chlamydomonas reinhardtii, and in the future will include additional genomes. The site allows users to interpret large gene lists by identifying associated functional terms, and their enrichment. Additionally, expression data for several experimental conditions were compiled and analyzed to provide an expression-based enrichment search. A tool to search for functionally-related genes based on gene expression across these conditions is also provided. Other features include dynamic visualization of

  18. Orion ECLSS/Suit System Intermediate Pressure Integrated Suit Test

    Science.gov (United States)

    Barido, Richard A.

    2014-01-01

    The Intermediate Pressure Integrated Suit Test (IPIST) phase of the integrated system testing of the Orion Vehicle Atmosphere Revitalization System (ARS) technology was conducted for the Multipurpose Crew Vehicle (MPCV) Program within the National Aeronautics and Space Administration (NASA) Exploration Systems Mission Directorate. This test was performed in the eleven-foot human-rated vacuum chamber at the NASA Johnson Space Center by the Crew and Thermal Systems Division. This testing is the second phase of suit loop testing to demonstrate the viability of the Environmental Control and Life Support System (ECLSS) being developed for Orion. The IPIST configuration consisted of development hardware that included the CAMRAS, air revitalization loop fan and suit loop regulator. Two test subjects were in pressure suits at varying suit pressures. Follow-on testing, to be conducted in 2014, will utilize the same hardware with human test subjects in pressure suits at vacuum. This paper will discuss the results and findings of IPIST and will also discuss future testing.

  19. ACQ4: an open-source software platform for data acquisition and analysis in neurophysiology research.

    Science.gov (United States)

    Campagnola, Luke; Kratz, Megan B; Manis, Paul B

    2014-01-01

    The complexity of modern neurophysiology experiments requires specialized software to coordinate multiple acquisition devices and analyze the collected data. We have developed ACQ4, an open-source software platform for performing data acquisition and analysis in experimental neurophysiology. This software integrates the tasks of acquiring, managing, and analyzing experimental data. ACQ4 has been used primarily for standard patch-clamp electrophysiology, laser scanning photostimulation, multiphoton microscopy, intrinsic imaging, and calcium imaging. The system is highly modular, which facilitates the addition of new devices and functionality. The modules included with ACQ4 provide for rapid construction of acquisition protocols, live video display, and customizable analysis tools. Position-aware data collection allows automated construction of image mosaics and registration of images with 3-dimensional anatomical atlases. ACQ4 uses free and open-source tools including Python, NumPy/SciPy for numerical computation, PyQt for the user interface, and PyQtGraph for scientific graphics. Supported hardware includes cameras, patch clamp amplifiers, scanning mirrors, lasers, shutters, Pockels cells, motorized stages, and more. ACQ4 is available for download at http://www.acq4.org.

  20. MyView2, a new visualization software tool for analysis of LHD data

    Energy Technology Data Exchange (ETDEWEB)

    Moon, Chanho, E-mail: moon@nifs.ac.jp; Yoshinuma, Mikirou; Emoto, Masahiko; Ida, Katsumi

    2016-03-15

    The Large Helical Device (LHD) at the National Institute for Fusion Science (NIFS) is the world’s largest superconducting helical fusion device, providing a scientific research center to elucidate important physics research such as plasma transport, turbulence dynamics, and other topics. Furthermore, many types of advanced diagnostic devices are used to measure the confinement plasma characteristics, and these valuable physical data are registered over the 131,000 discharges in the LHD database. However, it is difficult to investigate the experimental data even though much physical data has been registered. In order to improve the efficiency for investigating plasma physics in LHD, we have developed a new data visualization software, MyView2, which consists of Python-based modules that can be easily set up and updated. MyView2 provides immediate access to experimental results, cross-shot analysis, and a collaboration point for scientific research. In particular, the MyView2 software is a portable structure for making viewable LHD experimental data in on- and off-site web servers, which is a capability not previously available in any general use tool. We will also discuss the benefits of using the MyView2 software for in-depth analysis of LHD experimental data.

  1. plusTipTracker: Quantitative image analysis software for the measurement of microtubule dynamics.

    Science.gov (United States)

    Applegate, Kathryn T; Besson, Sebastien; Matov, Alexandre; Bagonis, Maria H; Jaqaman, Khuloud; Danuser, Gaudenz

    2011-11-01

    Here we introduce plusTipTracker, a Matlab-based open source software package that combines automated tracking, data analysis, and visualization tools for movies of fluorescently-labeled microtubule (MT) plus end binding proteins (+TIPs). Although +TIPs mark only phases of MT growth, the plusTipTracker software allows inference of additional MT dynamics, including phases of pause and shrinkage, by linking collinear, sequential growth tracks. The algorithm underlying the reconstruction of full MT trajectories relies on the spatially and temporally global tracking framework described in Jaqaman et al. (2008). Post-processing of track populations yields a wealth of quantitative phenotypic information about MT network architecture that can be explored using several visualization modalities and bioinformatics tools included in plusTipTracker. Graphical user interfaces enable novice Matlab users to track thousands of MTs in minutes. In this paper, we describe the algorithms used by plusTipTracker and show how the package can be used to study regional differences in the relative proportion of MT subpopulations within a single cell. The strategy of grouping +TIP growth tracks for the analysis of MT dynamics has been introduced before (Matov et al., 2010). The numerical methods and analytical functionality incorporated in plusTipTracker substantially advance this previous work in terms of flexibility and robustness. To illustrate the enhanced performance of the new software we thus compare computer-assembled +TIP-marked trajectories to manually-traced MT trajectories from the same movie used in Matov et al. (2010).

  2. Plume Ascent Tracker: Interactive Matlab software for analysis of ascending plumes in image data

    Science.gov (United States)

    Valade, S. A.; Harris, A. J. L.; Cerminara, M.

    2014-05-01

    This paper presents Matlab-based software designed to track and analyze an ascending plume as it rises above its source, in image data. It reads data recorded in various formats (video files, image files, or web-camera image streams), and at various wavelengths (infrared, visible, or ultra-violet). Using a set of filters which can be set interactively, the plume is first isolated from its background. A user-friendly interface then allows tracking of plume ascent and various parameters that characterize plume evolution during emission and ascent. These include records of plume height, velocity, acceleration, shape, volume, ash (fine-particle) loading, spreading rate, entrainment coefficient and inclination angle, as well as axial and radial profiles for radius and temperature (if data are radiometric). Image transformations (dilatation, rotation, resampling) can be performed to create new images with a vent-centered metric coordinate system. Applications may interest both plume observers (monitoring agencies) and modelers. For the first group, the software is capable of providing quantitative assessments of plume characteristics from image data, for post-event analysis or in near real-time analysis. For the second group, extracted data can serve as benchmarks for plume ascent models, and as inputs for cloud dispersal models. We here describe the software's tracking methodology and main graphical interfaces, using thermal infrared image data of an ascending volcanic ash plume at Santiaguito volcano.

  3. ACQ4: an open-source software platform for data acquisition and analysis in neurophysiology research

    Directory of Open Access Journals (Sweden)

    Luke eCampagnola

    2014-01-01

    Full Text Available The complexity of modern neurophysiology experiments requires specialized software to coordinate multiple acquisition devices and analyze the collected data. We have developed ACQ4, an open-source software platform for performing data acquisition and analysis in experimental neurophysiology. This software integrates the tasks of acquiring, managing, and analyzing experimental data. ACQ4 has been used primarily for standard patch-clamp electrophysiology, laser scanning photostimulation, multiphoton microscopy, intrinsic imaging, and calcium imaging. The system is highly modular, which facilitates the addition of new devices and functionality. The modules included with ACQ4 provide for rapid construction of acquisition protocols, live video display, and customizable analysis tools. Position-aware data collection allows automated construction of image mosaics and registration of images with 3-dimensional anatomical atlases. ACQ4 uses free and open-source tools including Python, NumPy/SciPy for numerical computation, PyQt for the user interface, and PyQtGraph for scientific graphics. Supported hardware includes cameras, patch clamp amplifiers, scanning mirrors, lasers, shutters, Pockels cells, motorized stages, and more. ACQ4 is available for download at http://www.acq4.org.

  4. Ignominy:a Tool for Software Dependency and Metric Analysis with Examples from Large HEP Packages

    Institute of Scientific and Technical Information of China (English)

    LassiA.Tuura; LucasTaylor

    2001-01-01

    Ignominy is a tool developed in the CMS IGUANA project to analyse the structure of software systems.Its primary component is a dependency scanner that distills information into human-usable forms.It also includes several tools to visualise the collected data in the form of graphical views and numerical metrics.Ignominy was designed to adapt to almost any reasonable structure,and it has been used to analyse several large projects. The original purpose of Ignominy was to help us better ensure the quality of our own software,and in particular warn us about possible structureal problems early on .As a part of this activity it is now used as a standard part of our release procedure,we also use it to evaluate and study the quality of external packages we plan to make use of .We describe what Ignominy can find out,and how if can be used to ivsualise and assess a software structure.We also discuss the inherent problems of the analysis as well as the different approaches to modularity the tool makes quite evident.The focus is the illustration of these issues through the analysis results for several sizable HEP softwre projects.

  5. Software Analysis of Uncorrelated MS1 Peaks for Discovery of Post-Translational Modifications.

    Science.gov (United States)

    Pascal, Bruce D; West, Graham M; Scharager-Tapia, Catherina; Flefil, Ricardo; Moroni, Tina; Martinez-Acedo, Pablo; Griffin, Patrick R; Carvalloza, Anthony C

    2015-12-01

    The goal in proteomics to identify all peptides in a complex mixture has been largely addressed using various LC MS/MS approaches, such as data dependent acquisition, SRM/MRM, and data independent acquisition instrumentation. Despite these developments, many peptides remain unsequenced, often due to low abundance, poor fragmentation patterns, or data analysis difficulties. Many of the unidentified peptides exhibit strong evidence in high resolution MS(1) data and are frequently post-translationally modified, playing a significant role in biological processes. Proteomics Workbench (PWB) software was developed to automate the detection and visualization of all possible peptides in MS(1) data, reveal candidate peptides not initially identified, and build inclusion lists for subsequent MS(2) analysis to uncover new identifications. We used this software on existing data on the autophagy regulating kinase Ulk1 as a proof of concept for this method, as we had already manually identified a number of phosphorylation sites Dorsey, F. C. et al (J. Proteome. Res. 8(11), 5253-5263 (2009)). PWB found all previously identified sites of phosphorylation. The software has been made freely available at http://www.proteomicsworkbench.com . Graphical Abstract ᅟ.

  6. Evaluation of a Game to Teach Requirements Collection and Analysis in Software Engineering at Tertiary Education Level

    Science.gov (United States)

    Hainey, Thomas; Connolly, Thomas M.; Stansfield, Mark; Boyle, Elizabeth A.

    2011-01-01

    A highly important part of software engineering education is requirements collection and analysis which is one of the initial stages of the Database Application Lifecycle and arguably the most important stage of the Software Development Lifecycle. No other conceptual work is as difficult to rectify at a later stage or as damaging to the overall…

  7. ELAN: a software package for analysis and visualization of MEG, EEG, and LFP signals.

    Science.gov (United States)

    Aguera, Pierre-Emmanuel; Jerbi, Karim; Caclin, Anne; Bertrand, Olivier

    2011-01-01

    The recent surge in computational power has led to extensive methodological developments and advanced signal processing techniques that play a pivotal role in neuroscience. In particular, the field of brain signal analysis has witnessed a strong trend towards multidimensional analysis of large data sets, for example, single-trial time-frequency analysis of high spatiotemporal resolution recordings. Here, we describe the freely available ELAN software package which provides a wide range of signal analysis tools for electrophysiological data including scalp electroencephalography (EEG), magnetoencephalography (MEG), intracranial EEG, and local field potentials (LFPs). The ELAN toolbox is based on 25 years of methodological developments at the Brain Dynamics and Cognition Laboratory in Lyon and was used in many papers including the very first studies of time-frequency analysis of EEG data exploring evoked and induced oscillatory activities in humans. This paper provides an overview of the concepts and functionalities of ELAN, highlights its specificities, and describes its complementarity and interoperability with other toolboxes.

  8. Introduction to the KWALON Experiment: Discussions on Qualitative Data Analysis Software by Developers and Users

    Directory of Open Access Journals (Sweden)

    Jeanine C. Evers

    2010-11-01

    Full Text Available In this introduction to the KWALON Experiment and related conference, we describe the motivations of the collaborating European networks in organising this joint endeavour. The KWALON Experiment consisted of five developers of Qualitative Data Analysis (QDA software analysing a dataset regarding the financial crisis in the time period 2008-2009, provided by the conference organisers. Besides this experiment, researchers were invited to present their reflective papers on the use of QDA software. This introduction gives a description of the experiment, the "rules", research questions and reflective points, as well as a full description of the dataset and search rules used, and our reflection on the lessons learned. The related conference is described, as are the papers which are included in this FQS issue. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs1101405

  9. Versatile Software Package For Near Real-Time Analysis of Experimental Data

    Science.gov (United States)

    Wieseman, Carol D.; Hoadley, Sherwood T.

    1998-01-01

    This paper provides an overview of a versatile software package developed for time- and frequency-domain analyses of experimental wind-tunnel data. This package, originally developed for analyzing data in the NASA Langley Transonic Dynamics Tunnel (TDT), is applicable for analyzing any time-domain data. A Matlab-based software package, TDT-analyzer, provides a compendium of commonly-required dynamic analysis functions in a user-friendly interactive and batch processing environment. TDT-analyzer has been used extensively to provide on-line near real-time and post-test examination and reduction of measured data acquired during wind tunnel tests of aeroelastically-scaled models of aircraft and rotorcraft as well as a flight test of the NASA High Alpha Research Vehicle (HARV) F-18. The package provides near real-time results in an informative and timely manner far exceeding prior methods of data reduction at the TDT.

  10. Madagascar: open-source software project for multidimensional data analysis and reproducible computational experiments

    Directory of Open Access Journals (Sweden)

    Sergey Fomel

    2013-11-01

    Full Text Available The Madagascar software package is designed for analysis of large-scale multidimensional data, such as those occurring in exploration geophysics. Madagascar provides a framework for reproducible research. By “reproducible research” we refer to the discipline of attaching software codes and data to computational results reported in publications. The package contains a collection of (a computational modules, (b data-processing scripts, and (c research papers. Madagascar is distributed on SourceForge under a GPL v2 license https://sourceforge.net/projects/rsf/. By October 2013, more than 70 people from different organizations around the world have contributed to the project, with increasing year-to-year activity. The Madagascar website is http://www.ahay.org/.

  11. Value Benefit Analysis Software and Its Application in Bolu-Lake Abant Natural Park

    Directory of Open Access Journals (Sweden)

    Omer Lutfu Corbaci

    2008-09-01

    Full Text Available Value benefit analysis (VBA is a psychometric instrument for finding the best compromise in forestry multiple-use planning, when the multiple objectives cannot be expressed in the same physical or monetary unit. It insures a systematic assessment of the consequences of proposed alternatives and thoroughly documents the decision process. The method leads to a ranking of alternatives based upon weighting of the objectives and evaluation of the contribution of each alternative to these objectives. The use of the method is illustrated with hypothetical data about Bolu-Lake Abant Natural Park (BLANP. In this study, in addition, computer software controlling the confidence was created. This software puts into practice the method proposed by Churchman and Ackoff, and determines the significance of the alternatives quickly and accurately.

  12. SOFTWARE METRICS VALIDATION METHODOLOGIES IN SOFTWARE ENGINEERING

    Directory of Open Access Journals (Sweden)

    K.P. Srinivasan

    2014-12-01

    Full Text Available In the software measurement validations, assessing the validation of software metrics in software engineering is a very difficult task due to lack of theoretical methodology and empirical methodology [41, 44, 45]. During recent years, there have been a number of researchers addressing the issue of validating software metrics. At present, software metrics are validated theoretically using properties of measures. Further, software measurement plays an important role in understanding and controlling software development practices and products. The major requirement in software measurement is that the measures must represent accurately those attributes they purport to quantify and validation is critical to the success of software measurement. Normally, validation is a collection of analysis and testing activities across the full life cycle and complements the efforts of other quality engineering functions and validation is a critical task in any engineering project. Further, validation objective is to discover defects in a system and assess whether or not the system is useful and usable in operational situation. In the case of software engineering, validation is one of the software engineering disciplines that help build quality into software. The major objective of software validation process is to determine that the software performs its intended functions correctly and provides information about its quality and reliability. This paper discusses the validation methodology, techniques and different properties of measures that are used for software metrics validation. In most cases, theoretical and empirical validations are conducted for software metrics validations in software engineering [1-50].

  13. Adiposoft: automated software for the analysis of white adipose tissue cellularity in histological sections.

    Science.gov (United States)

    Galarraga, Miguel; Campión, Javier; Muñoz-Barrutia, Arrate; Boqué, Noemí; Moreno, Haritz; Martínez, José Alfredo; Milagro, Fermín; Ortiz-de-Solórzano, Carlos

    2012-12-01

    The accurate estimation of the number and size of cells provides relevant information on the kinetics of growth and the physiological status of a given tissue or organ. Here, we present Adiposoft, a fully automated open-source software for the analysis of white adipose tissue cellularity in histological sections. First, we describe the sequence of image analysis routines implemented by the program. Then, we evaluate our software by comparing it with other adipose tissue quantification methods, namely, with the manual analysis of cells in histological sections (used as gold standard) and with the automated analysis of cells in suspension, the most commonly used method. Our results show significant concordance between Adiposoft and the other two methods. We also demonstrate the ability of the proposed method to distinguish the cellular composition of three different rat fat depots. Moreover, we found high correlation and low disagreement between Adiposoft and the manual delineation of cells. We conclude that Adiposoft provides accurate results while considerably reducing the amount of time and effort required for the analysis.

  14. Análisis del Grabado Dental Utilizando el Microscopio Metalográfico y el Software AnalySIS Analysis of Dental Etching Using the Metallographic Microscope and AnalySIS Software

    OpenAIRE

    Consuelo Flores-Yáñez; Javier Martínez-Juárez; Mario Palma-Guzmán; Jorge Yáñez-Santos

    2009-01-01

    Se muestra que el microscopio metalográfico y el software AnalySIS representan técnicas adecuadas para el análisis del grabado dental ácido. Se grabó la superficie vestibular de 25 dientes humanos permanentes, de reciente extracción y libres de caries, se observaron al microscopio (100X) y se procesaron con el software AnalySIS. Se determinó el diámetro de poros seleccionados aleatoriamente y se evaluaron diferentes concentraciones de ácido fosfórico variando el tiempo de exposición. Los resu...

  15. HORACE: software for the analysis of data from single crystal spectroscopy experiments at time-of-flight neutron instruments

    CERN Document Server

    Ewings, R A; Le, M D; van Duijn, J; Bustinduy, I; Perring, T G

    2016-01-01

    The HORACE suite of programs has been developed to work with large multiple-measurement data sets collected from time-of-flight neutron spectrometers equipped with arrays of position-sensitive detectors. The software allows exploratory studies of the four dimensions of reciprocal space and excitation energy to be undertaken, enabling multi-dimensional subsets to be visualized, algebraically manipulated, and models for the scattering to simulated or fitted to the data. The software is designed to be an extensible framework, thus allowing user-customized operations to be performed on the data. Examples of the use of its features are given for measurements exploring the spin waves of the simple antiferromagnet RbMnF$_{3}$ and ferromagnetic iron, and the phonons in URu$_{2}$Si$_{2}$.

  16. HORACE: Software for the analysis of data from single crystal spectroscopy experiments at time-of-flight neutron instruments

    Science.gov (United States)

    Ewings, R. A.; Buts, A.; Le, M. D.; van Duijn, J.; Bustinduy, I.; Perring, T. G.

    2016-10-01

    The HORACE suite of programs has been developed to work with large multiple-measurement data sets collected from time-of-flight neutron spectrometers equipped with arrays of position-sensitive detectors. The software allows exploratory studies of the four dimensions of reciprocal space and excitation energy to be undertaken, enabling multi-dimensional subsets to be visualized, algebraically manipulated, and models for the scattering to simulated or fitted to the data. The software is designed to be an extensible framework, thus allowing user-customized operations to be performed on the data. Examples of the use of its features are given for measurements exploring the spin waves of the simple antiferromagnet RbMnF3 and ferromagnetic iron, and the phonons in URu2Si2.

  17. Mainport planning suite software services to support mainport planning

    OpenAIRE

    2007-01-01

    Sustainable growth and the commercial success of "Mainport Holland", located in one of Europe’s most densely populated areas, is threatened by a lack of available land, a congested infrastructure, and an increasingly complex social, economic and political reality. To deal with these threats mainports, such as the Port of Rotterdam, are reengineering their planning processes. Instead of making plans based on an extrapolation of current trends, the aim is now to find answers to ...

  18. Mainport planning suite software services to support mainport planning

    NARCIS (Netherlands)

    Chin, R.T.H.

    2007-01-01

    Sustainable growth and the commercial success of "Mainport Holland", located in one of Europe’s most densely populated areas, is threatened by a lack of available land, a congested infrastructure, and an increasingly complex social, economic and political reality. To deal with the

  19. HPC Benchmark Suite NMx Project

    Data.gov (United States)

    National Aeronautics and Space Administration — In the phase II effort, Intelligent Automation Inc., (IAI) and University of Central Florida (UCF) propose to develop a comprehensive numerical test suite for...

  20. Grid search in stellar parameters: a software for spectrum analysis of single stars and binary systems

    Science.gov (United States)

    Tkachenko, A.

    2015-09-01

    Context. The currently operating space missions, as well as those that will be launched in the near future, will deliver high-quality data for millions of stellar objects. Since the majority of stellar astrophysical applications still (at least partly) rely on spectroscopic data, an efficient tool for the analysis of medium- to high-resolution spectroscopy is needed. Aims: We aim at developing an efficient software package for the analysis of medium- to high-resolution spectroscopy of single stars and those in binary systems. The major requirements are that the code should have a high performance, represent the state-of-the-art analysis tool, and provide accurate determinations of atmospheric parameters and chemical compositions for different types of stars. Methods: We use the method of atmosphere models and spectrum synthesis, which is one of the most commonly used approaches for the analysis of stellar spectra. Our Grid Search in Stellar Parameters (gssp) code makes use of the Message Passing Interface (OpenMPI) implementation, which makes it possible to run in parallel mode. The method is first tested on the simulated data and is then applied to the spectra of real stellar objects. Results: The majority of test runs on the simulated data were successful in that we were able to recover the initially assumed sets of atmospheric parameters. We experimentally find the limits in signal-to-noise ratios of the input spectra, below which the final set of parameters is significantly affected by the noise. Application of the gssp package to the spectra of three Kepler stars, KIC 11285625, KIC 6352430, and KIC 4931738, was also largely successful. We found an overall agreement of the final sets of the fundamental parameters with the original studies. For KIC 6352430, we found that dependence of the light dilution factor on wavelength cannot be ignored, as it has a significant impact on the determination of the atmospheric parameters of this binary system. Conclusions: The

  1. BROCCOLI: Software for Fast fMRI Analysis on Many-Core CPUs and GPUs

    Directory of Open Access Journals (Sweden)

    Anders eEklund

    2014-03-01

    Full Text Available Analysis of functional magnetic resonance imaging (fMRI data is becoming ever more computationally demanding as temporal and spatial resolutions improve, and large, publicly available data sets proliferate. Moreover, methodological improvements in the neuroimaging pipeline, such as non-linear spatial normalization, non-parametric permutation tests and Bayesian Markov Chain Monte Carlo approaches, can dramatically increase the computational burden. Despite these challenges, there do not yet exist any fMRI software packages which leverage inexpensive and powerful graphics processing units (GPUs to perform these analyses. Here, we therefore present BROCCOLI, a free software package written in OpenCL (Open Computing Language that can be used for parallel analysis of fMRI data on a large variety of hardware configurations. BROCCOLI has, for example, been tested with an Intel CPU, an Nvidia GPU and an AMD GPU. These tests show that parallel processing of fMRI data can lead to significantly faster analysis pipelines. This speedup can be achieved on relatively standard hardware, but further, dramatic speed improvements require only a modest investment in GPU hardware. BROCCOLI (running on a GPU can perform non-linear spatial normalization to a 1 mm3 brain template in 4-6 seconds, and run a second level permutation test with 10,000 permutations in about a minute. These non-parametric tests are generally more robust than their parametric counterparts, and can also enable more sophisticated analyses by estimating complicated null distributions. Additionally, BROCCOLI includes support for Bayesian first-level fMRI analysis using a Gibbs sampler. The new software is freely available under GNU GPL3 and can be downloaded from github (https://github.com/wanderine/BROCCOLI/.

  2. Development and validation of MIX: comprehensive free software for meta-analysis of causal research data

    Directory of Open Access Journals (Sweden)

    Ikeda Noriaki

    2006-10-01

    Full Text Available Abstract Background Meta-analysis has become a well-known method for synthesis of quantitative data from previously conducted research in applied health sciences. So far, meta-analysis has been particularly useful in evaluating and comparing therapies and in assessing causes of disease. Consequently, the number of software packages that can perform meta-analysis has increased over the years. Unfortunately, it can take a substantial amount of time to get acquainted with some of these programs and most contain little or no interactive educational material. We set out to create and validate an easy-to-use and comprehensive meta-analysis package that would be simple enough programming-wise to remain available as a free download. We specifically aimed at students and researchers who are new to meta-analysis, with important parts of the development oriented towards creating internal interactive tutoring tools and designing features that would facilitate usage of the software as a companion to existing books on meta-analysis. Results We took an unconventional approach and created a program that uses Excel as a calculation and programming platform. The main programming language was Visual Basic, as implemented in Visual Basic 6 and Visual Basic for Applications in Excel 2000 and higher. The development took approximately two years and resulted in the 'MIX' program, which can be downloaded from the program's website free of charge. Next, we set out to validate the MIX output with two major software packages as reference standards, namely STATA (metan, metabias, and metatrim and Comprehensive Meta-Analysis Version 2. Eight meta-analyses that had been published in major journals were used as data sources. All numerical and graphical results from analyses with MIX were identical to their counterparts in STATA and CMA. The MIX program distinguishes itself from most other programs by the extensive graphical output, the click-and-go (Excel interface, and the

  3. A new parallel-vector finite element analysis software on distributed-memory computers

    Science.gov (United States)

    Qin, Jiangning; Nguyen, Duc T.

    1993-01-01

    A new parallel-vector finite element analysis software package MPFEA (Massively Parallel-vector Finite Element Analysis) is developed for large-scale structural analysis on massively parallel computers with distributed-memory. MPFEA is designed for parallel generation and assembly of the global finite element stiffness matrices as well as parallel solution of the simultaneous linear equations, since these are often the major time-consuming parts of a finite element analysis. Block-skyline storage scheme along with vector-unrolling techniques are used to enhance the vector performance. Communications among processors are carried out concurrently with arithmetic operations to reduce the total execution time. Numerical results on the Intel iPSC/860 computers (such as the Intel Gamma with 128 processors and the Intel Touchstone Delta with 512 processors) are presented, including an aircraft structure and some very large truss structures, to demonstrate the efficiency and accuracy of MPFEA.

  4. The GEMPAK Barnes interactive objective map analysis scheme. [General Meteorological Software Package

    Science.gov (United States)

    Koch, S. E.; Kocin, P. J.; Desjardins, M.

    1983-01-01

    The analysis scheme and meteorological applications of the GEMPAK data analysis and display software system developed by NASA are described. The program was devised to permit objective, versatile, and practical analysis of satellite meteorological data using a minicomputer and a display system with graphics capability. A data area can be selected within the data file for the globe, and data-sparse regions can be avoided. Distances between observations and the nearest observation points are calculated in order to avoid errors when determining synoptic weather conditions. The Barnes (1973) successive correction method is employed to restore the amplitude of small yet resolvable wavelengths suppressed in an initial filtering pass. The rms deviation is then calculated in relation to available measured data. Examples are provided of treatment of VISSR data from the GOES satellite and a study of the impact of incorrect cloud height data on synoptic weather field analysis.

  5. A software framework for the analysis of complex microscopy image data.

    Science.gov (United States)

    Chao, Jerry; Ward, E Sally; Ober, Raimund J

    2010-07-01

    Technological advances in both hardware and software have made possible the realization of sophisticated biological imaging experiments using the optical microscope. As a result, modern microscopy experiments are capable of producing complex image datasets. For a given data analysis task, the images in a set are arranged, based on the requirements of the task, by attributes such as the time and focus levels at which they were acquired. Importantly, different tasks performed over the course of an analysis are often facilitated by the use of different arrangements of the images. We present a software framework that supports the use of different logical image arrangements to analyze a physical set of images. This framework, called the Microscopy Image Analysis Tool (MIATool), realizes the logical arrangements using arrays of pointers to the images, thereby removing the need to replicate and manipulate the actual images in their storage medium. In order that they may be tailored to the specific requirements of disparate analysis tasks, these logical arrangements may differ in size and dimensionality, with no restrictions placed on the number of dimensions and the meaning of each dimension. MIATool additionally supports processing flexibility, extensible image processing capabilities, and data storage management.

  6. A Visual Basic simulation software tool for performance analysis of a membrane-based advanced water treatment plant.

    Science.gov (United States)

    Pal, P; Kumar, R; Srivastava, N; Chowdhury, J

    2014-02-01

    A Visual Basic simulation software (WATTPPA) has been developed to analyse the performance of an advanced wastewater treatment plant. This user-friendly and menu-driven software is based on the dynamic mathematical model for an industrial wastewater treatment scheme that integrates chemical, biological and membrane-based unit operations. The software-predicted results corroborate very well with the experimental findings as indicated in the overall correlation coefficient of the order of 0.99. The software permits pre-analysis and manipulation of input data, helps in optimization and exhibits performance of an integrated plant visually on a graphical platform. It allows quick performance analysis of the whole system as well as the individual units. The software first of its kind in its domain and in the well-known Microsoft Excel environment is likely to be very useful in successful design, optimization and operation of an advanced hybrid treatment plant for hazardous wastewater.

  7. Software-Assisted Depth Analysis of Optic Nerve Stereoscopic Images in Telemedicine

    Directory of Open Access Journals (Sweden)

    Tian Xia

    2016-01-01

    Full Text Available Background. Software guided optic nerve assessment can assist in process automation and reduce interobserver disagreement. We tested depth analysis software (DAS in assessing optic nerve cup-to-disc ratio (VCD from stereoscopic optic nerve images (SONI of normal eyes. Methods. In a prospective study, simultaneous SONI from normal subjects were collected during telemedicine screenings using a Kowa 3Wx nonmydriatic simultaneous stereoscopic retinal camera (Tokyo, Japan. VCD was determined from SONI pairs and proprietary pixel DAS (Kowa Inc., Tokyo, Japan after disc and cup contour line placement. A nonstereoscopic VCD was determined using the right channel of a stereo pair. Mean, standard deviation, t-test, and the intraclass correlation coefficient (ICCC were calculated. Results. 32 patients had mean age of 40±14 years. Mean VCD on SONI was 0.36±0.09, with DAS 0.38±0.08, and with nonstereoscopic 0.29±0.12. The difference between stereoscopic and DAS assisted was not significant (p=0.45. ICCC showed agreement between stereoscopic and software VCD assessment. Mean VCD difference was significant between nonstereoscopic and stereoscopic (p<0.05 and nonstereoscopic and DAS (p<0.005 recordings. Conclusions. DAS successfully assessed SONI and showed a high degree of correlation to physician-determined stereoscopic VCD.

  8. GENES - a software package for analysis in experimental statistics and quantitative genetics

    Directory of Open Access Journals (Sweden)

    Cosme Damião Cruz

    2013-06-01

    Full Text Available GENES is a software package used for data analysis and processing with different biometricmodels and is essential in genetic studies applied to plant and animal breeding. It allows parameterestimation to analyze biologicalphenomena and is fundamental for the decision-making process andpredictions of success and viability of selection strategies. The program can be downloaded from theInternet (http://www.ufv.br/dbg/genes/genes.htm orhttp://www.ufv.br/dbg/biodata.htm and is available inPortuguese, English and Spanish. Specific literature (http://www.livraria.ufv.br/ and a set of sample filesare also provided, making GENES easy to use. The software is integrated into the programs MS Word, MSExcel and Paint, ensuring simplicity and effectiveness indata import and export ofresults, figures and data.It is also compatible with the free software R and Matlab, through the supply of useful scripts available forcomplementary analyses in different areas, including genome wide selection, prediction of breeding valuesand use of neural networks in genetic improvement.

  9. Software-Assisted Depth Analysis of Optic Nerve Stereoscopic Images in Telemedicine.

    Science.gov (United States)

    Xia, Tian; Patel, Shriji N; Szirth, Ben C; Kolomeyer, Anton M; Khouri, Albert S

    2016-01-01

    Background. Software guided optic nerve assessment can assist in process automation and reduce interobserver disagreement. We tested depth analysis software (DAS) in assessing optic nerve cup-to-disc ratio (VCD) from stereoscopic optic nerve images (SONI) of normal eyes. Methods. In a prospective study, simultaneous SONI from normal subjects were collected during telemedicine screenings using a Kowa 3Wx nonmydriatic simultaneous stereoscopic retinal camera (Tokyo, Japan). VCD was determined from SONI pairs and proprietary pixel DAS (Kowa Inc., Tokyo, Japan) after disc and cup contour line placement. A nonstereoscopic VCD was determined using the right channel of a stereo pair. Mean, standard deviation, t-test, and the intraclass correlation coefficient (ICCC) were calculated. Results. 32 patients had mean age of 40 ± 14 years. Mean VCD on SONI was 0.36 ± 0.09, with DAS 0.38 ± 0.08, and with nonstereoscopic 0.29 ± 0.12. The difference between stereoscopic and DAS assisted was not significant (p = 0.45). ICCC showed agreement between stereoscopic and software VCD assessment. Mean VCD difference was significant between nonstereoscopic and stereoscopic (p stereoscopic VCD.

  10. Ten years of software sustainability at the Infrared Processing and Analysis Center.

    Science.gov (United States)

    Berriman, G Bruce; Good, John; Deelman, Ewa; Alexov, Anastasia

    2011-08-28

    This paper presents a case study of an approach to sustainable software architecture that has been successfully applied over a period of 10 years to astronomy software services at the NASA Infrared Processing and Analysis Center (IPAC), Caltech (http://www.ipac.caltech.edu). The approach was developed in response to the need to build and maintain the NASA Infrared Science Archive (http://irsa.ipac.caltech.edu), NASA's archive node for infrared astronomy datasets. When the archive opened for business in 1999 serving only two datasets, it was understood that the holdings would grow rapidly in size and diversity, and consequently in the number of queries and volume of data download. It was also understood that platforms and browsers would be modernized, that user interfaces would need to be replaced and that new functionality outside of the scope of the original specifications would be needed. The changes in scientific functionality over time are largely driven by the archive user community, whose interests are represented by a formal user panel. The approach has been extended to support four more major astronomy archives, which today host data from more than 40 missions and projects, to support a complete modernization of a powerful and unique legacy astronomy application for co-adding survey data, and to support deployment of Montage, a powerful image mosaic engine for astronomy. The approach involves using a component-based architecture, designed from the outset to support sustainability, extensibility and portability. Although successful, the approach demands careful assessment of new and emerging technologies before adopting them, and attention to a disciplined approach to software engineering and maintenance. The paper concludes with a list of best practices for software sustainability that are based on 10 years of experience at IPAC.

  11. Introduction of Aesthetic Analyzer Software: Computer-aided Linear and Angular Analysis of Facial Profile Photographs

    Directory of Open Access Journals (Sweden)

    Moshkelgosha V.

    2012-06-01

    Full Text Available Statement of Problem: Evaluation of diagnostic records as a supplement to direct examination has an important role in treatment planning of orthodontic patients with aesthetic needs. Photogrammetry as a quantitative tool has recently attracted the attention of researchers again.Purpose: The purpose of this study was to design computer software to analyze orthodontic patients’ facial profile photographic images and to estimate reliability and validity of its measurement.Materials and Method: Profile photographic images of 20 volunteered students were taken in the natural head position with standard technique. Manual linear and angular measurements were used as a gold standard and compared with the results obtained from Aesthetic analyzer Software (designed for that purpose. Dahlberg’s method error and Intraclass Correlation Coefficient (ICC was used to estimate validity, reliability and inter-examiner errors.Results: Almost all the measurements showed a high correlation between the manual and computerized method (ICC>0.75. The maximum method errors computed from Dahlberg’s formula were 1.345 mm in linear and 3.294 degrees in angular measurements. At the highest levels, inter-examiner errors were 1.684 mm and 3.741 degrees in linear and angular measurements, respectively. Conclusion: Although a low budget has been allocated for the design of Aesthetic Analyzer software, its features are comparable with commercially available products. The software’s capabilities can be increased. The results of the current study indicated that the software is accurate and repeatable in photographic analysis of orthodontic patients.

  12. Comparison between ASI, CNES and JAXA CCD analysis software for optical space debris monitoring

    Science.gov (United States)

    Paolillo, Fabrizio; Laas-Bourez, Myrtille; Yanagisawa, Toshifumi; Cappelletti, Chantal; Graziani, Filippo; Vidal, Bruno

    Since nineties Italian Space Agency (ASI), Centre National d'Etudes Spatiales CNES and Japan Aerospace Exploration Agency (JAXA) play an important role in Inter-Agency Space Debris Coordination Committee (IADC) activities. Respectively the Group of Astrodynamics of Uni-versity Sapienza of Rome (GAUSS), TAROT team (Télescope a Action Rapide pour les Objets Transitoires) and Institute of Aerospace Technology (IAT), participate in optical space debris monitoring activities (WG1 at IADC ) with the following facilities: 1. SpaDE observatory of ASI/GAUSS in Collepardo (Fr.), country-regionplaceItaly. 2. TAROT observatories of CNES: one in Chili (ESO LA Silla) and one in placecountry-regionFrance (Observatoire de la Côte d'Azur, at Calern). 3. Nyukasayama Observatory of IAT/JAXA, country-regionplaceJapan. Due to the large amount of data collected during the IADC coordinated observation campaigns and the autonomous campaigns, these research groups developed three different software for image processing automation and for the correlation of the detected objects with the catalogue. Using these software the three different observatories are improving the knowledge of the space debris population, in particular in the so-called geostationary belt (AI23.4 IADC International 2007 optical observation campaigns in higher Earth orbits and AI23.2 Investigation of high A/m ratio debris in higher Earth orbits), but they use different space debris monitoring techniques. With the aim to improve CCD analysis capabilities of each research group, during the 27th IADC meeting ASI, CNES and JAXA started a cooperation in this field on the comparison between the image processing software. The objectives of this activity are: 1. Test of ASI, CNES and JAXA CCD analysis software on real images taken in the 3 dif-ferent observation strategies (each observatory uses a particular objects extraction pro-cedure). 2. Results comparison: number of bad detection, number of good detection, processing

  13. PDBStat: a universal restraint converter and restraint analysis software package for protein NMR

    Energy Technology Data Exchange (ETDEWEB)

    Tejero, Roberto [Rutgers, The State University of New Jersey, Center for Advanced Biotechnology and Medicine (United States); Snyder, David [William Paterson University, Department of Chemistry (United States); Mao, Binchen; Aramini, James M.; Montelione, Gaetano T., E-mail: guy@cabm.rutgers.edu [Rutgers, The State University of New Jersey, Center for Advanced Biotechnology and Medicine (United States)

    2013-08-15

    The heterogeneous array of software tools used in the process of protein NMR structure determination presents organizational challenges in the structure determination and validation processes, and creates a learning curve that limits the broader use of protein NMR in biology. These challenges, including accurate use of data in different data formats required by software carrying out similar tasks, continue to confound the efforts of novices and experts alike. These important issues need to be addressed robustly in order to standardize protein NMR structure determination and validation. PDBStat is a C/C++ computer program originally developed as a universal coordinate and protein NMR restraint converter. Its primary function is to provide a user-friendly tool for interconverting between protein coordinate and protein NMR restraint data formats. It also provides an integrated set of computational methods for protein NMR restraint analysis and structure quality assessment, relabeling of prochiral atoms with correct IUPAC names, as well as multiple methods for analysis of the consistency of atomic positions indicated by their convergence across a protein NMR ensemble. In this paper we provide a detailed description of the PDBStat software, and highlight some of its valuable computational capabilities. As an example, we demonstrate the use of the PDBStat restraint converter for restrained CS-Rosetta structure generation calculations, and compare the resulting protein NMR structure models with those generated from the same NMR restraint data using more traditional structure determination methods. These results demonstrate the value of a universal restraint converter in allowing the use of multiple structure generation methods with the same restraint data for consensus analysis of protein NMR structures and the underlying restraint data.

  14. Utilizing a Photo-Analysis Software for Content Identifying Method (CIM

    Directory of Open Access Journals (Sweden)

    Nejad Nasim Sahraei

    2015-01-01

    Full Text Available Content Identifying Methodology or (CIM was developed to measure public preferences in order to reveal the common characteristics of landscapes and aspects of underlying perceptions including the individual's reactions to content and spatial configuration, therefore, it can assist with the identification of factors that influenced preference. Regarding the analysis of landscape photographs through CIM, there are several studies utilizing image analysis software, such as Adobe Photoshop, in order to identify the physical contents in the scenes. This study attempts to evaluate public’s ‘preferences for aesthetic qualities of pedestrian bridges in urban areas through a photo-questionnaire survey, in which respondents evaluated images of pedestrian bridges in urban areas. Two groups of images were evaluated as the most and least preferred scenes that concern the highest and lowest mean scores respectively. These two groups were analyzed by CIM and also evaluated based on the respondent’s description of each group to reveal the pattern of preferences and the factors that may affect them. Digimizer Software was employed to triangulate the two approaches and to determine the role of these factors on people’s preferences. This study attempts to introduce the useful software for image analysis which can measure the physical contents and also their spatial organization in the scenes. According to the findings, it is revealed that Digimizer could be a useful tool in CIM approaches through preference studies that utilizes photographs in place of the actual landscape in order to determine the most important factors in public preferences for pedestrian bridges in urban areas.

  15. Statistical analysis of probabilistic models of software product lines with quantitative constraints

    DEFF Research Database (Denmark)

    Beek, M.H. ter; Legay, A.; Lluch Lafuente, Alberto

    2015-01-01

    We investigate the suitability of statistical model checking for the analysis of probabilistic models of software product lines with complex quantitative constraints and advanced feature installation options. Such models are specified in the feature-oriented language QFLan, a rich process algebra...... of certain behaviour to the expected average cost of products. This is supported by a Maude implementation of QFLan, integrated with the SMT solver Z3 and the distributed statistical model checker MultiVeStA. Our approach is illustrated with a bikes product line case study....

  16. PREDICTIVE ANALYSIS SOFTWARE FOR MODELING THE ALTMAN Z-SCORE FINANCIAL DISTRESS STATUS OF COMPANIES

    Directory of Open Access Journals (Sweden)

    ILIE RĂSCOLEAN

    2012-10-01

    Full Text Available Literature shows some bankruptcy methods for determining the financial distress status of companies and based on this information we chosen Altman statistical model because it has been used a lot in the past and like that it has become a benchmark for other methods. Based on this financial analysis flowchart, programming software was developed that allows the calculation and determination of the bankruptcy probability for a certain rate of failure Z-score, corresponding to a given interval that is equal to the ratio of the number of bankrupt companies and the total number of companies (bankrupt and healthy interval.

  17. rosettR: protocol and software for seedling area and growth analysis

    DEFF Research Database (Denmark)

    Tomé, Filipa; Jansseune, Karel; Saey, Bernadette

    2017-01-01

    differences among different genotypes and in response to light regimes and osmotic stress. rosettR is implemented as a package for the statistical computing software R and provides easy to use functions to design an experiment, analyze the images, and generate reports on quality control as well as a final......Growth is an important parameter to consider when studying the impact of treatments or mutations on plant physiology. Leaf area and growth rates can be estimated efficiently from images of plants, but the experiment setup, image analysis, and statistical evaluation can be laborious, often requiring...

  18. Software Tool for Automated Failure Modes and Effects Analysis (FMEA) of Hydraulic Systems

    DEFF Research Database (Denmark)

    Stecki, J. S.; Conrad, Finn; Oh, B.

    2002-01-01

    Offshore, marine,aircraft and other complex engineering systems operate in harsh environmental and operational conditions and must meet stringent requirements of reliability, safety and maintability. To reduce the hight costs of development of new systems in these fields improved the design...... management techniques and a vast array of computer aided techniques are applied during design and testing stages. The paper present and discusses the research and development of a software tool for automated failure mode and effects analysis - FMEA - of hydraulic systems. The paper explains the underlying...

  19. An Interactive Software for Conceptual Wing Flutter Analysis and Parametric Study

    Science.gov (United States)

    Mukhopadhyay, Vivek

    1996-01-01

    An interactive computer program was developed for wing flutter analysis in the conceptual design stage. The objective was to estimate the flutter instability boundary of a flexible cantilever wing, when well-defined structural and aerodynamic data are not available, and then study the effect of change in Mach number, dynamic pressure, torsional frequency, sweep, mass ratio, aspect ratio, taper ratio, center of gravity, and pitch inertia, to guide the development of the concept. The software was developed for Macintosh or IBM compatible personal computers, on MathCad application software with integrated documentation, graphics, data base and symbolic mathematics. The analysis method was based on non-dimensional parametric plots of two primary flutter parameters, namely Regier number and Flutter number, with normalization factors based on torsional stiffness, sweep, mass ratio, taper ratio, aspect ratio, center of gravity location and pitch inertia radius of gyration. The parametric plots were compiled in a Vought Corporation report from a vast data base of past experiments and wind-tunnel tests. The computer program was utilized for flutter analysis of the outer wing of a Blended-Wing-Body concept, proposed by McDonnell Douglas Corp. Using a set of assumed data, preliminary flutter boundary and flutter dynamic pressure variation with altitude, Mach number and torsional stiffness were determined.

  20. Strainer: software for analysis of population variation in community genomic datasets

    Directory of Open Access Journals (Sweden)

    Tyson Gene W

    2007-10-01

    Full Text Available Abstract Background Metagenomic analyses of microbial communities that are comprehensive enough to provide multiple samples of most loci in the genomes of the dominant organism types will also reveal patterns of genetic variation within natural populations. New bioinformatic tools will enable visualization and comprehensive analysis of this sequence variation and inference of recent evolutionary and ecological processes. Results We have developed a software package for analysis and visualization of genetic variation in populations and reconstruction of strain variants from otherwise co-assembled sequences. Sequencing reads can be clustered by matching patterns of single nucleotide polymorphisms to generate predicted gene and protein variant sequences, identify conserved intergenic regulatory sequences, and determine the quantity and distribution of recombination events. Conclusion The Strainer software, a first generation metagenomic bioinformatics tool, facilitates comprehension and analysis of heterogeneity intrinsic in natural communities. The program reveals the degree of clustering among closely related sequence variants and provides a rapid means to generate gene and protein sequences for functional, ecological, and evolutionary analyses.

  1. A Quantitative Analysis of Open Source Software's Acceptability as Production-Quality Code

    Science.gov (United States)

    Fischer, Michael

    2011-01-01

    The difficulty in writing defect-free software has been long acknowledged both by academia and industry. A constant battle occurs as developers seek to craft software that works within aggressive business schedules and deadlines. Many tools and techniques are used in attempt to manage these software projects. Software metrics are a tool that has…

  2. Structured Integration Test Suite Generation Process for Multi-Agent System

    Directory of Open Access Journals (Sweden)

    Z. Houhamdi

    2011-01-01

    Full Text Available Problem statement: In recent years, Agent-Oriented Software Engineering (AOSE methodologies are proposed to develop complex distributed systems based upon the agent paradigm. The implementation for such systems has usually the form of Multi-Agent Systems (MAS. Testing of MAS is a challenging task because these systems are often programmed to be autonomous and deliberative and they operate in an open world, which requires context awareness. Approach: We introduce a novel approach for goal-oriented software integration testing. It specifies an integration testing process that complements the goal oriented methodology Tropos and strengthens the mutual relationship between goal analysis and testing. Results: The derived test suites from the system goals can be used to observe emergent properties resulting from agent interactions and make sure that a group of agents and contextual resources work correctly together. Conclusion: This approach defines a structured and comprehensive integration test suite derivation process for engineering software agents by providing a systematic way of deriving test cases from goal analysis.

  3. Software Practicalization for Analysis of Wind-Induced Vibrations of Large Span Roof Structures

    Institute of Scientific and Technical Information of China (English)

    ZHANG Enuo; YANG Weiguo; ZHEN Wei; NA Xiangqian

    2005-01-01

    Wind loads are key considerations in the structural design of large-span structures since wind loads can be more important than earthquake loads, especially for large flexible structures. The analysis of wind loads on large span roof structures (LSRS) requires large amounts of calculations. Due to the combined effects of horizontal and vertical winds, the wind-induced vibrations of LSRS are analyzed in this paper with the frequency domain method as the first application of method for the analysis of the wind response of LSRS. A program is developed to analyze the wind-induced vibrations due to a combination of wind vibration modes. The program, which predicts the wind vibration coefficient and the wind pressure acting on the LSRS, interfaces with other finite element software to facilitate analysis of wind loads in the design of LSRS. The effectiveness and accuracy of the frequency domain method have been verified by numerical analyses of practical projects.

  4. NEuronMOrphological analysis tool: open-source software for quantitative morphometrics

    Directory of Open Access Journals (Sweden)

    Lucia eBilleci

    2013-02-01

    Full Text Available Morphometric analysis of neurons and brain tissue is relevant to the study of neuron circuitry development during the first phases of brain growth or for probing the link between microstructural morphology and degenerative diseases. As neural imaging techniques become ever more sophisticated, so does the amount and complexity of data generated. The NEuronMOrphological analysis tool NEMO was purposely developed to handle and process large numbers of optical microscopy image files of neurons in culture or slices in order to automatically run batch routines, store data and apply multivariate classification and feature extraction using3-way principal component analysis. Here we describe the software's main features, underlining the differences between NEMO and other commercial and non-commercial image processing tools, and show an example of how NEMO can be used to classify neurons from wild-type mice and from animal models of autism.

  5. Advanced functionality for radio analysis in the Offline software framework of the Pierre Auger Observatory

    CERN Document Server

    Abreu, P; Ahn, E J; Albuquerque, I F M; Allard, D; Allekotte, I; Allen, J; Allison, P; Castillo, J Alvarez; Alvarez-Muñiz, J; Ambrosio, M; Aminaei, A; Anchordoqui, L; Andringa, S; Antičić, T; Aramo, C; Arganda, E; Arqueros, F; Asorey, H; Assis, P; Aublin, J; Ave, M; Avenier, M; Avila, G; Bäcker, T; Balzer, M; Barber, K B; Barbosa, A F; Bardenet, R; Barroso, S L C; Baughman, B; Beatty, J J; Becker, B R; Becker, K H; Bellido, J A; BenZvi, S; Berat, C; Bertou, X; Biermann, P L; Billoir, P; Blanco, F; Blanco, M; Bleve, C; Blümer, H; Boháčová, M; Boncioli, D; Bonifazi, C; Bonino, R; Borodai, N; Brack, J; Brogueira, P; Brown, W C; Bruijn, R; Buchholz, P; Bueno, A; Burton, R E; Caballero-Mora, K S; Caramete, L; Caruso, R; Castellina, A; Cataldi, G; Cazon, L; Cester, R; Chauvin, J; Chiavassa, A; Chinellato, J A; Chou, A; Chudoba, J; Clay, R W; Coluccia, M R; Conceição, R; Contreras, F; Cook, H; Cooper, M J; Coppens, J; Cordier, A; Cotti, U; Coutu, S; Covault, C E; Creusot, A; Criss, A; Cronin, J; Curutiu, A; Dagoret-Campagne, S; Dallier, R; Dasso, S; Daumiller, K; Dawson, B R; de Almeida, R M; De Domenico, M; De Donato, C; de Jong, S J; De La Vega, G; Junior, W J M de Mello; Neto, J R T de Mello; De Mitri, I; de Souza, V; de Vries, K D; Decerprit, G; del Peral, L; Deligny, O; Dembinski, H; Denkiewicz, A; Di Giulio, C; Diaz, J C; Castro, M L Díaz; Diep, P N; Dobrigkeit, C; D'Olivo, J C; Dong, P N; Dorofeev, A; Anjos, J C dos; Dova, M T; D'Urso, D; Dutan, I; Ebr, J; Engel, R; Erdmann, M; Escobar, C O; Etchegoyen, A; Luis, P Facal San; Falcke, H; Farrar, G; Fauth, A C; Fazzini, N; Ferguson, A P; Ferrero, A; Fick, B; Filevich, A; Filipčič, A; Fliescher, S; Fracchiolla, C E; Fraenkel, E D; Fröhlich, U; Fuchs, B; Gamarra, R F; Gambetta, S; García, B; Gámez, D García; Garcia-Pinto, D; Gascon, A; Gemmeke, H; Gesterling, K; Ghia, P L; Giaccari, U; Giller, M; Glass, H; Gold, M S; Golup, G; Albarracin, F Gomez; Berisso, M Gómez; Gonçalves, P; Gonzalez, D; Gonzalez, J G; Gookin, B; Góra, D; Gorgi, A; Gouffon, P; Gozzini, S R; Grashorn, E; Grebe, S; Griffith, N; Grigat, M; Grillo, A F; Guardincerri, Y; Guarino, F; Guedes, G P; Hague, J D; Hansen, P; Harari, D; Harmsma, S; Harton, J L; Haungs, A; Hebbeker, T; Heck, D; Herve, A E; Hojvat, C; Holmes, V C; Homola, P; Hörandel, J R; Horneffer, A; Hrabovský, M; Huege, T; Insolia, A; Ionita, F; Italiano, A; Jiraskova, S; Kadija, K; Kampert, K H; Karhan, P; Karova, T; Kasper, P; Kégl, B; Keilhauer, B; Keivani, A; Kelley, J L; Kemp, E; Kieckhafer, R M; Klages, H O; Kleifges, M; Kleinfeller, J; Knapp, J; Koang, D -H; Kotera, K; Krohm, N; Krömer, O; Kruppke-Hansen, D; Kuehn, F; Kuempel, D; Kulbartz, J K; Kunka, N; La Rosa, G; Lachaud, C; Lautridou, P; Leão, M S A B; Lebrun, D; Lebrun, P; de Oliveira, M A Leigui; Lemiere, A; Letessier-Selvon, A; Lhenry-Yvon, I; Link, K; López, R; Agüera, A Lopez; Louedec, K; Bahilo, J Lozano; Lucero, A; Ludwig, M; Lyberis, H; Macolino, C; Maldera, S; Mandat, D; Mantsch, P; Mariazzi, A G; Marin, V; Maris, I C; Falcon, H R Marquez; Marsella, G; Martello, D; Martin, L; Bravo, O Martínez; Mathes, H J; Matthews, J; Matthews, J A J; Matthiae, G; Maurizio, D; Mazur, P O; Medina-Tanco, G; Melissas, M; Melo, D; Menichetti, E; Menshikov, A; Mertsch, P; Meurer, C; Mičanović, S; Micheletti, M I; Miller, W; Miramonti, L; Mollerach, S; Monasor, M; Ragaigne, D Monnier; Montanet, F; Morales, B; Morello, C; Moreno, E; Moreno, J C; Morris, C; Mostafá, M; Moura., C A; Mueller, S; Muller, M A; Müller, G; Münchmeyer, M; Mussa, R; Navarra, G; Navarro, J L; Navas, S; Necesal, P; Nellen, L; Nelles, A; Nhung, P T; Nierstenhoefer, N; Nitz, D; Nosek, D; Nožka, L; Nyklicek, M; Oehlschläger, J; Olinto, A; Oliva, P; Olmos-Gilbaja, V M; Ortiz, M; Pacheco, N; Selmi-Dei, D Pakk; Palatka, M; Pallotta, J; Palmieri, N; Parente, G; Parizot, E; Parra, A; Parrisius, J; Parsons, R D; Pastor, S; Paul, T; Pech, M; Pȩkala, J; Pelayo, R; Pepe, I M; Perrone, L; Pesce, R; Petermann, E; Petrera, S; Petrinca, P; Petrolini, A; Petrov, Y; Petrovic, J; Pfendner, C; Phan, N; Piegaia, R; Pierog, T; Pieroni, P; Pimenta, M; Pirronello, V; Platino, M; Ponce, V H; Pontz, M; Privitera, P; Prouza, M; Quel, E J; Rautenberg, J; Ravel, O; Ravignani, D; Revenu, B; Ridky, J; Risse, M; Ristori, P; Rivera, H; Rivière, C; Rizi, V; Robledo, C; de Carvalho, W Rodrigues; Rodriguez, G; Martino, J Rodriguez; Rojo, J Rodriguez; Rodriguez-Cabo, I; Rodríguez-Frías, M D; Ros, G; Rosado, J; Rossler, T; Roth, M; Rouillé-d'Orfeuil, B; Roulet, E; Rovero, A C; Rühle, C; Salamida, F; Salazar, H; Salina, G; Sánchez, F; Santander, M; Santo, C E; Santos, E; Santos, E M; Sarazin, F; Sarkar, S; Sato, R; Scharf, N; Scherini, V; Schieler, H; Schiffer, P; Schmidt, A; Schmidt, F; Schmidt, T; Scholten, O; Schoorlemmer, H; Schovancova, J; Schovánek, P; Schroeder, F; Schulte, S; Schuster, D; Sciutto, S J; Scuderi, M; Segreto, A; Semikoz, D; Settimo, M; Shadkam, A; Shellard, R C; Sidelnik, I; Sigl, G; Śmiałkowski, A; Šmída, R; Snow, G R; Sommers, P; Sorokin, J; Spinka, H; Squartini, R; Stapleton, J; Stasielak, J; Stephan, M; Stutz, A; Suarez, F; Suomijärvi, T; Supanitsky, A D; Šuša, T; Sutherland, M S; Swain, J; Szadkowski, Z; Szuba, M; Tamashiro, A; Tapia, A; Taşcău, O; Tcaciuc, R; Tegolo, D; Thao, N T; Thomas, D; Tiffenberg, J; Timmermans, C; Tiwari, D K; Tkaczyk, W; Peixoto, C J Todero; Tomé, B; Tonachini, A; Travnicek, P; Tridapalli, D B; Tristram, G; Trovato, E; Tueros, M; Ulrich, R; Unger, M; Urban, M; Galicia, J F Valdés; Valiño, I; Valore, L; Berg, A M van den; Cárdenas, B Vargas; Vázquez, J R; Vázquez, R A; Veberič, D; Verzi, V; Videla, M; Villaseñor, L; Wahlberg, H; Wahrlich, P; Wainberg, O; Warner, D; Watson, A A; Weber, M; Weidenhaupt, K; Weindl, A; Westerhoff, S; Whelan, B J; Wieczorek, G; Wiencke, L; Wilczyńska, B; Wilczyński, H; Will, M; Williams, C; Winchen, T; Winders, L; Winnick, M G; Wommer, M; Wundheiler, B; Yamamoto, T; Younk, P; Yuan, G; Zamorano, B; Zas, E; Zavrtanik, D; Zavrtanik, M; Zaw, I; Zepeda, A; Ziolkowski, M; 10.1016/j.nima.2011.01.049

    2011-01-01

    The advent of the Auger Engineering Radio Array (AERA) necessitates the development of a powerful framework for the analysis of radio measurements of cosmic ray air showers. As AERA performs "radio-hybrid" measurements of air shower radio emission in coincidence with the surface particle detectors and fluorescence telescopes of the Pierre Auger Observatory, the radio analysis functionality had to be incorporated in the existing hybrid analysis solutions for fluoresence and surface detector data. This goal has been achieved in a natural way by extending the existing Auger Offline software framework with radio functionality. In this article, we lay out the design, highlights and features of the radio extension implemented in the Auger Offline framework. Its functionality has achieved a high degree of sophistication and offers advanced features such as vectorial reconstruction of the electric field, advanced signal processing algorithms, a transparent and efficient handling of FFTs, a very detailed simulation of...

  6. Development of a gamma ray spectrometry software for neutron activation analysis using the open source concept; Desenvolvimento de um software de espectrometria gama para analise por ativacao com neutrons utilizando o conceito de codigo livre

    Energy Technology Data Exchange (ETDEWEB)

    Lucia, Silvio Rogerio de

    2008-07-01

    This study developed a specific software for gamma ray spectra analysis for researchers of the Neutron Activation Laboratory (LAN), which was named SAANI (Instrumental Neutron Activation Analysis Software). The LAN laboratory of the Institute for Research and Nuclear Energy (IPEN-CNEN/SP), uses a multielemental analytical technique, based on irradiation of a sample by a flux of neutrons from a nuclear reactor, which induces radioactivity. The sample is then placed in a gamma-ray spectrometer, to obtain the spectrum. With free software philosophy in mind, this software will replace the existing software VISPECT / VERSION 2. The new software's main features are: a friendlier interface; easier standardization procedure carried out by LAN staff and researchers; adapted to the use of plug technology; multi platform and code free. The software was developed using the programming Python language, the library Trolltech Qt graphics and some of their scientific extensions. Preliminary results using the SANNI software were compared to those obtained with the existing software and were considered good. There were some errors in accuracy during the implementation of the software. The SAANI software has been installed in selected computers to be used for routine analysis in order to verify its strength, accuracy and usability. (author)

  7. Extravehicular activity space suit interoperability.

    Science.gov (United States)

    Skoog, A I; McBarron JW 2nd; Severin, G I

    1995-10-01

    The European Agency (ESA) and the Russian Space Agency (RKA) are jointly developing a new space suit system for improved extravehicular activity (EVA) capabilities in support of the MIR Space Station Programme, the EVA Suit 2000. Recent national policy agreements between the U.S. and Russia on planned cooperations in manned space also include joint extravehicular activity (EVA). With an increased number of space suit systems and a higher operational frequency towards the end of this century an improved interoperability for both routine and emergency operations is of eminent importance. It is thus timely to report the current status of ongoing work on international EVA interoperability being conducted by the Committee on EVA Protocols and Operations of the International Academy of Astronauts initiated in 1991. This paper summarises the current EVA interoperability issues to be harmonised and presents quantified vehicle interface requirements for the current U.S. Shuttle EMU and Russian MIR Orlan DMA and the new European/Russian EVA Suit 2000 extravehicular systems. Major critical/incompatible interfaces for suits/mother-craft of different combinations are discussed, and recommendations for standardisations given.

  8. Extravehicular activity space suit interoperability

    Science.gov (United States)

    Skoog, A. Ingemar; McBarron, James W.; Severin, Guy I.

    1995-10-01

    The European Agency (ESA) and the Russian Space Agency (RKA) are jointly developing a new space suit system for improved extravehicular activity (EVA) capabilities in support of the MIR Space Station Programme, the EVA Suit 2000. Recent national policy agreements between the U.S. and Russia on planned cooperations in manned space also include joint extravehicular activity (EVA). With an increased number of space suit systems and a higher operational frequency towards the end of this century an improved interoperability for both routine and emergency operations is of eminent importance. It is thus timely to report the current status of ongoing work on international EVA interoperability being conducted by the Committee on EVA Protocols and Operations of the International Academy of Astronautics initialed in 1991. This paper summarises the current EVA interoperability issues to be harmonised and presents quantified vehicle interface requirements for the current U.S. Shuttle EMU and Russian MIR Orlan DMA and the new European/Russian EVA Suit 2000 extravehicular systems. Major critical/incompatible interfaces for suits/mothercraft of different combinations arc discussed, and recommendations for standardisations given.

  9. BuddySuite: Command-line toolkits for manipulating sequences, alignments, and phylogenetic trees.

    Science.gov (United States)

    Bond, Stephen R; Keat, Karl E; Barreira, Sofia N; Baxevanis, Andreas D

    2017-02-25

    The ability to manipulate sequence, alignment, and phylogenetic tree files has become an increasingly important skill in the life sciences, whether to generate summary information or to prepare data for further downstream analysis. The command line can be an extremely powerful environment for interacting with these resources, but only if the user has the appropriate general-purpose tools on hand. BuddySuite is a collection of four independent yet interrelated command-line toolkits that facilitate each step in the workflow of sequence discovery, curation, alignment, and phylogenetic reconstruction. Most common sequence, alignment, and tree file formats are automatically detected and parsed, and over 100 tools have been implemented for manipulating these data. The project has been engineered to easily accommodate the addition of new tools, it is written in the popular programming language Python, and is hosted on the Python Package Index and GitHub to maximize accessibility. Documentation for each BuddySuite tool, including usage examples, is available at http://tiny.cc/buddysuite wiki. All software is open source and freely available through http://research.nhgri.nih.gov/software/BuddySuite.

  10. CAVASS: a computer-assisted visualization and analysis software system - image processing aspects

    Science.gov (United States)

    Udupa, Jayaram K.; Grevera, George J.; Odhner, Dewey; Zhuge, Ying; Souza, Andre; Mishra, Shipra; Iwanaga, Tad

    2007-03-01

    The development of the concepts within 3DVIEWNIX and of the software system 3DVIEWNIX itself dates back to the 1970s. Since then, a series of software packages for Computer Assisted Visualization and Analysis (CAVA) of images came out from our group, 3DVIEWNIX released in 1993, being the most recent, and all were distributed with source code. CAVASS, an open source system, is the latest in this series, and represents the next major incarnation of 3DVIEWNIX. It incorporates four groups of operations: IMAGE PROCESSING (including ROI, interpolation, filtering, segmentation, registration, morphological, and algebraic operations), VISUALIZATION (including slice display, reslicing, MIP, surface rendering, and volume rendering), MANIPULATION (for modifying structures and surgery simulation), ANALYSIS (various ways of extracting quantitative information). CAVASS is designed to work on all platforms. Its key features are: (1) most major CAVA operations incorporated; (2) very efficient algorithms and their highly efficient implementations; (3) parallelized algorithms for computationally intensive operations; (4) parallel implementation via distributed computing on a cluster of PCs; (5) interface to other systems such as CAD/CAM software, ITK, and statistical packages; (6) easy to use GUI. In this paper, we focus on the image processing operations and compare the performance of CAVASS with that of ITK. Our conclusions based on assessing performance by utilizing a regular (6 MB), large (241 MB), and a super (873 MB) 3D image data set are as follows: CAVASS is considerably more efficient than ITK, especially in those operations which are computationally intensive. It can handle considerably larger data sets than ITK. It is easy and ready to use in applications since it provides an easy to use GUI. The users can easily build a cluster from ordinary inexpensive PCs and reap the full power of CAVASS inexpensively compared to expensive multiprocessing systems which are less

  11. SOFTWARE PACKAGE FOR SOLVING THE PROBLEMS OF ANALYSIS AND SYNTHESIS OF NETWORKED CONTROL SYSTEMS

    Directory of Open Access Journals (Sweden)

    A. E. Emelyanov

    2015-01-01

    Full Text Available Summary. Modern control systems shall exchange data packets through the network channels. Such systems are called network management systems. One of the promising directions of development of network management systems is the use of common computer networks in the control loop for the exchange of information between elements of the system. Such a construction of control systems leads to new problems. So in the design and study of such systems need to combine different methods of scientific fields. First of all, it is the field of control theory and communication theory. However, not all the developer has full knowledge of these areas to the same extent. To solve engineering problems, in order to ensure the required quality of operation, developed methods of analysis and synthesis of networked control systems with data transmission over a channel with competing access methods. These techniques allow the calculation of probability-time characteristics of a stochastic process data channel with competing access methods to build transients considered control systems to calculate their qualitative characteristics, to determine the conditions of stability of network systems management and tuning parameters to optimize the digital controllers for the respective criterion. These techniques are the basis for the development of software. The proposed software system allows for the analysis and synthesis of the network through which the information data exchange. As well as to study the network system for a variety of laws regulation. Complex structure based on the principles of modularity, hierarchy and nesting modules to each other. Easy to use interface allows the software user numb special training.

  12. Analysis of mice tumor models using dynamic MRI data and a dedicated software platform

    Energy Technology Data Exchange (ETDEWEB)

    Alfke, H.; Maurer, E.; Klose, K.J. [Philipps Univ. Marburg (Germany). Dept. of Radiology; Kohle, S.; Rascher-Friesenhausen, R.; Behrens, S.; Peitgen, H.O. [MeVis - Center for Medical Diagnostic Systems and Visualization, Bremen (Germany); Celik, I. [Philipps Univ. Marburg (Germany). Inst. for Theoretical Surgery; Heverhagen, J.T. [Philipps Univ. Marburg (Germany). Dept. of Radiology; Ohio State Univ., Columbus (United States). Dept. of Radiology

    2004-09-01

    Purpose: To implement a software platform (DynaVision) dedicated to analyze data from functional imaging of tumors with different mathematical approaches, and to test the software platform in pancreatic carcinoma xenografts in mice with severe combined immunodeficiency disease (SCID). Materials and Methods: A software program was developed for extraction and visualization of tissue perfusion parameters from dynamic contrast-enhanced images. This includes regional parameter calculation from enhancement curves, parametric images (e.g., blood flow), animation, 3D visualization, two-compartment modeling a mode for comparing different datasets (e.g., therapy monitoring), and motion correction. We analyzed xenograft tumors from two pancreatic carcinoma cell lines (B x PC3 and ASPC1) implanted in 14 SCID mice after injection of Gd-DTPA into the tail vein. These data were correlated with histopathological findings. Results: Image analysis was completed in approximately 15 minutes per data set. The possibility of drawing and editing ROIs within the whole data set makes it easy to obtain quantitative data from the intensity-time curves. In one animal, motion artifacts reduced the image quality to a greater extent but data analysis was still possible after motion correction. Dynamic MRI of mice tumor models revealed a highly heterogeneous distribution of the contrast-enhancement curves and derived parameters, which correlated with differences in histopathology. ASPc1 tumors showed a more hypervascular type of curves with faster and higher signal enhancement rate (wash-in) and a faster signal decrease (wash-out). BXPC3 tumors showed a more hypovascular type with slower wash-in and wash-out. This correlated with the biological properties of the tumors. (orig.)

  13. Quantifying Astronaut Tasks: Robotic Technology and Future Space Suit Design

    Science.gov (United States)

    Newman, Dava

    2003-01-01

    The primary aim of this research effort was to advance the current understanding of astronauts' capabilities and limitations in space-suited EVA by developing models of the constitutive and compatibility relations of a space suit, based on experimental data gained from human test subjects as well as a 12 degree-of-freedom human-sized robot, and utilizing these fundamental relations to estimate a human factors performance metric for space suited EVA work. The three specific objectives are to: 1) Compile a detailed database of torques required to bend the joints of a space suit, using realistic, multi- joint human motions. 2) Develop a mathematical model of the constitutive relations between space suit joint torques and joint angular positions, based on experimental data and compare other investigators' physics-based models to experimental data. 3) Estimate the work envelope of a space suited astronaut, using the constitutive and compatibility relations of the space suit. The body of work that makes up this report includes experimentation, empirical and physics-based modeling, and model applications. A detailed space suit joint torque-angle database was compiled with a novel experimental approach that used space-suited human test subjects to generate realistic, multi-joint motions and an instrumented robot to measure the torques required to accomplish these motions in a space suit. Based on the experimental data, a mathematical model is developed to predict joint torque from the joint angle history. Two physics-based models of pressurized fabric cylinder bending are compared to experimental data, yielding design insights. The mathematical model is applied to EVA operations in an inverse kinematic analysis coupled to the space suit model to calculate the volume in which space-suited astronauts can work with their hands, demonstrating that operational human factors metrics can be predicted from fundamental space suit information.

  14. Optical granulometric analysis of sedimentary deposits by color segmentation-based software: OPTGRAN-CS

    Science.gov (United States)

    Chávez, G. Moreno; Sarocchi, D.; Santana, E. Arce; Borselli, L.

    2015-12-01

    The study of grain size distribution is fundamental for understanding sedimentological environments. Through these analyses, clast erosion, transport and deposition processes can be interpreted and modeled. However, grain size distribution analysis can be difficult in some outcrops due to the number and complexity of the arrangement of clasts and matrix and their physical size. Despite various technological advances, it is almost impossible to get the full grain size distribution (blocks to sand grain size) with a single method or instrument of analysis. For this reason development in this area continues to be fundamental. In recent years, various methods of particle size analysis by automatic image processing have been developed, due to their potential advantages with respect to classical ones; speed and final detailed content of information (virtually for each analyzed particle). In this framework, we have developed a novel algorithm and software for grain size distribution analysis, based on color image segmentation using an entropy-controlled quadratic Markov measure field algorithm and the Rosiwal method for counting intersections between clast and linear transects in the images. We test the novel algorithm in different sedimentary deposit types from 14 varieties of sedimentological environments. The results of the new algorithm were compared with grain counts performed manually by the same Rosiwal methods applied by experts. The new algorithm has the same accuracy as a classical manual count process, but the application of this innovative methodology is much easier and dramatically less time-consuming. The final productivity of the new software for analysis of clasts deposits after recording field outcrop images can be increased significantly.

  15. BASE - 2nd generation software for microarray data management and analysis

    Directory of Open Access Journals (Sweden)

    Nordborg Nicklas

    2009-10-01

    Full Text Available Abstract Background Microarray experiments are increasing in size and samples are collected asynchronously over long time. Available data are re-analysed as more samples are hybridized. Systematic use of collected data requires tracking of biomaterials, array information, raw data, and assembly of annotations. To meet the information tracking and data analysis challenges in microarray experiments we reimplemented and improved BASE version 1.2. Results The new BASE presented in this report is a comprehensive annotable local microarray data repository and analysis application providing researchers with an efficient information management and analysis tool. The information management system tracks all material from biosource, via sample and through extraction and labelling to raw data and analysis. All items in BASE can be annotated and the annotations can be used as experimental factors in downstream analysis. BASE stores all microarray experiment related data regardless if analysis tools for specific techniques or data formats are readily available. The BASE team is committed to continue improving and extending BASE to make it usable for even more experimental setups and techniques, and we encourage other groups to target their specific needs leveraging on the infrastructure provided by BASE. Conclusion BASE is a comprehensive management application for information, data, and analysis of microarray experiments, available as free open source software at http://base.thep.lu.se under the terms of the GPLv3 license.

  16. Detailed analysis of complex single molecule FRET data with the software MASH

    Science.gov (United States)

    Hadzic, Mélodie C. A. S.; Kowerko, Danny; Börner, Richard; Zelger-Paulus, Susann; Sigel, Roland K. O.

    2016-04-01

    The processing and analysis of surface-immobilized single molecule FRET (Förster resonance energy transfer) data follows systematic steps (e.g. single molecule localization, clearance of different sources of noise, selection of the conformational and kinetic model, etc.) that require a solid knowledge in optics, photophysics, signal processing and statistics. The present proceeding aims at standardizing and facilitating procedures for single molecule detection by guiding the reader through an optimization protocol for a particular experimental data set. Relevant features were determined from single molecule movies (SMM) imaging Cy3- and Cy5-labeled Sc.ai5γ group II intron molecules synthetically recreated, to test the performances of four different detection algorithms. Up to 120 different parameterizations per method were routinely evaluated to finally establish an optimum detection procedure. The present protocol is adaptable to any movie displaying surface-immobilized molecules, and can be easily reproduced with our home-written software MASH (multifunctional analysis software for heterogeneous data) and script routines (both available in the download section of www.chem.uzh.ch/rna).

  17. An open source cryostage and software analysis method for detection of antifreeze activity

    DEFF Research Database (Denmark)

    Lørup Buch, Johannes; Ramløv, H

    2016-01-01

    The aim of this study is to provide the reader with a simple setup that can detect antifreeze proteins (AFP) by inhibition of ice recrystallisation in very small sample sizes. This includes an open source cryostage, a method for preparing and loading samples as well as a software analysis method....... The entire setup was tested using hyperactive AFP from the cerambycid beetle, Rhagium mordax. Samples containing AFP were compared to buffer samples, and the results are visualised as crystal radius evolution over time and in absolute change over 30 min. Statistical analysis showed that samples containing...... AFP could reliably be told apart from controls after only two minutes of recrystallisation. The goal of providing a fast, cheap and easy method for detecting antifreeze proteins in solution was met, and further development of the system can be followed at https://github.com/pechano/cryostage....

  18. The MGDO software library for data analysis in Ge neutrinoless double-beta decay experiments

    CERN Document Server

    Agostini, M; Finnerty, P; Kröninger, K; Lenz, D; Liu, J; Marino, M G; Martin, R; Nguyen, K D; Pandola, L; Schubert, A G; Volynets, O; Zavarise, P

    2011-01-01

    The GERDA and Majorana experiments will search for neutrinoless double-beta decay of germanium-76 using isotopically enriched high-purity germanium detectors. Although the experiments differ in conceptual design, they share many aspects in common, and in particular will employ similar data analysis techniques. The collaborations are jointly developing a C++ software library, MGDO, which contains a set of data objects and interfaces to encapsulate, store and manage physical quantities of interest, such as waveforms and high-purity germanium detector geometries. These data objects define a common format for persistent data, whether it is generated by Monte Carlo simulations or an experimental apparatus, to reduce code duplication and to ease the exchange of information between detector systems. MGDO also includes general-purpose analysis tools that can be used for the processing of measured or simulated digital signals. The MGDO design is based on the Object-Oriented programming paradigm and is very flexible, a...

  19. ICAS-PAT: A Software for Design, Analysis and Validation of PAT Systems

    DEFF Research Database (Denmark)

    Singh, Ravendra; Gernaey, Krist; Gani, Rafiqul

    2010-01-01

    end product qualities. In an earlier article, Singh et al. [Singh, R., Gernaey, K. V., Gani, R. (2009). Model-based computer-aided framework for design of process monitoring and analysis systems. Computers & Chemical Engineering, 33, 22–42] proposed the use of a systematic model and data based...... (consisting of process knowledge as well as knowledge on measurement methods and tools) and a generic model library (consisting of process operational models). Through a tablet manufacturing process example, the application of ICAS-PAT is illustrated, highlighting as well, the main features of the software.......In chemicals based product manufacturing, as in pharmaceutical, food and agrochemical industries, efficient and consistent process monitoring and analysis systems (PAT systems) have a very important role. These PAT systems ensure that the chemicals based product is manufactured with the specified...

  20. An open source cryostage and software analysis method for detection of antifreeze activity.

    Science.gov (United States)

    Buch, J L; Ramløv, H

    2016-06-01

    The aim of this study is to provide the reader with a simple setup that can detect antifreeze proteins (AFP) by inhibition of ice recrystallisation in very small sample sizes. This includes an open source cryostage, a method for preparing and loading samples as well as a software analysis method. The entire setup was tested using hyperactive AFP from the cerambycid beetle, Rhagium mordax. Samples containing AFP were compared to buffer samples, and the results are visualised as crystal radius evolution over time and in absolute change over 30 min. Statistical analysis showed that samples containing AFP could reliably be told apart from controls after only two minutes of recrystallisation. The goal of providing a fast, cheap and easy method for detecting antifreeze proteins in solution was met, and further development of the system can be followed at https://github.com/pechano/cryostage.