WorldWideScience

Sample records for suite integrated software

  1. TypingSuite: Integrated Software for Presenting Stimuli, and Collecting and Analyzing Typing Data

    Science.gov (United States)

    Mazerolle, Erin L.; Marchand, Yannick

    2015-01-01

    Research into typing patterns has broad applications in both psycholinguistics and biometrics (i.e., improving security of computer access via each user's unique typing patterns). We present a new software package, TypingSuite, which can be used for presenting visual and auditory stimuli, collecting typing data, and summarizing and analyzing the…

  2. An Integrated Software Suite for Surface-based Analyses of Cerebral Cortex

    Science.gov (United States)

    Van Essen, David C.; Drury, Heather A.; Dickson, James; Harwell, John; Hanlon, Donna; Anderson, Charles H.

    2001-01-01

    The authors describe and illustrate an integrated trio of software programs for carrying out surface-based analyses of cerebral cortex. The first component of this trio, SureFit (Surface Reconstruction by Filtering and Intensity Transformations), is used primarily for cortical segmentation, volume visualization, surface generation, and the mapping of functional neuroimaging data onto surfaces. The second component, Caret (Computerized Anatomical Reconstruction and Editing Tool Kit), provides a wide range of surface visualization and analysis options as well as capabilities for surface flattening, surface-based deformation, and other surface manipulations. The third component, SuMS (Surface Management System), is a database and associated user interface for surface-related data. It provides for efficient insertion, searching, and extraction of surface and volume data from the database. PMID:11522765

  3. An integrated software suite for surface-based analyses of cerebral cortex

    Science.gov (United States)

    Van Essen, D. C.; Drury, H. A.; Dickson, J.; Harwell, J.; Hanlon, D.; Anderson, C. H.

    2001-01-01

    The authors describe and illustrate an integrated trio of software programs for carrying out surface-based analyses of cerebral cortex. The first component of this trio, SureFit (Surface Reconstruction by Filtering and Intensity Transformations), is used primarily for cortical segmentation, volume visualization, surface generation, and the mapping of functional neuroimaging data onto surfaces. The second component, Caret (Computerized Anatomical Reconstruction and Editing Tool Kit), provides a wide range of surface visualization and analysis options as well as capabilities for surface flattening, surface-based deformation, and other surface manipulations. The third component, SuMS (Surface Management System), is a database and associated user interface for surface-related data. It provides for efficient insertion, searching, and extraction of surface and volume data from the database.

  4. The BTeV Software Tutorial Suite

    International Nuclear Information System (INIS)

    Kutschke, Robert K.

    2004-01-01

    The BTeV Collaboration is starting to develop its C++ based offline software suite, an integral part of which is a series of tutorials. These tutorials are targeted at a diverse audience, including new graduate students, experienced physicists with little or no C++ experience, those with just enough C++ to be dangerous, and experts who need only an overview of the available tools. The tutorials must both teach C++ in general and the BTeV specific tools in particular. Finally, they must teach physicists how to find and use the detailed documentation. This report will review the status of the BTeV experiment, give an overview of the plans for and the state of the software and will then describe the plans for the tutorial suite

  5. The ESA's Space Trajectory Analysis software suite

    Science.gov (United States)

    Ortega, Guillermo

    The European Space Agency (ESA) initiated in 2005 an internal activity to develop an open source software suite involving university science departments and research institutions all over the world. This project is called the "Space Trajectory Analysis" or STA. This article describes the birth of STA and its present configuration. One of the STA aims is to promote the exchange of technical ideas, and raise knowledge and competence in the areas of applied mathematics, space engineering, and informatics at University level. Conceived as a research and education tool to support the analysis phase of a space mission, STA is able to visualize a wide range of space trajectories. These include among others ascent, re-entry, descent and landing trajectories, orbits around planets and moons, interplanetary trajectories, rendezvous trajectories, etc. The article explains that STA project is an original idea of the Technical Directorate of ESA. It was born in August 2005 to provide a framework in astrodynamics research at University level. As research and education software applicable to Academia, a number of Universities support this development by joining ESA in leading the development. ESA and Universities partnership are expressed in the STA Steering Board. Together with ESA, each University has a chair in the board whose tasks are develop, control, promote, maintain, and expand the software suite. The article describes that STA provides calculations in the fields of spacecraft tracking, attitude analysis, coverage and visibility analysis, orbit determination, position and velocity of solar system bodies, etc. STA implements the concept of "space scenario" composed of Solar system bodies, spacecraft, ground stations, pads, etc. It is able to propagate the orbit of a spacecraft where orbital propagators are included. STA is able to compute communication links between objects of a scenario (coverage, line of sight), and to represent the trajectory computations and

  6. MODEL: A software suite for data acquisition

    Energy Technology Data Exchange (ETDEWEB)

    Sendall, D M; Boissat, C; Bozzoli, W; Burkimsher, P; Jones, R; Matheys, J P; Mornacchi, G; Nguyen, T; Vyvre, P vande; Vascotto, A; Weaver, D [European Organization for Nuclear Research, Geneva (Switzerland). DD Div.

    1989-12-01

    MODEL is a new suite of modular data-acquisition software. It is aimed at the needs of LEP experiments, and is also general enough to be more widely used. It can accomodate a variety of users styles. It runs on a set of loosely coupled processors, and makes use of the remote procedure call technique. Implemented originally for the VAX family, some of its services have already been extended to other systems, including embedded microprocessors. The software modules available include facilities for data-flow management, a framework for monitoring programs, a window-oriented human interface, an error message utility, a process control utility and a run control scheme. It is already in use in a variety of experiments, and is still under development in the light of user experience. (orig.).

  7. Engineering Software Suite Validates System Design

    Science.gov (United States)

    2007-01-01

    EDAptive Computing Inc.'s (ECI) EDAstar engineering software tool suite, created to capture and validate system design requirements, was significantly funded by NASA's Ames Research Center through five Small Business Innovation Research (SBIR) contracts. These programs specifically developed Syscape, used to capture executable specifications of multi-disciplinary systems, and VectorGen, used to automatically generate tests to ensure system implementations meet specifications. According to the company, the VectorGen tests considerably reduce the time and effort required to validate implementation of components, thereby ensuring their safe and reliable operation. EDASHIELD, an additional product offering from ECI, can be used to diagnose, predict, and correct errors after a system has been deployed using EDASTAR -created models. Initial commercialization for EDASTAR included application by a large prime contractor in a military setting, and customers include various branches within the U.S. Department of Defense, industry giants like the Lockheed Martin Corporation, Science Applications International Corporation, and Ball Aerospace and Technologies Corporation, as well as NASA's Langley and Glenn Research Centers

  8. Improvements to the APBS biomolecular solvation software suite: Improvements to the APBS Software Suite

    Energy Technology Data Exchange (ETDEWEB)

    Jurrus, Elizabeth [Pacific Northwest National Laboratory, Richland Washington; Engel, Dave [Pacific Northwest National Laboratory, Richland Washington; Star, Keith [Pacific Northwest National Laboratory, Richland Washington; Monson, Kyle [Pacific Northwest National Laboratory, Richland Washington; Brandi, Juan [Pacific Northwest National Laboratory, Richland Washington; Felberg, Lisa E. [University of California, Berkeley California; Brookes, David H. [University of California, Berkeley California; Wilson, Leighton [University of Michigan, Ann Arbor Michigan; Chen, Jiahui [Southern Methodist University, Dallas Texas; Liles, Karina [Pacific Northwest National Laboratory, Richland Washington; Chun, Minju [Pacific Northwest National Laboratory, Richland Washington; Li, Peter [Pacific Northwest National Laboratory, Richland Washington; Gohara, David W. [St. Louis University, St. Louis Missouri; Dolinsky, Todd [FoodLogiQ, Durham North Carolina; Konecny, Robert [University of California San Diego, San Diego California; Koes, David R. [University of Pittsburgh, Pittsburgh Pennsylvania; Nielsen, Jens Erik [Protein Engineering, Novozymes A/S, Copenhagen Denmark; Head-Gordon, Teresa [University of California, Berkeley California; Geng, Weihua [Southern Methodist University, Dallas Texas; Krasny, Robert [University of Michigan, Ann Arbor Michigan; Wei, Guo-Wei [Michigan State University, East Lansing Michigan; Holst, Michael J. [University of California San Diego, San Diego California; McCammon, J. Andrew [University of California San Diego, San Diego California; Baker, Nathan A. [Pacific Northwest National Laboratory, Richland Washington; Brown University, Providence Rhode Island

    2017-10-24

    The Adaptive Poisson-Boltzmann Solver (APBS) software was developed to solve the equations of continuum electrostatics for large biomolecular assemblages that has provided impact in the study of a broad range of chemical, biological, and biomedical applications. APBS addresses three key technology challenges for understanding solvation and electrostatics in biomedical applications: accurate and efficient models for biomolecular solvation and electrostatics, robust and scalable software for applying those theories to biomolecular systems, and mechanisms for sharing and analyzing biomolecular electrostatics data in the scientific community. To address new research applications and advancing computational capabilities, we have continually updated APBS and its suite of accompanying software since its release in 2001. In this manuscript, we discuss the models and capabilities that have recently been implemented within the APBS software package including: a Poisson-Boltzmann analytical and a semi-analytical solver, an optimized boundary element solver, a geometry-based geometric flow solvation model, a graph theory based algorithm for determining pKa values, and an improved web-based visualization tool for viewing electrostatics.

  9. CAMEO (Computer-Aided Management of Emergency Operations) Software Suite

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — CAMEO is the umbrella name for a system of software applications used widely to plan for and respond to chemical emergencies. All of the programs in the suite work...

  10. Strengthening Software Authentication with the ROSE Software Suite

    International Nuclear Information System (INIS)

    White, G

    2006-01-01

    Many recent nonproliferation and arms control software projects include a software authentication regime. These include U.S. Government-sponsored projects both in the United States and in the Russian Federation (RF). This trend toward requiring software authentication is only accelerating. Demonstrating assurance that software performs as expected without hidden ''backdoors'' is crucial to a project's success. In this context, ''authentication'' is defined as determining that a software package performs only its intended purpose and performs said purpose correctly and reliably over the planned duration of an agreement. In addition to visual inspections by knowledgeable computer scientists, automated tools are needed to highlight suspicious code constructs, both to aid visual inspection and to guide program development. While many commercial tools are available for portions of the authentication task, they are proprietary and not extensible. An open-source, extensible tool can be customized to the unique needs of each project (projects can have both common and custom rules to detect flaws and security holes). Any such extensible tool has to be based on a complete language compiler. ROSE is precisely such a compiler infrastructure developed within the Department of Energy (DOE) and targeted at the optimization of scientific applications and user-defined libraries within large-scale applications (typically applications of a million lines of code). ROSE is a robust, source-to-source analysis and optimization infrastructure currently addressing large, million-line DOE applications in C and C++ (handling the full C, C99, C++ languages and with current collaborations to support Fortran90). We propose to extend ROSE to address a number of security-specific requirements, and apply it to software authentication for nonproliferation and arms control projects

  11. Knowledge Architect : A Tool Suite for Managing Software Architecture Knowledge

    NARCIS (Netherlands)

    Liang, Peng; Jansen, Anton; Avgeriou, Paris

    2009-01-01

    Management of software architecture knowledge (AK) is vital for improving an organization’s architectural capabilities. To support the architecting process within our industrial partner: Astron, the Dutch radio astronomy institute, we implemented the Knowledge Architect (KA): a tool suite for

  12. The IMBA suite: integrated modules for bioassay analysis

    Energy Technology Data Exchange (ETDEWEB)

    Birchall, A.; Jarvis, N.S.; Peace, M.S.; Riddell, A.E.; Battersby, W.P

    1998-07-01

    The increasing complexity of models representing the biokinetic behaviour of radionuclides in the body following intake poses problems for people who are required to implement these models. The problem is exacerbated by the current paucity of suitable software. In order to remedy this situation, a collaboration between British Nuclear Fuels, Westlakes Research Institute and the National Radiological Protection Board has started with the aim of producing a suite of modules for estimating intakes and doses from bioassay measurements using the new ICRP models. Each module will have a single purpose (e.g. to calculate respiratory tract deposition) and will interface with other software using data files. The elements to be implemented initially are plutonium, uranium, caesium, iodine and tritium. It is intended to make the software available to other parties under terms yet to be decided. This paper describes the proposed suite of integrated modules for bioassay analysis, IMBA. (author)

  13. Integrated Instrument Simulator Suites for Earth Science

    Science.gov (United States)

    Tanelli, Simone; Tao, Wei-Kuo; Matsui, Toshihisa; Hostetler, Chris; Hair, John; Butler, Carolyn; Kuo, Kwo-Sen; Niamsuwan, Noppasin; Johnson, Michael P.; Jacob, Joseph C.; hide

    2012-01-01

    The NASA Earth Observing System Simulators Suite (NEOS3) is a modular framework of forward simulations tools for remote sensing of Earth's Atmosphere from space. It was initiated as the Instrument Simulator Suite for Atmospheric Remote Sensing (ISSARS) under the NASA Advanced Information Systems Technology (AIST) program of the Earth Science Technology Office (ESTO) to enable science users to perform simulations based on advanced atmospheric and simple land surface models, and to rapidly integrate in a broad framework any experimental or innovative tools that they may have developed in this context. The name was changed to NEOS3 when the project was expanded to include more advanced modeling tools for the surface contributions, accounting for scattering and emission properties of layered surface (e.g., soil moisture, vegetation, snow and ice, subsurface layers). NEOS3 relies on a web-based graphic user interface, and a three-stage processing strategy to generate simulated measurements. The user has full control over a wide range of customizations both in terms of a priori assumptions and in terms of specific solvers or models used to calculate the measured signals.This presentation will demonstrate the general architecture, the configuration procedures and illustrate some sample products and the fundamental interface requirements for modules candidate for integration.

  14. Improvements to the APBS biomolecular solvation software suite.

    Science.gov (United States)

    Jurrus, Elizabeth; Engel, Dave; Star, Keith; Monson, Kyle; Brandi, Juan; Felberg, Lisa E; Brookes, David H; Wilson, Leighton; Chen, Jiahui; Liles, Karina; Chun, Minju; Li, Peter; Gohara, David W; Dolinsky, Todd; Konecny, Robert; Koes, David R; Nielsen, Jens Erik; Head-Gordon, Teresa; Geng, Weihua; Krasny, Robert; Wei, Guo-Wei; Holst, Michael J; McCammon, J Andrew; Baker, Nathan A

    2018-01-01

    The Adaptive Poisson-Boltzmann Solver (APBS) software was developed to solve the equations of continuum electrostatics for large biomolecular assemblages that have provided impact in the study of a broad range of chemical, biological, and biomedical applications. APBS addresses the three key technology challenges for understanding solvation and electrostatics in biomedical applications: accurate and efficient models for biomolecular solvation and electrostatics, robust and scalable software for applying those theories to biomolecular systems, and mechanisms for sharing and analyzing biomolecular electrostatics data in the scientific community. To address new research applications and advancing computational capabilities, we have continually updated APBS and its suite of accompanying software since its release in 2001. In this article, we discuss the models and capabilities that have recently been implemented within the APBS software package including a Poisson-Boltzmann analytical and a semi-analytical solver, an optimized boundary element solver, a geometry-based geometric flow solvation model, a graph theory-based algorithm for determining pK a values, and an improved web-based visualization tool for viewing electrostatics. © 2017 The Protein Society.

  15. SSAGES: Software Suite for Advanced General Ensemble Simulations

    Science.gov (United States)

    Sidky, Hythem; Colón, Yamil J.; Helfferich, Julian; Sikora, Benjamin J.; Bezik, Cody; Chu, Weiwei; Giberti, Federico; Guo, Ashley Z.; Jiang, Xikai; Lequieu, Joshua; Li, Jiyuan; Moller, Joshua; Quevillon, Michael J.; Rahimi, Mohammad; Ramezani-Dakhel, Hadi; Rathee, Vikramjit S.; Reid, Daniel R.; Sevgen, Emre; Thapar, Vikram; Webb, Michael A.; Whitmer, Jonathan K.; de Pablo, Juan J.

    2018-01-01

    Molecular simulation has emerged as an essential tool for modern-day research, but obtaining proper results and making reliable conclusions from simulations requires adequate sampling of the system under consideration. To this end, a variety of methods exist in the literature that can enhance sampling considerably, and increasingly sophisticated, effective algorithms continue to be developed at a rapid pace. Implementation of these techniques, however, can be challenging for experts and non-experts alike. There is a clear need for software that provides rapid, reliable, and easy access to a wide range of advanced sampling methods and that facilitates implementation of new techniques as they emerge. Here we present SSAGES, a publicly available Software Suite for Advanced General Ensemble Simulations designed to interface with multiple widely used molecular dynamics simulations packages. SSAGES allows facile application of a variety of enhanced sampling techniques—including adaptive biasing force, string methods, and forward flux sampling—that extract meaningful free energy and transition path data from all-atom and coarse-grained simulations. A noteworthy feature of SSAGES is a user-friendly framework that facilitates further development and implementation of new methods and collective variables. In this work, the use of SSAGES is illustrated in the context of simple representative applications involving distinct methods and different collective variables that are available in the current release of the suite. The code may be found at: https://github.com/MICCoM/SSAGES-public.

  16. SSAGES: Software Suite for Advanced General Ensemble Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Sidky, Hythem [Department of Chemical and Biomolecular Engineering, University of Notre Dame, Notre Dame, Indiana 46556, USA; Colón, Yamil J. [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Institute for Molecular Engineering and Materials Science Division, Argonne National Laboratory, Lemont, Illinois 60439, USA; Helfferich, Julian [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Steinbuch Center for Computing, Karlsruhe Institute of Technology, Hermann-von-Helmholtz-Platz 1, 76344 Eggenstein-Leopoldshafen, Germany; Sikora, Benjamin J. [Department of Chemical and Biomolecular Engineering, University of Notre Dame, Notre Dame, Indiana 46556, USA; Bezik, Cody [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Chu, Weiwei [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Giberti, Federico [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Guo, Ashley Z. [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Jiang, Xikai [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Lequieu, Joshua [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Li, Jiyuan [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Moller, Joshua [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Quevillon, Michael J. [Department of Chemical and Biomolecular Engineering, University of Notre Dame, Notre Dame, Indiana 46556, USA; Rahimi, Mohammad [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Ramezani-Dakhel, Hadi [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Department of Biochemistry and Molecular Biology, University of Chicago, Chicago, Illinois 60637, USA; Rathee, Vikramjit S. [Department of Chemical and Biomolecular Engineering, University of Notre Dame, Notre Dame, Indiana 46556, USA; Reid, Daniel R. [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Sevgen, Emre [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Thapar, Vikram [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Webb, Michael A. [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Institute for Molecular Engineering and Materials Science Division, Argonne National Laboratory, Lemont, Illinois 60439, USA; Whitmer, Jonathan K. [Department of Chemical and Biomolecular Engineering, University of Notre Dame, Notre Dame, Indiana 46556, USA; de Pablo, Juan J. [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Institute for Molecular Engineering and Materials Science Division, Argonne National Laboratory, Lemont, Illinois 60439, USA

    2018-01-28

    Molecular simulation has emerged as an essential tool for modern-day research, but obtaining proper results and making reliable conclusions from simulations requires adequate sampling of the system under consideration. To this end, a variety of methods exist in the literature that can enhance sampling considerably, and increasingly sophisticated, effective algorithms continue to be developed at a rapid pace. Implementation of these techniques, however, can be challenging for experts and non-experts alike. There is a clear need for software that provides rapid, reliable, and easy access to a wide range of advanced sampling methods, and that facilitates implementation of new techniques as they emerge. Here we present SSAGES, a publicly available Software Suite for Advanced General Ensemble Simulations designed to interface with multiple widely used molecular dynamics simulations packages. SSAGES allows facile application of a variety of enhanced sampling techniques—including adaptive biasing force, string methods, and forward flux sampling—that extract meaningful free energy and transition path data from all-atom and coarse grained simulations. A noteworthy feature of SSAGES is a user-friendly framework that facilitates further development and implementation of new methods and collective variables. In this work, the use of SSAGES is illustrated in the context of simple representative applications involving distinct methods and different collective variables that are available in the current release of the suite.

  17. Tier-3 Monitoring Software Suite (T3MON) proposal

    CERN Document Server

    Andreeva, J; The ATLAS collaboration; Klimentov, A; Korenkov, V; Oleynik, D; Panitkin, S; Petrosyan, A

    2011-01-01

    The ATLAS Distributed Computing activities concentrated so far in the “central” part of the computing system of the experiment, namely the first 3 tiers (CERN Tier0, the 10 Tier1s centres and the 60+ Tier2s). This is a coherent system to perform data processing and management on a global scale and host (re)processing, simulation activities down to group and user analysis. Many ATLAS Institutes and National Communities built (or have plans to build) Tier-3 facilities. The definition of Tier-3 concept has been outlined (REFERENCE). Tier-3 centres consist of non-pledged resources mostly dedicated for the data analysis by the geographically close or local scientific groups. Tier-3 sites comprise a range of architectures and many do not possess Grid middleware, which would render application of Tier-2 monitoring systems useless. This document describes a strategy to develop a software suite for monitoring of the Tier3 sites. This software suite will enable local monitoring of the Tier3 sites and the global vie...

  18. User Guide for the STAYSL PNNL Suite of Software Tools

    Energy Technology Data Exchange (ETDEWEB)

    Greenwood, Lawrence R.; Johnson, Christian D.

    2013-02-27

    The STAYSL PNNL software suite provides a set of tools for working with neutron activation rates measured in a nuclear fission reactor, an accelerator-based neutron source, or any neutron field to determine the neutron flux spectrum through a generalized least-squares approach. This process is referred to as neutron spectral adjustment since the preferred approach is to use measured data to adjust neutron spectra provided by neutron physics calculations. The input data consist of the reaction rates based on measured activities, an initial estimate of the neutron flux spectrum, neutron activation cross sections and their associated uncertainties (covariances), and relevant correction factors. The output consists of the adjusted neutron flux spectrum and associated covariance matrix, which is useful for neutron dosimetry and radiation damage calculations.

  19. eXtended CASA Line Analysis Software Suite (XCLASS)

    Science.gov (United States)

    Möller, T.; Endres, C.; Schilke, P.

    2017-02-01

    The eXtended CASA Line Analysis Software Suite (XCLASS) is a toolbox for the Common Astronomy Software Applications package (CASA) containing new functions for modeling interferometric and single dish data. Among the tools is the myXCLASS program which calculates synthetic spectra by solving the radiative transfer equation for an isothermal object in one dimension, whereas the finite source size and dust attenuation are considered as well. Molecular data required by the myXCLASS program are taken from an embedded SQLite3 database containing entries from the Cologne Database for Molecular Spectroscopy (CDMS) and JPL using the Virtual Atomic and Molecular Data Center (VAMDC) portal. Additionally, the toolbox provides an interface for the model optimizer package Modeling and Analysis Generic Interface for eXternal numerical codes (MAGIX), which helps to find the best description of observational data using myXCLASS (or another external model program), that is, finding the parameter set that most closely reproduces the data. http://www.astro.uni-koeln.de/projects/schilke/myXCLASSInterface A copy of the code is available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (http://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/598/A7

  20. Freud: a software suite for high-throughput simulation analysis

    Science.gov (United States)

    Harper, Eric; Spellings, Matthew; Anderson, Joshua; Glotzer, Sharon

    Computer simulation is an indispensable tool for the study of a wide variety of systems. As simulations scale to fill petascale and exascale supercomputing clusters, so too does the size of the data produced, as well as the difficulty in analyzing these data. We present Freud, an analysis software suite for efficient analysis of simulation data. Freud makes no assumptions about the system being analyzed, allowing for general analysis methods to be applied to nearly any type of simulation. Freud includes standard analysis methods such as the radial distribution function, as well as new methods including the potential of mean force and torque and local crystal environment analysis. Freud combines a Python interface with fast, parallel C + + analysis routines to run efficiently on laptops, workstations, and supercomputing clusters. Data analysis on clusters reduces data transfer requirements, a prohibitive cost for petascale computing. Used in conjunction with simulation software, Freud allows for smart simulations that adapt to the current state of the system, enabling the study of phenomena such as nucleation and growth, intelligent investigation of phases and phase transitions, and determination of effective pair potentials.

  1. SUIT

    DEFF Research Database (Denmark)

    Algreen-Ussing, Gregers; Wedebrunn, Ola

    2003-01-01

    Leaflet om project SUIT udgivet af European Commission. Tryksagen forklarer i korte ord resultatet af projektet SUIT. Kulturværdier i Miljøspørgsmål. Vurdering af projekter og indvirkning på miljø....

  2. Controlatron Neutron Tube Test Suite Software Manual - Operation Manual (V2.2)

    CERN Document Server

    Noel, W P; Hertrich, R J; Martinez, M L; Wallace, D L

    2002-01-01

    The Controlatron Software Suite is a custom built application to perform automated testing of Controlatron neutron tubes. The software package was designed to allowing users to design tests and to run a series of test suites on a tube. The data is output to ASCII files of a pre-defined format for data analysis and viewing with the Controlatron Data Viewer Application. This manual discusses the operation of the Controlatron Test Suite Software and a brief discussion of state machine theory, as state machine is the functional basis of the software.

  3. Interface-based software integration

    Directory of Open Access Journals (Sweden)

    Aziz Ahmad Rais

    2016-07-01

    Full Text Available Enterprise architecture frameworks define the goals of enterprise architecture in order to make business processes and IT operations more effective, and to reduce the risk of future investments. These enterprise architecture frameworks offer different architecture development methods that help in building enterprise architecture. In practice, the larger organizations become, the larger their enterprise architecture and IT become. This leads to an increasingly complex system of enterprise architecture development and maintenance. Application software architecture is one type of architecture that, along with business architecture, data architecture and technology architecture, composes enterprise architecture. From the perspective of integration, enterprise architecture can be considered a system of interaction between multiple examples of application software. Therefore, effective software integration is a very important basis for the future success of the enterprise architecture in question. This article will provide interface-based integration practice in order to help simplify the process of building such a software integration system. The main goal of interface-based software integration is to solve problems that may arise with software integration requirements and developing software integration architecture.

  4. NGSUtils: a software suite for analyzing and manipulating next-generation sequencing datasets

    OpenAIRE

    Breese, Marcus R.; Liu, Yunlong

    2013-01-01

    Summary: NGSUtils is a suite of software tools for manipulating data common to next-generation sequencing experiments, such as FASTQ, BED and BAM format files. These tools provide a stable and modular platform for data management and analysis.

  5. Modular Algorithm Testbed Suite (MATS): A Software Framework for Automatic Target Recognition

    Science.gov (United States)

    2017-01-01

    NAVAL SURFACE WARFARE CENTER PANAMA CITY DIVISION PANAMA CITY, FL 32407-7001 TECHNICAL REPORT NSWC PCD TR-2017-004 MODULAR ...31-01-2017 Technical Modular Algorithm Testbed Suite (MATS): A Software Framework for Automatic Target Recognition DR...flexible platform to facilitate the development and testing of ATR algorithms. To that end, NSWC PCD has created the Modular Algorithm Testbed Suite

  6. Robotic Software Integration Using MARIE

    Directory of Open Access Journals (Sweden)

    Carle Côté

    2006-03-01

    Full Text Available This paper presents MARIE, a middleware framework oriented towards developing and integrating new and existing software for robotic systems. By using a generic communication framework, MARIE aims to create a flexible distributed component system that allows robotics developers to share software programs and algorithms, and design prototypes rapidly based on their own integration needs. The use of MARIE is illustrated with the design of a socially interactive autonomous mobile robot platform capable of map building, localization, navigation, tasks scheduling, sound source localization, tracking and separation, speech recognition and generation, visual tracking, message reading and graphical interaction using a touch screen interface.

  7. Software attribute visualization for high integrity software

    Energy Technology Data Exchange (ETDEWEB)

    Pollock, G.M.

    1998-03-01

    This report documents a prototype tool developed to investigate the use of visualization and virtual reality technologies for improving software surety confidence. The tool is utilized within the execution phase of the software life cycle. It provides a capability to monitor an executing program against prespecified requirements constraints provided in a program written in the requirements specification language SAGE. The resulting Software Attribute Visual Analysis Tool (SAVAnT) also provides a technique to assess the completeness of a software specification.

  8. Technical Note: DIRART- A software suite for deformable image registration and adaptive radiotherapy research

    International Nuclear Information System (INIS)

    Yang Deshan; Brame, Scott; El Naqa, Issam; Aditya, Apte; Wu Yu; Murty Goddu, S.; Mutic, Sasa; Deasy, Joseph O.; Low, Daniel A.

    2011-01-01

    Purpose: Recent years have witnessed tremendous progress in image guide radiotherapy technology and a growing interest in the possibilities for adapting treatment planning and delivery over the course of treatment. One obstacle faced by the research community has been the lack of a comprehensive open-source software toolkit dedicated for adaptive radiotherapy (ART). To address this need, the authors have developed a software suite called the Deformable Image Registration and Adaptive Radiotherapy Toolkit (DIRART). Methods: DIRART is an open-source toolkit developed in MATLAB. It is designed in an object-oriented style with focus on user-friendliness, features, and flexibility. It contains four classes of DIR algorithms, including the newer inverse consistency algorithms to provide consistent displacement vector field in both directions. It also contains common ART functions, an integrated graphical user interface, a variety of visualization and image-processing features, dose metric analysis functions, and interface routines. These interface routines make DIRART a powerful complement to the Computational Environment for Radiotherapy Research (CERR) and popular image-processing toolkits such as ITK. Results: DIRART provides a set of image processing/registration algorithms and postprocessing functions to facilitate the development and testing of DIR algorithms. It also offers a good amount of options for DIR results visualization, evaluation, and validation. Conclusions: By exchanging data with treatment planning systems via DICOM-RT files and CERR, and by bringing image registration algorithms closer to radiotherapy applications, DIRART is potentially a convenient and flexible platform that may facilitate ART and DIR research.

  9. Salvus: A scalable software suite for full-waveform modelling & inversion

    Science.gov (United States)

    Afanasiev, M.; Boehm, C.; van Driel, M.; Krischer, L.; Fichtner, A.

    2017-12-01

    Full-waveform inversion (FWI), whether at the lab, exploration, or planetary scale, requires the cooperation of five principal components. (1) The geometry of the domain needs to be properly discretized and an initial guess of the model parameters must be projected onto it; (2) Large volumes of recorded waveform data must be collected, organized, and processed; (3) Synthetic waveform data must be efficiently and accurately computed through complex domains; (4) Suitable misfit functions and optimization techniques must be used to relate discrepancies in data space to perturbations in the model; and (5) Some form of workflow management must be employed to schedule and run (1) - (4) in the correct order. Each one of these components can represent a formidable technical challenge which redirects energy from the true task at hand: using FWI to extract new information about some underlying continuum.In this presentation we give an overview of the current status of the Salvus software suite, which was introduced to address the challenges listed above. Specifically, we touch on (1) salvus_mesher, which eases the discretization of complex Earth models into hexahedral meshes; (2) salvus_seismo, which integrates with LASIF and ObsPy to streamline the processing and preparation of seismic data; (3) salvus_wave, a high-performance and scalable spectral-element solver capable of simulating waveforms through general unstructured 2- and 3-D domains, and (4) salvus_opt, an optimization toolbox specifically designed for full-waveform inverse problems. Tying everything together, we also discuss (5) salvus_flow: a workflow package designed to orchestrate and manage the rest of the suite. It is our hope that these developments represent a step towards the automation of large-scale seismic waveform inversion, while also lowering the barrier of entry for new applications. We include several examples of Salvus' use in (extra-) planetary seismology, non-destructive testing, and medical

  10. Software for pipeline integrity administration

    Energy Technology Data Exchange (ETDEWEB)

    Soula, Gerardo; Perona, Lucas Fernandez [Gie SA., Buenos Aires (Argentina); Martinich, Carlos [Refinaria do Norte S. A. (REFINOR), Tartagal, Provincia de Salta (Argentina)

    2009-07-01

    A Software for 'pipeline integrity management' was developed. It allows to deal with Geographical Information and a PODS database (Pipeline Open database Standard) simultaneously, in a simple and reliable way. The premises for the design were the following: didactic, geo referenced, multiple reference systems. Program skills: 1.PODS+GIS: the PODS database in which the software is based on is completely integrated with the GIS module. 2 Management of different kinds of information: it allows to manage information on facilities, repairs, interventions, physical inspections, geographical characteristics, compliance with regulations, training, offline events, operation measures, O and M information treatment and importing specific data and studies in a massive way. It also assures the integrity of the loaded information. 3 Right of way survey: it allows to verify the class location, ROW occupation, sensitive areas identification and to manage landowners. 4 Risk analysis: it is done in a qualitative way, depending on the entered data, allowing the user to identify the riskiest stretches of the system. Either results from risk analysis, data and consultations made about the database, can be exported to standard formats. (author)

  11. Model-integrating software components engineering flexible software systems

    CERN Document Server

    Derakhshanmanesh, Mahdi

    2015-01-01

    In his study, Mahdi Derakhshanmanesh builds on the state of the art in modeling by proposing to integrate models into running software on the component-level without translating them to code. Such so-called model-integrating software exploits all advantages of models: models implicitly support a good separation of concerns, they are self-documenting and thus improve understandability and maintainability and in contrast to model-driven approaches there is no synchronization problem anymore between the models and the code generated from them. Using model-integrating components, software will be

  12. A comprehensive software suite for protein family construction and functional site prediction.

    Directory of Open Access Journals (Sweden)

    David Renfrew Haft

    Full Text Available In functionally diverse protein families, conservation in short signature regions may outperform full-length sequence comparisons for identifying proteins that belong to a subgroup within which one specific aspect of their function is conserved. The SIMBAL workflow (Sites Inferred by Metabolic Background Assertion Labeling is a data-mining procedure for finding such signature regions. It begins by using clues from genomic context, such as co-occurrence or conserved gene neighborhoods, to build a useful training set from a large number of uncharacterized but mutually homologous proteins. When training set construction is successful, the YES partition is enriched in proteins that share function with the user's query sequence, while the NO partition is depleted. A selected query sequence is then mined for short signature regions whose closest matches overwhelmingly favor proteins from the YES partition. High-scoring signature regions typically contain key residues critical to functional specificity, so proteins with the highest sequence similarity across these regions tend to share the same function. The SIMBAL algorithm was described previously, but significant manual effort, expertise, and a supporting software infrastructure were required to prepare the requisite training sets. Here, we describe a new, distributable software suite that speeds up and simplifies the process for using SIMBAL, most notably by providing tools that automate training set construction. These tools have broad utility for comparative genomics, allowing for flexible collection of proteins or protein domains based on genomic context as well as homology, a capability that can greatly assist in protein family construction. Armed with this new software suite, SIMBAL can serve as a fast and powerful in silico alternative to direct experimentation for characterizing proteins and their functional interactions.

  13. Integrated Power, Avionics, and Software (IPAS) Flexible Systems Integration

    Data.gov (United States)

    National Aeronautics and Space Administration — The Integrated Power, Avionics, and Software (IPAS) facility is a flexible, multi-mission hardware and software design environment. This project will develop a...

  14. Automated integration of lidar into the LANDFIRE product suite

    Science.gov (United States)

    Birgit Peterson; Kurtis J. Nelson; Carl Seielstad; Jason Stoker; W. Matt Jolly; Russell Parsons

    2015-01-01

    Accurate information about three-dimensional canopy structure and wildland fuel across the landscape is necessary for fire behaviour modelling system predictions. Remotely sensed data are invaluable for assessing these canopy characteristics over large areas; lidar data, in particular, are uniquely suited for quantifying three-dimensional canopy structure. Although...

  15. A Software Suite for Testing SpaceWire Devices and Networks

    Science.gov (United States)

    Mills, Stuart; Parkes, Steve

    2015-09-01

    SpaceWire is a data-handling network for use on-board spacecraft, which connects together instruments, mass-memory, processors, downlink telemetry, and other on-board sub-systems. SpaceWire is simple to implement and has some specific characteristics that help it support data-handling applications in space: high-speed, low-power, simplicity, relatively low implementation cost, and architectural flexibility making it ideal for many space missions. SpaceWire provides high-speed (2 Mbits/s to 200 Mbits/s), bi-directional, full-duplex data-links, which connect together SpaceWire enabled equipment. Data-handling networks can be built to suit particular applications using point-to-point data-links and routing switches. STAR-Dundee’s STAR-System software stack has been designed to meet the needs of engineers designing and developing SpaceWire networks and devices. This paper describes the aims of the software and how those needs were met.

  16. CLMSVault: A Software Suite for Protein Cross-Linking Mass-Spectrometry Data Analysis and Visualization.

    Science.gov (United States)

    Courcelles, Mathieu; Coulombe-Huntington, Jasmin; Cossette, Émilie; Gingras, Anne-Claude; Thibault, Pierre; Tyers, Mike

    2017-07-07

    Protein cross-linking mass spectrometry (CL-MS) enables the sensitive detection of protein interactions and the inference of protein complex topology. The detection of chemical cross-links between protein residues can identify intra- and interprotein contact sites or provide physical constraints for molecular modeling of protein structure. Recent innovations in cross-linker design, sample preparation, mass spectrometry, and software tools have significantly improved CL-MS approaches. Although a number of algorithms now exist for the identification of cross-linked peptides from mass spectral data, a dearth of user-friendly analysis tools represent a practical bottleneck to the broad adoption of the approach. To facilitate the analysis of CL-MS data, we developed CLMSVault, a software suite designed to leverage existing CL-MS algorithms and provide intuitive and flexible tools for cross-platform data interpretation. CLMSVault stores and combines complementary information obtained from different cross-linkers and search algorithms. CLMSVault provides filtering, comparison, and visualization tools to support CL-MS analyses and includes a workflow for label-free quantification of cross-linked peptides. An embedded 3D viewer enables the visualization of quantitative data and the mapping of cross-linked sites onto PDB structural models. We demonstrate the application of CLMSVault for the analysis of a noncovalent Cdc34-ubiquitin protein complex cross-linked under different conditions. CLMSVault is open-source software (available at https://gitlab.com/courcelm/clmsvault.git ), and a live demo is available at http://democlmsvault.tyerslab.com/ .

  17. Probability and statistics with integrated software routines

    CERN Document Server

    Deep, Ronald

    2005-01-01

    Probability & Statistics with Integrated Software Routines is a calculus-based treatment of probability concurrent with and integrated with statistics through interactive, tailored software applications designed to enhance the phenomena of probability and statistics. The software programs make the book unique.The book comes with a CD containing the interactive software leading to the Statistical Genie. The student can issue commands repeatedly while making parameter changes to observe the effects. Computer programming is an excellent skill for problem solvers, involving design, prototyping, data gathering, testing, redesign, validating, etc, all wrapped up in the scientific method.See also: CD to accompany Probability and Stats with Integrated Software Routines (0123694698)* Incorporates more than 1,000 engaging problems with answers* Includes more than 300 solved examples* Uses varied problem solving methods

  18. The application of formal software engineering methods to the unattended and remote monitoring software suite at Los Alamos National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Determan, John Clifford [Los Alamos National Laboratory; Longo, Joseph F [Los Alamos National Laboratory; Michel, Kelly D [Los Alamos National Laboratory

    2009-01-01

    The Unattended and Remote Monitoring (UNARM) system is a collection of specialized hardware and software used by the International Atomic Energy Agency (IAEA) to institute nuclear safeguards at many nuclear facilities around the world. The hardware consists of detectors, instruments, and networked computers for acquiring various forms of data, including but not limited to radiation data, global position coordinates, camera images, isotopic data, and operator declarations. The software provides two primary functions: the secure and reliable collection of this data from the instruments and the ability to perform an integrated review and analysis of the disparate data sources. Several years ago the team responsible for maintaining the software portion of the UNARM system began the process of formalizing its operations. These formal operations include a configuration management system, a change control board, an issue tracking system, and extensive formal testing, for both functionality and reliability. Functionality is tested with formal test cases chosen to fully represent the data types and methods of analysis that will be commonly encountered. Reliability is tested with iterative, concurrent testing where up to five analyses are executed simultaneously for thousands of cycles. Iterative concurrent testing helps ensure that there are no resource conflicts or leaks when multiple system components are in use simultaneously. The goal of this work is to provide a high quality, reliable product, commensurate with the criticality of the application. Testing results will be presented that demonstrate that this goal has been achieved and the impact of the introduction of a formal software engineering framework to the UNARM product will be presented.

  19. CCP4 Software Suite: history, evolution, content, challenges and future developments

    Directory of Open Access Journals (Sweden)

    Krissinel, Eugene

    2015-04-01

    Full Text Available Collaborative Computational Project Number 4 (CCP4 in Protein Crystallography is a public resource for producing and supporting a world-leading, integrated Suite of programs that allows researchers to determine macromolecular structures by X-ray crystallography, and other biophysical techniques. CCP4 supports the widest possible researcher community, embracing academic, not for profit, and for profit research. The primary aims of CCP4 include development and support of the development of cutting edge approaches to experimental determination and analysis of protein structure, with integration of them into the suite for worldwide dissemination. In addition, CCP4 plays an important role in the education and training of scientists in experimental structural biology. In this paper, we overview CCP4’s 35-year long history and (technical milestones of its evolution. We will also consider how a particular structure of CCP4 Suite and Collaboration has emerged, its main functionality, current state and plans for future.“Collaborative Computational Project Number 4 (CCP4” en Cristalografía de Proteínas es un recurso público líder mundial, encaminado a producir y mantener un conjunto integrado de programas que permite a los investigadores determinar estructuras macromoleculares mediante cristalografía de rayos-X, así como por otras técnicas biofísicas. CCP4 va dirigido a la más amplia comunidad científica posible, abarcando investigaciones en el ámbito académico, tanto sin ánimo de lucro como con él. Sus objetivos principales incluyen el desarrollo y soporte de metodologías punteras para la determinación y análisis de estructuras de proteínas, integradas en un conjunto bien definido para facilitar su fácil difusión mundial. Además, CCP4 juega un papel importante en la formación y entrenamiento de científicos en biología estructural experimental. En este artículo, ofreceré una visión de conjunto de la larga historia e hitos t

  20. Personal computer security: part 1. Firewalls, antivirus software, and Internet security suites.

    Science.gov (United States)

    Caruso, Ronald D

    2003-01-01

    Personal computer (PC) security in the era of the Health Insurance Portability and Accountability Act of 1996 (HIPAA) involves two interrelated elements: safeguarding the basic computer system itself and protecting the information it contains and transmits, including personal files. HIPAA regulations have toughened the requirements for securing patient information, requiring every radiologist with such data to take further precautions. Security starts with physically securing the computer. Account passwords and a password-protected screen saver should also be set up. A modern antivirus program can easily be installed and configured. File scanning and updating of virus definitions are simple processes that can largely be automated and should be performed at least weekly. A software firewall is also essential for protection from outside intrusion, and an inexpensive hardware firewall can provide yet another layer of protection. An Internet security suite yields additional safety. Regular updating of the security features of installed programs is important. Obtaining a moderate degree of PC safety and security is somewhat inconvenient but is necessary and well worth the effort. Copyright RSNA, 2003

  1. Automated integration of lidar into the LANDFIRE product suite

    Science.gov (United States)

    Peterson, Birgit; Nelson, Kurtis; Seielstad, Carl; Stoker, Jason M.; Jolly, W. Matt; Parsons, Russell

    2015-01-01

    Accurate information about three-dimensional canopy structure and wildland fuel across the landscape is necessary for fire behaviour modelling system predictions. Remotely sensed data are invaluable for assessing these canopy characteristics over large areas; lidar data, in particular, are uniquely suited for quantifying three-dimensional canopy structure. Although lidar data are increasingly available, they have rarely been applied to wildland fuels mapping efforts, mostly due to two issues. First, the Landscape Fire and Resource Planning Tools (LANDFIRE) program, which has become the default source of large-scale fire behaviour modelling inputs for the US, does not currently incorporate lidar data into the vegetation and fuel mapping process because spatially continuous lidar data are not available at the national scale. Second, while lidar data are available for many land management units across the US, these data are underutilized for fire behaviour applications. This is partly due to a lack of local personnel trained to process and analyse lidar data. This investigation addresses these issues by developing the Creating Hybrid Structure from LANDFIRE/lidar Combinations (CHISLIC) tool. CHISLIC allows individuals to automatically generate a suite of vegetation structure and wildland fuel parameters from lidar data and infuse them into existing LANDFIRE data sets. CHISLIC will become available for wider distribution to the public through a partnership with the U.S. Forest Service’s Wildland Fire Assessment System (WFAS) and may be incorporated into the Wildland Fire Decision Support System (WFDSS) with additional design and testing. WFAS and WFDSS are the primary systems used to support tactical and strategic wildland fire management decisions.

  2. EpiTools, A software suite for presurgical brain mapping in epilepsy: Intracerebral EEG.

    Science.gov (United States)

    Medina Villalon, S; Paz, R; Roehri, N; Lagarde, S; Pizzo, F; Colombet, B; Bartolomei, F; Carron, R; Bénar, C-G

    2018-03-29

    In pharmacoresistant epilepsy, exploration with depth electrodes can be needed to precisely define the epileptogenic zone. Accurate location of these electrodes is thus essential for the interpretation of Stereotaxic EEG (SEEG) signals. As SEEG analysis increasingly relies on signal processing, it is crucial to make a link between these results and patient's anatomy. Our aims were thus to develop a suite of software tools, called "EpiTools", able to i) precisely and automatically localize the position of each SEEG contact and ii) display the results of signal analysis in each patient's anatomy. The first tool, GARDEL (GUI for Automatic Registration and Depth Electrode Localization), is able to automatically localize SEEG contacts and to label each contact according to a pre-specified nomenclature (for instance that of FreeSurfer or MarsAtlas). The second tool, 3Dviewer, enables to visualize in the 3D anatomy of the patient the origin of signal processing results such as rate of biomarkers, connectivity graphs or Epileptogenicity Index. GARDEL was validated in 30 patients by clinicians and proved to be highly reliable to determine within the patient's individual anatomy the actual location of contacts. GARDEL is a fully automatic electrode localization tool needing limited user interaction (only for electrode naming or contact correction). The 3Dviewer is able to read signal processing results and to display them in link with patient's anatomy. EpiTools can help speeding up the interpretation of SEEG data and improving its precision. Copyright © 2018 Elsevier B.V. All rights reserved.

  3. Software extension and integration with type classes

    DEFF Research Database (Denmark)

    Lämmel, Ralf; Ostermann, Klaus

    2006-01-01

    expressiveness, by using the language concept of \\emph{type classes}, as it is available in the functional programming language Haskell. A detailed comparison with related work shows that type classes provide a powerful framework in which solutions to known software extension and integration problems can...... be provided. We also pinpoint several limitations of type classes in this context....

  4. Integrating existing software toolkits into VO system

    Science.gov (United States)

    Cui, Chenzhou; Zhao, Yong-Heng; Wang, Xiaoqian; Sang, Jian; Luo, Ze

    2004-09-01

    Virtual Observatory (VO) is a collection of interoperating data archives and software tools. Taking advantages of the latest information technologies, it aims to provide a data-intensively online research environment for astronomers all around the world. A large number of high-qualified astronomical software packages and libraries are powerful and easy of use, and have been widely used by astronomers for many years. Integrating those toolkits into the VO system is a necessary and important task for the VO developers. VO architecture greatly depends on Grid and Web services, consequently the general VO integration route is "Java Ready - Grid Ready - VO Ready". In the paper, we discuss the importance of VO integration for existing toolkits and discuss the possible solutions. We introduce two efforts in the field from China-VO project, "gImageMagick" and "Galactic abundance gradients statistical research under grid environment". We also discuss what additional work should be done to convert Grid service to VO service.

  5. Model-driven design of simulation support for the TERRA robot software tool suite

    NARCIS (Netherlands)

    Lu, Zhou; Bezemer, M.M.; Broenink, Johannes F.

    2015-01-01

    Model-Driven Development (MDD) – based on the concepts of model, meta-model and model transformation – is an approach to develop predictable and re- liable software for Cyber-Physical Systems (CPS). The work presented here concerns a methodology to design simulation software based on MDD techniques,

  6. EarthCollab, building geoscience-centric implementations of the VIVO semantic software suite

    Science.gov (United States)

    Rowan, L. R.; Gross, M. B.; Mayernik, M. S.; Daniels, M. D.; Krafft, D. B.; Kahn, H. J.; Allison, J.; Snyder, C. B.; Johns, E. M.; Stott, D.

    2017-12-01

    EarthCollab, an EarthCube Building Block project, is extending an existing open-source semantic web application, VIVO, to enable the exchange of information about scientific researchers and resources across institutions. EarthCollab is a collaboration between UNAVCO, a geodetic facility and consortium that supports diverse research projects informed by geodesy, The Bering Sea Project, an interdisciplinary field program whose data archive is hosted by NCAR's Earth Observing Laboratory, and Cornell University. VIVO has been implemented by more than 100 universities and research institutions to highlight research and institutional achievements. This presentation will discuss benefits and drawbacks of working with and extending open source software. Some extensions include plotting georeferenced objects on a map, a mobile-friendly theme, integration of faceting via Elasticsearch, extending the VIVO ontology to capture geoscience-centric objects and relationships, and the ability to cross-link between VIVO instances. Most implementations of VIVO gather information about a single organization. The EarthCollab project created VIVO extensions to enable cross-linking of VIVO instances to reduce the amount of duplicate information about the same people and scientific resources and to enable dynamic linking of related information across VIVO installations. As the list of customizations grows, so does the effort required to maintain compatibility between the EarthCollab forks and the main VIVO code. For example, dozens of libraries and dependencies were updated prior to the VIVO v1.10 release, which introduced conflicts in the EarthCollab cross-linking code. The cross-linking code has been developed to enable sharing of data across different versions of VIVO, however, using a JSON output schema standardized across versions. We will outline lessons learned in working with VIVO and its open source dependencies, which include Jena, Solr, Freemarker, and jQuery and discuss future

  7. Integrating interface slicing into software engineering processes

    Science.gov (United States)

    Beck, Jon

    1993-01-01

    Interface slicing is a tool which was developed to facilitate software engineering. As previously presented, it was described in terms of its techniques and mechanisms. The integration of interface slicing into specific software engineering activities is considered by discussing a number of potential applications of interface slicing. The applications discussed specifically address the problems, issues, or concerns raised in a previous project. Because a complete interface slicer is still under development, these applications must be phrased in future tenses. Nonetheless, the interface slicing techniques which were presented can be implemented using current compiler and static analysis technology. Whether implemented as a standalone tool or as a module in an integrated development or reverse engineering environment, they require analysis no more complex than that required for current system development environments. By contrast, conventional slicing is a methodology which, while showing much promise and intuitive appeal, has yet to be fully implemented in a production language environment despite 12 years of development.

  8. Integration of software for scenario exploration

    International Nuclear Information System (INIS)

    Oyamada, Kiyoshi; Ikeda, Takao

    1999-03-01

    The scenario exploration methodology using shadow models is a variation of the environmental simulation method. Key aspect of the scenario exploration is the use of shadow models which are not corresponding to any specific assumptions on physical processes and, instead, abstract their general features relevant to the effects on nuclide transport in a general manner so that benefit of using simulation approach can be maximized. In developing the shadow models, all the modelling options that have not yet been denied by the experts are kept and parametrized in a very general framework. This, in turn, enables one to treat various types of the uncertainty in performance assessment, i.e., scenario uncertainty, conceptual model uncertainty, mathematical model uncertainty and parameter uncertainty, in a common framework of uncertainty / sensitivity analysis. Objective of the current study is to review / modify the tools which have been developed separately and, thence, not fully consistent from one to the other and to integrate them into a unified methodology and software. Tasks for this are; 1. modification / integration of tools for scenario exploration of nuclide transport in the EBS and the near-field host rock, 2. verification of the software modified and integrated, 3. installation of the software at JNC. (author)

  9. APMS: An Integrated Suite of Tools for Measuring Performance and Safety

    Science.gov (United States)

    Statler, Irving C.; Lynch, Robert E.; Connors, Mary M. (Technical Monitor)

    1997-01-01

    statistical evaluation of the performance of large groups of flights. This paper describes the integrated suite of tools that will assist analysts in evaluating the operational performance and safety of the national air transport system, the air carrier, and the air crew.

  10. The Aviation Performance Measuring System (APMS): An Integrated Suite of Tools for Measuring Performance and Safety

    Science.gov (United States)

    Statler, Irving C.; Connor, Mary M. (Technical Monitor)

    1998-01-01

    statistical evaluation of the performance of large groups of flights. This paper describes the integrated suite of tools that will assist analysts in evaluating the operational performance and safety of the national air transport system, the air carrier, and the aircrew.

  11. MASH Suite: a user-friendly and versatile software interface for high-resolution mass spectrometry data interpretation and visualization.

    Science.gov (United States)

    Guner, Huseyin; Close, Patrick L; Cai, Wenxuan; Zhang, Han; Peng, Ying; Gregorich, Zachery R; Ge, Ying

    2014-03-01

    The rapid advancements in mass spectrometry (MS) instrumentation, particularly in Fourier transform (FT) MS, have made the acquisition of high-resolution and high-accuracy mass measurements routine. However, the software tools for the interpretation of high-resolution MS data are underdeveloped. Although several algorithms for the automatic processing of high-resolution MS data are available, there is still an urgent need for a user-friendly interface with functions that allow users to visualize and validate the computational output. Therefore, we have developed MASH Suite, a user-friendly and versatile software interface for processing high-resolution MS data. MASH Suite contains a wide range of features that allow users to easily navigate through data analysis, visualize complex high-resolution MS data, and manually validate automatically processed results. Furthermore, it provides easy, fast, and reliable interpretation of top-down, middle-down, and bottom-up MS data. MASH Suite is convenient, easily operated, and freely available. It can greatly facilitate the comprehensive interpretation and validation of high-resolution MS data with high accuracy and reliability.

  12. DelPhi: a comprehensive suite for DelPhi software and associated resources

    Directory of Open Access Journals (Sweden)

    Li Lin

    2012-05-01

    Full Text Available Abstract Background Accurate modeling of electrostatic potential and corresponding energies becomes increasingly important for understanding properties of biological macromolecules and their complexes. However, this is not an easy task due to the irregular shape of biological entities and the presence of water and mobile ions. Results Here we report a comprehensive suite for the well-known Poisson-Boltzmann solver, DelPhi, enriched with additional features to facilitate DelPhi usage. The suite allows for easy download of both DelPhi executable files and source code along with a makefile for local installations. The users can obtain the DelPhi manual and parameter files required for the corresponding investigation. Non-experienced researchers can download examples containing all necessary data to carry out DelPhi runs on a set of selected examples illustrating various DelPhi features and demonstrating DelPhi’s accuracy against analytical solutions. Conclusions DelPhi suite offers not only the DelPhi executable and sources files, examples and parameter files, but also provides links to third party developed resources either utilizing DelPhi or providing plugins for DelPhi. In addition, the users and developers are offered a forum to share ideas, resolve issues, report bugs and seek help with respect to the DelPhi package. The resource is available free of charge for academic users from URL: http://compbio.clemson.edu/DelPhi.php.

  13. Asset management: integrated software optimizes production performance

    Energy Technology Data Exchange (ETDEWEB)

    Polczer, S.

    1998-06-01

    Two new multi-dimensional databases, which expand the `row and column` concept of spreadsheets into multiple categories of data called dimensions, are described. These integrated software packages provide the foundation for industry players such as Poco Petroleum Ltd and Numac Energy Inc to gain a competitive advantage, by overhauling their respective data collection and retrieval systems to allow for timely cost analysis and financial reporting. Energy Warehouse, an on-line analytical processing product marketed by SysGold Ltd, is one of the software products described. It gathers various sources of information, allows advanced searches and generates reports previously unavailable in other conventional financial accounting systems. The second product discussed - the Canadian Upstream Energy System (CUES) - is an on-line analytical processing system developed by Oracle Corporation and Calgary-based Applied Terravision Systems (ATS) Inc. CUES combines Oracle`s universal data server and software development tools with ATS`s upstream financial, land, geotechnical and production applications. The software also allows for optimization of facilities, analysis of production efficiencies and comparison of performance against industry standards.

  14. Integration and validation of a data grid software

    Science.gov (United States)

    Carenton-Madiec, Nicolas; Berger, Katharina; Cofino, Antonio

    2014-05-01

    The Earth System Grid Federation (ESGF) Peer-to-Peer (P2P) is a software infrastructure for the management, dissemination, and analysis of model output and observational data. The ESGF grid is composed with several types of nodes which have different roles. About 40 data nodes host model outputs and datasets using thredds catalogs. About 25 compute nodes offer remote visualization and analysis tools. About 15 index nodes crawl data nodes catalogs and implement faceted and federated search in a web interface. About 15 Identity providers nodes manage accounts, authentication and authorization. Here we will present an actual size test federation spread across different institutes in different countries and a python test suite that were started in December 2013. The first objective of the test suite is to provide a simple tool that helps to test and validate a single data node and its closest index, compute and identity provider peer. The next objective will be to run this test suite on every data node of the federation and therefore test and validate every single node of the whole federation. The suite already implements nosetests, requests, myproxy-logon, subprocess, selenium and fabric python libraries in order to test both web front ends, back ends and security services. The goal of this project is to improve the quality of deliverable in a small developers team context. Developers are widely spread around the world working collaboratively and without hierarchy. This kind of working organization context en-lighted the need of a federated integration test and validation process.

  15. Real-time fluoroscopic needle guidance in the interventional radiology suite using navigational software for percutaneous bone biopsies in children

    Energy Technology Data Exchange (ETDEWEB)

    Shellikeri, Sphoorti; Srinivasan, Abhay; Krishnamurthy, Ganesh; Vatsky, Seth; Zhu, Xiaowei; Keller, Marc S.; Cahill, Anne Marie [The Children' s Hospital of Philadelphia, Department of Radiology, Philadelphia, PA (United States); Setser, Randolph M. [Siemens Medical Solutions USA, Inc., Hoffman Estates, IL (United States); Hwang, Tiffany J. [University of Southern California, Keck School of Medicine, Los Angeles, CA (United States); Girard, Erin [Siemens Medical Solutions USA, Inc., Princeton, NJ (United States)

    2017-07-15

    Navigational software provides real-time fluoroscopic needle guidance for percutaneous procedures in the Interventional Radiology (IR) suite. We describe our experience with navigational software for pediatric percutaneous bone biopsies in the IR suite and compare technical success, diagnostic accuracy, radiation dose and procedure time with that of CT-guided biopsies. Pediatric bone biopsies performed using navigational software (Syngo iGuide, Siemens Healthcare) from 2011 to 2016 were prospectively included and anatomically matched CT-guided bone biopsies from 2008 to 2016 were retrospectively reviewed with institutional review board approval. C-arm CT protocols used for navigational software-assisted cases included institution-developed low-dose (0.1/0.17 μGy/projection), regular-dose (0.36 μGy/projection), or a combination of low-dose/regular-dose protocols. Estimated effective radiation dose and procedure times were compared between software-assisted and CT-guided biopsies. Twenty-six patients (15 male; mean age: 10 years) underwent software-assisted biopsies (15 pelvic, 7 lumbar and 4 lower extremity) and 33 patients (13 male; mean age: 9 years) underwent CT-guided biopsies (22 pelvic, 7 lumbar and 4 lower extremity). Both modality biopsies resulted in a 100% technical success rate. Twenty-five of 26 (96%) software-assisted and 29/33 (88%) CT-guided biopsies were diagnostic. Overall, the effective radiation dose was significantly lower in software-assisted than CT-guided cases (3.0±3.4 vs. 6.6±7.7 mSv, P=0.02). The effective dose difference was most dramatic in software-assisted cases using low-dose C-arm CT (1.2±1.8 vs. 6.6±7.7 mSv, P=0.001) or combined low-dose/regular-dose C-arm CT (1.9±2.4 vs. 6.6±7.7 mSv, P=0.04), whereas effective dose was comparable in software-assisted cases using regular-dose C-arm CT (6.0±3.5 vs. 6.6±7.7 mSv, P=0.7). Mean procedure time was significantly lower for software-assisted cases (91±54 vs. 141±68 min, P=0

  16. A Methodology for Integrating Maintainability Using Software Metrics

    OpenAIRE

    Lewis, John A.; Henry, Sallie M.

    1989-01-01

    Maintainability must be integrated into software early in the development process. But for practical use, the techniques used must be as unobtrusive to the existing software development process as possible. This paper defines a methodology for integrating maintainability into large-scale software and describes an experiment which implemented the methodology into a major commercial software development environment.

  17. An integrative variant analysis suite for whole exome next-generation sequencing data

    Directory of Open Access Journals (Sweden)

    Challis Danny

    2012-01-01

    Full Text Available Abstract Background Whole exome capture sequencing allows researchers to cost-effectively sequence the coding regions of the genome. Although the exome capture sequencing methods have become routine and well established, there is currently a lack of tools specialized for variant calling in this type of data. Results Using statistical models trained on validated whole-exome capture sequencing data, the Atlas2 Suite is an integrative variant analysis pipeline optimized for variant discovery on all three of the widely used next generation sequencing platforms (SOLiD, Illumina, and Roche 454. The suite employs logistic regression models in conjunction with user-adjustable cutoffs to accurately separate true SNPs and INDELs from sequencing and mapping errors with high sensitivity (96.7%. Conclusion We have implemented the Atlas2 Suite and applied it to 92 whole exome samples from the 1000 Genomes Project. The Atlas2 Suite is available for download at http://sourceforge.net/projects/atlas2/. In addition to a command line version, the suite has been integrated into the Genboree Workbench, allowing biomedical scientists with minimal informatics expertise to remotely call, view, and further analyze variants through a simple web interface. The existing genomic databases displayed via the Genboree browser also streamline the process from variant discovery to functional genomics analysis, resulting in an off-the-shelf toolkit for the broader community.

  18. The PyRosetta Toolkit: a graphical user interface for the Rosetta software suite.

    Science.gov (United States)

    Adolf-Bryfogle, Jared; Dunbrack, Roland L

    2013-01-01

    The Rosetta Molecular Modeling suite is a command-line-only collection of applications that enable high-resolution modeling and design of proteins and other molecules. Although extremely useful, Rosetta can be difficult to learn for scientists with little computational or programming experience. To that end, we have created a Graphical User Interface (GUI) for Rosetta, called the PyRosetta Toolkit, for creating and running protocols in Rosetta for common molecular modeling and protein design tasks and for analyzing the results of Rosetta calculations. The program is highly extensible so that developers can add new protocols and analysis tools to the PyRosetta Toolkit GUI.

  19. Integrating open-source software applications to build molecular dynamics systems.

    Science.gov (United States)

    Allen, Bruce M; Predecki, Paul K; Kumosa, Maciej

    2014-04-05

    Three open-source applications, NanoEngineer-1, packmol, and mis2lmp are integrated using an open-source file format to quickly create molecular dynamics (MD) cells for simulation. The three software applications collectively make up the open-source software (OSS) suite known as MD Studio (MDS). The software is validated through software engineering practices and is verified through simulation of the diglycidyl ether of bisphenol-a and isophorone diamine (DGEBA/IPD) system. Multiple simulations are run using the MDS software to create MD cells, and the data generated are used to calculate density, bulk modulus, and glass transition temperature of the DGEBA/IPD system. Simulation results compare well with published experimental and numerical results. The MDS software prototype confirms that OSS applications can be analyzed against real-world research requirements and integrated to create a new capability. Copyright © 2014 Wiley Periodicals, Inc.

  20. SAGES: a suite of freely-available software tools for electronic disease surveillance in resource-limited settings.

    Directory of Open Access Journals (Sweden)

    Sheri L Lewis

    Full Text Available Public health surveillance is undergoing a revolution driven by advances in the field of information technology. Many countries have experienced vast improvements in the collection, ingestion, analysis, visualization, and dissemination of public health data. Resource-limited countries have lagged behind due to challenges in information technology infrastructure, public health resources, and the costs of proprietary software. The Suite for Automated Global Electronic bioSurveillance (SAGES is a collection of modular, flexible, freely-available software tools for electronic disease surveillance in resource-limited settings. One or more SAGES tools may be used in concert with existing surveillance applications or the SAGES tools may be used en masse for an end-to-end biosurveillance capability. This flexibility allows for the development of an inexpensive, customized, and sustainable disease surveillance system. The ability to rapidly assess anomalous disease activity may lead to more efficient use of limited resources and better compliance with World Health Organization International Health Regulations.

  1. Asset management -- Integrated software optimizes production performance

    International Nuclear Information System (INIS)

    Polczer, S.

    1998-01-01

    Developments in data collection and retrieval systems to allow timely cost analysis, financial reporting and production management are discussed. One of the most important new OLAP (on-line analytical processing) products is Energy Warehouse which gathers field information from various sources, allows advanced searches, and generates reports previously unavailable in other conventional financial accounting systems. Another OLAP-based system, the Canadian Upstream Energy System (CUES), was developed by the Oracle Corporation and the Calgary-based Applied Terravision Systems (ATS) Inc. CUES combines Oracle's universal data server software development tools with ATS's upstream financial, land, geotechnical and production applications. ATS also developed a product called IDPMARS (Integrated Daily Production Management Accounting Reporting System). It interfaces with CUES to link working interests, government royalties, administration, facility charges, lifting costs, transportation tooling, and customers by integrating field data collection systems with financial accounting

  2. Asset management -- Integrated software optimizes production performance

    Energy Technology Data Exchange (ETDEWEB)

    Polczer, S.

    1998-10-01

    Developments in data collection and retrieval systems to allow timely cost analysis, financial reporting and production management are discussed. One of the most important new OLAP (on-line analytical processing) products is Energy Warehouse which gathers field information from various sources, allows advanced searches, and generates reports previously unavailable in other conventional financial accounting systems. Another OLAP-based system, the Canadian Upstream Energy System (CUES), was developed by the Oracle Corporation and the Calgary-based Applied Terravision Systems (ATS) Inc. CUES combines Oracle`s universal data server software development tools with ATS`s upstream financial, land, geotechnical and production applications. ATS also developed a product called IDPMARS (Integrated Daily Production Management Accounting Reporting System). It interfaces with CUES to link working interests, government royalties, administration, facility charges, lifting costs, transportation tooling, and customers by integrating field data collection systems with financial accounting.

  3. Approaches in highly parameterized inversion—PEST++ Version 3, a Parameter ESTimation and uncertainty analysis software suite optimized for large environmental models

    Science.gov (United States)

    Welter, David E.; White, Jeremy T.; Hunt, Randall J.; Doherty, John E.

    2015-09-18

    The PEST++ Version 1 object-oriented parameter estimation code is here extended to Version 3 to incorporate additional algorithms and tools to further improve support for large and complex environmental modeling problems. PEST++ Version 3 includes the Gauss-Marquardt-Levenberg (GML) algorithm for nonlinear parameter estimation, Tikhonov regularization, integrated linear-based uncertainty quantification, options of integrated TCP/IP based parallel run management or external independent run management by use of a Version 2 update of the GENIE Version 1 software code, and utilities for global sensitivity analyses. The Version 3 code design is consistent with PEST++ Version 1 and continues to be designed to lower the barriers of entry for users as well as developers while providing efficient and optimized algorithms capable of accommodating large, highly parameterized inverse problems. As such, this effort continues the original focus of (1) implementing the most popular and powerful features of the PEST software suite in a fashion that is easy for novice or experienced modelers to use and (2) developing a software framework that is easy to extend.

  4. Integrated modeling of software cost and quality

    International Nuclear Information System (INIS)

    Rone, K.Y.; Olson, K.M.

    1994-01-01

    In modeling the cost and quality of software systems, the relationship between cost and quality must be considered. This explicit relationship is dictated by the criticality of the software being developed. The balance between cost and quality is a viable software engineering trade-off throughout the life cycle. Therefore, the ability to accurately estimate the cost and quality of software systems is essential to providing reliable software on time and within budget. Software cost models relate the product error rate to the percent of the project labor that is required for independent verification and validation. The criticality of the software determines which cost model is used to estimate the labor required to develop the software. Software quality models yield an expected error discovery rate based on the software size, criticality, software development environment, and the level of competence of the project and the developers with respect to the processes being employed

  5. Software features and applications in process design, integration and operation

    Energy Technology Data Exchange (ETDEWEB)

    Dhole, V. [Aspen Tech Limited, Warrington (United Kingdom)

    1999-02-01

    Process engineering technologies and tools have evolved rapidly over the last twenty years. Process simulation/modeling, advanced process control, on-line optimisation, production planning and supply chain management are some of the examples of technologies that have rapidly matured from early commercial prototypes and concepts to established tools with significant impact on profitability of process industry today. Process Synthesis or Process Integration (PI) in comparison is yet to create its impact and still remains largely in the domain of few expert users. One of the key reasons as to why PI has not taken off is because the PI tools have not become integral components of the standard process engineering environments. On the last 15 years AspenTech has grown from a small process simulation tool provider to a large multinational company providing a complete suite of process engineering technologies and services covering process design, operation, planning and supply chain management. Throughout this period, AspenTech has acquired experience in rapidly evolving technologies from their early prototype stage to mature products and services. The paper outlines AspenTech`s strategy of integrating PI with other more established process design and operational improvement technologies. The paper illustrates the key elements of AspenTech`s strategy via examples of software development initiatives and services projects. The paper also outlines AspenTech`s future vision of the role of PI in process engineering. (au)

  6. Integrated Suit Test 1 - A Study to Evaluate Effects of Suit Weight, Pressure, and Kinematics on Human Performance during Lunar Ambulation

    Science.gov (United States)

    Gernhardt, Michael L.; Norcross, Jason; Vos, Jessica R.

    2008-01-01

    In an effort to design the next generation Lunar suit, NASA has initiated a series of tests aimed at understanding the human physiological and biomechanical affects of space suits under a variety of conditions. The first of these tests was the EVA Walkback Test (ICES 2007-01-3133). NASA-JSC assembled a multi-disciplinary team to conduct the second test of the series, titled Integrated Suit Test 1 (IST-1), from March 6 through July 24, 2007. Similar to the Walkback Test, this study was performed with the Mark III (MKIII) EVA Technology Demonstrator suit, a treadmill, and the Partial Gravity Simulator in the Space Vehicle Mock-Up Facility at Johnson Space Center. The data collected for IST-1 included metabolic rates, ground reaction forces, biomechanics, and subjective workload and controllability feedback on both suited and unsuited (shirt-sleeve) astronaut subjects. For IST-1 the center of gravity was controlled to a nearly perfect position while the weight, pressure and biomechanics (waist locked vs. unlocked) were varied individually to evaluate the effects of each on the ability to perform level (0 degree incline) ambulation in simulated Lunar gravity. The detailed test methodology and preliminary key findings of IST-1 are summarized in this report.

  7. On the Prospects and Concerns of Integrating Open Source Software Environment in Software Engineering Education

    Science.gov (United States)

    Kamthan, Pankaj

    2007-01-01

    Open Source Software (OSS) has introduced a new dimension in software community. As the development and use of OSS becomes prominent, the question of its integration in education arises. In this paper, the following practices fundamental to projects and processes in software engineering are examined from an OSS perspective: project management;…

  8. Integrating Behaviour in Software Models: An Event Coordination Notation

    DEFF Research Database (Denmark)

    Kindler, Ekkart

    2011-01-01

    One of the main problems in model-based software engineering is modelling behaviour in such a way that the behaviour models can be easily integrated with each other, with the structural software models and with pre-existing software. In this paper, we propose an event coordination notation (ECNO)...

  9. Integrating Usability Evaluations into the Software Development Process

    DEFF Research Database (Denmark)

    Lizano, Fulvio

    as relevant and strategic human–computer interaction (HCI) activities in the software development process, there are obstacles that limit the complete, effective and efficient integration of this kind of testing into the software development process. Two main obstacles are the cost of usability evaluations...... and the software developers' resistance to accepting users’ opinions regarding the lack of usability in their software systems. The ‘cost obstacle’ refers to the constraint of conducting usability evaluations in the software process due to the significant amount of resources required by this type of testing. Some......This thesis addresses the integration of usability evaluations into the software development process. The integration here is contextualized in terms of how to include usability evaluation as an activity in the software development lifecycle. Even though usability evaluations are considered...

  10. Requirements Engineering for Software Integrity and Safety

    Science.gov (United States)

    Leveson, Nancy G.

    2002-01-01

    Requirements flaws are the most common cause of errors and software-related accidents in operational software. Most aerospace firms list requirements as one of their most important outstanding software development problems and all of the recent, NASA spacecraft losses related to software (including the highly publicized Mars Program failures) can be traced to requirements flaws. In light of these facts, it is surprising that relatively little research is devoted to requirements in contrast with other software engineering topics. The research proposed built on our previous work. including both criteria for determining whether a requirements specification is acceptably complete and a new approach to structuring system specifications called Intent Specifications. This grant was to fund basic research on how these ideas could be extended to leverage innovative approaches to the problems of (1) reducing the impact of changing requirements, (2) finding requirements specification flaws early through formal and informal analysis, and (3) avoiding common flaws entirely through appropriate requirements specification language design.

  11. Producing software by integration: challenges and research directions (keynote)

    OpenAIRE

    Inverardi , Paola; Autili , Marco; Di Ruscio , Davide; Pelliccione , Patrizio; Tivoli , Massimo

    2013-01-01

    International audience; Software is increasingly produced according to a certain goal and by integrating existing software produced by third-parties, typically black-box, and often provided without a machine readable documentation. This implies that development processes of the next future have to explicitly deal with an inherent incompleteness of information about existing software, notably on its behaviour. Therefore, on one side a software producer will less and less know the precise behav...

  12. An integrated low-voltage rated HTS DC power system with multifunctions to suit smart grids

    Energy Technology Data Exchange (ETDEWEB)

    Jin, Jian Xun, E-mail: jxjin@uestc.edu.cn [Center of Applied Superconductivity, School of Electrical Engineering and Automation, Tianjin University, Tianjin 300072 (China); Center of Applied Superconductivity and Electrical Engineering, School of Automation Engineering, University of Electronic Science and Technology of China, Chengdu 611731 (China); Chen, Xiao Yuan [School of Engineering, Sichuan Normal University, Chengdu 610101 (China); Qu, Ronghai; Fang, Hai Yang [School of Electrical and Electronic Engineering, Huazhong University of Science and Technology, Wuhan 430074 (China); Xin, Ying [Center of Applied Superconductivity, School of Electrical Engineering and Automation, Tianjin University, Tianjin 300072 (China)

    2015-03-15

    Highlights: • A novel LVDC HTS power transmission network is presented. • An integrated power system is achieved by using HTS DC cable and SMES. • DC superconducting cable is verified to achieve self-acting fault current limitation. • SMES is verified to achieve fast-response buffering effect under a power fluctuation. • SMES is verified to achieve favorable load voltage protection effect under a fault. - Abstract: A low-voltage rated DC power transmission network integrated with superconducting cables (SCs) and superconducting magnetic energy storage (SMES) devices has been studied with analytic results presented. In addition to the properties of loss-less and high current transportation capacity, the effectively integrated system is formed with a self-acting fault current limitation feature of the SC and a buffering effect of the SMES to power fluctuations. The results obtained show that the integrated system can achieve high-quality power transmission under common power fluctuation conditions with an advanced self-protection feature under short circuit conditions, which is identified to suit especially the smart grid applications.

  13. An integrated framework for software vulnerability detection ...

    Indian Academy of Sciences (India)

    Manoj Kumar

    2017-07-15

    Jul 15, 2017 ... concern and intelligent framework and provides more secured ... In the present scenario, the software systems are being .... human. In human body, the autonomic nervous system ..... such as artificial neural networks, genetic algorithm, grey ..... [8] Bansiya J 1997 A hierarchical model for quality assessment.

  14. CHECWORKS integrated software for corrosion control

    International Nuclear Information System (INIS)

    Schefski, C.; Pietralik; Hazelton, T.

    1997-01-01

    CHECWORKS, a comprehensive software package for managing Flow-Accelerated Corrosion (FAC, also called erosion-corrosion and flow-assisted corrosion) concerns, is expanding to include other systems and other aspects of corrosion control in CANDU reactors. This paper will outline CHECWORKS applications at various CANDU stations and further plans for CHECWORKS to become a code for comprehensive corrosion control management. (author)

  15. Architecture of a consent management suite and integration into IHE-based Regional Health Information Networks.

    Science.gov (United States)

    Heinze, Oliver; Birkle, Markus; Köster, Lennart; Bergh, Björn

    2011-10-04

    The University Hospital Heidelberg is implementing a Regional Health Information Network (RHIN) in the Rhine-Neckar-Region in order to establish a shared-care environment, which is based on established Health IT standards and in particular Integrating the Healthcare Enterprise (IHE). Similar to all other Electronic Health Record (EHR) and Personal Health Record (PHR) approaches the chosen Personal Electronic Health Record (PEHR) architecture relies on the patient's consent in order to share documents and medical data with other care delivery organizations, with the additional requirement that the German legislation explicitly demands a patients' opt-in and does not allow opt-out solutions. This creates two issues: firstly the current IHE consent profile does not address this approach properly and secondly none of the employed intra- and inter-institutional information systems, like almost all systems on the market, offers consent management solutions at all. Hence, the objective of our work is to develop and introduce an extensible architecture for creating, managing and querying patient consents in an IHE-based environment. Based on the features offered by the IHE profile Basic Patient Privacy Consent (BPPC) and literature, the functionalities and components to meet the requirements of a centralized opt-in consent management solution compliant with German legislation have been analyzed. Two services have been developed and integrated into the Heidelberg PEHR. The standard-based Consent Management Suite consists of two services. The Consent Management Service is able to receive and store consent documents. It can receive queries concerning a dedicated patient consent, process it and return an answer. It represents a centralized policy enforcement point. The Consent Creator Service allows patients to create their consents electronically. Interfaces to a Master Patient Index (MPI) and a provider index allow to dynamically generate XACML-based policies which are

  16. Reverse Engineering in Data Integration Software

    Directory of Open Access Journals (Sweden)

    Vlad DIACONITA

    2013-05-01

    Full Text Available Integrated applications are complex solutions that help build better consolidated and standardized systems from existing (usually transactional systems. Integrated applications are complex solutions, whose complexity are determined by the economic processes they implement, the amount of data employed (millions of records grouped in hundreds of tables, databases, hundreds of GB and the number of users [11]. Oracle, once mainly known for his database and e-business solutions has been constantly expanding its product portfolio, providing solutions for SOA, BPA, Warehousing, Big Data and Cloud Computing. In this article I will review the facilities and the power of using a dedicated integration tool in an environment with multiple data sources and a target data mart.

  17. Integrated software tool automates MOV diagnosis

    International Nuclear Information System (INIS)

    Joshi, B.D.; Upadhyaya, B.R.

    1996-01-01

    This article reports that researchers at the University of Tennessee have developed digital signal processing software that takes the guesswork out of motor current signature analysis (MCSA). The federal testing regulations for motor-operated valves (MOV) used in nuclear power plants have recently come under critical scrutiny by the Nuclear Regulatory Commission (NRC) and the American Society of Mechanical Engineers (ASME). New ASME testing specifications mandate that all valves performing a safety function are to be tested -- not just ASME Code 1, 2 and 3 valves. The NRC will likely endorse the ASME regulations in the near future. Because of these changes, several utility companies have voluntarily expanded the scope of their in-service testing programs for MOVs, in spite of the additional expense

  18. Data processing software suite SITENNO for coherent X-ray diffraction imaging using the X-ray free-electron laser SACLA

    International Nuclear Information System (INIS)

    Sekiguchi, Yuki; Oroguchi, Tomotaka; Takayama, Yuki; Nakasako, Masayoshi

    2014-01-01

    The software suite SITENNO is developed for processing diffraction data collected in coherent X-ray diffraction imaging experiments of non-crystalline particles using an X-ray free-electron laser. Coherent X-ray diffraction imaging is a promising technique for visualizing the structures of non-crystalline particles with dimensions of micrometers to sub-micrometers. Recently, X-ray free-electron laser sources have enabled efficient experiments in the ‘diffraction before destruction’ scheme. Diffraction experiments have been conducted at SPring-8 Angstrom Compact free-electron LAser (SACLA) using the custom-made diffraction apparatus KOTOBUKI-1 and two multiport CCD detectors. In the experiments, ten thousands of single-shot diffraction patterns can be collected within several hours. Then, diffraction patterns with significant levels of intensity suitable for structural analysis must be found, direct-beam positions in diffraction patterns determined, diffraction patterns from the two CCD detectors merged, and phase-retrieval calculations for structural analyses performed. A software suite named SITENNO has been developed to semi-automatically apply the four-step processing to a huge number of diffraction data. Here, details of the algorithm used in the suite are described and the performance for approximately 9000 diffraction patterns collected from cuboid-shaped copper oxide particles reported. Using the SITENNO suite, it is possible to conduct experiments with data processing immediately after the data collection, and to characterize the size distribution and internal structures of the non-crystalline particles

  19. SALOME. A software integration platform for multi-physics, pre-processing and visualisation

    International Nuclear Information System (INIS)

    Bergeaud, Vincent; Lefebvre, Vincent

    2010-01-01

    In order to ease the development of applications integrating simulation codes, CAD modelers and post-processing tools. CEA and EDF R and D have invested in the SALOME platform, a tool dedicated to the environment of the scientific codes. The platform comes in the shape of a toolbox which offers functionalities for CAD, meshing, code coupling, visualization, GUI development. These tools can be combined to create integrated applications that make the scientific codes easier to use and well-interfaced with their environment be it other codes, CAD and meshing tools or visualization software. Many projects in CEA and EDF R and D now use SALOME, bringing technical coherence to the software suites of our institutions. (author)

  20. Integrating and Managing Bim in GIS, Software Review

    Science.gov (United States)

    El Meouche, R.; Rezoug, M.; Hijazi, I.

    2013-08-01

    Since the advent of Computer-Aided Design (CAD) and Geographical Information System (GIS) tools, project participants have been increasingly leveraging these tools throughout the different phases of a civil infrastructure project. In recent years the number of GIS software that provides tools to enable the integration of Building information in geo context has risen sharply. More and more GIS software are added tools for this purposes and other software projects are regularly extending these tools. However, each software has its different strength and weakness and its purpose of use. This paper provides a thorough review to investigate the software capabilities and clarify its purpose. For this study, Autodesk Revit 2012 i.e. BIM editor software was used to create BIMs. In the first step, three building models were created, the resulted models were converted to BIM format and then the software was used to integrate it. For the evaluation of the software, general characteristics was studied such as the user interface, what formats are supported (import/export), and the way building information are imported.

  1. IM (Integrity Management) software must show flexibility to local codes

    Energy Technology Data Exchange (ETDEWEB)

    Brors, Markus [ROSEN Technology and Research Center GmbH (Germany); Diggory, Ian [Macaw Engineering Ltd., Northumberland (United Kingdom)

    2009-07-01

    There are many internationally recognized codes and standards, such as API 1160 and ASME B31.8S, which help pipeline operators to manage and maintain the integrity of their pipeline networks. However, operators in many countries still use local codes that often reflect the history of pipeline developments in their region and are based on direct experience and research on their pipelines. As pipeline companies come under increasing regulatory and financial pressures to maintain the integrity of their networks, it is important that operators using regional codes are able to benchmark their integrity management schemes against these international standards. Any comprehensive Pipeline Integrity Management System (PIMS) software package should therefore not only incorporate industry standards for pipeline integrity assessment but also be capable of implementing regional codes for comparison purposes. This paper describes the challenges and benefits of incorporating one such set of regional pipeline standards into ROSEN Asset Integrity Management Software (ROAIMS). (author)

  2. Exploring the organizational impact of software-as-a-Service on software vendors the role of organizational integration in software-as-a-Service development and operation

    CERN Document Server

    Stuckenberg, Sebastian

    2014-01-01

    Software-as-a-Service has gained momentum as a software delivery and pricing model within the software industry. Existing practices of software vendors are challenged by a potential paradigm shift. This book analyzes the implications of Software-as-a-Service on software vendors using a business model and value chain perspective. The analysis of qualitative data from software vendors highlights the role of organizational integration within software vendors. By providing insights regarding the impact of Software-as-a-Service on organizational structures and processes of software vendors, this st

  3. Integrated FASTBUS, VME and CAMAC diagnostic software at Fermilab

    International Nuclear Information System (INIS)

    Anderson, J.; Forster, R.; Franzen, J.; Wilcer, N.

    1992-10-01

    A fully integrated system for the diagnosis and repair of data acquisition hardware in FASTBUS, VME and CAMAC is described. A short cost/benefit analysis of using a distributed network of personal computers for diagnosis is presented. The SPUDS (Single Platform Uniting Diagnostic Software) software package developed at Fermilab by the authors is introduced. Examples of how SPUDS is currently used in the Fermilab equipment repair facility, as an evaluation tool and for field diagnostics are given

  4. Metabolic and Subjective Results Review of the Integrated Suit Test Series

    Science.gov (United States)

    Norcross, J.R.; Stroud, L.C.; Klein, J.; Desantis, L.; Gernhardt, M.L.

    2009-01-01

    Crewmembers will perform a variety of exploration and construction activities on the lunar surface. These activities will be performed while inside an extravehicular activity (EVA) spacesuit. In most cases, human performance is compromised while inside an EVA suit as compared to a crewmember s unsuited performance baseline. Subjects completed different EVA type tasks, ranging from ambulation to geology and construction activities, in different lunar analog environments including overhead suspension, underwater and 1-g lunar-like terrain, in both suited and unsuited conditions. In the suited condition, the Mark III (MKIII) EVA technology demonstrator suit was used and suit pressure and suit weight were parameters tested. In the unsuited conditions, weight, mass, center of gravity (CG), terrain type and navigation were the parameters. To the extent possible, one parameter was varied while all others were held constant. Tests were not fully crossed, but rather one parameter was varied while all others were left in the most nominal setting. Oxygen consumption (VO2), modified Cooper-Harper (CH) ratings of operator compensation and ratings of perceived exertion (RPE) were measured for each trial. For each variable, a lower value correlates to more efficient task performance. Due to a low sample size, statistical significance was not attainable. Initial findings indicate that suit weight, CG and the operational environment can have a large impact on human performance during EVA. Systematic, prospective testing series such as those performed to date will enable a better understanding of the crucial interactions of the human and the EVA suit system and their environment. However, work remains to be done to confirm these findings. These data have been collected using only unsuited subjects and one EVA suit prototype that is known to fit poorly on a large demographic of the astronaut population. Key findings need to be retested using an EVA suit prototype better suited to a

  5. Medical technology integration: CT, angiography, imaging-capable OR-table, navigation and robotics in a multifunctional sterile suite.

    Science.gov (United States)

    Jacob, A L; Regazzoni, P; Bilecen, D; Rasmus, M; Huegli, R W; Messmer, P

    2007-01-01

    Technology integration is an enabling technological prerequisite to achieve a major breakthrough in sophisticated intra-operative imaging, navigation and robotics in minimally invasive and/or emergency diagnosis and therapy. Without a high degree of integration and reliability comparable to that achieved in the aircraft industry image guidance in its different facets will not ultimately succeed. As of today technology integration in the field of image-guidance is close to nonexistent. Technology integration requires inter-departmental integration of human and financial resources and of medical processes in a dialectic way. This expanded techno-socio-economic integration has profound consequences for the administration and working conditions in hospitals. At the university hospital of Basel, Switzerland, a multimodality multifunction sterile suite was put into operation after a substantial pre-run. We report the lessons learned during our venture into the world of medical technology integration and describe new possibilities for similar integration projects in the future.

  6. Quantifying multiple telecouplings using an integrated suite of spatially-explicit tools

    Science.gov (United States)

    Tonini, F.; Liu, J.

    2016-12-01

    Telecoupling is an interdisciplinary research umbrella concept that enables natural and social scientists to understand and generate information for managing how humans and nature can sustainably coexist worldwide. To systematically study telecoupling, it is essential to build a comprehensive set of spatially-explicit tools for describing and quantifying multiple reciprocal socioeconomic and environmental interactions between a focal area and other areas. Here we introduce the Telecoupling Toolbox, a new free and open-source set of tools developed to map and identify the five major interrelated components of the telecoupling framework: systems, flows, agents, causes, and effects. The modular design of the toolbox allows the integration of existing tools and software (e.g. InVEST) to assess synergies and tradeoffs associated with policies and other local to global interventions. We show applications of the toolbox using a number of representative studies that address a variety of scientific and management issues related to telecouplings throughout the world. The results suggest that the toolbox can thoroughly map and quantify multiple telecouplings under various contexts while providing users with an easy-to-use interface. It provides a powerful platform to address globally important issues, such as land use and land cover change, species invasion, migration, flows of ecosystem services, and international trade of goods and products.

  7. A graphical user interface for RAId, a knowledge integrated proteomics analysis suite with accurate statistics.

    Science.gov (United States)

    Joyce, Brendan; Lee, Danny; Rubio, Alex; Ogurtsov, Aleksey; Alves, Gelio; Yu, Yi-Kuo

    2018-03-15

    RAId is a software package that has been actively developed for the past 10 years for computationally and visually analyzing MS/MS data. Founded on rigorous statistical methods, RAId's core program computes accurate E-values for peptides and proteins identified during database searches. Making this robust tool readily accessible for the proteomics community by developing a graphical user interface (GUI) is our main goal here. We have constructed a graphical user interface to facilitate the use of RAId on users' local machines. Written in Java, RAId_GUI not only makes easy executions of RAId but also provides tools for data/spectra visualization, MS-product analysis, molecular isotopic distribution analysis, and graphing the retrieval versus the proportion of false discoveries. The results viewer displays and allows the users to download the analyses results. Both the knowledge-integrated organismal databases and the code package (containing source code, the graphical user interface, and a user manual) are available for download at https://www.ncbi.nlm.nih.gov/CBBresearch/Yu/downloads/raid.html .

  8. A Roadmap to Continuous Integration for ATLAS Software Development

    Science.gov (United States)

    Elmsheuser, J.; Krasznahorkay, A.; Obreshkov, E.; Undrus, A.; ATLAS Collaboration

    2017-10-01

    The ATLAS software infrastructure facilitates efforts of more than 1000 developers working on the code base of 2200 packages with 4 million lines of C++ and 1.4 million lines of python code. The ATLAS offline code management system is the powerful, flexible framework for processing new package versions requests, probing code changes in the Nightly Build System, migration to new platforms and compilers, deployment of production releases for worldwide access and supporting physicists with tools and interfaces for efficient software use. It maintains multi-stream, parallel development environment with about 70 multi-platform branches of nightly releases and provides vast opportunities for testing new packages, for verifying patches to existing software and for migrating to new platforms and compilers. The system evolution is currently aimed on the adoption of modern continuous integration (CI) practices focused on building nightly releases early and often, with rigorous unit and integration testing. This paper describes the CI incorporation program for the ATLAS software infrastructure. It brings modern open source tools such as Jenkins and GitLab into the ATLAS Nightly System, rationalizes hardware resource allocation and administrative operations, provides improved feedback and means to fix broken builds promptly for developers. Once adopted, ATLAS CI practices will improve and accelerate innovation cycles and result in increased confidence in new software deployments. The paper reports the status of Jenkins integration with the ATLAS Nightly System as well as short and long term plans for the incorporation of CI practices.

  9. A roadmap to continuous integration for ATLAS software development

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00132984; The ATLAS collaboration; Elmsheuser, Johannes; Obreshkov, Emil; Krasznahorkay, Attila

    2017-01-01

    The ATLAS software infrastructure facilitates efforts of more than 1000 developers working on the code base of 2200 packages with 4 million C++ and 1.4 million python lines. The ATLAS offline code management system is the powerful, flexible framework for processing new package versions requests, probing code changes in the Nightly Build System, migration to new platforms and compilers, deployment of production releases for worldwide access and supporting physicists with tools and interfaces for efficient software use. It maintains multi-stream, parallel development environment with about 70 multi-platform branches of nightly releases and provides vast opportunities for testing new packages, for verifying patches to existing software and for migrating to new platforms and compilers. The system evolution is currently aimed on the adoption of modern continuous integration (CI) practices focused on building nightly releases early and often, with rigorous unit and integration testing. This paper describes the CI ...

  10. A Roadmap to Continuous Integration for ATLAS Software Development

    CERN Document Server

    Elmsheuser, Johannes; The ATLAS collaboration; Obreshkov, Emil; Undrus, Alexander

    2016-01-01

    The ATLAS software infrastructure facilitates efforts of more than 1000 developers working on the code base of 2200 packages with 4 million C++ and 1.4 million python lines. The ATLAS offline code management system is the powerful, flexible framework for processing new package versions requests, probing code changes in the Nightly Build System, migration to new platforms and compilers, deployment of production releases for worldwide access and supporting physicists with tools and interfaces for efficient software use. It maintains multi-stream, parallel development environment with about 70 multi-platform branches of nightly releases and provides vast opportunities for testing new packages, for verifying patches to existing software and for migrating to new platforms and compilers. The system evolution is currently aimed on the adoption of modern continuous integration (CI) practices focused on building nightly releases early and often, with rigorous unit and integration testing. This presentation describes t...

  11. A Posteriori Integration of University CAPE Software Developments

    DEFF Research Database (Denmark)

    Tolksdorf, Gregor; Fillinger, Sandra; Wozny, Guenter

    2015-01-01

    This contribution deals with the mutual integration of existing CAPE software products developed at different universities in Germany, Denmark, and Italy. After the motivation MOSAIC is presented as the bridge building the connection between the modelling tool ICAS-MoT and the numerical processin...

  12. Software for the occupational health and safety integrated management system

    International Nuclear Information System (INIS)

    Vătăsescu, Mihaela

    2015-01-01

    This paper intends to present the design and the production of a software for the Occupational Health and Safety Integrated Management System with the view to a rapid drawing up of the system documents in the field of occupational health and safety

  13. Software for the occupational health and safety integrated management system

    Energy Technology Data Exchange (ETDEWEB)

    Vătăsescu, Mihaela [University Politehnica Timisoara, Department of Engineering and Management, 5 Revolutiei street, 331128 Hunedoara (Romania)

    2015-03-10

    This paper intends to present the design and the production of a software for the Occupational Health and Safety Integrated Management System with the view to a rapid drawing up of the system documents in the field of occupational health and safety.

  14. Integrated management software files action in case of fire

    International Nuclear Information System (INIS)

    Moreno-Ventas Garcia, V.; Gimeno Serrano, F.

    2010-01-01

    The proper management of emergencies, is a challenge for which it is essential to be prepared. Integrated Software Performance Chips In case of fire and rapid access to information, make this application a must to effectively drive any emergency due to fire at any nuclear facility.

  15. Software

    Energy Technology Data Exchange (ETDEWEB)

    Macedo, R.; Budd, G.; Ross, E.; Wells, P.

    2010-07-15

    The software section of this journal presented new software programs that have been developed to help in the exploration and development of hydrocarbon resources. Software provider IHS Inc. has made additions to its geological and engineering analysis software tool, IHS PETRA, a product used by geoscientists and engineers to visualize, analyze and manage well production, well log, drilling, reservoir, seismic and other related information. IHS PETRA also includes a directional well module and a decline curve analysis module to improve analysis capabilities in unconventional reservoirs. Petris Technology Inc. has developed a software to help manage the large volumes of data. PetrisWinds Enterprise (PWE) helps users find and manage wellbore data, including conventional wireline and MWD core data; analysis core photos and images; waveforms and NMR; and external files documentation. Ottawa-based Ambercore Software Inc. has been collaborating with Nexen on the Petroleum iQ software for steam assisted gravity drainage (SAGD) producers. Petroleum iQ integrates geology and geophysics data with engineering data in 3D and 4D. Calgary-based Envirosoft Corporation has developed a software that reduces the costly and time-consuming effort required to comply with Directive 39 of the Alberta Energy Resources Conservation Board. The product includes an emissions modelling software. Houston-based Seismic Micro-Technology (SMT) has developed the Kingdom software that features the latest in seismic interpretation. Holland-based Joa Oil and Gas and Calgary-based Computer Modelling Group have both supplied the petroleum industry with advanced reservoir simulation software that enables reservoir interpretation. The 2010 software survey included a guide to new software applications designed to facilitate petroleum exploration, drilling and production activities. Oil and gas producers can use the products for a range of functions, including reservoir characterization and accounting. In

  16. USE OF SOFTWARES FOR POSTURE ASSESSMENT: INTEGRATIVE REVIEW

    Directory of Open Access Journals (Sweden)

    Edyla Maria Porto de Freitas Camelo

    2015-09-01

    Full Text Available To carry out an integrative literature review on the postural analysis softwares available today. It is an integrative-narrative review of qualitative and methodological nature performed during April-July 2014. As inclusion criteria, the articles should be bibliographical or original research and available with full access. At first, we proceeded to the identification of the keywords for the softwares related to postural assessment commonly used in the health field, in such case "posture", "software", and "postural assessment". The search was narrowed by publication date from 2002 to 2014. Through the information acquired from the articles and from the software developers, information on 12 programs that assist the postural evaluation were obtained - Alcimage, All Body Scan 3D, Aplob, APPID, Biotonix, Corporis Pro, Fisimetrix, Fisiometer Posturograma, Physical Fisio, Physio Easy, Posture Print and SAPO. However, only one tool has more information and studies, namely SAPO. There are many postural analysis softwares available on the internet today, however, these are quite disparate in relation to possible answers and are still poorly widespread as research tools.

  17. CONNJUR Workflow Builder: a software integration environment for spectral reconstruction.

    Science.gov (United States)

    Fenwick, Matthew; Weatherby, Gerard; Vyas, Jay; Sesanker, Colbert; Martyn, Timothy O; Ellis, Heidi J C; Gryk, Michael R

    2015-07-01

    CONNJUR Workflow Builder (WB) is an open-source software integration environment that leverages existing spectral reconstruction tools to create a synergistic, coherent platform for converting biomolecular NMR data from the time domain to the frequency domain. WB provides data integration of primary data and metadata using a relational database, and includes a library of pre-built workflows for processing time domain data. WB simplifies maximum entropy reconstruction, facilitating the processing of non-uniformly sampled time domain data. As will be shown in the paper, the unique features of WB provide it with novel abilities to enhance the quality, accuracy, and fidelity of the spectral reconstruction process. WB also provides features which promote collaboration, education, parameterization, and non-uniform data sets along with processing integrated with the Rowland NMR Toolkit (RNMRTK) and NMRPipe software packages. WB is available free of charge in perpetuity, dual-licensed under the MIT and GPL open source licenses.

  18. CONNJUR Workflow Builder: a software integration environment for spectral reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Fenwick, Matthew; Weatherby, Gerard; Vyas, Jay; Sesanker, Colbert [UConn Health, Department of Molecular Biology and Biophysics (United States); Martyn, Timothy O. [Rensselaer at Hartford, Department of Engineering and Science (United States); Ellis, Heidi J. C. [Western New England College, Department of Computer Science and Information Technology (United States); Gryk, Michael R., E-mail: gryk@uchc.edu [UConn Health, Department of Molecular Biology and Biophysics (United States)

    2015-07-15

    CONNJUR Workflow Builder (WB) is an open-source software integration environment that leverages existing spectral reconstruction tools to create a synergistic, coherent platform for converting biomolecular NMR data from the time domain to the frequency domain. WB provides data integration of primary data and metadata using a relational database, and includes a library of pre-built workflows for processing time domain data. WB simplifies maximum entropy reconstruction, facilitating the processing of non-uniformly sampled time domain data. As will be shown in the paper, the unique features of WB provide it with novel abilities to enhance the quality, accuracy, and fidelity of the spectral reconstruction process. WB also provides features which promote collaboration, education, parameterization, and non-uniform data sets along with processing integrated with the Rowland NMR Toolkit (RNMRTK) and NMRPipe software packages. WB is available free of charge in perpetuity, dual-licensed under the MIT and GPL open source licenses.

  19. CONNJUR Workflow Builder: a software integration environment for spectral reconstruction

    International Nuclear Information System (INIS)

    Fenwick, Matthew; Weatherby, Gerard; Vyas, Jay; Sesanker, Colbert; Martyn, Timothy O.; Ellis, Heidi J. C.; Gryk, Michael R.

    2015-01-01

    CONNJUR Workflow Builder (WB) is an open-source software integration environment that leverages existing spectral reconstruction tools to create a synergistic, coherent platform for converting biomolecular NMR data from the time domain to the frequency domain. WB provides data integration of primary data and metadata using a relational database, and includes a library of pre-built workflows for processing time domain data. WB simplifies maximum entropy reconstruction, facilitating the processing of non-uniformly sampled time domain data. As will be shown in the paper, the unique features of WB provide it with novel abilities to enhance the quality, accuracy, and fidelity of the spectral reconstruction process. WB also provides features which promote collaboration, education, parameterization, and non-uniform data sets along with processing integrated with the Rowland NMR Toolkit (RNMRTK) and NMRPipe software packages. WB is available free of charge in perpetuity, dual-licensed under the MIT and GPL open source licenses

  20. Debris Examination Using Ballistic and Radar Integrated Software

    Science.gov (United States)

    Griffith, Anthony; Schottel, Matthew; Lee, David; Scully, Robert; Hamilton, Joseph; Kent, Brian; Thomas, Christopher; Benson, Jonathan; Branch, Eric; Hardman, Paul; hide

    2012-01-01

    The Debris Examination Using Ballistic and Radar Integrated Software (DEBRIS) program was developed to provide rapid and accurate analysis of debris observed by the NASA Debris Radar (NDR). This software provides a greatly improved analysis capacity over earlier manual processes, allowing for up to four times as much data to be analyzed by one-quarter of the personnel required by earlier methods. There are two applications that comprise the DEBRIS system: the Automated Radar Debris Examination Tool (ARDENT) and the primary DEBRIS tool.

  1. Collaboration in Global Software Engineering Based on Process Description Integration

    Science.gov (United States)

    Klein, Harald; Rausch, Andreas; Fischer, Edward

    Globalization is one of the big trends in software development. Development projects need a variety of different resources with appropriate expert knowledge to be successful. More and more of these resources are nowadays obtained from specialized organizations and countries all over the world, varying in development approaches, processes, and culture. As seen with early outsourcing attempts, collaboration may fail due to these differences. Hence, the major challenge in global software engineering is to streamline collaborating organizations towards a successful conjoint development. Based on typical collaboration scenarios, this paper presents a structured approach to integrate processes in a comprehensible way.

  2. Comprehensive understandings of energy confinement in LHD plasmas through extensive application of the integrated transport analysis suite

    International Nuclear Information System (INIS)

    Yokoyama, M.; Seki, R.; Suzuki, C.; Ida, K.; Osakabe, M.; Satake, S.; Yamada, H.; Murakami, S.

    2014-10-01

    The integrated transport analysis suite, TASK3D-a, has enhanced energy transport analyses in LHD. It has clearly elucidated (1) the systematic dependence of ion and electron energy confinement on wide variation of plasma parameters, and (2) statistically-derived fitting expressions for the ion and electron heat diffusivities (χ i and χ e ), separately, taking also those radial-profile information into account. In particular, the latter approach can outstrip the conventional scaling laws for the global confinement time (τ E ) in terms of its considerations on profiles (temperature, density, heating depositions etc.). This has been made possible with the analysis database accumulated by the extensive application of the integrated transport analysis suite to experiment data. In this proceeding, TASK3D-a analysis-database for high-ion-temperature (high-T i ) plasmas in LHD (Large Helical Device) are exemplified. This approach should be applicable to any other combinations of integrated transport analysis suites and fusion experiments. (author)

  3. Integration of Simulink Models with Component-based Software Models

    Directory of Open Access Journals (Sweden)

    MARIAN, N.

    2008-06-01

    Full Text Available Model based development aims to facilitate the development of embedded control systems by emphasizing the separation of the design level from the implementation level. Model based design involves the use of multiple models that represent different views of a system, having different semantics of abstract system descriptions. Usually, in mechatronics systems, design proceeds by iterating model construction, model analysis, and model transformation. Constructing a MATLAB/Simulink model, a plant and controller behavior is simulated using graphical blocks to represent mathematical and logical constructs and process flow, then software code is generated. A Simulink model is a representation of the design or implementation of a physical system that satisfies a set of requirements. A software component-based system aims to organize system architecture and behavior as a means of computation, communication and constraints, using computational blocks and aggregates for both discrete and continuous behavior, different interconnection and execution disciplines for event-based and time-based controllers, and so on, to encompass the demands to more functionality, at even lower prices, and with opposite constraints. COMDES (Component-based Design of Software for Distributed Embedded Systems is such a component-based system framework developed by the software engineering group of Mads Clausen Institute for Product Innovation (MCI, University of Southern Denmark. Once specified, the software model has to be analyzed. One way of doing that is to integrate in wrapper files the model back into Simulink S-functions, and use its extensive simulation features, thus allowing an early exploration of the possible design choices over multiple disciplines. The paper describes a safe translation of a restricted set of MATLAB/Simulink blocks to COMDES software components, both for continuous and discrete behavior, and the transformation of the software system into the S

  4. The emerging Web 2.0 social software: an enabling suite of sociable technologies in health and health care education.

    Science.gov (United States)

    Kamel Boulos, Maged N; Wheeler, Steve

    2007-03-01

    Web 2.0 sociable technologies and social software are presented as enablers in health and health care, for organizations, clinicians, patients and laypersons. They include social networking services, collaborative filtering, social bookmarking, folksonomies, social search engines, file sharing and tagging, mashups, instant messaging, and online multi-player games. The more popular Web 2.0 applications in education, namely wikis, blogs and podcasts, are but the tip of the social software iceberg. Web 2.0 technologies represent a quite revolutionary way of managing and repurposing/remixing online information and knowledge repositories, including clinical and research information, in comparison with the traditional Web 1.0 model. The paper also offers a glimpse of future software, touching on Web 3.0 (the Semantic Web) and how it could be combined with Web 2.0 to produce the ultimate architecture of participation. Although the tools presented in this review look very promising and potentially fit for purpose in many health care applications and scenarios, careful thinking, testing and evaluation research are still needed in order to establish 'best practice models' for leveraging these emerging technologies to boost our teaching and learning productivity, foster stronger 'communities of practice', and support continuing medical education/professional development (CME/CPD) and patient education.

  5. The defendant in a medical malpractice suit: an integral part of the defense team

    International Nuclear Information System (INIS)

    Petrek, F.R. Jr.; Slovis, M.R.

    1998-01-01

    This article explains the litigation process of a medical malpractice suit and offers suggestions to help pediatric radiologists cope with the stress of being sued. It provides tangible ways in which the pediatric radiologist can become an important part of the defense team. Our goal is to enable the pediatric radiologist to place the lawsuit in a proper perspective and demonstrate the importance of providing medical insight to aid in forming legal strategy. (orig.)

  6. Managing Risks in Distributed Software Projects: An Integrative Framework

    DEFF Research Database (Denmark)

    Persson, John Stouby; Mathiassen, Lars; Boeg, Jesper

    2009-01-01

    techniques into an integrative framework for managing risks in distributed contexts. Subsequent implementation of a Web-based tool helped us refine the framework based on empirical evaluation of its practical usefulness.We conclude by discussing implications for both research and practice.......Software projects are increasingly geographically distributed with limited face-to-face interaction between participants. These projects face particular challenges that need carefulmanagerial attention. While risk management has been adopted with success to address other challenges within software...... development, there are currently no frameworks available for managing risks related to geographical distribution. On this background, we systematically review the literature on geographically distributed software projects. Based on the review, we synthesize what we know about risks and risk resolution...

  7. An integrated infrastructure in support of software development

    International Nuclear Information System (INIS)

    Antonelli, S; Bencivenni, M; De Girolamo, D; Giacomini, F; Longo, S; Manzali, M; Veraldi, R; Zani, S

    2014-01-01

    This paper describes the design and the current state of implementation of an infrastructure made available to software developers within the Italian National Institute for Nuclear Physics (INFN) to support and facilitate their daily activity. The infrastructure integrates several tools, each providing a well-identified function: project management, version control system, continuous integration, dynamic provisioning of virtual machines, efficiency improvement, knowledge base. When applicable, access to the services is based on the INFN-wide Authentication and Authorization Infrastructure. The system is being installed and progressively made available to INFN users belonging to tens of sites and laboratories and will represent a solid foundation for the software development efforts of the many experiments and projects that see the involvement of the Institute. The infrastructure will be beneficial especially for small- and medium-size collaborations, which often cannot afford the resources, in particular in terms of know-how, needed to set up such services.

  8. Integrating commercial software in accelerator control- case study

    International Nuclear Information System (INIS)

    Pace, Alberto

    1994-01-01

    Using existing commercial software is the dream of any control system engineer for the development cost reduction that can reach one order of magnitude. This dream often vanishes when appears the requirement to have a uniform and consistent architecture through a wide number of components and applications. This makes it difficult to integrate several commercial packages that often impose different user interface and communication standards. This paper will describe the approach and standards that have been chosen for the CERN ISOLDE control system that have allowed several commercial packages to be integrated in the system as-they-are permitting the software development cost to be reduced to a minimum. (author). 10 refs., 2 tabs., 9 figs

  9. Business Intelligence Applied to the ALMA Software Integration Process

    Science.gov (United States)

    Zambrano, M.; Recabarren, C.; González, V.; Hoffstadt, A.; Soto, R.; Shen, T.-C.

    2012-09-01

    Software quality assurance and planning of an astronomy project is a complex task, specially if it is a distributed collaborative project such as ALMA, where the development centers are spread across the globe. When you execute a software project there is much valuable information about this process itself that you might be able to collect. One of the ways you can receive this input is via an issue tracking system that will gather the problem reports relative to software bugs captured during the testing of the software, during the integration of the different components or even worst, problems occurred during production time. Usually, there is little time spent on analyzing them but with some multidimensional processing you can extract valuable information from them and it might help you on the long term planning and resources allocation. We present an analysis of the information collected at ALMA from a collection of key unbiased indicators. We describe here the extraction, transformation and load process and how the data was processed. The main goal is to assess a software process and get insights from this information.

  10. Distributed software framework and continuous integration in hydroinformatics systems

    Science.gov (United States)

    Zhou, Jianzhong; Zhang, Wei; Xie, Mengfei; Lu, Chengwei; Chen, Xiao

    2017-08-01

    When encountering multiple and complicated models, multisource structured and unstructured data, complex requirements analysis, the platform design and integration of hydroinformatics systems become a challenge. To properly solve these problems, we describe a distributed software framework and it’s continuous integration process in hydroinformatics systems. This distributed framework mainly consists of server cluster for models, distributed database, GIS (Geographic Information System) servers, master node and clients. Based on it, a GIS - based decision support system for joint regulating of water quantity and water quality of group lakes in Wuhan China is established.

  11. Integrated software system for low level waste management

    International Nuclear Information System (INIS)

    Worku, G.

    1995-01-01

    In the continually changing and uncertain world of low level waste management, many generators in the US are faced with the prospect of having to store their waste on site for the indefinite future. This consequently increases the set of tasks performed by the generators in the areas of packaging, characterizing, classifying, screening (if a set of acceptance criteria applies), and managing the inventory for the duration of onsite storage. When disposal sites become available, it is expected that the work will require re-evaluating the waste packages, including possible re-processing, re-packaging, or re-classifying in preparation for shipment for disposal under the regulatory requirements of the time. In this day and age, when there is wide use of computers and computer literacy is at high levels, an important waste management tool would be an integrated software system that aids waste management personnel in conducting these tasks quickly and accurately. It has become evident that such an integrated radwaste management software system offers great benefits to radwaste generators both in the US and other countries. This paper discusses one such approach to integrated radwaste management utilizing some globally accepted radiological assessment software applications

  12. CyberGIS software: a synthetic review and integration roadmap

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Shaowen [University of Illinois, Urbana-Champaign; Anselin, Luc [Arizona State University; Bhaduri, Budhendra L [ORNL; Cosby, Christopher [University Navstar Consortium, Boulder, CO; Goodchild, Michael [University of California, Santa Barbara; Liu, Yan [University of Illinois, Urbana-Champaign; Nygers, Timothy L. [University of Washington, Seattle

    2013-01-01

    CyberGIS defined as cyberinfrastructure-based geographic information systems (GIS) has emerged as a new generation of GIS representing an important research direction for both cyberinfrastructure and geographic information science. This study introduces a 5-year effort funded by the US National Science Foundation to advance the science and applications of CyberGIS, particularly for enabling the analysis of big spatial data, computationally intensive spatial analysis and modeling (SAM), and collaborative geospatial problem-solving and decision-making, simultaneously conducted by a large number of users. Several fundamental research questions are raised and addressed while a set of CyberGIS challenges and opportunities are identified from scientific perspectives. The study reviews several key CyberGIS software tools that are used to elucidate a vision and roadmap for CyberGIS software research. The roadmap focuses on software integration and synthesis of cyberinfrastructure, GIS, and SAM by defining several key integration dimensions and strategies. CyberGIS, based on this holistic integration roadmap, exhibits the following key characteristics: high-performance and scalable, open and distributed, collaborative, service-oriented, user-centric, and community-driven. As a major result of the roadmap, two key CyberGIS modalities gateway and toolkit combined with a community-driven and participatory approach have laid a solid foundation to achieve scientific breakthroughs across many geospatial communities that would be otherwise impossible.

  13. A new software suite for NO2 vertical profile retrieval from ground-based zenith-sky spectrometers

    International Nuclear Information System (INIS)

    Denis, L.; Roscoe, H.K.; Chipperfield, M.P.; Roozendael, M. van; Goutail, F.

    2005-01-01

    Here we present an operational method to improve accuracy and information content of ground-based measurements of stratospheric NO 2 . The motive is to improve the investigation of trends in NO 2 , and is important because the current trend in NO 2 appears to contradict the trend in its source, suggesting that the stratospheric circulation has changed. To do so, a new software package for retrieving NO 2 vertical profiles from slant columns measured by zenith-sky spectrometers has been created. It uses a Rodgers optimal linear inverse method coupled with a radiative transfer model for calculations of transfer functions between profiles and columns, and a chemical box model for taking into account the NO 2 variations during twilight and during the day. Each model has parameters that vary according to season and location. Forerunners of each model have been previously validated. The scheme maps random errors in the measurements and systematic errors in the models and their parameters on to the retrieved profiles. Initialisation for models is derived from well-established climatologies. The software has been tested by comparing retrieved profiles to simultaneous balloon-borne profiles at mid-latitudes in spring

  14. A new software suite for NO{sub 2} vertical profile retrieval from ground-based zenith-sky spectrometers

    Energy Technology Data Exchange (ETDEWEB)

    Denis, L. [British Antarctic Survey/NERC, Madingley Road, Cambridge CB3 0ET (United Kingdom); Roscoe, H.K. [British Antarctic Survey/NERC, Madingley Road, Cambridge CB3 0ET (United Kingdom)]. E-mail: h.roscoe@bas.ac.uk; Chipperfield, M.P. [Environment Centre, University of Leeds, Leeds LS2 9JT (United Kingdom); Roozendael, M. van [Belgian Institute for Space Aeronomy (BIRA/IASB), 1180 Brussels (Belgium); Goutail, F. [Service d' Aeronomie du CNRS, BP3, 91271 Verrieres le Buisson (France)

    2005-05-15

    Here we present an operational method to improve accuracy and information content of ground-based measurements of stratospheric NO{sub 2}. The motive is to improve the investigation of trends in NO{sub 2}, and is important because the current trend in NO{sub 2} appears to contradict the trend in its source, suggesting that the stratospheric circulation has changed. To do so, a new software package for retrieving NO{sub 2} vertical profiles from slant columns measured by zenith-sky spectrometers has been created. It uses a Rodgers optimal linear inverse method coupled with a radiative transfer model for calculations of transfer functions between profiles and columns, and a chemical box model for taking into account the NO{sub 2} variations during twilight and during the day. Each model has parameters that vary according to season and location. Forerunners of each model have been previously validated. The scheme maps random errors in the measurements and systematic errors in the models and their parameters on to the retrieved profiles. Initialisation for models is derived from well-established climatologies. The software has been tested by comparing retrieved profiles to simultaneous balloon-borne profiles at mid-latitudes in spring.

  15. Do You Need ERP? In the Business World, Enterprise Resource Planning Software Keeps Costs down and Productivity up. Should Districts Follow Suit?

    Science.gov (United States)

    Careless, James

    2007-01-01

    Enterprise resource planning (ERP) software does what school leaders have always wanted their computer systems to do: It sees all. By integrating every IT application an organization has--from purchasing and inventory control to payroll--ERPs create a single unified system. Not only does this give IT managers a holistic view to what is happening…

  16. Do You Need ERP? In the Business World, Enterprise Resource Planning Software Keeps Costs down and Productivity up. Should Districts Follow Suit?

    Science.gov (United States)

    Careless, James

    2007-01-01

    Enterprise resource planning software does what school leaders have always wanted their computer systems to do: It sees all. By integrating every IT application an organization has--from purchasing and inventory control to payroll--ERPs create a single unified system. Not only does this give IT managers a holistic view to what is happening in the…

  17. Semi-automated software service integration in virtual organisations

    Science.gov (United States)

    Afsarmanesh, Hamideh; Sargolzaei, Mahdi; Shadi, Mahdieh

    2015-08-01

    To enhance their business opportunities, organisations involved in many service industries are increasingly active in pursuit of both online provision of their business services (BSs) and collaborating with others. Collaborative Networks (CNs) in service industry sector, however, face many challenges related to sharing and integration of their collection of provided BSs and their corresponding software services. Therefore, the topic of service interoperability for which this article introduces a framework is gaining momentum in research for supporting CNs. It contributes to generation of formal machine readable specification for business processes, aimed at providing their unambiguous definitions, as needed for developing their equivalent software services. The framework provides a model and implementation architecture for discovery and composition of shared services, to support the semi-automated development of integrated value-added services. In support of service discovery, a main contribution of this research is the formal representation of services' behaviour and applying desired service behaviour specified by users for automated matchmaking with other existing services. Furthermore, to support service integration, mechanisms are developed for automated selection of the most suitable service(s) according to a number of service quality aspects. Two scenario cases are presented, which exemplify several specific features related to service discovery and service integration aspects.

  18. A graphical user interface for RAId, a knowledge integrated proteomics analysis suite with accurate statistics

    OpenAIRE

    Joyce, Brendan; Lee, Danny; Rubio, Alex; Ogurtsov, Aleksey; Alves, Gelio; Yu, Yi-Kuo

    2018-01-01

    Abstract Objective RAId is a software package that has been actively developed for the past 10 years for computationally and visually analyzing MS/MS data. Founded on rigorous statistical methods, RAId’s core program computes accurate E-values for peptides and proteins identified during database searches. Making this robust tool readily accessible for the proteomics community by developing a graphical user interface (GUI) is our main goa...

  19. Software for the Integration of Multiomics Experiments in Bioconductor.

    Science.gov (United States)

    Ramos, Marcel; Schiffer, Lucas; Re, Angela; Azhar, Rimsha; Basunia, Azfar; Rodriguez, Carmen; Chan, Tiffany; Chapman, Phil; Davis, Sean R; Gomez-Cabrero, David; Culhane, Aedin C; Haibe-Kains, Benjamin; Hansen, Kasper D; Kodali, Hanish; Louis, Marie S; Mer, Arvind S; Riester, Markus; Morgan, Martin; Carey, Vince; Waldron, Levi

    2017-11-01

    Multiomics experiments are increasingly commonplace in biomedical research and add layers of complexity to experimental design, data integration, and analysis. R and Bioconductor provide a generic framework for statistical analysis and visualization, as well as specialized data classes for a variety of high-throughput data types, but methods are lacking for integrative analysis of multiomics experiments. The MultiAssayExperiment software package, implemented in R and leveraging Bioconductor software and design principles, provides for the coordinated representation of, storage of, and operation on multiple diverse genomics data. We provide the unrestricted multiple 'omics data for each cancer tissue in The Cancer Genome Atlas as ready-to-analyze MultiAssayExperiment objects and demonstrate in these and other datasets how the software simplifies data representation, statistical analysis, and visualization. The MultiAssayExperiment Bioconductor package reduces major obstacles to efficient, scalable, and reproducible statistical analysis of multiomics data and enhances data science applications of multiple omics datasets. Cancer Res; 77(21); e39-42. ©2017 AACR . ©2017 American Association for Cancer Research.

  20. [Development of integrated support software for clinical nutrition].

    Science.gov (United States)

    Siquier Homar, Pedro; Pinteño Blanco, Manel; Calleja Hernández, Miguel Ángel; Fernández Cortés, Francisco; Martínez Sotelo, Jesús

    2015-09-01

    to develop an integrated computer software application for specialized nutritional support, integrated in the electronic clinical record, which detects automatically and early those undernourished patients or at risk of developing undernourishment, determining points of opportunity for improvement and evaluation of the results. the quality standards published by the Nutrition Work Group of the Spanish Society of Hospital Pharmacy (SEFH) and the recommendations by the Pharmacy Group of the Spanish Society of Parenteral and Enteral Nutrition (SENPE) have been taken into account. According to these quality standards, the nutritional support has to include the following healthcare stages or sub-processes: nutritional screening, nutritional assessment, plan for nutritional care, prescription, preparation and administration. this software allows to conduct, in an automated way, a specific nutritional assessment for those patients with nutritional risk, implementing, if necessary, a nutritional treatment plan, conducting follow-up and traceability of outcomes derived from the implementation of improvement actions, and quantifying to what extent our practice is close to the established standard. this software allows to standardize the specialized nutritional support from a multidisciplinary point of view, introducing the concept of quality control per processes, and including patient as the main customer. Copyright AULA MEDICA EDICIONES 2014. Published by AULA MEDICA. All rights reserved.

  1. Development of integrated support software for clinical nutrition

    Directory of Open Access Journals (Sweden)

    Pedro Siquier Homar

    2015-09-01

    Full Text Available Objectives: to develop an integrated computer software application for specialized nutritional support, integrated in the electronic clinical record, which detects automatically and early those undernourished patients or at risk of developing undernourishment, determining points of opportunity for improvement and evaluation of the results. Methods: the quality standards published by the Nutrition Work Group of the Spanish Society of Hospital Pharmacy (SEFH and the recommendations by the Pharmacy Group of the Spanish Society of Parenteral and Enteral Nutrition (SENPE have been taken into account. According to these quality standards, the nutritional support has to include the following healthcare stages or sub-processes: nutritional screening, nutritional assessment, plan for nutritional care, prescription, preparation and administration. Results: this software allows to conduct, in an automated way, a specific nutritional assessment for those patients with nutritional risk, implementing, if necessary, a nutritional treatment plan, conducting follow-up and traceability of outcomes derived from the implementation of improvement actions, and quantifying to what extent our practice is close to the established standard. Conclusions: this software allows to standardize the specialized nutritional support from a multidisciplinary point of view, introducing the concept of quality control per processes, and including patient as the main customer

  2. Integration of Simulink Models with Component-based Software Models

    DEFF Research Database (Denmark)

    Marian, Nicolae; Top, Søren

    2008-01-01

    , communication and constraints, using computational blocks and aggregates for both discrete and continuous behaviour, different interconnection and execution disciplines for event-based and time-based controllers, and so on, to encompass the demands to more functionality, at even lower prices, and with opposite...... to be analyzed. One way of doing that is to integrate in wrapper files the model back into Simulink S-functions, and use its extensive simulation features, thus allowing an early exploration of the possible design choices over multiple disciplines. The paper describes a safe translation of a restricted set...... of MATLAB/Simulink blocks to COMDES software components, both for continuous and discrete behaviour, and the transformation of the software system into the S-functions. The general aim of this work is the improvement of multi-disciplinary development of embedded systems with the focus on the relation...

  3. Large scale continuous integration and delivery : Making great software better and faster

    NARCIS (Netherlands)

    Stahl, Daniel

    2017-01-01

    Since the inception of continuous integration, and later continuous delivery, the methods of producing software in the industry have changed dramatically over the last two decades. Automated, rapid and frequent compilation, integration, testing, analysis, packaging and delivery of new software

  4. Integrating manufacturing softwares for intelligent planning execution: a CIIMPLEX perspective

    Science.gov (United States)

    Chu, Bei Tseng B.; Tolone, William J.; Wilhelm, Robert G.; Hegedus, M.; Fesko, J.; Finin, T.; Peng, Yun; Jones, Chris H.; Long, Junshen; Matthews, Mike; Mayfield, J.; Shimp, J.; Su, S.

    1997-01-01

    Recent developments have made it possible to interoperate complex business applications at much lower costs. Application interoperation, along with business process re- engineering can result in significant savings by eliminating work created by disconnected business processes due to isolated business applications. However, we believe much greater productivity benefits can be achieved by facilitating timely decision-making, utilizing information from multiple enterprise perspectives. The CIIMPLEX enterprise integration architecture is designed to enable such productivity gains by helping people to carry out integrated enterprise scenarios. An enterprise scenario is triggered typically by some external event. The goal of an enterprise scenario is to make the right decisions considering the full context of the problem. Enterprise scenarios are difficult for people to carry out because of the interdependencies among various actions. One can easily be overwhelmed by the large amount of information. We propose the use of software agents to help gathering relevant information and present them in the appropriate context of an enterprise scenario. The CIIMPLEX enterprise integration architecture is based on the FAIME methodology for application interoperation and plug-and-play. It also explores the use of software agents in application plug-and- play.

  5. YPED: an integrated bioinformatics suite and database for mass spectrometry-based proteomics research.

    Science.gov (United States)

    Colangelo, Christopher M; Shifman, Mark; Cheung, Kei-Hoi; Stone, Kathryn L; Carriero, Nicholas J; Gulcicek, Erol E; Lam, TuKiet T; Wu, Terence; Bjornson, Robert D; Bruce, Can; Nairn, Angus C; Rinehart, Jesse; Miller, Perry L; Williams, Kenneth R

    2015-02-01

    We report a significantly-enhanced bioinformatics suite and database for proteomics research called Yale Protein Expression Database (YPED) that is used by investigators at more than 300 institutions worldwide. YPED meets the data management, archival, and analysis needs of a high-throughput mass spectrometry-based proteomics research ranging from a single laboratory, group of laboratories within and beyond an institution, to the entire proteomics community. The current version is a significant improvement over the first version in that it contains new modules for liquid chromatography-tandem mass spectrometry (LC-MS/MS) database search results, label and label-free quantitative proteomic analysis, and several scoring outputs for phosphopeptide site localization. In addition, we have added both peptide and protein comparative analysis tools to enable pairwise analysis of distinct peptides/proteins in each sample and of overlapping peptides/proteins between all samples in multiple datasets. We have also implemented a targeted proteomics module for automated multiple reaction monitoring (MRM)/selective reaction monitoring (SRM) assay development. We have linked YPED's database search results and both label-based and label-free fold-change analysis to the Skyline Panorama repository for online spectra visualization. In addition, we have built enhanced functionality to curate peptide identifications into an MS/MS peptide spectral library for all of our protein database search identification results. Copyright © 2015 The Authors. Production and hosting by Elsevier Ltd.. All rights reserved.

  6. Integrated risk assessment for spent fuel transportation using developed software

    International Nuclear Information System (INIS)

    Yun, Mi Rae; Christian, Robby; Kim, Bo Gyung; Almomani, Belal; Ham, Jae Hyun; Kang, Gook Hyun; Lee, Sang hoon

    2016-01-01

    As on-site spent fuel storage meets limitation of their capacity, spent fuel need to be transported to other place. In this research, risk of two ways of transportation method, maritime transportation and on-site transportation, and interim storage facility were analyzed. Easier and integrated risk assessment for spent fuel transportation will be possible by applying this software. Risk assessment for spent fuel transportation has not been researched and this work showed a case for analysis. By using this analysis method and developed software, regulators can get some insights for spent fuel transportation. For example, they can restrict specific region for preventing ocean accident and also they can arrange spend fuel in interim storage facility avoiding most risky region which have high risk from aircraft engine shaft. Finally, they can apply soft material on the floor for specific stage for on-site transportation. In this software, because we targeted Korea, we need to use Korean reference data. However, there were few Korean reference data. Especially, there was no food chain data for Korean ocean. In MARINRAD, they used steady state food chain model, but it is far from reality. Therefore, to get Korean realistic reference data, dynamic food chain model for Korean ocean need to be developed

  7. Integrated risk assessment for spent fuel transportation using developed software

    Energy Technology Data Exchange (ETDEWEB)

    Yun, Mi Rae; Christian, Robby; Kim, Bo Gyung; Almomani, Belal; Ham, Jae Hyun; Kang, Gook Hyun [KAIST, Daejeon (Korea, Republic of); Lee, Sang hoon [Keimyung University, Daegu (Korea, Republic of)

    2016-05-15

    As on-site spent fuel storage meets limitation of their capacity, spent fuel need to be transported to other place. In this research, risk of two ways of transportation method, maritime transportation and on-site transportation, and interim storage facility were analyzed. Easier and integrated risk assessment for spent fuel transportation will be possible by applying this software. Risk assessment for spent fuel transportation has not been researched and this work showed a case for analysis. By using this analysis method and developed software, regulators can get some insights for spent fuel transportation. For example, they can restrict specific region for preventing ocean accident and also they can arrange spend fuel in interim storage facility avoiding most risky region which have high risk from aircraft engine shaft. Finally, they can apply soft material on the floor for specific stage for on-site transportation. In this software, because we targeted Korea, we need to use Korean reference data. However, there were few Korean reference data. Especially, there was no food chain data for Korean ocean. In MARINRAD, they used steady state food chain model, but it is far from reality. Therefore, to get Korean realistic reference data, dynamic food chain model for Korean ocean need to be developed.

  8. Pegasys: software for executing and integrating analyses of biological sequences

    Directory of Open Access Journals (Sweden)

    Lett Drew

    2004-04-01

    Full Text Available Abstract Background We present Pegasys – a flexible, modular and customizable software system that facilitates the execution and data integration from heterogeneous biological sequence analysis tools. Results The Pegasys system includes numerous tools for pair-wise and multiple sequence alignment, ab initio gene prediction, RNA gene detection, masking repetitive sequences in genomic DNA as well as filters for database formatting and processing raw output from various analysis tools. We introduce a novel data structure for creating workflows of sequence analyses and a unified data model to store its results. The software allows users to dynamically create analysis workflows at run-time by manipulating a graphical user interface. All non-serial dependent analyses are executed in parallel on a compute cluster for efficiency of data generation. The uniform data model and backend relational database management system of Pegasys allow for results of heterogeneous programs included in the workflow to be integrated and exported into General Feature Format for further analyses in GFF-dependent tools, or GAME XML for import into the Apollo genome editor. The modularity of the design allows for new tools to be added to the system with little programmer overhead. The database application programming interface allows programmatic access to the data stored in the backend through SQL queries. Conclusions The Pegasys system enables biologists and bioinformaticians to create and manage sequence analysis workflows. The software is released under the Open Source GNU General Public License. All source code and documentation is available for download at http://bioinformatics.ubc.ca/pegasys/.

  9. Framework programmable platform for the advanced software development workstation. Integration mechanism design document

    Science.gov (United States)

    Mayer, Richard J.; Blinn, Thomas M.; Mayer, Paula S. D.; Reddy, Uday; Ackley, Keith; Futrell, Mike

    1991-01-01

    The Framework Programmable Software Development Platform (FPP) is a project aimed at combining effective tool and data integration mechanisms with a model of the software development process in an intelligent integrated software development environment. Guided by this model, this system development framework will take advantage of an integrated operating environment to automate effectively the management of the software development process so that costly mistakes during the development phase can be eliminated.

  10. Re-engineering software systems in the Department of Defense using integrated computer aided software engineering tools

    OpenAIRE

    Jennings, Charles A.

    1992-01-01

    Approved for public release; distribution is unlimited The Department of Defense (DoD) is plagues with severe cost overruns and delays in developing software systems. Existing software within Dod, some developed 15-to 20 years ago, require continual maintenance and modification. Major difficulties arise with maintaining older systems due to cryptic source code and a lack of adequate documentation. To remedy this situation, the DoD, is pursuing the integrated computer aided software engi...

  11. Integration of Optical Coherence Tomography Scan Patterns to Augment Clinical Data Suite

    Science.gov (United States)

    Mason, S.; Patel, N.; Van Baalen, M.; Tarver, W.; Otto, C.; Samuels, B.; Koslovsky, M.; Schaefer, C.; Taiym, W.; Wear, M.; hide

    2018-01-01

    Vision changes identified in long duration spaceflight astronauts has led Space Medicine at NASA to adopt a more comprehensive clinical monitoring protocol. Optical Coherence Tomography (OCT) was recently implemented at NASA, including on board the International Space Station in 2013. NASA is collaborating with Heidelberg Engineering to increase the fidelity of the current OCT data set by integrating the traditional circumpapillary OCT image with radial and horizontal block images at the optic nerve head. The retinal nerve fiber layer was segmented by two experienced individuals. Intra-rater (N=4 subjects and 70 images) and inter-rater (N=4 subjects and 221 images) agreement was performed. The results of this analysis and the potential benefits will be presented.

  12. Computer software design description for the integrated control and data acquisition system LDUA system

    International Nuclear Information System (INIS)

    Aftanas, B.L.

    1998-01-01

    This Computer Software Design Description (CSDD) document provides the overview of the software design for all the software that is part of the integrated control and data acquisition system of the Light Duty Utility Arm System (LDUA). It describes the major software components and how they interface. It also references the documents that contain the detailed design description of the components

  13. Propulsion/flight control integration technology (PROFIT) software system definition

    Science.gov (United States)

    Carlin, C. M.; Hastings, W. J.

    1978-01-01

    The Propulsion Flight Control Integration Technology (PROFIT) program is designed to develop a flying testbed dedicated to controls research. The control software for PROFIT is defined. Maximum flexibility, needed for long term use of the flight facility, is achieved through a modular design. The Host program, processes inputs from the telemetry uplink, aircraft central computer, cockpit computer control and plant sensors to form an input data base for use by the control algorithms. The control algorithms, programmed as application modules, process the input data to generate an output data base. The Host program formats the data for output to the telemetry downlink, the cockpit computer control, and the control effectors. Two applications modules are defined - the bill of materials F-100 engine control and the bill of materials F-15 inlet control.

  14. Integrated software system for improving medical equipment management.

    Science.gov (United States)

    Bliznakov, Z; Pappous, G; Bliznakova, K; Pallikarakis, N

    2003-01-01

    The evolution of biomedical technology has led to an extraordinary use of medical devices in health care delivery. During the last decade, clinical engineering departments (CEDs) turned toward computerization and application of specific software systems for medical equipment management in order to improve their services and monitor outcomes. Recently, much emphasis has been given to patient safety. Through its Medical Device Directives, the European Union has required all member nations to use a vigilance system to prevent the reoccurrence of adverse events that could lead to injuries or death of patients or personnel as a result of equipment malfunction or improper use. The World Health Organization also has made this issue a high priority and has prepared a number of actions and recommendations. In the present workplace, a new integrated, Windows-oriented system is proposed, addressing all tasks of CEDs but also offering a global approach to their management needs, including vigilance. The system architecture is based on a star model, consisting of a central core module and peripheral units. Its development has been based on the integration of 3 software modules, each one addressing specific predefined tasks. The main features of this system include equipment acquisition and replacement management, inventory archiving and monitoring, follow up on scheduled maintenance, corrective maintenance, user training, data analysis, and reports. It also incorporates vigilance monitoring and information exchange for adverse events, together with a specific application for quality-control procedures. The system offers clinical engineers the ability to monitor and evaluate the quality and cost-effectiveness of the service provided by means of quality and cost indicators. Particular emphasis has been placed on the use of harmonized standards with regard to medical device nomenclature and classification. The system's practical applications have been demonstrated through a pilot

  15. Software Application Profile: RVPedigree: a suite of family-based rare variant association tests for normally and non-normally distributed quantitative traits.

    Science.gov (United States)

    Oualkacha, Karim; Lakhal-Chaieb, Lajmi; Greenwood, Celia Mt

    2016-04-01

    RVPedigree (Rare Variant association tests in Pedigrees) implements a suite of programs facilitating genome-wide analysis of association between a quantitative trait and autosomal region-based genetic variation. The main features here are the ability to appropriately test for association of rare variants with non-normally distributed quantitative traits, and also to appropriately adjust for related individuals, either from families or from population structure and cryptic relatedness. RVPedigree is available as an R package. The package includes calculation of kinship matrices, various options for coping with non-normality, three different ways of estimating statistical significance incorporating triaging to enable efficient use of the most computationally-intensive calculations, and a parallelization option for genome-wide analysis. The software is available from the Comprehensive R Archive Network [CRAN.R-project.org] under the name 'RVPedigree' and at [https://github.com/GreenwoodLab]. It has been published under General Public License (GPL) version 3 or newer. © The Author 2016; all rights reserved. Published by Oxford University Press on behalf of the International Epidemiological Association.

  16. Software Testbed for Developing and Evaluating Integrated Autonomous Subsystems

    Science.gov (United States)

    Ong, James; Remolina, Emilio; Prompt, Axel; Robinson, Peter; Sweet, Adam; Nishikawa, David

    2015-01-01

    To implement fault tolerant autonomy in future space systems, it will be necessary to integrate planning, adaptive control, and state estimation subsystems. However, integrating these subsystems is difficult, time-consuming, and error-prone. This paper describes Intelliface/ADAPT, a software testbed that helps researchers develop and test alternative strategies for integrating planning, execution, and diagnosis subsystems more quickly and easily. The testbed's architecture, graphical data displays, and implementations of the integrated subsystems support easy plug and play of alternate components to support research and development in fault-tolerant control of autonomous vehicles and operations support systems. Intelliface/ADAPT controls NASA's Advanced Diagnostics and Prognostics Testbed (ADAPT), which comprises batteries, electrical loads (fans, pumps, and lights), relays, circuit breakers, invertors, and sensors. During plan execution, an experimentor can inject faults into the ADAPT testbed by tripping circuit breakers, changing fan speed settings, and closing valves to restrict fluid flow. The diagnostic subsystem, based on NASA's Hybrid Diagnosis Engine (HyDE), detects and isolates these faults to determine the new state of the plant, ADAPT. Intelliface/ADAPT then updates its model of the ADAPT system's resources and determines whether the current plan can be executed using the reduced resources. If not, the planning subsystem generates a new plan that reschedules tasks, reconfigures ADAPT, and reassigns the use of ADAPT resources as needed to work around the fault. The resource model, planning domain model, and planning goals are expressed using NASA's Action Notation Modeling Language (ANML). Parts of the ANML model are generated automatically, and other parts are constructed by hand using the Planning Model Integrated Development Environment, a visual Eclipse-based IDE that accelerates ANML model development. Because native ANML planners are currently

  17. Dietary intake assessment using integrated sensors and software

    Science.gov (United States)

    Shang, Junqing; Pepin, Eric; Johnson, Eric; Hazel, David; Teredesai, Ankur; Kristal, Alan; Mamishev, Alexander

    2012-02-01

    The area of dietary assessment is becoming increasingly important as obesity rates soar, but valid measurement of the food intake in free-living persons is extraordinarily challenging. Traditional paper-based dietary assessment methods have limitations due to bias, user burden and cost, and therefore improved methods are needed to address important hypotheses related to diet and health. In this paper, we will describe the progress of our mobile Diet Data Recorder System (DDRS), where an electronic device is used for objective measurement on dietary intake in real time and at moderate cost. The DDRS consists of (1) a mobile device that integrates a smartphone and an integrated laser package, (2) software on the smartphone for data collection and laser control, (3) an algorithm to process acquired data for food volume estimation, which is the largest source of error in calculating dietary intake, and (4) database and interface for data storage and management. The estimated food volume, together with direct entries of food questionnaires and voice recordings, could provide dietitians and nutritional epidemiologists with more complete food description and more accurate food portion sizes. In this paper, we will describe the system design of DDRS and initial results of dietary assessment.

  18. Outcomes from the First Wingman Software in the Loop Integration Event: January 2017

    Science.gov (United States)

    2017-06-28

    ARL-TN-0830 ● June 2017 US Army Research Laboratory Outcomes from the First Wingman Software- in-the-Loop Integration Event...ARL-TN-0830 ● JUNE 2017 US Army Research Laboratory Outcomes from the First Wingman Software- in-the-Loop Integration Event: January 2017...Note 3. DATES COVERED (From - To) January 2017–September 2017 4. TITLE AND SUBTITLE Outcomes from the First Wingman Software-in-the-Loop Integration

  19. Scrum2Kanban: Integrating Kanban and Scrum in a University Software Engineering Capstone Course

    OpenAIRE

    Matthies, Christoph

    2018-01-01

    Using university capstone courses to teach agile software development methodologies has become commonplace, as agile methods have gained support in professional software development. This usually means students are introduced to and work with the currently most popular agile methodology: Scrum. However, as the agile methods employed in the industry change and are adapted to different contexts, university courses must follow suit. A prime example of this is the Kanban method, which has recentl...

  20. The use of software agents and distributed objects to integrate enterprises: Compatible or competing technologies?

    Energy Technology Data Exchange (ETDEWEB)

    Pancerella, C.M.

    1998-04-01

    Distributed object and software agent technologies are two integration methods for connecting enterprises. The two technologies have overlapping goals--interoperability and architectural support for integrating software components--though to date little or no integration of the two technologies has been made at the enterprise level. The primary difference between these two technologies is that distributed object technologies focus on the problems inherent in connecting distributed heterogeneous systems whereas software agent technologies focus on the problems involved with coordination and knowledge exchange across domain boundaries. This paper addresses the integration of these technologies in support of enterprise integration across organizational and geographic boundaries. The authors discuss enterprise integration issues, review their experiences with both technologies, and make recommendations for future work. Neither technology is a panacea. Good software engineering techniques must be applied to integrate an enterprise because scalability and a distributed software development team are realities.

  1. High-integrity software, computation and the scientific method

    International Nuclear Information System (INIS)

    Hatton, L.

    2012-01-01

    Computation rightly occupies a central role in modern science. Datasets are enormous and the processing implications of some algorithms are equally staggering. With the continuing difficulties in quantifying the results of complex computations, it is of increasing importance to understand its role in the essentially Popperian scientific method. In this paper, some of the problems with computation, for example the long-term unquantifiable presence of undiscovered defect, problems with programming languages and process issues will be explored with numerous examples. One of the aims of the paper is to understand the implications of trying to produce high-integrity software and the limitations which still exist. Unfortunately Computer Science itself suffers from an inability to be suitably critical of its practices and has operated in a largely measurement-free vacuum since its earliest days. Within computer science itself, this has not been so damaging in that it simply leads to unconstrained creativity and a rapid turnover of new technologies. In the applied sciences however which have to depend on computational results, such unquantifiability significantly undermines trust. It is time this particular demon was put to rest. (author)

  2. Integrated Software Environment for Pressurized Thermal Shock Analysis

    Directory of Open Access Journals (Sweden)

    Dino Araneo

    2011-01-01

    Full Text Available The present paper describes the main features and an application to a real Nuclear Power Plant (NPP of an Integrated Software Environment (in the following referred to as “platform” developed at University of Pisa (UNIPI to perform Pressurized Thermal Shock (PTS analysis. The platform is written in Java for the portability and it implements all the steps foreseen in the methodology developed at UNIPI for the deterministic analysis of PTS scenarios. The methodology starts with the thermal hydraulic analysis of the NPP with a system code (such as Relap5-3D and Cathare2, during a selected transient scenario. The results so obtained are then processed to provide boundary conditions for the next step, that is, a CFD calculation. Once the system pressure and the RPV wall temperature are known, the stresses inside the RPV wall can be calculated by mean a Finite Element (FE code. The last step of the methodology is the Fracture Mechanics (FM analysis, using weight functions, aimed at evaluating the stress intensity factor (KI at crack tip to be compared with the critical stress intensity factor KIc. The platform automates all these steps foreseen in the methodology once the user specifies a number of boundary conditions at the beginning of the simulation.

  3. Extending and Enhancing SAS (Static Analysis Suite)

    CERN Document Server

    Ho, David

    2016-01-01

    The Static Analysis Suite (SAS) is an open-source software package used to perform static analysis on C and C++ code, helping to ensure safety, readability and maintainability. In this Summer Student project, SAS was enhanced to improve ease of use and user customisation. A straightforward method of integrating static analysis into a project at compilation time was provided using the automated build tool CMake. The process of adding checkers to the suite was streamlined and simplied by developing an automatic code generator. To make SAS more suitable for continuous integration, a reporting mechanism summarising results was added. This suitability has been demonstrated by inclusion of SAS in the Future Circular Collider Software nightly build system. Scalability of the improved package was demonstrated by using the tool to analyse the ROOT code base.

  4. Experiences with Integrating Simulation into a Software Engineering Curriculum

    Science.gov (United States)

    Bollin, Andreas; Hochmuller, Elke; Mittermeir, Roland; Samuelis, Ladislav

    2012-01-01

    Software Engineering education must account for a broad spectrum of knowledge and skills software engineers will be required to apply throughout their professional life. Covering all the topics in depth within a university setting is infeasible due to curricular constraints as well as due to the inherent differences between educational…

  5. Integrated analysis software for bulk power system stability

    Energy Technology Data Exchange (ETDEWEB)

    Tanaka, T; Nagao, T; Takahashi, K [Central Research Inst. of Electric Power Industry, Tokyo (Japan)

    1994-12-31

    This paper presents Central Research Inst.of Electric Power Industry - CRIEPI`s - own developed three softwares for bulk power network analysis and the user support system which arranges tremendous data necessary for these softwares with easy and high reliability. (author) 3 refs., 7 figs., 2 tabs.

  6. Direct Integration: Training Software Developers to Conduct Usability Evaluations

    DEFF Research Database (Denmark)

    Skov, Mikael B.; Stage, Jan

    2008-01-01

    is based on an empirical study where 36 teams with a total of 234 first-year university students on software development and design educations were trained in a simple approach for user-based website usability testing that was taught in a 40 hour course. This approach supported them in planning, conducting......Many improvements of the interplay between usability evaluation and software development rely either on better methods for conducting usability evaluations or on better formats for presenting evaluation results in ways that are useful for software designers and developers. Both approaches involve...... a complete division of work between developers and evaluators, which is an undesirable complexity for many software development projects. This paper takes a different approach by exploring to what extent software developers and designers can be trained to carry out their own usability evaluations. The paper...

  7. Optical Beam Deflection Based AFM with Integrated Hardware and Software Platform for an Undergraduate Engineering Laboratory

    Directory of Open Access Journals (Sweden)

    Siu Hong Loh

    2017-02-01

    Full Text Available Atomic force microscopy (AFM has been used extensively in nanoscience research since its invention. Recently, many teaching laboratories in colleges, undergraduate institutions, and even high schools incorporate AFM as an effective teaching tool for nanoscience education. This paper presents an optical beam deflection (OBD based atomic force microscope, designed specifically for the undergraduate engineering laboratory as a teaching instrument. An electronic module for signal conditioning was built with components that are commonly available in an undergraduate electronic laboratory. In addition to off-the-shelf mechanical parts and optics, the design of custom-built mechanical parts waskept as simple as possible. Hence, the overall cost for the setup is greatly reduced. The AFM controller was developed using National Instruments Educational Laboratory Virtual Instrumentation Suite (NI ELVIS, an integrated hardware and software platform which can be programmed in LabVIEW. A simple yet effective control algorithm for scanning and feedback control was developed. Despite the use of an educational platform and low-cost components from the undergraduate laboratory, the developed AFM is capable of performing imaging in constant-force mode with submicron resolution and at reasonable scanning speed (approximately 18 min per image. Therefore, the AFM is suitable to be used as an educational tool for nanoscience. Moreover, the construction of the system can be a valuable educational experience for electronic and mechanical engineering students.

  8. Method for critical software event execution reliability in high integrity software

    Energy Technology Data Exchange (ETDEWEB)

    Kidd, M.E. [Sandia National Labs., Albuquerque, NM (United States)

    1997-11-01

    This report contains viewgraphs on a method called SEER, which provides a high level of confidence that critical software driven event execution sequences faithfully exceute in the face of transient computer architecture failures in both normal and abnormal operating environments.

  9. Experiences from the formal specification of the integration platform and the synthesis of SDT with the software bus

    International Nuclear Information System (INIS)

    Thunem, Harald; Mohn, Peter; Sandmark, Haakon; Stoelen, Ketil

    1999-04-01

    The three year programme 1997-1999 for the OECD Halden Reactor Project (HRP) identifies the need to gain experience from applying formal techniques in real-life system developments. This motivated the initiation of the HRP research activity Integration of Formal Specification in the Development of HAMMLAB 2000 (INT-FS). The principal objective was to experiment with formal techniques in system developments at the HRP; in particular, system developments connected to HAMMLAB 2000 - the computerised laboratory for man-machine-interaction experiments currently under construction. It was hoped that this experimentation with formal techniques should result in a better understanding of how such techniques should be utilised in a more industrial setting. To obtain more knowledge with respect to the practical effects and consequences of an increased level of formalization was another objective. This report summarises experiences, results and conclusions from a pre-study addressing INT-FS related issues connected to the development of the HAMMLAB 2000 Integration Platform (IP). The report starts by giving a brief overview of the IP. Then it describes and summarises experiences from the formalization of a top-level requirements specification for the IP. Finally, it discusses various approaches for the integration of applications generated automatically through the CASE-tool SDT and the Software Bus on which the communication within HAMMLAB 2000 will be based. The report concludes that the selected formalisms and tools are well-suited to describe IP-like systems. The report also concludes that the integration of SDT applications with the Software Bus will not be a major obstacle, and finally that a monitoring component for the IP is well-suited for development within INT-FS (author) (ml)

  10. Software-Programmed Optical Networking with Integrated NFV Service Provisioning

    DEFF Research Database (Denmark)

    Mehmeri, Victor; Wang, Xi; Basu, Shrutarshi

    2017-01-01

    We showcase demonstrations of “program & compile” styled optical networking as well as open platforms & standards based NFV service provisioning using a proof-of-concept implementation of the Software-Programmed Networking Operating System (SPN OS).......We showcase demonstrations of “program & compile” styled optical networking as well as open platforms & standards based NFV service provisioning using a proof-of-concept implementation of the Software-Programmed Networking Operating System (SPN OS)....

  11. Continuous integration and quality control for scientific software

    Science.gov (United States)

    Neidhardt, A.; Ettl, M.; Brisken, W.; Dassing, R.

    2013-08-01

    Modern software has to be stable, portable, fast and reliable. This is going to be also more and more important for scientific software. But this requires a sophisticated way to inspect, check and evaluate the quality of source code with a suitable, automated infrastructure. A centralized server with a software repository and a version control system is one essential part, to manage the code basis and to control the different development versions. While each project can be compiled separately, the whole code basis can also be compiled with one central “Makefile”. This is used to create automated, nightly builds. Additionally all sources are inspected automatically with static code analysis and inspection tools, which check well-none error situations, memory and resource leaks, performance issues, or style issues. In combination with an automatic documentation generator it is possible to create the developer documentation directly from the code and the inline comments. All reports and generated information are presented as HTML page on a Web server. Because this environment increased the stability and quality of the software of the Geodetic Observatory Wettzell tremendously, it is now also available for scientific communities. One regular customer is already the developer group of the DiFX software correlator project.

  12. Integrating Design Decision Management with Model-based Software Development

    DEFF Research Database (Denmark)

    Könemann, Patrick

    Design decisions are continuously made during the development of software systems and are important artifacts for design documentation. Dedicated decision management systems are often used to capture such design knowledge. Most such systems are, however, separated from the design artifacts...... of the system. In model-based software development, where design models are used to develop a software system, outcomes of many design decisions have big impact on design models. The realization of design decisions is often manual and tedious work on design models. Moreover, keeping design models consistent......, or by ignoring the causes. This substitutes manual reviews to some extent. The concepts, implemented in a tool, have been validated with design patterns, refactorings, and domain level tests that comprise a replay of a real project. This proves the applicability of the solution to realistic examples...

  13. Integration of Simulink Models with Component-based Software Models

    DEFF Research Database (Denmark)

    Marian, Nicolae

    2008-01-01

    Model based development aims to facilitate the development of embedded control systems by emphasizing the separation of the design level from the implementation level. Model based design involves the use of multiple models that represent different views of a system, having different semantics...... of abstract system descriptions. Usually, in mechatronics systems, design proceeds by iterating model construction, model analysis, and model transformation. Constructing a MATLAB/Simulink model, a plant and controller behavior is simulated using graphical blocks to represent mathematical and logical...... constraints. COMDES (Component-based Design of Software for Distributed Embedded Systems) is such a component-based system framework developed by the software engineering group of Mads Clausen Institute for Product Innovation (MCI), University of Southern Denmark. Once specified, the software model has...

  14. Integrating Dynamic Mathematics Software into Cooperative Learning Environments in Mathematics

    Science.gov (United States)

    Zengin, Yilmaz; Tatar, Enver

    2017-01-01

    The aim of this study was to evaluate the implementation of the cooperative learning model supported with dynamic mathematics software (DMS), that is a reflection of constructivist learning theory in the classroom environment, in the teaching of mathematics. For this purpose, a workshop was conducted with the volunteer teachers on the…

  15. Concrete containment integrity software: Procedure manual and guidelines

    International Nuclear Information System (INIS)

    Dameron, R.A.; Dunham, R.S.; Rashid, Y.R.

    1990-06-01

    This report is an executive summary describing the concrete containment analysis methodology and software that was developed in the EPRI-sponsored research to predict the overpressure behavior and leakage of concrete containments. A set of guidelines has been developed for performing reliable 2D axisymmetric concrete containment analysis with a cracking concrete constitutive model developed by ANATECH. The software package developed during this research phase is designed for use in conjunction with ABAQUS-EPGEN; it provides the concrete model and automates axisymmetric grid preparation, and rebar generation for 2D and 3D grids. The software offers the option of generating pre-programmed axisymmetric grids that can be tailored to a specific containment by input of a few geometry parameters. The goal of simplified axisymmetric analysis within the framework of the containment leakage prediction methodology is to compute global liner strain histories at various locations within the containment. A simplified approach for generating peak liner strains at structural discontinuities as function of the global liner strains has been presented in a separate leakage criteria document; the curves for strain magnification factors and liner stress triaxiality factors found in that document are intended to be applied to the global liner strain histories developed through global 2D analysis. This report summarizes the procedures for global 2D analysis and gives an overview of the constitutive model and the special purpose concrete containment analysis software developed in this research phase. 8 refs., 10 figs

  16. Integration of the MUSE Software Pipeline into the Astro-WISE System

    NARCIS (Netherlands)

    Pizagno, J.; Streicher, O.; Vriend, W.-J.; Ballester, P.; Egret, D.; Lorente, N.P.F.

    We discuss the current state of integrating the Mutli Unit Spectroscopic Explorer (hereafter: MUSE) software pipeline (Weilbacher et al. 2006) into the Astro-WISE system (Valentijn et al. 2007a; Vriend et al. 2012). MUSE is a future integral-field spectrograph for the VLT, consisting of 24 Integral

  17. Optimal integration and test plans for software releases of lithographic systems

    NARCIS (Netherlands)

    Boumen, R.; Jong, de I.S.M.; Mortel - Fronczak, van de J.M.; Rooda, J.E.

    2007-01-01

    This paper describes a method to determine the optimal integration and test plan for embedded systems software releases. The method consists of four steps: 1)describe the integration and test problem in an integration and test model which is introduced in this paper, 2) determine possible test

  18. Integrated Software Development System/Higher Order Software Conceptual Description (ISDS/HOS)

    Science.gov (United States)

    1976-11-01

    Structured Flowchart Conventions 270 6.3.5.3 Design Diagram Notation 273 xii HIGHER ORDER SOFTWARE, INC. 843 MASSACHUSETTS AVENUE. CAMBRIDGE, MASSACHUSETTS...associated with the process steps. They also reference other HIPO diagrams as well an non-HIPO documentation such as flowcharts or decision tables of...syntax that is easy to learn and must provide the novice with some prompting to help him avoid classic beginner errors. Desirable editing capabilities

  19. On integrating modeling software for application to total-system performance assessment

    International Nuclear Information System (INIS)

    Lewis, L.C.; Wilson, M.L.

    1994-05-01

    We examine the processes and methods used to facilitate collaboration in software development between two organizations at separate locations -- Lawrence Livermore National Laboratory (LLNL) in California and Sandia National Laboratories (SNL) in New Mexico. Our software development process integrated the efforts of these two laboratories. Software developed at LLNL to model corrosion and failure of waste packages and subsequent releases of radionuclides was incorporated as a source term into SNLs computer models for fluid flow and radionuclide transport through the geosphere

  20. State of the Art : Integrated Management of Requirements in Model-Based Software Engineering

    OpenAIRE

    Thörn, Christer

    2006-01-01

    This report describes the background and future of research concerning integrated management of requirements in model-based software engineering. The focus is on describing the relevant topics and existing theoretical backgrounds that form the basis for the research. The report describes the fundamental difficulties of requirements engineering for software projects, and proposes that the results and methods of models in software engineering can help leverage those problems. Taking inspiration...

  1. Evaluation and selection of open-source EMR software packages based on integrated AHP and TOPSIS.

    Science.gov (United States)

    Zaidan, A A; Zaidan, B B; Al-Haiqi, Ahmed; Kiah, M L M; Hussain, Muzammil; Abdulnabi, Mohamed

    2015-02-01

    Evaluating and selecting software packages that meet the requirements of an organization are difficult aspects of software engineering process. Selecting the wrong open-source EMR software package can be costly and may adversely affect business processes and functioning of the organization. This study aims to evaluate and select open-source EMR software packages based on multi-criteria decision-making. A hands-on study was performed and a set of open-source EMR software packages were implemented locally on separate virtual machines to examine the systems more closely. Several measures as evaluation basis were specified, and the systems were selected based a set of metric outcomes using Integrated Analytic Hierarchy Process (AHP) and TOPSIS. The experimental results showed that GNUmed and OpenEMR software can provide better basis on ranking score records than other open-source EMR software packages. Copyright © 2014 Elsevier Inc. All rights reserved.

  2. Integrating HCI Specialists into Open Source Software Development Projects

    Science.gov (United States)

    Hedberg, Henrik; Iivari, Netta

    Typical open source software (OSS) development projects are organized around technically talented developers, whose communication is based on technical aspects and source code. Decision-making power is gained through proven competence and activity in the project, and non-technical end-user opinions are too many times neglected. In addition, also human-computer interaction (HCI) specialists have encountered difficulties in trying to participate in OSS projects, because there seems to be no clear authority and responsibility for them. In this paper, based on HCI and OSS literature, we introduce an extended OSS development project organization model that adds a new level of communication and roles for attending human aspects of software. The proposed model makes the existence of HCI specialists visible in the projects, and promotes interaction between developers and the HCI specialists in the course of a project.

  3. A framework to integrate software behavior into dynamic probabilistic risk assessment

    International Nuclear Information System (INIS)

    Zhu Dongfeng; Mosleh, Ali; Smidts, Carol

    2007-01-01

    Software plays an increasingly important role in modern safety-critical systems. Although, research has been done to integrate software into the classical probabilistic risk assessment (PRA) framework, current PRA practice overwhelmingly neglects the contribution of software to system risk. Dynamic probabilistic risk assessment (DPRA) is considered to be the next generation of PRA techniques. DPRA is a set of methods and techniques in which simulation models that represent the behavior of the elements of a system are exercised in order to identify risks and vulnerabilities of the system. The fact remains, however, that modeling software for use in the DPRA framework is also quite complex and very little has been done to address the question directly and comprehensively. This paper develops a methodology to integrate software contributions in the DPRA environment. The framework includes a software representation, and an approach to incorporate the software representation into the DPRA environment SimPRA. The software representation is based on multi-level objects and the paper also proposes a framework to simulate the multi-level objects in the simulation-based DPRA environment. This is a new methodology to address the state explosion problem in the DPRA environment. This study is the first systematic effort to integrate software risk contributions into DPRA environments

  4. RAGE Reusable Game Software Components and Their Integration into Serious Game Engines

    NARCIS (Netherlands)

    Van der Vegt, Wim; Nyamsuren, Enkhbold; Westera, Wim

    2016-01-01

    This paper presents and validates a methodology for integrating reusable software components in diverse game engines. While conforming to the RAGE com-ponent-based architecture described elsewhere, the paper explains how the interac-tions and data exchange processes between a reusable software

  5. LearnWeb 2.0. Integrating Social Software for Lifelong Learning.

    NARCIS (Netherlands)

    Marenzi, Ivana; Demidova, Elena; Nejdl, Wolfgang

    2008-01-01

    Marenzi, I., Demidova, E., & Nejdl, W. (2008). LearnWeb 2.0. Integrating Social Software for Lifelong Learning. Proceedings of the ED-Media 2008. World Conference on Educational Multimedia, Hypermedia & Telecommunications. June, 30 - July, 4, 2008, Austria, Vienna.

  6. Integrated management software files action in case of fire; Software de gestion integral de fichas de actuacion en caso de incendio

    Energy Technology Data Exchange (ETDEWEB)

    Moreno-Ventas Garcia, V.; Gimeno Serrano, F.

    2010-07-01

    The proper management of emergencies, is a challenge for which it is essential to be prepared. Integrated Software Performance Chips In case of fire and rapid access to information, make this application a must to effectively drive any emergency due to fire at any nuclear facility.

  7. Sensitivity Analysis for Design Optimization Integrated Software Tools, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The objective of this proposed project is to provide a new set of sensitivity analysis theory and codes, the Sensitivity Analysis for Design Optimization Integrated...

  8. Hermeneutics framework: integration of design rationale and optimizing software modules

    NARCIS (Netherlands)

    Aksit, Mehmet; Malakuti Khah Olun Abadi, Somayeh

    To tackle the evolution challenges of adaptive systems, this paper argues on the necessity of hermeneutic approaches that help to avoid too early elimination of design alternatives. This visionary paper proposes the Hermeneutics Framework, which computationally integrates a design rationale

  9. Analytical Tools for Space Suit Design

    Science.gov (United States)

    Aitchison, Lindsay

    2011-01-01

    As indicated by the implementation of multiple small project teams within the agency, NASA is adopting a lean approach to hardware development that emphasizes quick product realization and rapid response to shifting program and agency goals. Over the past two decades, space suit design has been evolutionary in approach with emphasis on building prototypes then testing with the largest practical range of subjects possible. The results of these efforts show continuous improvement but make scaled design and performance predictions almost impossible with limited budgets and little time. Thus, in an effort to start changing the way NASA approaches space suit design and analysis, the Advanced Space Suit group has initiated the development of an integrated design and analysis tool. It is a multi-year-if not decadal-development effort that, when fully implemented, is envisioned to generate analysis of any given space suit architecture or, conversely, predictions of ideal space suit architectures given specific mission parameters. The master tool will exchange information to and from a set of five sub-tool groups in order to generate the desired output. The basic functions of each sub-tool group, the initial relationships between the sub-tools, and a comparison to state of the art software and tools are discussed.

  10. A MODEL FOR INTEGRATED SOFTWARE TO IMPROVE COMMUNICATION POLICY IN DENTAL TECHNICAL LABS

    Directory of Open Access Journals (Sweden)

    Minko M. Milev

    2017-06-01

    Full Text Available Introduction: Integrated marketing communications (IMC are all kinds of communications between organisations and customers, partners, other organisations and society. Aim: To develop and present an integrated software model, which can improve the effectiveness of communications in dental technical services. Material and Methods: The model of integrated software is based on recommendations of a total of 700 respondents (students of dental technology, dental physicians, dental technicians and patients of dental technical laboratories in Northeastern Bulgaria. Results and Discussion: We present the benefits of future integrated software to improve the communication policy in the dental technical laboratory that meets the needs of fast cooperation and well-built communicative network between dental physicians, dental technicians, patients and students. Conclusion: The use of integrated communications could be a powerful unified approach to improving the communication policy between all players at the market of dental technical services.

  11. RAJA Performance Suite

    Energy Technology Data Exchange (ETDEWEB)

    2017-09-01

    The RAJA Performance Suite is designed to evaluate performance of the RAJA performance portability library on a wide variety of important high performance computing (HPC) algorithmic lulmels. These kernels assess compiler optimizations and various parallel programming model backends accessible through RAJA, such as OpenMP, CUDA, etc. The Initial version of the suite contains 25 computational kernels, each of which appears in 6 variants: Baseline SequcntiaJ, RAJA SequentiaJ, Baseline OpenMP, RAJA OpenMP, Baseline CUDA, RAJA CUDA. All variants of each kernel perform essentially the same mathematical operations and the loop body code for each kernel is identical across all variants. There are a few kernels, such as those that contain reduction operations, that require CUDA-specific coding for their CUDA variants. ActuaJ computer instructions executed and how they run in parallel differs depending on the parallel programming model backend used and which optimizations are perfonned by the compiler used to build the Perfonnance Suite executable. The Suite will be used primarily by RAJA developers to perform regular assessments of RAJA performance across a range of hardware platforms and compilers as RAJA features are being developed. It will also be used by LLNL hardware and software vendor panners for new defining requirements for future computing platform procurements and acceptance testing. In particular, the RAJA Performance Suite will be used for compiler acceptance testing of the upcoming CORAUSierra machine {initial LLNL delivery expected in late-2017/early 2018) and the CORAL-2 procurement. The Suite will aJso be used to generate concise source code reproducers of compiler and runtime issues we uncover so that we may provide them to relevant vendors to be fixed.

  12. Integrating environmental component models. Development of a software framework

    NARCIS (Netherlands)

    Schmitz, O.

    2014-01-01

    Integrated models consist of interacting component models that represent various natural and social systems. They are important tools to improve our understanding of environmental systems, to evaluate cause–effect relationships of human–natural interactions, and to forecast the behaviour of

  13. SIMPATIQCO: a server-based software suite which facilitates monitoring the time course of LC-MS performance metrics on Orbitrap instruments.

    Science.gov (United States)

    Pichler, Peter; Mazanek, Michael; Dusberger, Frederico; Weilnböck, Lisa; Huber, Christian G; Stingl, Christoph; Luider, Theo M; Straube, Werner L; Köcher, Thomas; Mechtler, Karl

    2012-11-02

    While the performance of liquid chromatography (LC) and mass spectrometry (MS) instrumentation continues to increase, applications such as analyses of complete or near-complete proteomes and quantitative studies require constant and optimal system performance. For this reason, research laboratories and core facilities alike are recommended to implement quality control (QC) measures as part of their routine workflows. Many laboratories perform sporadic quality control checks. However, successive and systematic longitudinal monitoring of system performance would be facilitated by dedicated automatic or semiautomatic software solutions that aid an effortless analysis and display of QC metrics over time. We present the software package SIMPATIQCO (SIMPle AuTomatIc Quality COntrol) designed for evaluation of data from LTQ Orbitrap, Q-Exactive, LTQ FT, and LTQ instruments. A centralized SIMPATIQCO server can process QC data from multiple instruments. The software calculates QC metrics supervising every step of data acquisition from LC and electrospray to MS. For each QC metric the software learns the range indicating adequate system performance from the uploaded data using robust statistics. Results are stored in a database and can be displayed in a comfortable manner from any computer in the laboratory via a web browser. QC data can be monitored for individual LC runs as well as plotted over time. SIMPATIQCO thus assists the longitudinal monitoring of important QC metrics such as peptide elution times, peak widths, intensities, total ion current (TIC) as well as sensitivity, and overall LC-MS system performance; in this way the software also helps identify potential problems. The SIMPATIQCO software package is available free of charge.

  14. Integration of drinking water treatment plant process models and emulated process automation software

    NARCIS (Netherlands)

    Worm, G.I.M.

    2012-01-01

    The objective of this research is to limit the risks of fully automated operation of drinking water treatment plants and to improve their operation by using an integrated system of process models and emulated process automation software. This thesis contains the design of such an integrated system.

  15. Experience with Intel's Many Integrated Core Architecture in ATLAS Software

    CERN Document Server

    Fleischmann, S; The ATLAS collaboration; Lavrijsen, W; Neumann, M; Vitillo, R

    2014-01-01

    Intel recently released the first commercial boards of its Many Integrated Core (MIC) Architecture. MIC is Intel's solution for the domain of throughput computing, currently dominated by general purpose programming on graphics processors (GPGPU). MIC allows the use of the more familiar x86 programming model and supports standard technologies such as OpenMP, MPI, and Intel's Threading Building Blocks. This should make it possible to develop for both throughput and latency devices using a single code base.\

  16. Experience with Intel's Many Integrated Core Architecture in ATLAS Software

    CERN Document Server

    Fleischmann, S; The ATLAS collaboration; Lavrijsen, W; Neumann, M; Vitillo, R

    2013-01-01

    Intel recently released the first commercial boards of its Many Integrated Core (MIC) Architecture. MIC is Intel's solution for the domain of throughput computing, currently dominated by general purpose programming on graphics processors (GPGPU). MIC allows the use of the more familiar x86 programming model and supports standard technologies such as OpenMP, MPI, and Intel's Threading Building Blocks. This should make it possible to develop for both throughput and latency devices using a single code base.\

  17. A method for establishing integrity in software-based systems

    International Nuclear Information System (INIS)

    Staple, B.D.; Berg, R.S.; Dalton, L.J.

    1997-01-01

    In this paper, the authors present a digital system requirements specification method that has demonstrated a potential for improving the completeness of requirements while reducing ambiguity. It assists with making proper digital system design decisions, including the defense against specific digital system failures modes. It also helps define the technical rationale for all of the component and interface requirements. This approach is a procedural method that abstracts key features that are expanded in a partitioning that identifies and characterizes hazards and safety system function requirements. The key system features are subjected to a hierarchy that progressively defines their detailed characteristics and components. This process produces a set of requirements specifications for the system and all of its components. Based on application to nuclear power plants, the approach described here uses two ordered domains: plant safety followed by safety system integrity. Plant safety refers to those systems defined to meet the safety goals for the protection of the public. Safety system integrity refers to systems defined to ensure that the system can meet the safety goals. Within each domain, a systematic process is used to identify hazards and define the corresponding means of defense and mitigation. In both domains, the approach and structure are focused on the completeness of information and eliminating ambiguities in the generation of safety system requirements that will achieve the plant safety goals

  18. Development, Validation and Integration of the ATLAS Trigger System Software in Run 2

    CERN Document Server

    Keyes, Robert; The ATLAS collaboration

    2016-01-01

    The trigger system of the ATLAS detector at the LHC is a combination of hardware, firmware and software, associated to various sub-detectors that must seamlessly cooperate in order to select 1 collision of interest out of every 40,000 delivered by the LHC every millisecond. This talk will discuss the challenges, workflow and organization of the ongoing trigger software development, validation and deployment. This development, from the top level integration and configuration to the individual components responsible for each sub system, is done to ensure that the most up to date algorithms are used to optimize the performance of the experiment. This optimization hinges on the reliability and predictability of the software performance, which is why validation is of the utmost importance. The software adheres to a hierarchical release structure, with newly validated releases propagating upwards. Integration tests are carried out on a daily basis to ensure that the releases deployed to the online trigger farm duri...

  19. Integration testing through reusing representative unit test cases for high-confidence medical software.

    Science.gov (United States)

    Shin, Youngsul; Choi, Yunja; Lee, Woo Jin

    2013-06-01

    As medical software is getting larger-sized, complex, and connected with other devices, finding faults in integrated software modules gets more difficult and time consuming. Existing integration testing typically takes a black-box approach, which treats the target software as a black box and selects test cases without considering internal behavior of each software module. Though it could be cost-effective, this black-box approach cannot thoroughly test interaction behavior among integrated modules and might leave critical faults undetected, which should not happen in safety-critical systems such as medical software. This work anticipates that information on internal behavior is necessary even for integration testing to define thorough test cases for critical software and proposes a new integration testing method by reusing test cases used for unit testing. The goal is to provide a cost-effective method to detect subtle interaction faults at the integration testing phase by reusing the knowledge obtained from unit testing phase. The suggested approach notes that the test cases for the unit testing include knowledge on internal behavior of each unit and extracts test cases for the integration testing from the test cases for the unit testing for a given test criteria. The extracted representative test cases are connected with functions under test using the state domain and a single test sequence to cover the test cases is produced. By means of reusing unit test cases, the tester has effective test cases to examine diverse execution paths and find interaction faults without analyzing complex modules. The produced test sequence can have test coverage as high as the unit testing coverage and its length is close to the length of optimal test sequences. Copyright © 2013 Elsevier Ltd. All rights reserved.

  20. Dexterity: A MATLAB-based analysis software suite for processing and visualizing data from tasks that measure arm or forelimb function.

    Science.gov (United States)

    Butensky, Samuel D; Sloan, Andrew P; Meyers, Eric; Carmel, Jason B

    2017-07-15

    Hand function is critical for independence, and neurological injury often impairs dexterity. To measure hand function in people or forelimb function in animals, sensors are employed to quantify manipulation. These sensors make assessment easier and more quantitative and allow automation of these tasks. While automated tasks improve objectivity and throughput, they also produce large amounts of data that can be burdensome to analyze. We created software called Dexterity that simplifies data analysis of automated reaching tasks. Dexterity is MATLAB software that enables quick analysis of data from forelimb tasks. Through a graphical user interface, files are loaded and data are identified and analyzed. These data can be annotated or graphed directly. Analysis is saved, and the graph and corresponding data can be exported. For additional analysis, Dexterity provides access to custom scripts created by other users. To determine the utility of Dexterity, we performed a study to evaluate the effects of task difficulty on the degree of impairment after injury. Dexterity analyzed two months of data and allowed new users to annotate the experiment, visualize results, and save and export data easily. Previous analysis of tasks was performed with custom data analysis, requiring expertise with analysis software. Dexterity made the tools required to analyze, visualize and annotate data easy to use by investigators without data science experience. Dexterity increases accessibility to automated tasks that measure dexterity by making analysis of large data intuitive, robust, and efficient. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. An integrated computer design environment for the development of micro-computer critical software

    International Nuclear Information System (INIS)

    De Agostino, E.; Massari, V.

    1986-01-01

    The paper deals with the development of micro-computer software for Nuclear Safety System. More specifically, it describes an experimental work in the field of software development methodologies to be used for the implementation of micro-computer based safety systems. An investigation of technological improvements that are provided by state-of-the-art integrated packages for micro-based systems development has been carried out. The work has aimed to assess a suitable automated tools environment for the whole software life-cycle. The main safety functions, as DNBR, KW/FT, of a nuclear power reactor have been implemented in a host-target approach. A prototype test-bed microsystem has been implemented to run the safety functions in order to derive a concrete evaluation on the feasibility of critical software according to new technological trends of ''Software Factories''. (author)

  2. User-driven integrated software lives: ``Paleomag'' paleomagnetics analysis on the Macintosh

    Science.gov (United States)

    Jones, Craig H.

    2002-12-01

    "PaleoMag," a paleomagnetics analysis package originally developed for the Macintosh operating system in 1988, allows examination of demagnetization of individual samples and analysis of directional data from collections of samples. Prior to recent reinvigorated development of the software for both Macintosh and Windows, it was widely used despite not running properly on machines and operating systems sold after 1995. This somewhat surprising situation demonstrates that there is a continued need for integrated analysis software within the earth sciences, in addition to well-developed scripting and batch-mode software. One distinct advantage of software like PaleoMag is in the ability to combine quality control with analysis within a unique graphical environment. Because such demands are frequent within the earth sciences, means of nurturing the development of similar software should be found.

  3. Software quality assurance plan for the National Ignition Facility integrated computer control system

    International Nuclear Information System (INIS)

    Woodruff, J.

    1996-11-01

    Quality achievement is the responsibility of the line organizations of the National Ignition Facility (NIF) Project. This Software Quality Assurance Plan (SQAP) applies to the activities of the Integrated Computer Control System (ICCS) organization and its subcontractors. The Plan describes the activities implemented by the ICCS section to achieve quality in the NIF Project's controls software and implements the NIF Quality Assurance Program Plan (QAPP, NIF-95-499, L-15958-2) and the Department of Energy's (DOE's) Order 5700.6C. This SQAP governs the quality affecting activities associated with developing and deploying all control system software during the life cycle of the NIF Project

  4. FEATUREOUS: AN INTEGRATED ENVIRONMENT FOR FEATURE-CENTRIC ANALYSIS AND MODIFICATION OF OBJECT-ORIENTED SOFTWARE

    DEFF Research Database (Denmark)

    Olszak, Andrzej; Jørgensen, Bo Nørregaard

    2011-01-01

    The decentralized nature of collaborations between objects in object-oriented software makes it difficult to understand the implementations of user-observable program features and their respective interdependencies. As feature-centric program understanding and modification are essential during...... software maintenance and evolution, this situation needs to change. In this paper, we present Featureous, an integrated development environment built on top of the NetBeans IDE that facilitates feature-centric analysis of object-oriented software. Our integrated development environment encompasses...... a lightweight feature location mechanism, a number of reusable analytical views, and necessary APIs for supporting future extensions. The base of the integrated development environment is a conceptual framework comprising of three complementary dimensions of comprehension: perspective, abstraction...

  5. Integrated Payload Data Handling Systems Using Software Partitioning

    Science.gov (United States)

    Taylor, Alun; Hann, Mark; Wishart, Alex

    2015-09-01

    An integrated Payload Data Handling System (I-PDHS) is one in which multiple instruments share a central payload processor for their on-board data processing tasks. This offers a number of advantages over the conventional decentralised architecture. Savings in payload mass and power can be realised because the total processing resource is matched to the requirements, as opposed to the decentralised architecture here the processing resource is in effect the sum of all the applications. Overall development cost can be reduced using a common processor. At individual instrument level the potential benefits include a standardised application development environment, and the opportunity to run the instrument data handling application on a fully redundant and more powerful processing platform [1]. This paper describes a joint program by SCISYS UK Limited, Airbus Defence and Space, Imperial College London and RAL Space to implement a realistic demonstration of an I-PDHS using engineering models of flight instruments (a magnetometer and camera) and a laboratory demonstrator of a central payload processor which is functionally representative of a flight design. The objective is to raise the Technology Readiness Level of the centralised data processing technique by address the key areas of task partitioning to prevent fault propagation and the use of a common development process for the instrument applications. The project is supported by a UK Space Agency grant awarded under the National Space Technology Program SpaceCITI scheme. [1].

  6. The dynamic of modern software development project management and the software crisis of quality. An integrated system dynamics approach towards software quality improvement

    OpenAIRE

    Nasirikaljahi, Armindokht

    2012-01-01

    The software industry is plagued by cost-overruns, delays, poor customer satisfaction and quality issues that are costing clients and customers world-wide billions of dollars each year. The phenomenon is coined The Software Crisis", and poses a huge challenge for software project management. This thesis addresses one of the core issues of the software crisis, namely software quality. The challenges of software quality are central for understanding the other symptoms of the software crisis. Th...

  7. Assessment of the integration capability of system architectures from a complex and distributed software systems perspective

    Science.gov (United States)

    Leuchter, S.; Reinert, F.; Müller, W.

    2014-06-01

    Procurement and design of system architectures capable of network centric operations demand for an assessment scheme in order to compare different alternative realizations. In this contribution an assessment method for system architectures targeted at the C4ISR domain is presented. The method addresses the integration capability of software systems from a complex and distributed software system perspective focusing communication, interfaces and software. The aim is to evaluate the capability to integrate a system or its functions within a system-of-systems network. This method uses approaches from software architecture quality assessment and applies them on the system architecture level. It features a specific goal tree of several dimensions that are relevant for enterprise integration. These dimensions have to be weighed against each other and totalized using methods from the normative decision theory in order to reflect the intention of the particular enterprise integration effort. The indicators and measurements for many of the considered quality features rely on a model based view on systems, networks, and the enterprise. That means it is applicable to System-of-System specifications based on enterprise architectural frameworks relying on defined meta-models or domain ontologies for defining views and viewpoints. In the defense context we use the NATO Architecture Framework (NAF) to ground respective system models. The proposed assessment method allows evaluating and comparing competing system designs regarding their future integration potential. It is a contribution to the system-of-systems engineering methodology.

  8. Software FMEA analysis for safety-related application software

    International Nuclear Information System (INIS)

    Park, Gee-Yong; Kim, Dong Hoon; Lee, Dong Young

    2014-01-01

    Highlights: • We develop a modified FMEA analysis suited for applying to software architecture. • A template for failure modes on a specific software language is established. • A detailed-level software FMEA analysis on nuclear safety software is presented. - Abstract: A method of a software safety analysis is described in this paper for safety-related application software. The target software system is a software code installed at an Automatic Test and Interface Processor (ATIP) in a digital reactor protection system (DRPS). For the ATIP software safety analysis, at first, an overall safety or hazard analysis is performed over the software architecture and modules, and then a detailed safety analysis based on the software FMEA (Failure Modes and Effect Analysis) method is applied to the ATIP program. For an efficient analysis, the software FMEA analysis is carried out based on the so-called failure-mode template extracted from the function blocks used in the function block diagram (FBD) for the ATIP software. The software safety analysis by the software FMEA analysis, being applied to the ATIP software code, which has been integrated and passed through a very rigorous system test procedure, is proven to be able to provide very valuable results (i.e., software defects) that could not be identified during various system tests

  9. The MSRC Ab Initio Methods Benchmark Suite: A measurement of hardware and software performance in the area of electronic structure methods

    Energy Technology Data Exchange (ETDEWEB)

    Feller, D.F.

    1993-07-01

    This collection of benchmark timings represents a snapshot of the hardware and software capabilities available for ab initio quantum chemical calculations at Pacific Northwest Laboratory`s Molecular Science Research Center in late 1992 and early 1993. The ``snapshot`` nature of these results should not be underestimated, because of the speed with which both hardware and software are changing. Even during the brief period of this study, we were presented with newer, faster versions of several of the codes. However, the deadline for completing this edition of the benchmarks precluded updating all the relevant entries in the tables. As will be discussed below, a similar situation occurred with the hardware. The timing data included in this report are subject to all the normal failures, omissions, and errors that accompany any human activity. In an attempt to mimic the manner in which calculations are typically performed, we have run the calculations with the maximum number of defaults provided by each program and a near minimum amount of memory. This approach may not produce the fastest performance that a particular code can deliver. It is not known to what extent improved timings could be obtained for each code by varying the run parameters. If sufficient interest exists, it might be possible to compile a second list of timing data corresponding to the fastest observed performance from each application, using an unrestricted set of input parameters. Improvements in I/O might have been possible by fine tuning the Unix kernel, but we resisted the temptation to make changes to the operating system. Due to the large number of possible variations in levels of operating system, compilers, speed of disks and memory, versions of applications, etc., readers of this report may not be able to exactly reproduce the times indicated. Copies of the output files from individual runs are available if questions arise about a particular set of timings.

  10. An integrated software testing framework for FGA-based controllers in nuclear power plants

    International Nuclear Information System (INIS)

    Kim, Jae Yeob; Kim, Eun Sub; Yoo, Jun Beom; Lee, Young Jun; Choi, Jong Gyun

    2016-01-01

    Field-programmable gate arrays (FPGAs) have received much attention from the nuclear industry as an alternative platform to programmable logic controllers for digital instrumentation and control. The software aspect of FPGA development consists of several steps of synthesis and refinement, and also requires verification activities, such as simulations that are performed individually at each step. This study proposed an integrated software-testing framework for simulating all artifacts of the FPGA software development simultaneously and evaluating whether all artifacts work correctly using common oracle programs. This method also generates a massive number of meaningful simulation scenarios that reflect reactor shutdown logics. The experiment, which was performed on two FPGA software implementations, showed that it can dramatically save both time and costs

  11. Using MDA for integration of heterogeneous components in software supply chains

    NARCIS (Netherlands)

    Hartmann, Johan Herman; Keren, Mila; Matsinger, Aart; Rubin, Julia; Trew, Tim; Yatzkar-Haham, Tali

    2013-01-01

    Software product lines are increasingly built using components from specialized suppliers. A company that is in the middle of a supply chain has to integrate components from its suppliers and offer (partially configured) products to its customers. To satisfy both the variability required by each

  12. On Integrating Student Empirical Software Engineering Studies with Research and Teaching Goals

    NARCIS (Netherlands)

    Galster, Matthias; Tofan, Dan; Avgeriou, Paris

    2012-01-01

    Background: Many empirical software engineering studies use students as subjects and are conducted as part of university courses. Aim: We aim at reporting our experiences with using guidelines for integrating empirical studies with our research and teaching goals. Method: We document our experience

  13. Integrating communication protocol selection with partitioning in hardware/software codesign

    DEFF Research Database (Denmark)

    Knudsen, Peter Voigt; Madsen, Jan

    1998-01-01

    frequencies of system components such as buses, CPU's, ASIC's, software code size, hardware area, and component prices. A distinct feature of the model is the modeling of driver processing of data (packing, splitting, compression, etc.) and its impact on communication throughput. The integration...

  14. The Rapid Integration and Test Environment - A Process for Achieving Software Test Acceptance

    OpenAIRE

    Jack, Rick

    2010-01-01

    Proceedings Paper (for Acquisition Research Program) Approved for public release; distribution unlimited. The Rapid Integration and Test Environment (RITE) initiative, implemented by the Program Executive Office, Command, Control, Communications, Computers and Intelligence, Command and Control Program Office (PMW-150), was born of necessity. Existing processes for requirements definition and management, as well as those for software development, did not consistently deliver high-qualit...

  15. The impact of continuous integration on other software development practices: a large-scale empirical study

    NARCIS (Netherlands)

    Zhao, Y.; Serebrenik, A.; Zhou, Y.; Filkov, V.; Vasilescu, B.N.

    2017-01-01

    Continuous Integration (CI) has become a disruptive innovation in software development: with proper tool support and adoption, positive effects have been demonstrated for pull request throughput and scaling up of project sizes. As any other innovation, adopting CI implies adapting existing practices

  16. LipiDex: An Integrated Software Package for High-Confidence Lipid Identification.

    Science.gov (United States)

    Hutchins, Paul D; Russell, Jason D; Coon, Joshua J

    2018-04-17

    State-of-the-art proteomics software routinely quantifies thousands of peptides per experiment with minimal need for manual validation or processing of data. For the emerging field of discovery lipidomics via liquid chromatography-tandem mass spectrometry (LC-MS/MS), comparably mature informatics tools do not exist. Here, we introduce LipiDex, a freely available software suite that unifies and automates all stages of lipid identification, reducing hands-on processing time from hours to minutes for even the most expansive datasets. LipiDex utilizes flexible in silico fragmentation templates and lipid-optimized MS/MS spectral matching routines to confidently identify and track hundreds of lipid species and unknown compounds from diverse sample matrices. Unique spectral and chromatographic peak purity algorithms accurately quantify co-isolation and co-elution of isobaric lipids, generating identifications that match the structural resolution afforded by the LC-MS/MS experiment. During final data filtering, ionization artifacts are removed to significantly reduce dataset redundancy. LipiDex interfaces with several LC-MS/MS software packages, enabling robust lipid identification to be readily incorporated into pre-existing data workflows. Copyright © 2018 Elsevier Inc. All rights reserved.

  17. An Integrated Platform for Dynamic Software Updating and its Application in Self-* systems

    DEFF Research Database (Denmark)

    Gregersen, Allan Raundahl; Jørgensen, Bo Nørregaard; Hadaytullah

    2012-01-01

    Practical dynamic updating of modern Java applications requires tool support to become an integral part of the software development and maintenance lifecycle. In this paper we present Javeleon, an easy-to-use tool for dynamic updates of Java applications. To support integration with specific...... frameworks, component systems and application servers, Javeleon currently provides tight integration with the NetBeans Platform, facilitating dynamic updating for applications built on top of the NetBeans Platform in an unconstrained manner. Javeleon supports state-preserving unanticipated runtime evolution...

  18. An intelligent and integrated V and V environment design for NPP I and C software systems

    International Nuclear Information System (INIS)

    Koo, Seo Ryong; Son Han Seong; Seong, Poong Hyun

    2001-01-01

    Nuclear Power Plant (NPP) is the safety critical system. Since, nuclear instrumentation and control (I and C) systems including the plant protection system play the brain part of human, nuclear I and C systems have an influence on safety and operation of NPP. Essentially, software V and V should be performed for the safety critical systems based on software. It is very important in the technical aspect because of the problems concerning license acquisitions. In this work, an intelligent and integrated V and V environment supporting the automation of V and V was designed. The intelligent and integrated V and V environment consists of the intelligent controller part, components part, interface part, and GUI part. These parts were integrated systematically, while taking their own independent functions

  19. Power, Avionics and Software - Phase 1.0:. [Subsystem Integration Test Report

    Science.gov (United States)

    Ivancic, William D.; Sands, Obed S.; Bakula, Casey J.; Oldham, Daniel R.; Wright, Ted; Bradish, Martin A.; Klebau, Joseph M.

    2014-01-01

    This report describes Power, Avionics and Software (PAS) 1.0 subsystem integration testing and test results that occurred in August and September of 2013. This report covers the capabilities of each PAS assembly to meet integration test objectives for non-safety critical, non-flight, non-human-rated hardware and software development. This test report is the outcome of the first integration of the PAS subsystem and is meant to provide data for subsequent designs, development and testing of the future PAS subsystems. The two main objectives were to assess the ability of the PAS assemblies to exchange messages and to perform audio testing of both inbound and outbound channels. This report describes each test performed, defines the test, the data, and provides conclusions and recommendations.

  20. Westinghouse integrated protection system. An overview of the software design and maintenance features

    International Nuclear Information System (INIS)

    Gibson, R.J.

    1995-01-01

    The Westinghouse Integrated Protection System was designed with the goal of providing a system which can be easily verified, validated, and maintained. The software design and structure promote the ease of translation from functional requirements to applications function software while also improving the ability to verify and maintain the applications function software. The use of independent, reusable, common functions software modules focuses the design, verification, and validation of the software and reduces the likelihood of errors occurring during the application and maintenance of the software. The simple continuous loop method of operation used throughout the IPS provides a standard deterministic method of operation. The IPS design also incorporates the use of embedded self-diagnostics to perform continuous hardware oriented tests of the system and the use of an independent subsystem to automatically perform a functional test of the system. Maintenance interfaces also exist to readily identify and locate faults as well as providing other maintenance capabilities. These testing and maintenance features enhance the overall reliability and availability of the system. (orig.) (2 refs., 2 figs.)

  1. Integrated software health management for aerospace guidance, navigation, and control systems: A probabilistic reasoning approach

    Science.gov (United States)

    Mbaya, Timmy

    Embedded Aerospace Systems have to perform safety and mission critical operations in a real-time environment where timing and functional correctness are extremely important. Guidance, Navigation, and Control (GN&C) systems substantially rely on complex software interfacing with hardware in real-time; any faults in software or hardware, or their interaction could result in fatal consequences. Integrated Software Health Management (ISWHM) provides an approach for detection and diagnosis of software failures while the software is in operation. The ISWHM approach is based on probabilistic modeling of software and hardware sensors using a Bayesian network. To meet memory and timing constraints of real-time embedded execution, the Bayesian network is compiled into an Arithmetic Circuit, which is used for on-line monitoring. This type of system monitoring, using an ISWHM, provides automated reasoning capabilities that compute diagnoses in a timely manner when failures occur. This reasoning capability enables time-critical mitigating decisions and relieves the human agent from the time-consuming and arduous task of foraging through a multitude of isolated---and often contradictory---diagnosis data. For the purpose of demonstrating the relevance of ISWHM, modeling and reasoning is performed on a simple simulated aerospace system running on a real-time operating system emulator, the OSEK/Trampoline platform. Models for a small satellite and an F-16 fighter jet GN&C (Guidance, Navigation, and Control) system have been implemented. Analysis of the ISWHM is then performed by injecting faults and analyzing the ISWHM's diagnoses.

  2. CDApps: integrated software for experimental planning and data processing at beamline B23, Diamond Light Source.

    Science.gov (United States)

    Hussain, Rohanah; Benning, Kristian; Javorfi, Tamas; Longo, Edoardo; Rudd, Timothy R; Pulford, Bill; Siligardi, Giuliano

    2015-03-01

    The B23 Circular Dichroism beamline at Diamond Light Source has been operational since 2009 and has seen visits from more than 200 user groups, who have generated large amounts of data. Based on the experience of overseeing the users' progress at B23, four key areas requiring the most assistance are identified: planning of experiments and note-keeping; designing titration experiments; processing and analysis of the collected data; and production of experimental reports. To streamline these processes an integrated software package has been developed and made available for the users. The subsequent article summarizes the main features of the software.

  3. GRO/EGRET data analysis software: An integrated system of custom and commercial software using standard interfaces

    Science.gov (United States)

    Laubenthal, N. A.; Bertsch, D.; Lal, N.; Etienne, A.; Mcdonald, L.; Mattox, J.; Sreekumar, P.; Nolan, P.; Fierro, J.

    1992-01-01

    The Energetic Gamma Ray Telescope Experiment (EGRET) on the Compton Gamma Ray Observatory has been in orbit for more than a year and is being used to map the full sky for gamma rays in a wide energy range from 30 to 20,000 MeV. Already these measurements have resulted in a wide range of exciting new information on quasars, pulsars, galactic sources, and diffuse gamma ray emission. The central part of the analysis is done with sky maps that typically cover an 80 x 80 degree section of the sky for an exposure time of several days. Specific software developed for this program generates the counts, exposure, and intensity maps. The analysis is done on a network of UNIX based workstations and takes full advantage of a custom-built user interface called X-dialog. The maps that are generated are stored in the FITS format for a collection of energies. These, along with similar diffuse emission background maps generated from a model calculation, serve as input to a maximum likelihood program that produces maps of likelihood with optional contours that are used to evaluate regions for sources. Likelihood also evaluates the background corrected intensity at each location for each energy interval from which spectra can be generated. Being in a standard FITS format permits all of the maps to be easily accessed by the full complement of tools available in several commercial astronomical analysis systems. In the EGRET case, IDL is used to produce graphics plots in two and three dimensions and to quickly implement any special evaluation that might be desired. Other custom-built software, such as the spectral and pulsar analyses, take advantage of the XView toolkit for display and Postscript output for the color hard copy. This poster paper outlines the data flow and provides examples of the user interfaces and output products. It stresses the advantages that are derived from the integration of the specific instrument-unique software and powerful commercial tools for graphics and

  4. PET-Tool: a software suite for comprehensive processing and managing of Paired-End diTag (PET sequence data

    Directory of Open Access Journals (Sweden)

    Wei Chia-Lin

    2006-08-01

    Full Text Available Abstract Background We recently developed the Paired End diTag (PET strategy for efficient characterization of mammalian transcriptomes and genomes. The paired end nature of short PET sequences derived from long DNA fragments raised a new set of bioinformatics challenges, including how to extract PETs from raw sequence reads, and correctly yet efficiently map PETs to reference genome sequences. To accommodate and streamline data analysis of the large volume PET sequences generated from each PET experiment, an automated PET data process pipeline is desirable. Results We designed an integrated computation program package, PET-Tool, to automatically process PET sequences and map them to the genome sequences. The Tool was implemented as a web-based application composed of four modules: the Extractor module for PET extraction; the Examiner module for analytic evaluation of PET sequence quality; the Mapper module for locating PET sequences in the genome sequences; and the ProjectManager module for data organization. The performance of PET-Tool was evaluated through the analyses of 2.7 million PET sequences. It was demonstrated that PET-Tool is accurate and efficient in extracting PET sequences and removing artifacts from large volume dataset. Using optimized mapping criteria, over 70% of quality PET sequences were mapped specifically to the genome sequences. With a 2.4 GHz LINUX machine, it takes approximately six hours to process one million PETs from extraction to mapping. Conclusion The speed, accuracy, and comprehensiveness have proved that PET-Tool is an important and useful component in PET experiments, and can be extended to accommodate other related analyses of paired-end sequences. The Tool also provides user-friendly functions for data quality check and system for multi-layer data management.

  5. The Software Reliability of Large Scale Integration Circuit and Very Large Scale Integration Circuit

    OpenAIRE

    Artem Ganiyev; Jan Vitasek

    2010-01-01

    This article describes evaluation method of faultless function of large scale integration circuits (LSI) and very large scale integration circuits (VLSI). In the article there is a comparative analysis of factors which determine faultless of integrated circuits, analysis of already existing methods and model of faultless function evaluation of LSI and VLSI. The main part describes a proposed algorithm and program for analysis of fault rate in LSI and VLSI circuits.

  6. Adobe Creative Suite 4 Bible

    CERN Document Server

    Padova, Ted

    2009-01-01

    As one of the few books to cover integration and workflow issues between Photoshop, Illustrator, InDesign, GoLive, Acrobat, and Version Cue, this comprehensive reference is the one book that Creative Suite users need; Two well-known and respected authors cover topics such as developing consistent color-managed workflows, moving files among the Creative Suite applications, preparing files for print or the Web, repurposing documents, and using the Creative Suite with Microsoft Office documents; More than 1,200 pages are packed with valuable advice and techniques for tackling common everyday issu

  7. RSYST: From nuclear reactor calculations towards a highly sophisticated scientific software integration environment

    International Nuclear Information System (INIS)

    Noack, M.; Seybold, J.; Ruehle, R.

    1996-01-01

    The software environment RSYST was originally used to solve problems of reactor physics. The consideration of advanced scientific simulation requirements and the strict application of modern software design principles led to a system which is perfectly suitable to solve problems in various complex scientific problem domains. Starting with a review of the early days of RSYST, we describe the straight evolution driven by the need of software environment which combines the advantages of a high-performance database system with the capability to integrate sophisticated scientific technical applications. The RSYST architecture is presented and the data modelling capabilities are described. To demonstrate the powerful possibilities and flexibility of the RSYST environment, we describe a wide range of RSYST applications, e.g., mechanical simulations of multibody systems, which are used in biomechanical research, civil engineering and robotics. In addition, a hypermedia system which is used for scientific technical training and documentation is presented. (orig.) [de

  8. Integrated software environment dedicated for implementation of control systems based on PLC controllers

    Directory of Open Access Journals (Sweden)

    Szymon SURMA

    2007-01-01

    Full Text Available Industrial processes’ control systems based on PLC controllers play today a very important role in all fields of transport, including also sea transport. Construction of control systems is the field of engineering, which has been continuously evolving towards maximum simplification of system design path. Up to now the time needed forthe system construction from the design to commissioning had to be divided into a few stages. A mistake made in an earlier stage caused that in most cases the next stages had to be restarted. Available debugging systems allows defect detection at an early stage of theproject implementation. The paper presents general characteristic of integrated software for implementation of complex control systems. The issues related to the software use for programming of the visualisation environment, control computer, selection oftransmission medium and transmission protocol as well as PLC controllers’ configuration, software and control have been analysed.

  9. Software quality assurance plan for the National Ignition Facility integrated computer control system

    Energy Technology Data Exchange (ETDEWEB)

    Woodruff, J.

    1996-11-01

    Quality achievement is the responsibility of the line organizations of the National Ignition Facility (NIF) Project. This Software Quality Assurance Plan (SQAP) applies to the activities of the Integrated Computer Control System (ICCS) organization and its subcontractors. The Plan describes the activities implemented by the ICCS section to achieve quality in the NIF Project`s controls software and implements the NIF Quality Assurance Program Plan (QAPP, NIF-95-499, L-15958-2) and the Department of Energy`s (DOE`s) Order 5700.6C. This SQAP governs the quality affecting activities associated with developing and deploying all control system software during the life cycle of the NIF Project.

  10. A combined approach of AHP and TOPSIS methods applied in the field of integrated software systems

    Science.gov (United States)

    Berdie, A. D.; Osaci, M.; Muscalagiu, I.; Barz, C.

    2017-05-01

    Adopting the most appropriate technology for developing applications on an integrated software system for enterprises, may result in great savings both in cost and hours of work. This paper proposes a research study for the determination of a hierarchy between three SAP (System Applications and Products in Data Processing) technologies. The technologies Web Dynpro -WD, Floorplan Manager - FPM and CRM WebClient UI - CRM WCUI are multi-criteria evaluated in terms of the obtained performances through the implementation of the same web business application. To establish the hierarchy a multi-criteria analysis model that combines the AHP (Analytic Hierarchy Process) and the TOPSIS (Technique for Order Preference by Similarity to Ideal Solution) methods was proposed. This model was built with the help of the SuperDecision software. This software is based on the AHP method and determines the weights for the selected sets of criteria. The TOPSIS method was used to obtain the final ranking and the technologies hierarchy.

  11. pcircle - A Suite of Scalable Parallel File System Tools

    Energy Technology Data Exchange (ETDEWEB)

    2015-10-01

    Most of the software related to file system are written for conventional local file system, they are serialized and can't take advantage of the benefit of a large scale parallel file system. "pcircle" software builds on top of ubiquitous MPI in cluster computing environment and "work-stealing" pattern to provide a scalable, high-performance suite of file system tools. In particular - it implemented parallel data copy and parallel data checksumming, with advanced features such as async progress report, checkpoint and restart, as well as integrity checking.

  12. ImatraNMR: Novel software for batch integration and analysis of quantitative NMR spectra

    Science.gov (United States)

    Mäkelä, A. V.; Heikkilä, O.; Kilpeläinen, I.; Heikkinen, S.

    2011-08-01

    Quantitative NMR spectroscopy is a useful and important tool for analysis of various mixtures. Recently, in addition of traditional quantitative 1D 1H and 13C NMR methods, a variety of pulse sequences aimed for quantitative or semiquantitative analysis have been developed. To obtain actual usable results from quantitative spectra, they must be processed and analyzed with suitable software. Currently, there are many processing packages available from spectrometer manufacturers and third party developers, and most of them are capable of analyzing and integration of quantitative spectra. However, they are mainly aimed for processing single or few spectra, and are slow and difficult to use when large numbers of spectra and signals are being analyzed, even when using pre-saved integration areas or custom scripting features. In this article, we present a novel software, ImatraNMR, designed for batch analysis of quantitative spectra. In addition to capability of analyzing large number of spectra, it provides results in text and CSV formats, allowing further data-analysis using spreadsheet programs or general analysis programs, such as Matlab. The software is written with Java, and thus it should run in any platform capable of providing Java Runtime Environment version 1.6 or newer, however, currently it has only been tested with Windows and Linux (Ubuntu 10.04). The software is free for non-commercial use, and is provided with source code upon request.

  13. A Comparison of Various Software Development Methodologies: Feasibility and Methods of Integration

    Directory of Open Access Journals (Sweden)

    Samir Abou El-Seoud

    2016-12-01

    Full Text Available System development methodologies which have being used in the academic and commercial environments during last two decades have advantages and disadvantages. Researchers had tried to identify objectives, scope …etc. of the methodologies by following different approaches. Each approach has its Limitation, specific interest, coverage …etc. In this paper, we tried to perform a comparative study of those methodologies which are popular and commonly used in banking and commercial environment. We tried in our study to determine objectives, scope, tools and other features of the methodologies. We also, tried to determine how and to what extent the methodologies incorporate the facilities such as project management, cost benefit analysis, documentation …etc. One of the most important aspects of our study was how to integrate the methodologies and develop a global methodology which covers the complete span of the software development life cycle? A prototype system which integrates the selected methodologies has been developed. The developed system helps analysts and designers how to choose suitable tools or to obtain guidelines on what to do in a particular situation. The prototype system has been tested during the development of a software for an ATM “Auto Teller Machine” by selecting and applying SASD methodology during software development. This resulted in the development of high quality and well documented software system.

  14. Semantic integration of gene expression analysis tools and data sources using software connectors

    Science.gov (United States)

    2013-01-01

    Background The study and analysis of gene expression measurements is the primary focus of functional genomics. Once expression data is available, biologists are faced with the task of extracting (new) knowledge associated to the underlying biological phenomenon. Most often, in order to perform this task, biologists execute a number of analysis activities on the available gene expression dataset rather than a single analysis activity. The integration of heteregeneous tools and data sources to create an integrated analysis environment represents a challenging and error-prone task. Semantic integration enables the assignment of unambiguous meanings to data shared among different applications in an integrated environment, allowing the exchange of data in a semantically consistent and meaningful way. This work aims at developing an ontology-based methodology for the semantic integration of gene expression analysis tools and data sources. The proposed methodology relies on software connectors to support not only the access to heterogeneous data sources but also the definition of transformation rules on exchanged data. Results We have studied the different challenges involved in the integration of computer systems and the role software connectors play in this task. We have also studied a number of gene expression technologies, analysis tools and related ontologies in order to devise basic integration scenarios and propose a reference ontology for the gene expression domain. Then, we have defined a number of activities and associated guidelines to prescribe how the development of connectors should be carried out. Finally, we have applied the proposed methodology in the construction of three different integration scenarios involving the use of different tools for the analysis of different types of gene expression data. Conclusions The proposed methodology facilitates the development of connectors capable of semantically integrating different gene expression analysis tools

  15. Kedalion: NASA's Adaptable and Agile Hardware/Software Integration and Test Lab

    Science.gov (United States)

    Mangieri, Mark L.; Vice, Jason

    2011-01-01

    NASA fs Kedalion engineering analysis lab at Johnson Space Center is on the forefront of validating and using many contemporary avionics hardware/software development and integration techniques, which represent new paradigms to heritage NASA culture. Kedalion has validated many of the Orion hardware/software engineering techniques borrowed from the adjacent commercial aircraft avionics solution space, with the intention to build upon such techniques to better align with today fs aerospace market. Using agile techniques, commercial products, early rapid prototyping, in-house expertise and tools, and customer collaboration, Kedalion has demonstrated that cost effective contemporary paradigms hold the promise to serve future NASA endeavors within a diverse range of system domains. Kedalion provides a readily adaptable solution for medium/large scale integration projects. The Kedalion lab is currently serving as an in-line resource for the project and the Multipurpose Crew Vehicle (MPCV) program.

  16. SITEGI Project: Applying Geotechnologies to Road Inspection. Sensor Integration and software processing

    Directory of Open Access Journals (Sweden)

    J. Martínez-Sánchez

    2013-10-01

    Full Text Available Infrastructure management represents a critical economic milestone. The current decision-making process in infrastructure rehabilitation is essentially based on qualitative parameters obtained from visual inspections and subject to the ability of technicians. In order to increase both efficiency and productivity in infrastructure management, this work addresses the integration of different instrumentation and sensors in a mobile mapping vehicle. This vehicle allows the continuous recording of quantitative data suitable for roadside inspection. The geometric integration and synchronization of these sensors is achieved through hardware and/or software strategies that permit the georeferencing of the data obtained with each sensor. In addition, a visualization software for simpler data management was implemented using Qt framework, PCL library and C++. As a result, the developed system supports the decision-making in road inspection, providing quantitative information suitable for sophisticated analysis systems.

  17. Development, validation and integration of the ATLAS Trigger System software in Run 2

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00377077; The ATLAS collaboration

    2017-01-01

    The trigger system of the ATLAS detector at the LHC is a combination of hardware, firmware, and software, associated to various sub-detectors that must seamlessly cooperate in order to select one collision of interest out of every 40,000 delivered by the LHC every millisecond. These proceedings discuss the challenges, organization and work flow of the ongoing trigger software development, validation, and deployment. The goal of this development is to ensure that the most up-to-date algorithms are used to optimize the performance of the experiment. The goal of the validation is to ensure the reliability and predictability of the software performance. Integration tests are carried out to ensure that the software deployed to the online trigger farm during data-taking run as desired. Trigger software is validated by emulating online conditions using a benchmark run and mimicking the reconstruction that occurs during normal data-taking. This exercise is computationally demanding and thus runs on the ATLAS high per...

  18. Development, Validation and Integration of the ATLAS Trigger System Software in Run 2

    Science.gov (United States)

    Keyes, Robert; ATLAS Collaboration

    2017-10-01

    The trigger system of the ATLAS detector at the LHC is a combination of hardware, firmware, and software, associated to various sub-detectors that must seamlessly cooperate in order to select one collision of interest out of every 40,000 delivered by the LHC every millisecond. These proceedings discuss the challenges, organization and work flow of the ongoing trigger software development, validation, and deployment. The goal of this development is to ensure that the most up-to-date algorithms are used to optimize the performance of the experiment. The goal of the validation is to ensure the reliability and predictability of the software performance. Integration tests are carried out to ensure that the software deployed to the online trigger farm during data-taking run as desired. Trigger software is validated by emulating online conditions using a benchmark run and mimicking the reconstruction that occurs during normal data-taking. This exercise is computationally demanding and thus runs on the ATLAS high performance computing grid with high priority. Performance metrics ranging from low-level memory and CPU requirements, to distributions and efficiencies of high-level physics quantities are visualized and validated by a range of experts. This is a multifaceted critical task that ties together many aspects of the experimental effort and thus directly influences the overall performance of the ATLAS experiment.

  19. The Environment for Application Software Integration and Execution (EASIE), version 1.0. Volume 2: Program integration guide

    Science.gov (United States)

    Jones, Kennie H.; Randall, Donald P.; Stallcup, Scott S.; Rowell, Lawrence F.

    1988-01-01

    The Environment for Application Software Integration and Execution, EASIE, provides a methodology and a set of software utility programs to ease the task of coordinating engineering design and analysis codes. EASIE was designed to meet the needs of conceptual design engineers that face the task of integrating many stand-alone engineering analysis programs. Using EASIE, programs are integrated through a relational data base management system. In volume 2, the use of a SYSTEM LIBRARY PROCESSOR is used to construct a DATA DICTIONARY describing all relations defined in the data base, and a TEMPLATE LIBRARY. A TEMPLATE is a description of all subsets of relations (including conditional selection criteria and sorting specifications) to be accessed as input or output for a given application. Together, these form the SYSTEM LIBRARY which is used to automatically produce the data base schema, FORTRAN subroutines to retrieve/store data from/to the data base, and instructions to a generic REVIEWER program providing review/modification of data for a given template. Automation of these functions eliminates much of the tedious, error prone work required by the usual approach to data base integration.

  20. Molecular radiotherapy: The NUKFIT software for calculating the time-integrated activity coefficient

    Energy Technology Data Exchange (ETDEWEB)

    Kletting, P.; Schimmel, S.; Luster, M. [Klinik für Nuklearmedizin, Universität Ulm, Ulm 89081 (Germany); Kestler, H. A. [Research Group Bioinformatics and Systems Biology, Institut für Neuroinformatik, Universität Ulm, Ulm 89081 (Germany); Hänscheid, H.; Fernández, M.; Lassmann, M. [Klinik für Nuklearmedizin, Universität Würzburg, Würzburg 97080 (Germany); Bröer, J. H.; Nosske, D. [Bundesamt für Strahlenschutz, Fachbereich Strahlenschutz und Gesundheit, Oberschleißheim 85764 (Germany); Glatting, G. [Medical Radiation Physics/Radiation Protection, Medical Faculty Mannheim, Heidelberg University, Mannheim 68167 (Germany)

    2013-10-15

    Purpose: Calculation of the time-integrated activity coefficient (residence time) is a crucial step in dosimetry for molecular radiotherapy. However, available software is deficient in that it is either not tailored for the use in molecular radiotherapy and/or does not include all required estimation methods. The aim of this work was therefore the development and programming of an algorithm which allows for an objective and reproducible determination of the time-integrated activity coefficient and its standard error.Methods: The algorithm includes the selection of a set of fitting functions from predefined sums of exponentials and the choice of an error model for the used data. To estimate the values of the adjustable parameters an objective function, depending on the data, the parameters of the error model, the fitting function and (if required and available) Bayesian information, is minimized. To increase reproducibility and user-friendliness the starting values are automatically determined using a combination of curve stripping and random search. Visual inspection, the coefficient of determination, the standard error of the fitted parameters, and the correlation matrix are provided to evaluate the quality of the fit. The functions which are most supported by the data are determined using the corrected Akaike information criterion. The time-integrated activity coefficient is estimated by analytically integrating the fitted functions. Its standard error is determined assuming Gaussian error propagation. The software was implemented using MATLAB.Results: To validate the proper implementation of the objective function and the fit functions, the results of NUKFIT and SAAM numerical, a commercially available software tool, were compared. The automatic search for starting values was successfully tested for reproducibility. The quality criteria applied in conjunction with the Akaike information criterion allowed the selection of suitable functions. Function fit

  1. The EGSE science software of the IBIS instrument on-board INTEGRAL satellite

    International Nuclear Information System (INIS)

    La Rosa, Giovanni; Fazio, Giacomo; Segreto, Alberto; Gianotti, Fulvio; Stephen, John; Trifoglio, Massimo

    2000-01-01

    IBIS (Imager on Board INTEGRAL Satellite) is one of the key instrument on-board the INTEGRAL satellite, the follow up mission of the high energy missions CGRO and Granat. The EGSE of IBIS is composed by a Satellite Interface Simulator, a Control Station and a Science Station. Here are described the solutions adopted for the architectural design of the software running on the Science Station. Some preliminary results are used to show the science functionality, that allowed to understand the instrument behavior, all along the test and calibration campaigns of the Engineering Model of IBIS

  2. Integrating optical, mechanical, and test software (with applications to freeform optics)

    Science.gov (United States)

    Genberg, Victor; Michels, Gregory; Myer, Brian

    2017-10-01

    Optical systems must perform under environmental conditions including thermal and mechanical loading. To predict the performance in the field, integrated analysis combining optical and mechanical software is required. Freeform and conformal optics offer many new opportunities for optical design. The unconventional geometries can lead to unconventional, and therefore unintuitive, mechanical behavior. Finite element (FE) analysis offers the ability to predict the deformations of freeform optics under various environments and load conditions. To understand the impact on optical performance, the deformations must be brought into optical analysis codes. This paper discusses several issues related to the integrated optomechanical analysis of freeform optics.

  3. Integrating R and Java for Enhancing Interactivity of Algorithmic Data Analysis Software Solutions

    Directory of Open Access Journals (Sweden)

    Titus Felix FURTUNĂ

    2016-06-01

    Full Text Available Conceiving software solutions for statistical processing and algorithmic data analysis involves handling diverse data, fetched from various sources and in different formats, and presenting the results in a suggestive, tailorable manner. Our ongoing research aims to design programming technics for integrating R developing environment with Java programming language for interoperability at a source code level. The goal is to combine the intensive data processing capabilities of R programing language, along with the multitude of statistical function libraries, with the flexibility offered by Java programming language and platform, in terms of graphical user interface and mathematical function libraries. Both developing environments are multiplatform oriented, and can complement each other through interoperability. R is a comprehensive and concise programming language, benefiting from a continuously expanding and evolving set of packages for statistical analysis, developed by the open source community. While is a very efficient environment for statistical data processing, R platform lacks support for developing user friendly, interactive, graphical user interfaces (GUIs. Java on the other hand, is a high level object oriented programming language, which supports designing and developing performant and interactive frameworks for general purpose software solutions, through Java Foundation Classes, JavaFX and various graphical libraries. In this paper we treat both aspects of integration and interoperability that refer to integrating Java code into R applications, and bringing R processing sequences into Java driven software solutions. Our research has been conducted focusing on case studies concerning pattern recognition and cluster analysis.

  4. NuSEE: an integrated environment of software specification and V and V for PLC based safety-critical systems

    International Nuclear Information System (INIS)

    Koo, Seo Ryong; Seong, Poong Hyun; Yoo, Jun Beom; Cha, Sung Deok; Youn, Cheong; Han, Hyun Chul

    2006-01-01

    As the use of digital systems becomes more prevalent, adequate techniques for software specification and analysis have become increasingly important in Nuclear Power Plant (NPP) safety-critical systems. Additionally, the importance of software Verification and Validation (V and V) based on adequate specification has received greater emphasis in view of improving software quality. For thorough V and V of safety-critical systems, V and V should be performed throughout the software lifecycle. However, systematic V and V is difficult as it involves many manual-oriented tasks. Tool support is needed in order to more conveniently perform software V and V. In response, we developed four kinds of Computer Aided Software Engineering (CASE) tools to support system specification for a formal-based analysis according to the software lifecycle. In this work, we achieved optimized integration of each tool. The toolset, NuSEE, is an integrated environment for software specification and V and V for PLC based safety-critical systems. In accordance with the software lifecycle, NuSEE consists of NuSISRT for the concept phase, NuSRS for the requirements phase, NuSDS for the design phase and NuSCM for configuration management. It is believed that after further development our integrated environment will be a unique and promising software specification and analysis toolset that will support the entire software lifecycle for the development of PLC based NPP safety-critical systems

  5. SU-F-J-72: A Clinical Usable Integrated Contouring Quality Evaluation Software for Radiotherapy

    International Nuclear Information System (INIS)

    Jiang, S; Dolly, S; Cai, B; Mutic, S; Li, H

    2016-01-01

    Purpose: To introduce the Auto Contour Evaluation (ACE) software, which is the clinical usable, user friendly, efficient and all-in-one toolbox for automatically identify common contouring errors in radiotherapy treatment planning using supervised machine learning techniques. Methods: ACE is developed with C# using Microsoft .Net framework and Windows Presentation Foundation (WPF) for elegant GUI design and smooth GUI transition animations through the integration of graphics engines and high dots per inch (DPI) settings on modern high resolution monitors. The industrial standard software design pattern, Model-View-ViewModel (MVVM) pattern, is chosen to be the major architecture of ACE for neat coding structure, deep modularization, easy maintainability and seamless communication with other clinical software. ACE consists of 1) a patient data importing module integrated with clinical patient database server, 2) a 2D DICOM image and RT structure simultaneously displaying module, 3) a 3D RT structure visualization module using Visualization Toolkit or VTK library and 4) a contour evaluation module using supervised pattern recognition algorithms to detect contouring errors and display detection results. ACE relies on supervised learning algorithms to handle all image processing and data processing jobs. Implementations of related algorithms are powered by Accord.Net scientific computing library for better efficiency and effectiveness. Results: ACE can take patient’s CT images and RT structures from commercial treatment planning software via direct user input or from patients’ database. All functionalities including 2D and 3D image visualization and RT contours error detection have been demonstrated with real clinical patient cases. Conclusion: ACE implements supervised learning algorithms and combines image processing and graphical visualization modules for RT contours verification. ACE has great potential for automated radiotherapy contouring quality verification

  6. SU-F-J-72: A Clinical Usable Integrated Contouring Quality Evaluation Software for Radiotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Jiang, S; Dolly, S; Cai, B; Mutic, S; Li, H [Washington University School of Medicine, Saint Louis, MO (United States)

    2016-06-15

    Purpose: To introduce the Auto Contour Evaluation (ACE) software, which is the clinical usable, user friendly, efficient and all-in-one toolbox for automatically identify common contouring errors in radiotherapy treatment planning using supervised machine learning techniques. Methods: ACE is developed with C# using Microsoft .Net framework and Windows Presentation Foundation (WPF) for elegant GUI design and smooth GUI transition animations through the integration of graphics engines and high dots per inch (DPI) settings on modern high resolution monitors. The industrial standard software design pattern, Model-View-ViewModel (MVVM) pattern, is chosen to be the major architecture of ACE for neat coding structure, deep modularization, easy maintainability and seamless communication with other clinical software. ACE consists of 1) a patient data importing module integrated with clinical patient database server, 2) a 2D DICOM image and RT structure simultaneously displaying module, 3) a 3D RT structure visualization module using Visualization Toolkit or VTK library and 4) a contour evaluation module using supervised pattern recognition algorithms to detect contouring errors and display detection results. ACE relies on supervised learning algorithms to handle all image processing and data processing jobs. Implementations of related algorithms are powered by Accord.Net scientific computing library for better efficiency and effectiveness. Results: ACE can take patient’s CT images and RT structures from commercial treatment planning software via direct user input or from patients’ database. All functionalities including 2D and 3D image visualization and RT contours error detection have been demonstrated with real clinical patient cases. Conclusion: ACE implements supervised learning algorithms and combines image processing and graphical visualization modules for RT contours verification. ACE has great potential for automated radiotherapy contouring quality verification

  7. Molecular radiotherapy: the NUKFIT software for calculating the time-integrated activity coefficient.

    Science.gov (United States)

    Kletting, P; Schimmel, S; Kestler, H A; Hänscheid, H; Luster, M; Fernández, M; Bröer, J H; Nosske, D; Lassmann, M; Glatting, G

    2013-10-01

    Calculation of the time-integrated activity coefficient (residence time) is a crucial step in dosimetry for molecular radiotherapy. However, available software is deficient in that it is either not tailored for the use in molecular radiotherapy and/or does not include all required estimation methods. The aim of this work was therefore the development and programming of an algorithm which allows for an objective and reproducible determination of the time-integrated activity coefficient and its standard error. The algorithm includes the selection of a set of fitting functions from predefined sums of exponentials and the choice of an error model for the used data. To estimate the values of the adjustable parameters an objective function, depending on the data, the parameters of the error model, the fitting function and (if required and available) Bayesian information, is minimized. To increase reproducibility and user-friendliness the starting values are automatically determined using a combination of curve stripping and random search. Visual inspection, the coefficient of determination, the standard error of the fitted parameters, and the correlation matrix are provided to evaluate the quality of the fit. The functions which are most supported by the data are determined using the corrected Akaike information criterion. The time-integrated activity coefficient is estimated by analytically integrating the fitted functions. Its standard error is determined assuming Gaussian error propagation. The software was implemented using MATLAB. To validate the proper implementation of the objective function and the fit functions, the results of NUKFIT and SAAM numerical, a commercially available software tool, were compared. The automatic search for starting values was successfully tested for reproducibility. The quality criteria applied in conjunction with the Akaike information criterion allowed the selection of suitable functions. Function fit parameters and their standard

  8. Omics Informatics: From Scattered Individual Software Tools to Integrated Workflow Management Systems.

    Science.gov (United States)

    Ma, Tianle; Zhang, Aidong

    2017-01-01

    Omic data analyses pose great informatics challenges. As an emerging subfield of bioinformatics, omics informatics focuses on analyzing multi-omic data efficiently and effectively, and is gaining momentum. There are two underlying trends in the expansion of omics informatics landscape: the explosion of scattered individual omics informatics tools with each of which focuses on a specific task in both single- and multi- omic settings, and the fast-evolving integrated software platforms such as workflow management systems that can assemble multiple tools into pipelines and streamline integrative analysis for complicated tasks. In this survey, we give a holistic view of omics informatics, from scattered individual informatics tools to integrated workflow management systems. We not only outline the landscape and challenges of omics informatics, but also sample a number of widely used and cutting-edge algorithms in omics data analysis to give readers a fine-grained view. We survey various workflow management systems (WMSs), classify them into three levels of WMSs from simple software toolkits to integrated multi-omic analytical platforms, and point out the emerging needs for developing intelligent workflow management systems. We also discuss the challenges, strategies and some existing work in systematic evaluation of omics informatics tools. We conclude by providing future perspectives of emerging fields and new frontiers in omics informatics.

  9. Integrated management tool for controls software problems, requests and project tasking at SLAC

    International Nuclear Information System (INIS)

    Rogind, D.; Allen, W.; Colocho, W.; DeContreras, G.; Gordon, J.; Pandey, P.; Shoaee, H.

    2012-01-01

    The Accelerator Directorate (AD) Instrumentation and Controls (ICD) Software (SW) Department at SLAC, with its service center model, continuously receives engineering requests to design, build and support controls for accelerator systems lab-wide. Each customer request can vary in complexity from a small software engineering change to a major enhancement. SLAC's Accelerator Improvement Projects (AIPs), along with DOE Construction projects, also contribute heavily to the work load. The various customer requests and projects, paired with the ongoing operational maintenance and problem reports, place a demand on the department that consistently exceeds the capacity of available resources. A centralized repository - comprised of all requests, project tasks, and problems - available to physicists, operators, managers, and engineers alike, is essential to capture, communicate, prioritize, assign, schedule, track, and finally, commission all work components. The Software Department has recently integrated request / project tasking into SLAC's custom online problem tracking 'Comprehensive Accelerator Tool for Enhancing Reliability' (CATER) tool. This paper discusses the newly implemented software request management tool - the workload it helps to track, its structure, features, reports, work-flow and its many usages. (authors)

  10. Integrated structural analysis tool using the linear matching method part 1 – Software development

    International Nuclear Information System (INIS)

    Ure, James; Chen, Haofeng; Tipping, David

    2014-01-01

    A number of direct methods based upon the Linear Matching Method (LMM) framework have been developed to address structural integrity issues for components subjected to cyclic thermal and mechanical load conditions. This paper presents a new integrated structural analysis tool using the LMM framework for the assessment of load carrying capacity, shakedown limit, ratchet limit and steady state cyclic response of structures. First, the development of the LMM for the evaluation of design limits in plasticity is introduced. Second, preliminary considerations for the development of the LMM into a tool which can be used on a regular basis by engineers are discussed. After the re-structuring of the LMM subroutines for multiple central processing unit (CPU) solution, the LMM software tool for the assessment of design limits in plasticity is implemented by developing an Abaqus CAE plug-in with graphical user interfaces. Further demonstration of this new LMM analysis tool including practical application and verification is presented in an accompanying paper. - Highlights: • A new structural analysis tool using the Linear Matching Method (LMM) is developed. • The software tool is able to evaluate the design limits in plasticity. • Able to assess limit load, shakedown, ratchet limit and steady state cyclic response. • Re-structuring of the LMM subroutines for multiple CPU solution is conducted. • The software tool is implemented by developing an Abaqus CAE plug-in with GUI

  11. Integrated software package for nuclear material safeguards in a MOX fuel fabrication facility

    International Nuclear Information System (INIS)

    Schreiber, H.J.; Piana, M.; Moussalli, G.; Saukkonen, H.

    2000-01-01

    Since computerized data processing was introduced to Safeguards at large bulk handling facilities, a large number of individual software applications have been developed for nuclear material Safeguards implementation. Facility inventory and flow data are provided in computerized format for performing stratification, sample size calculation and selection of samples for destructive and non-destructive assay. Data is collected from nuclear measurement systems running in attended, unattended mode and more recently from remote monitoring systems controlled. Data sets from various sources have to be evaluated for Safeguards purposes, such as raw data, processed data and conclusions drawn from data evaluation results. They are reported in computerized format at the International Atomic Energy Agency headquarters and feedback from the Agency's mainframe computer system is used to prepare and support Safeguards inspection activities. The integration of all such data originating from various sources cannot be ensured without the existence of a common data format and a database system. This paper describes the fundamental relations between data streams, individual data processing tools, data evaluation results and requirements for an integrated software solution to facilitate nuclear material Safeguards at a bulk handling facility. The paper also explains the basis for designing a software package to manage data streams from various data sources and for incorporating diverse data processing tools that until now have been used independently from each other and under different computer operating systems. (author)

  12. Error-Free Software

    Science.gov (United States)

    1989-01-01

    001 is an integrated tool suited for automatically developing ultra reliable models, simulations and software systems. Developed and marketed by Hamilton Technologies, Inc. (HTI), it has been applied in engineering, manufacturing, banking and software tools development. The software provides the ability to simplify the complex. A system developed with 001 can be a prototype or fully developed with production quality code. It is free of interface errors, consistent, logically complete and has no data or control flow errors. Systems can be designed, developed and maintained with maximum productivity. Margaret Hamilton, President of Hamilton Technologies, also directed the research and development of USE.IT, an earlier product which was the first computer aided software engineering product in the industry to concentrate on automatically supporting the development of an ultrareliable system throughout its life cycle. Both products originated in NASA technology developed under a Johnson Space Center contract.

  13. Design of energy efficient optical networks with software enabled integrated control plane

    DEFF Research Database (Denmark)

    Wang, Jiayuan; Yan, Ying; Dittmann, Lars

    2015-01-01

    energy consumption by proposing a new integrated control plane structure utilising Software Defined Networking technologies. The integrated control plane increases the efficiencies of exchanging control information across different network domains, while introducing new possibilities to the routing...... methods and the control over quality of service (QoS). The structure is defined as an overlay generalised multi-protocol label switching (GMPLS) control model. With the defined structure, the integrated control plane is able to gather information from different domains (i.e. optical core network......'s) routing behaviours. With the flexibility of the routing structure, results show that the energy efficiency of the network can be improved without compromising the QoS for delay/blocking sensitive services....

  14. A COTS RF Optical Software Defined Radio for the Integrated Radio and Optical Communications Test Bed

    Science.gov (United States)

    Nappier, Jennifer M.; Zeleznikar, Daniel J.; Wroblewski, Adam C.; Tokars, Roger P.; Schoenholz, Bryan L.; Lantz, Nicholas C.

    2016-01-01

    The Integrated Radio and Optical Communications (iROC) project at the National Aeronautics and Space Administration (NASA) is investigating the merits of a hybrid radio frequency (RF) and optical communication system for deep space missions. In an effort to demonstrate the feasibility and advantages of a hybrid RFOptical software defined radio (SDR), a laboratory prototype was assembled from primarily commercial-off-the-shelf (COTS) hardware components. This COTS platform has been used to demonstrate simultaneous transmission of the radio and optical communications waveforms through to the physical layer (telescope and antenna). This paper details the hardware and software used in the platform and various measures of its performance. A laboratory optical receiver platform has also been assembled in order to demonstrate hybrid free space links in combination with the transmitter.

  15. Integrated State Estimation and Contingency Analysis Software Implementation using High Performance Computing Techniques

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Yousu; Glaesemann, Kurt R.; Rice, Mark J.; Huang, Zhenyu

    2015-12-31

    Power system simulation tools are traditionally developed in sequential mode and codes are optimized for single core computing only. However, the increasing complexity in the power grid models requires more intensive computation. The traditional simulation tools will soon not be able to meet the grid operation requirements. Therefore, power system simulation tools need to evolve accordingly to provide faster and better results for grid operations. This paper presents an integrated state estimation and contingency analysis software implementation using high performance computing techniques. The software is able to solve large size state estimation problems within one second and achieve a near-linear speedup of 9,800 with 10,000 cores for contingency analysis application. The performance evaluation is presented to show its effectiveness.

  16. IEEE Computer Society/Software Engineering Institute Watts S. Humphrey Software Process Achievement Award 2016: Raytheon Integrated Defense Systems Design for Six Sigma Team

    Science.gov (United States)

    2017-04-01

    worldwide • $23 billion in sales for 2015 Raytheon Integrated Defense Systems (IDS) is one of five businesses within Raytheon Company and is headquartered...Raytheon Integrated Defense Systems DFSS team has developed and implemented numerous leading-edge improvement and optimization methodologies resulting in...our software systems . In this section, we explain the first methodology, the application of statistical test optimization (STO) using Design of

  17. Increasing quality and managing complexity in neuroinformatics software development with continuous integration

    Directory of Open Access Journals (Sweden)

    Yury V. Zaytsev

    2013-01-01

    Full Text Available High quality neuroscience research requires accurate, reliable and well maintained neuroinformatics applications. As software projects become larger, offering more functionality and developing a denser web of interdependence between their component parts, we need more sophisticated methods to manage their complexity. If complexity is allowed to get out of hand, either the quality of the software or the speed of development suffer, and in many cases both. To address this issue, here we develop a scalable, low-cost and open source solution for continuous integration (CI, a technique which ensures the quality of changes to the code base during the development procedure, rather than relying on a pre-release integration phase. We demonstrate that a CI based workflow, due to rapid feedback about code integration problems and tracking of code health measures, enabled substantial increases in productivity for a major neuroinformatics project and additional benefits for three further projects. Beyond the scope of the current study, we identify multiple areas in which CI can be employed to further increase the quality of neuroinformatics projects by improving development practices and incorporating appropriate development tools. Finally, we discuss what measures can be taken to lower the barrier for developers of neuroinformatics applications to adopt this useful technique.

  18. Increasing quality and managing complexity in neuroinformatics software development with continuous integration.

    Science.gov (United States)

    Zaytsev, Yury V; Morrison, Abigail

    2012-01-01

    High quality neuroscience research requires accurate, reliable and well maintained neuroinformatics applications. As software projects become larger, offering more functionality and developing a denser web of interdependence between their component parts, we need more sophisticated methods to manage their complexity. If complexity is allowed to get out of hand, either the quality of the software or the speed of development suffer, and in many cases both. To address this issue, here we develop a scalable, low-cost and open source solution for continuous integration (CI), a technique which ensures the quality of changes to the code base during the development procedure, rather than relying on a pre-release integration phase. We demonstrate that a CI-based workflow, due to rapid feedback about code integration problems and tracking of code health measures, enabled substantial increases in productivity for a major neuroinformatics project and additional benefits for three further projects. Beyond the scope of the current study, we identify multiple areas in which CI can be employed to further increase the quality of neuroinformatics projects by improving development practices and incorporating appropriate development tools. Finally, we discuss what measures can be taken to lower the barrier for developers of neuroinformatics applications to adopt this useful technique.

  19. Algal Functional Annotation Tool: a web-based analysis suite to functionally interpret large gene lists using integrated annotation and expression data

    Directory of Open Access Journals (Sweden)

    Merchant Sabeeha S

    2011-07-01

    Full Text Available Abstract Background Progress in genome sequencing is proceeding at an exponential pace, and several new algal genomes are becoming available every year. One of the challenges facing the community is the association of protein sequences encoded in the genomes with biological function. While most genome assembly projects generate annotations for predicted protein sequences, they are usually limited and integrate functional terms from a limited number of databases. Another challenge is the use of annotations to interpret large lists of 'interesting' genes generated by genome-scale datasets. Previously, these gene lists had to be analyzed across several independent biological databases, often on a gene-by-gene basis. In contrast, several annotation databases, such as DAVID, integrate data from multiple functional databases and reveal underlying biological themes of large gene lists. While several such databases have been constructed for animals, none is currently available for the study of algae. Due to renewed interest in algae as potential sources of biofuels and the emergence of multiple algal genome sequences, a significant need has arisen for such a database to process the growing compendiums of algal genomic data. Description The Algal Functional Annotation Tool is a web-based comprehensive analysis suite integrating annotation data from several pathway, ontology, and protein family databases. The current version provides annotation for the model alga Chlamydomonas reinhardtii, and in the future will include additional genomes. The site allows users to interpret large gene lists by identifying associated functional terms, and their enrichment. Additionally, expression data for several experimental conditions were compiled and analyzed to provide an expression-based enrichment search. A tool to search for functionally-related genes based on gene expression across these conditions is also provided. Other features include dynamic visualization of

  20. A Development Framework for Software Security in Nuclear Safety Systems: Integrating Secure Development and System Security Activities

    Energy Technology Data Exchange (ETDEWEB)

    Park, Jaekwan; Suh, Yongsuk [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2014-02-15

    The protection of nuclear safety software is essential in that a failure can result in significant economic loss and physical damage to the public. However, software security has often been ignored in nuclear safety software development. To enforce security considerations, nuclear regulator commission recently issued and revised the security regulations for nuclear computer-based systems. It is a great challenge for nuclear developers to comply with the security requirements. However, there is still no clear software development process regarding security activities. This paper proposes an integrated development process suitable for the secure development requirements and system security requirements described by various regulatory bodies. It provides a three-stage framework with eight security activities as the software development process. Detailed descriptions are useful for software developers and licensees to understand the regulatory requirements and to establish a detailed activity plan for software design and engineering.

  1. Development of a new model to predict indoor daylighting: Integration in CODYRUN software and validation

    Energy Technology Data Exchange (ETDEWEB)

    Fakra, A.H., E-mail: fakra@univ-reunion.f [Physics and Mathematical Engineering Laboratory for Energy and Environment (PIMENT), University of La Reunion, 117 rue du General Ailleret, 97430 Le Tampon (French Overseas Dpt.), Reunion (France); Miranville, F.; Boyer, H.; Guichard, S. [Physics and Mathematical Engineering Laboratory for Energy and Environment (PIMENT), University of La Reunion, 117 rue du General Ailleret, 97430 Le Tampon (French Overseas Dpt.), Reunion (France)

    2011-07-15

    Research highlights: {yields} This study presents a new model capable to simulate indoor daylighting. {yields} The model was introduced in research software called CODYRUN. {yields} The validation of the code was realized from a lot of tests cases. -- Abstract: Many models exist in the scientific literature for determining indoor daylighting values. They are classified in three categories: numerical, simplified and empirical models. Nevertheless, each of these categories of models are not convenient for every application. Indeed, the numerical model requires high calculation time; conditions of use of the simplified models are limited, and experimental models need not only important financial resources but also a perfect control of experimental devices (e.g. scale model), as well as climatic characteristics of the location (e.g. in situ experiment). In this article, a new model based on a combination of multiple simplified models is established. The objective is to improve this category of model. The originality of our paper relies on the coupling of several simplified models of indoor daylighting calculations. The accuracy of the simulation code, introduced into CODYRUN software to simulate correctly indoor illuminance, is then verified. Besides, the software consists of a numerical building simulation code, developed in the Physics and Mathematical Engineering Laboratory for Energy and Environment (PIMENT) at the University of Reunion. Initially dedicated to the thermal, airflow and hydrous phenomena in the buildings, the software has been completed for the calculation of indoor daylighting. New models and algorithms - which rely on a semi-detailed approach - will be presented in this paper. In order to validate the accuracy of the integrated models, many test cases have been considered as analytical, inter-software comparisons and experimental comparisons. In order to prove the accuracy of the new model - which can properly simulate the illuminance - a

  2. CASSys: an integrated software-system for the interactive analysis of ChIP-seq data

    Directory of Open Access Journals (Sweden)

    Alawi Malik

    2011-06-01

    Full Text Available The mapping of DNA-protein interactions is crucial for a full understanding of transcriptional regulation. Chromatin-immunoprecipitation followed bymassively parallel sequencing (ChIP-seq has become the standard technique for analyzing these interactions on a genome-wide scale. We have developed a software system called CASSys (ChIP-seq data Analysis Software System spanning all steps of ChIP-seq data analysis. It supersedes the laborious application of several single command line tools. CASSys provides functionality ranging from quality assessment and -control of short reads, over the mapping of reads against a reference genome (readmapping and the detection of enriched regions (peakdetection to various follow-up analyses. The latter are accessible via a state-of-the-art web interface and can be performed interactively by the user. The follow-up analyses allow for flexible user defined association of putative interaction sites with genes, visualization of their genomic context with an integrated genome browser, the detection of putative binding motifs, the identification of over-represented Gene Ontology-terms, pathway analysis and the visualization of interaction networks. The system is client-server based, accessible via a web browser and does not require any software installation on the client side. To demonstrate CASSys’s functionality we used the system for the complete data analysis of a publicly available Chip-seq study that investigated the role of the transcription factor estrogen receptor-α in breast cancer cells.

  3. Integrating Multimedia ICT Software in Language Curriculum: Students’ Perception, Use, and Effectivenes

    Directory of Open Access Journals (Sweden)

    Nikolai Penner

    2014-03-01

    Full Text Available Information and Communication Technologies (ICT constitute an integral part of the teaching and learning environment in present-day educational institutions and play an increasingly important role in the modern second language classroom. In this study, an online language learning tool Tell Me More (TMM has been introduced as  a supplementary tool in French and German first and second-year language university classes. At the end of the academic year, the students completed a questionnaire exploring their TMM usage behaviour and perception of the software. The survey also addressed aspects of the respondents' readiness for self-directed language learning. The data were then imported into SPSS and underwent statistical analysis. The results of the study show that 1 relatively few of today's university students are open to the idea of voluntarily using ICT for independent language practice; 2 grade, price, and availability of alternative means of language practice are the most important factors affecting the students' decision to purchase and use ICT software; 3 there is a relationship between the students' decision to buy and use ICT software and their readiness for self-directed learning.

  4. Experience Supporting the Integration of LHC Experiments Software Framework with the LCG Middleware

    CERN Document Server

    Santinelli, Roberto

    2006-01-01

    The LHC experiments are currently preparing for data acquisition in 2007 and because of the large amount of required computing and storage resources, they decided to embrace the grid paradigm. The LHC Computing Project (LCG) provides and operates a computing infrastructure suitable for data handling, Monte Carlo production and analysis. While LCG offers a set of high level services, intended to be generic enough to accommodate the needs of different Virtual Organizations, the LHC experiments software framework and applications are very specific and focused on the computing and data models. The LCG Experiment Integration Support team works in close contact with the experiments, the middleware developers and the LCG certification and operations teams to integrate the underlying grid middleware with the experiment specific components. The strategical position between the experiments and the middleware suppliers allows EIS team to play a key role at communications level between the customers and the service provi...

  5. Integration of XNAT/PACS, DICOM, and research software for automated multi-modal image analysis

    Science.gov (United States)

    Gao, Yurui; Burns, Scott S.; Lauzon, Carolyn B.; Fong, Andrew E.; James, Terry A.; Lubar, Joel F.; Thatcher, Robert W.; Twillie, David A.; Wirt, Michael D.; Zola, Marc A.; Logan, Bret W.; Anderson, Adam W.; Landman, Bennett A.

    2013-03-01

    Traumatic brain injury (TBI) is an increasingly important public health concern. While there are several promising avenues of intervention, clinical assessments are relatively coarse and comparative quantitative analysis is an emerging field. Imaging data provide potentially useful information for evaluating TBI across functional, structural, and microstructural phenotypes. Integration and management of disparate data types are major obstacles. In a multi-institution collaboration, we are collecting electroencephalogy (EEG), structural MRI, diffusion tensor MRI (DTI), and single photon emission computed tomography (SPECT) from a large cohort of US Army service members exposed to mild or moderate TBI who are undergoing experimental treatment. We have constructed a robust informatics backbone for this project centered on the DICOM standard and eXtensible Neuroimaging Archive Toolkit (XNAT) server. Herein, we discuss (1) optimization of data transmission, validation and storage, (2) quality assurance and workflow management, and (3) integration of high performance computing with research software.

  6. Software Configuration Management Plan for the K West Basin Integrated Water Treatment System (IWTS) - Project A.9

    International Nuclear Information System (INIS)

    GREEN, J.W.

    2000-01-01

    This document provides a configuration control plan for the software associated with the operation and control of the Integrated Water Treatment System (IWTS). It establishes requirements for ensuring configuration item identification, configuration control, configuration status accounting, defect reporting and resolution of computer software. It is written to comply with HNF-SD-SNF-CM-001, Spent Nuclear Fuel Configuration Management Plan (Forehand 1998) and HNF-PRO-309 Computer Software Quality Assurance Requirements, and applicable sections of administrative procedure CM-6-037-00, SNF Project Process Automation Software and Equipment

  7. Cytoscape: a software environment for integrated models of biomolecular interaction networks.

    Science.gov (United States)

    Shannon, Paul; Markiel, Andrew; Ozier, Owen; Baliga, Nitin S; Wang, Jonathan T; Ramage, Daniel; Amin, Nada; Schwikowski, Benno; Ideker, Trey

    2003-11-01

    Cytoscape is an open source software project for integrating biomolecular interaction networks with high-throughput expression data and other molecular states into a unified conceptual framework. Although applicable to any system of molecular components and interactions, Cytoscape is most powerful when used in conjunction with large databases of protein-protein, protein-DNA, and genetic interactions that are increasingly available for humans and model organisms. Cytoscape's software Core provides basic functionality to layout and query the network; to visually integrate the network with expression profiles, phenotypes, and other molecular states; and to link the network to databases of functional annotations. The Core is extensible through a straightforward plug-in architecture, allowing rapid development of additional computational analyses and features. Several case studies of Cytoscape plug-ins are surveyed, including a search for interaction pathways correlating with changes in gene expression, a study of protein complexes involved in cellular recovery to DNA damage, inference of a combined physical/functional interaction network for Halobacterium, and an interface to detailed stochastic/kinetic gene regulatory models.

  8. Design and Implementation of Integrated Software Research and Community Service at State Polytechnic of Manado

    Science.gov (United States)

    Saroinsong, T.; A. S Kondoj, M.; Kandiyoh, G.; Pontoh, G.

    2018-01-01

    The State Polytechnic of Manado (Polimdo) is one of the reliable institutions in North Sulawesi that first implemented ISO 9001. But the accreditation of the institution has not been satisfactory, it means there is still much to be prepared to achieve the expected target. One of the criteria of assessment of institutional accreditation is related to research activities and social work in accordance with the standard seven. Data documentation systems related to research activities and social work are not well integrated and well documented in all existing work units. This causes the process of gathering information related to the activities and the results of research and social work in order to support the accreditation activities of the institution is still not efficient. This study aims to build an integrated software in all work units in Polimdo to obtain documentation and data synchronization in support of activities or reporting of documents accreditation institution in accordance with standard seven specifically in terms of submission of research proposal and dedication. The software will be developed using RUP method with analysis using data flow diagram and ERM so that the result of this research is documentation and synchronization of data and information of research activity and community service which can be used in preparing documents report for accreditation institution.

  9. Integrated software system for seismic evaluation of nuclear power plant structures

    International Nuclear Information System (INIS)

    Xu, J.; Graves, H.L.

    1993-01-01

    The computer software CARES (Computer Analysis for Rapid Evaluation of Structures) was developed by the Brookhaven National Laboratory for the U.S. Nuclear Regulatory Commission. It represents an effort to utilize established numerical methodologies commonly employed by industry for structural safety evaluations of nuclear power plant facilities and incorporates them into an integrated computer software package operated on personal computers. CARES was developed with the objective of including all aspects of seismic performance evaluation of nuclear power structures. It can be used to evaluate the validity and accuracy of analysis methodologies used for structural safety evaluations of nuclear power plants by various utilities. CARES has a modular format, each module performing a specific type of analysis. The seismic module integrates all the steps of a complete seismic analysis into a single package with many user-friendly features such as interactiveness and quick turnaround. Linear structural theory and pseudo-linear convolution theory are utilized as the bases for the development with a special emphasis on the nuclear regulatory requirements for structural safety of nuclear plants. The organization of the seismic module is arranged in eight options, each performing a specific step of the analysis with most of input/output interfacing processed by the general manager. Finally, CARES provides comprehensive post-processing capability for displaying results graphically or in tabular form so that direct comparisons can be easily made. (author)

  10. Primer3_masker: integrating masking of template sequence with primer design software.

    Science.gov (United States)

    Kõressaar, Triinu; Lepamets, Maarja; Kaplinski, Lauris; Raime, Kairi; Andreson, Reidar; Remm, Maido

    2018-06-01

    Designing PCR primers for amplifying regions of eukaryotic genomes is a complicated task because the genomes contain a large number of repeat sequences and other regions unsuitable for amplification by PCR. We have developed a novel k-mer based masking method that uses a statistical model to detect and mask failure-prone regions on the DNA template prior to primer design. We implemented the software as a standalone software primer3_masker and integrated it into the primer design program Primer3. The standalone version of primer3_masker is implemented in C. The source code is freely available at https://github.com/bioinfo-ut/primer3_masker/ (standalone version for Linux and macOS) and at https://github.com/primer3-org/primer3/ (integrated version). Primer3 web application that allows masking sequences of 196 animal and plant genomes is available at http://primer3.ut.ee/. maido.remm@ut.ee. Supplementary data are available at Bioinformatics online.

  11. An integrated environment of software development and V and V for PLC based safety-critical systems

    International Nuclear Information System (INIS)

    Koo, Seo Ryong

    2005-02-01

    To develop and implement a safety-critical system, the requirements of the system must be analyzed thoroughly during the phases of a software development's life cycle because a single error in the requirements can generate serious software faults. We therefore propose an Integrated Environment (IE) approach for requirements which is an integrated approach that enables easy inspection by combining requirement traceability and effective use of a formal method. For the V and V tasks of requirements phase, our approach uses software inspection, requirement traceability, and formal specification with structural decomposition. Software inspection and the analysis of requirements traceability are the most effective methods of software V and V. Although formal methods are also considered an effective V and V activity, they are difficult to use properly in nuclear fields, as well as in other fields, because of their mathematical nature. We also propose another Integrated Environment (IE) for the design and implementation of safety-critical systems. In this study, a nuclear FED-style design specification and analysis (NuFDS) approach was proposed for PLC based safety-critical systems. The NuFDS approach is suggested in a straightforward manner for the effective and formal specification and analysis of software designs. Accordingly, the proposed NuFDS approach comprises one technique for specifying the software design and another for analyzing the software design. In addition, with the NuFDS approach, we can analyze the safety of software on the basis of fault tree synthesis. To analyze the design phase more effectively, we propose a technique of fault tree synthesis, along with a universal fault tree template for the architecture modules of nuclear software. Various tools have been needed to make software V and V more convenient. We therefore developed four kinds of computer-aided software engineering tools that could be used in accordance with the software's life cycle to

  12. The community-driven BiG CZ software system for integration and analysis of bio- and geoscience data in the critical zone

    Science.gov (United States)

    Aufdenkampe, A. K.; Mayorga, E.; Horsburgh, J. S.; Lehnert, K. A.; Zaslavsky, I.; Valentine, D. W., Jr.; Richard, S. M.; Cheetham, R.; Meyer, F.; Henry, C.; Berg-Cross, G.; Packman, A. I.; Aronson, E. L.

    2014-12-01

    Here we present the prototypes of a new scientific software system designed around the new Observations Data Model version 2.0 (ODM2, https://github.com/UCHIC/ODM2) to substantially enhance integration of biological and Geological (BiG) data for Critical Zone (CZ) science. The CZ science community takes as its charge the effort to integrate theory, models and data from the multitude of disciplines collectively studying processes on the Earth's surface. The central scientific challenge of the CZ science community is to develop a "grand unifying theory" of the critical zone through a theory-model-data fusion approach, for which the key missing need is a cyberinfrastructure for seamless 4D visual exploration of the integrated knowledge (data, model outputs and interpolations) from all the bio and geoscience disciplines relevant to critical zone structure and function, similar to today's ability to easily explore historical satellite imagery and photographs of the earth's surface using Google Earth. This project takes the first "BiG" steps toward answering that need. The overall goal of this project is to co-develop with the CZ science and broader community, including natural resource managers and stakeholders, a web-based integration and visualization environment for joint analysis of cross-scale bio and geoscience processes in the critical zone (BiG CZ), spanning experimental and observational designs. We will: (1) Engage the CZ and broader community to co-develop and deploy the BiG CZ software stack; (2) Develop the BiG CZ Portal web application for intuitive, high-performance map-based discovery, visualization, access and publication of data by scientists, resource managers, educators and the general public; (3) Develop the BiG CZ Toolbox to enable cyber-savvy CZ scientists to access BiG CZ Application Programming Interfaces (APIs); and (4) Develop the BiG CZ Central software stack to bridge data systems developed for multiple critical zone domains into a single

  13. Development and use of mathematical models and software frameworks for integrated analysis of agricultural systems and associated water use impacts

    Science.gov (United States)

    Fowler, K. R.; Jenkins, E.W.; Parno, M.; Chrispell, J.C.; Colón, A. I.; Hanson, Randall T.

    2016-01-01

    The development of appropriate water management strategies requires, in part, a methodology for quantifying and evaluating the impact of water policy decisions on regional stakeholders. In this work, we describe the framework we are developing to enhance the body of resources available to policy makers, farmers, and other community members in their e orts to understand, quantify, and assess the often competing objectives water consumers have with respect to usage. The foundation for the framework is the construction of a simulation-based optimization software tool using two existing software packages. In particular, we couple a robust optimization software suite (DAKOTA) with the USGS MF-OWHM water management simulation tool to provide a flexible software environment that will enable the evaluation of one or multiple (possibly competing) user-defined (or stakeholder) objectives. We introduce the individual software components and outline the communication strategy we defined for the coupled development. We present numerical results for case studies related to crop portfolio management with several defined objectives. The objectives are not optimally satisfied for any single user class, demonstrating the capability of the software tool to aid in the evaluation of a variety of competing interests.

  14. Integrating Testing into Software Engineering Courses Supported by a Collaborative Learning Environment

    Science.gov (United States)

    Clarke, Peter J.; Davis, Debra; King, Tariq M.; Pava, Jairo; Jones, Edward L.

    2014-01-01

    As software becomes more ubiquitous and complex, the cost of software bugs continues to grow at a staggering rate. To remedy this situation, there needs to be major improvement in the knowledge and application of software validation techniques. Although there are several software validation techniques, software testing continues to be one of the…

  15. IClinfMRI Software for Integrating Functional MRI Techniques in Presurgical Mapping and Clinical Studies.

    Science.gov (United States)

    Hsu, Ai-Ling; Hou, Ping; Johnson, Jason M; Wu, Changwei W; Noll, Kyle R; Prabhu, Sujit S; Ferguson, Sherise D; Kumar, Vinodh A; Schomer, Donald F; Hazle, John D; Chen, Jyh-Horng; Liu, Ho-Ling

    2018-01-01

    Task-evoked and resting-state (rs) functional magnetic resonance imaging (fMRI) techniques have been applied to the clinical management of neurological diseases, exemplified by presurgical localization of eloquent cortex, to assist neurosurgeons in maximizing resection while preserving brain functions. In addition, recent studies have recommended incorporating cerebrovascular reactivity (CVR) imaging into clinical fMRI to evaluate the risk of lesion-induced neurovascular uncoupling (NVU). Although each of these imaging techniques possesses its own advantage for presurgical mapping, a specialized clinical software that integrates the three complementary techniques and promptly outputs the analyzed results to radiology and surgical navigation systems in a clinical format is still lacking. We developed the Integrated fMRI for Clinical Research (IClinfMRI) software to facilitate these needs. Beyond the independent processing of task-fMRI, rs-fMRI, and CVR mapping, IClinfMRI encompasses three unique functions: (1) supporting the interactive rs-fMRI mapping while visualizing task-fMRI results (or results from published meta-analysis) as a guidance map, (2) indicating/visualizing the NVU potential on analyzed fMRI maps, and (3) exporting these advanced mapping results in a Digital Imaging and Communications in Medicine (DICOM) format that are ready to export to a picture archiving and communication system (PACS) and a surgical navigation system. In summary, IClinfMRI has the merits of efficiently translating and integrating state-of-the-art imaging techniques for presurgical functional mapping and clinical fMRI studies.

  16. The BRITNeY Suite Animation Tool

    DEFF Research Database (Denmark)

    Westergaard, Michael; Lassen, Kristian Bisgaard

    2006-01-01

    This paper describes the BRITNeY suite, a tool which enables users to create visualizations of formal models. BRITNeY suite is integrated with CPN Tools, and we give an example of how to extend a simple stop-and-wait protocol with a visualization in the form of message sequence charts. We also sh...... examples of animations created during industrial projects to give an impression of what is possible with the BRITNeY suite....

  17. Instant Spring Tool Suite

    CERN Document Server

    Chiang, Geoff

    2013-01-01

    Filled with practical, step-by-step instructions and clear explanations for the most important and useful tasks. A tutorial guide that walks you through how to use the features of Spring Tool Suite using well defined sections for the different parts of Spring.Instant Spring Tool Suite is for novice to intermediate Java developers looking to get a head-start in enterprise application development using Spring Tool Suite and the Spring framework. If you are looking for a guide for effective application development using Spring Tool Suite, then this book is for you.

  18. XplorSeq: a software environment for integrated management and phylogenetic analysis of metagenomic sequence data.

    Science.gov (United States)

    Frank, Daniel N

    2008-10-07

    Advances in automated DNA sequencing technology have accelerated the generation of metagenomic DNA sequences, especially environmental ribosomal RNA gene (rDNA) sequences. As the scale of rDNA-based studies of microbial ecology has expanded, need has arisen for software that is capable of managing, annotating, and analyzing the plethora of diverse data accumulated in these projects. XplorSeq is a software package that facilitates the compilation, management and phylogenetic analysis of DNA sequences. XplorSeq was developed for, but is not limited to, high-throughput analysis of environmental rRNA gene sequences. XplorSeq integrates and extends several commonly used UNIX-based analysis tools by use of a Macintosh OS-X-based graphical user interface (GUI). Through this GUI, users may perform basic sequence import and assembly steps (base-calling, vector/primer trimming, contig assembly), perform BLAST (Basic Local Alignment and Search Tool; 123) searches of NCBI and local databases, create multiple sequence alignments, build phylogenetic trees, assemble Operational Taxonomic Units, estimate biodiversity indices, and summarize data in a variety of formats. Furthermore, sequences may be annotated with user-specified meta-data, which then can be used to sort data and organize analyses and reports. A document-based architecture permits parallel analysis of sequence data from multiple clones or amplicons, with sequences and other data stored in a single file. XplorSeq should benefit researchers who are engaged in analyses of environmental sequence data, especially those with little experience using bioinformatics software. Although XplorSeq was developed for management of rDNA sequence data, it can be applied to most any sequencing project. The application is available free of charge for non-commercial use at http://vent.colorado.edu/phyloware.

  19. Virtual pools for interactive analysis and software development through an integrated Cloud environment

    International Nuclear Information System (INIS)

    Grandi, C; Italiano, A; Salomoni, D; Melcarne, A K Calabrese

    2011-01-01

    WNoDeS, an acronym for Worker Nodes on Demand Service, is software developed at CNAF-Tier1, the National Computing Centre of the Italian Institute for Nuclear Physics (INFN) located in Bologna. WNoDeS provides on demand, integrated access to both Grid and Cloud resources through virtualization technologies. Besides the traditional use of computing resources in batch mode, users need to have interactive and local access to a number of systems. WNoDeS can dynamically select these computers instantiating Virtual Machines, according to the requirements (computing, storage and network resources) of users through either the Open Cloud Computing Interface API, or through a web console. An interactive use is usually limited to activities in user space, i.e. where the machine configuration is not modified. In some other instances the activity concerns development and testing of services and thus implies the modification of the system configuration (and, therefore, root-access to the resource). The former use case is a simple extension of the WNoDeS approach, where the resource is provided in interactive mode. The latter implies saving the virtual image at the end of each user session so that it can be presented to the user at subsequent requests. This work describes how the LHC experiments at INFN-Bologna are testing and making use of these dynamically created ad-hoc machines via WNoDeS to support flexible, interactive analysis and software development at the INFN Tier-1 Computing Centre.

  20. Transformation as a Design Process and Runtime Architecture for High Integrity Software

    Energy Technology Data Exchange (ETDEWEB)

    Bespalko, S.J.; Winter, V.L.

    1999-04-05

    We have discussed two aspects of creating high integrity software that greatly benefit from the availability of transformation technology, which in this case is manifest by the requirement for a sophisticated backtracking parser. First, because of the potential for correctly manipulating programs via small changes, an automated non-procedural transformation system can be a valuable tool for constructing high assurance software. Second, modeling the processing of translating data into information as a, perhaps, context-dependent grammar leads to an efficient, compact implementation. From a practical perspective, the transformation process should begin in the domain language in which a problem is initially expressed. Thus in order for a transformation system to be practical it must be flexible with respect to domain-specific languages. We have argued that transformation applied to specification results in a highly reliable system. We also attempted to briefly demonstrate that transformation technology applied to the runtime environment will result in a safe and secure system. We thus believe that the sophisticated multi-lookahead backtracking parsing technology is central to the task of being in a position to demonstrate the existence of HIS.

  1. A software tool integrated risk assessment of spent fuel transpotation and storage

    Energy Technology Data Exchange (ETDEWEB)

    Yun, Mi Rae; Almomani, Belal; Ham, Jae Hyun; Kang, Hyun Gook [Dept. of Nuclear and Quantum Engineering, Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of); Christian, Robby [Dept. of Mechanical, Aerospace, and Nuclear Engineering, Rensselaer Polytechnic Institute, Troy (Korea, Republic of); Kim, Bo Gyung [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of); Lee, Sang Hoon [Dept. of Mechanical and Automotive Engineering, Keimyung University, Daegu (Korea, Republic of)

    2017-06-15

    When temporary spent fuel storage pools at nuclear power plants reach their capacity limit, the spent fuel must be moved to an alternative storage facility. However, radioactive materials must be handled and stored carefully to avoid severe consequences to the environment. In this study, the risks of three potential accident scenarios (i.e., maritime transportation, an aircraft crashing into an interim storage facility, and on-site transportation) associated with the spent fuel transportation process were analyzed using a probabilistic approach. For each scenario, the probabilities and the consequences were calculated separately to assess the risks: the probabilities were calculated using existing data and statistical models, and the consequences were calculated using computation models. Risk assessment software was developed to conveniently integrate the three scenarios. The risks were analyzed using the developed software according to the shipment route, building characteristics, and spent fuel handling environment. As a result of the risk analysis with varying accident conditions, transportation and storage strategies with relatively low risk were developed for regulators and licensees. The focus of this study was the risk assessment methodology; however, the applied model and input data have some uncertainties. Further research to reduce these uncertainties will improve the accuracy of this mode.

  2. A software tool integrated risk assessment of spent fuel transpotation and storage

    International Nuclear Information System (INIS)

    Yun, Mi Rae; Almomani, Belal; Ham, Jae Hyun; Kang, Hyun Gook; Christian, Robby; Kim, Bo Gyung; Lee, Sang Hoon

    2017-01-01

    When temporary spent fuel storage pools at nuclear power plants reach their capacity limit, the spent fuel must be moved to an alternative storage facility. However, radioactive materials must be handled and stored carefully to avoid severe consequences to the environment. In this study, the risks of three potential accident scenarios (i.e., maritime transportation, an aircraft crashing into an interim storage facility, and on-site transportation) associated with the spent fuel transportation process were analyzed using a probabilistic approach. For each scenario, the probabilities and the consequences were calculated separately to assess the risks: the probabilities were calculated using existing data and statistical models, and the consequences were calculated using computation models. Risk assessment software was developed to conveniently integrate the three scenarios. The risks were analyzed using the developed software according to the shipment route, building characteristics, and spent fuel handling environment. As a result of the risk analysis with varying accident conditions, transportation and storage strategies with relatively low risk were developed for regulators and licensees. The focus of this study was the risk assessment methodology; however, the applied model and input data have some uncertainties. Further research to reduce these uncertainties will improve the accuracy of this mode

  3. GMATA: An Integrated Software Package for Genome-Scale SSR Mining, Marker Development and Viewing.

    Science.gov (United States)

    Wang, Xuewen; Wang, Le

    2016-01-01

    Simple sequence repeats (SSRs), also referred to as microsatellites, are highly variable tandem DNAs that are widely used as genetic markers. The increasing availability of whole-genome and transcript sequences provides information resources for SSR marker development. However, efficient software is required to efficiently identify and display SSR information along with other gene features at a genome scale. We developed novel software package Genome-wide Microsatellite Analyzing Tool Package (GMATA) integrating SSR mining, statistical analysis and plotting, marker design, polymorphism screening and marker transferability, and enabled simultaneously display SSR markers with other genome features. GMATA applies novel strategies for SSR analysis and primer design in large genomes, which allows GMATA to perform faster calculation and provides more accurate results than existing tools. Our package is also capable of processing DNA sequences of any size on a standard computer. GMATA is user friendly, only requires mouse clicks or types inputs on the command line, and is executable in multiple computing platforms. We demonstrated the application of GMATA in plants genomes and reveal a novel distribution pattern of SSRs in 15 grass genomes. The most abundant motifs are dimer GA/TC, the A/T monomer and the GCG/CGC trimer, rather than the rich G/C content in DNA sequence. We also revealed that SSR count is a linear to the chromosome length in fully assembled grass genomes. GMATA represents a powerful application tool that facilitates genomic sequence analyses. GAMTA is freely available at http://sourceforge.net/projects/gmata/?source=navbar.

  4. Validation suite for MCNP

    International Nuclear Information System (INIS)

    Mosteller, Russell D.

    2002-01-01

    Two validation suites, one for criticality and another for radiation shielding, have been defined and tested for the MCNP Monte Carlo code. All of the cases in the validation suites are based on experiments so that calculated and measured results can be compared in a meaningful way. The cases in the validation suites are described, and results from those cases are discussed. For several years, the distribution package for the MCNP Monte Carlo code1 has included an installation test suite to verify that MCNP has been installed correctly. However, the cases in that suite have been constructed primarily to test options within the code and to execute quickly. Consequently, they do not produce well-converged answers, and many of them are physically unrealistic. To remedy these deficiencies, sets of validation suites are being defined and tested for specific types of applications. All of the cases in the validation suites are based on benchmark experiments. Consequently, the results from the measurements are reliable and quantifiable, and calculated results can be compared with them in a meaningful way. Currently, validation suites exist for criticality and radiation-shielding applications.

  5. Pharmacy settles suit.

    Science.gov (United States)

    1998-10-02

    A suit was filed by an HIV-positive man against a pharmacy that inadvertently disclosed his HIV status to his ex-wife and children. His ex-wife tried to use the information in a custody battle for their two children. The suit against the pharmacy was settled, but the terms of the settlement remain confidential.

  6. Software Uncertainty in Integrated Environmental Modelling: the role of Semantics and Open Science

    Science.gov (United States)

    de Rigo, Daniele

    2013-04-01

    Computational aspects increasingly shape environmental sciences [1]. Actually, transdisciplinary modelling of complex and uncertain environmental systems is challenging computational science (CS) and also the science-policy interface [2-7]. Large spatial-scale problems falling within this category - i.e. wide-scale transdisciplinary modelling for environment (WSTMe) [8-10] - often deal with factors (a) for which deep-uncertainty [2,11-13] may prevent usual statistical analysis of modelled quantities and need different ways for providing policy-making with science-based support. Here, practical recommendations are proposed for tempering a peculiar - not infrequently underestimated - source of uncertainty. Software errors in complex WSTMe may subtly affect the outcomes with possible consequences even on collective environmental decision-making. Semantic transparency in CS [2,8,10,14,15] and free software [16,17] are discussed as possible mitigations (b) . Software uncertainty, black-boxes and free software. Integrated natural resources modelling and management (INRMM) [29] frequently exploits chains of nontrivial data-transformation models (D- TM), each of them affected by uncertainties and errors. Those D-TM chains may be packaged as monolithic specialized models, maybe only accessible as black-box executables (if accessible at all) [50]. For end-users, black-boxes merely transform inputs in the final outputs, relying on classical peer-reviewed publications for describing the internal mechanism. While software tautologically plays a vital role in CS, it is often neglected in favour of more theoretical aspects. This paradox has been provocatively described as "the invisibility of software in published science. Almost all published papers required some coding, but almost none mention software, let alone include or link to source code" [51]. Recently, this primacy of theory over reality [52-54] has been challenged by new emerging hybrid approaches [55] and by the

  7. Open source projects as incubators of innovation: From niche phenomenon to integral part of the software industry

    OpenAIRE

    Schrape, Jan-Felix

    2017-01-01

    Over the last 20 years, open source development has become an integral part of the software industry and a key component of the innovation strategies of all major IT providers. Against this backdrop, this paper seeks to develop a systematic overview of open source communities and their socio-economic contexts. I begin with a reconstruction of the genesis of open source software projects and their changing relation- ships to established IT companies. This is followed by the identification of f...

  8. ASDA - Advanced Suit Design Analyzer computer program

    Science.gov (United States)

    Bue, Grant C.; Conger, Bruce C.; Iovine, John V.; Chang, Chi-Min

    1992-01-01

    An ASDA model developed to evaluate the heat and mass transfer characteristics of advanced pressurized suit design concepts for low pressure or vacuum planetary applications is presented. The model is based on a generalized 3-layer suit that uses the Systems Integrated Numerical Differencing Analyzer '85 in conjunction with a 41-node FORTRAN routine. The latter simulates the transient heat transfer and respiratory processes of a human body in a suited environment. The user options for the suit encompass a liquid cooled garment, a removable jacket, a CO2/H2O permeable layer, and a phase change layer.

  9. Problem of Office Suite Training at the University

    Directory of Open Access Journals (Sweden)

    Natalia A. Nastashchuk

    2013-01-01

    Full Text Available Te paper considers the problem of office suite applications training, caused by a rapid change of their versions, variety of software developers and a rapid development of software and hardware platforms. The content of office suite applications training, based on the system of office suite notions, its basic functional and standards of information technologies development (OpenDocument Format Standard, ISO 26300-200Х is presented.

  10. The CMS software performance at the start of data taking

    CERN Document Server

    Benelli, Gabriele

    2009-01-01

    The CMS software framework (CMSSW) is a complex project evolving very rapidly as the first LHC colliding beams approach. The computing requirements constrain performance in terms of CPU time, memory footprint and event size on disk to allow for planning and managing the computing infrastructure necessary to handle the needs of the experiment. A performance suite of tools has been developed to track all aspects of code performance, through the software release cycles, allowing for regression and guiding code development for optimization. In this talk, we describe the CMSSW performance suite tools used and present some sample performance results from the release integration process for the CMS software.

  11. Measuring CMS Software Performance in the first years of LHC collisions

    CERN Document Server

    Benelli, Gabriele; Pfeiffer, Andreas; Piparo, Danilo; Zemleris, Vidmantas

    2011-01-01

    The CMSSW software framework is a complex project enabling the CMS collaboration to investigate the fast growing LHC collision data sample. A software performance suite of tools has been developed and integrated in CMSSW to keep track of cpu time, memory footprint and event size on disk. These three metrics are key constraints in software development in order to meet the computing requirements used in the planning and management of the CMS computing infrastructure. The performance suite allows the measurement and tracking of the performance across the framework, publishing the results in a dedicated database. A web application makes the results easily accessible to software release managers allowing for automatic integration in CMSSW release cycle quality assurance. The performance suite is also available to individual developers for dedicated code optimization and the web application allows historic regression and comparisons across releases. The performance suite tools and the performance of the CMSSW frame...

  12. Integrated SCM/PDM/CRM and delivery of software products to 160.000 customers

    NARCIS (Netherlands)

    R.L. Jansen (Remy); G.C. Ballintijn (Gerco); S. Brinkkemper; A. van Nieuwland

    2004-01-01

    textabstractThe release and deployment of enterprise application software is a potentially complex task for software vendors. This complexity can unfortunately result in a significant amount of work and risk. This paper presents a case study of a product software vendor that tries to reduce this

  13. COMSY - A software tool for PLIM + PLEX with integrated risk-informed approaches

    International Nuclear Information System (INIS)

    Zander, A.; Nopper, H.; Roessner, R.

    2004-01-01

    The majority of mechanical components and structures in a thermal power plant are designed to experience a service life which is far above the intended design life. In most cases, only a small percentage of mechanical components are subject to significant degradation which may affect the integrity or the function of the component. If plant life extension (PLEX) is considered as an option, a plant specific PLIM strategy needs to be developed. One of the most important tasks of such a PLIM strategy is to identify those components which (i) are relevant for the safety and/or availability of the plant and (ii) experience elevated degradation due to their operating and design conditions. For these components special life management strategies need to be established to reliably monitor their condition. FRAMATOME ANP GmbH has developed the software tool COMSY, which is designed to efficiently support a plant-wide lifetime management strategy for static mechanical components, providing the basis for plant life extension (PLEX) activities. The objective is the economical and safe operation of power plants over their design lifetime - and beyond. The tool provides the capability to establish a program guided technical documentation of the plant by utilizing a virtual plant data model. The software integrates engineering analysis functions and comprehensive material libraries to perform a lifetime analysis for various degradation mechanisms typically experienced in power plants (e.g. flow-accelerated corrosion, intergranular stress corrosion cracking, strain-induced cracking, material fatigue, cavitation erosion, droplet impingement erosion, pitting, etc.). A risk-based prioritization serves to focus inspection activities on safety or availability relevant locations, where a degradation potential exists. Trending functions support the comparison of the as-measured condition with the predicted progress of degradation while making allowance for measurement tolerances. The

  14. The IFPUG guide to IT and software measurement

    CERN Document Server

    IFPUG

    2012-01-01

    The widespread deployment of millions of current and emerging software applications has placed software economic studies among the most critical of any form of business analysis. Unfortunately, a lack of an integrated suite of metrics makes software economic analysis extremely difficult. The International Function Point Users Group (IFPUG), a nonprofit and member-governed organization, has become the recognized leader in promoting the effective management of application software development and maintenance activities. The IFPUG Guide to IT and Software Measurement brings together 52 leading so

  15. Integrated graphical user interface for the back-end software sub-system

    International Nuclear Information System (INIS)

    Badescu, E.; Caprini, M.

    2001-01-01

    The ATLAS data acquisition and Event Filter prototype '-1' project was intended to produce a prototype system for evaluating candidate technologies and architectures for the final ATLAS DAQ system on the LHC accelerator at CERN. Within the prototype project, the back-end sub-system encompasses the software for configuring, controlling and monitoring the data acquisition (DAQ). The back-end sub-system includes core components and detector integration components. One of the detector integration components is the Integrated Graphical User Interface (IGUI), which is intended to give a view of the status of the DAQ system and its sub-systems (Dataflow, Event Filter and Back-end) and to allow the user (general users, such as a shift operator at a test beam or experts, in order to control and debug the DAQ system) to control its operation. The IGUI is intended to be a Status Display and a Control Interface too, so there are three groups of functional requirements: display requirements (the information to be displayed); control requirements (the actions the IGUI shall perform on the DAQ components); general requirements, applying to the general functionality of the IGUI. The constraint requirements include requirements related to the access control (shift operator or expert user). The quality requirements are related to the portability on different platforms. The IGUI has to interact with many components in a distributed environment. The following design guidelines have been considered in order to fulfil the requirements: use a modular design with easy possibility to integrate different sub-systems; use Java language for portability and powerful graphical features; use CORBA interfaces for communication with other components. The actual implementation of Back-end software components use Inter-Language Unification (ILU) for inter-process communication. Different methods of access of Java applications to ILU C++ servers have been evaluated (native methods, ILU Java support

  16. Quantitative Assessment of Free Flap Viability with CEUS Using an Integrated Perfusion Software.

    Science.gov (United States)

    Geis, S; Klein, S; Prantl, L; Dolderer, J; Lamby, P; Jung, E-M

    2015-12-01

    New treatment strategies in oncology and trauma surgery lead to an increasing demand for soft tissue reconstruction with free tissue transfer. In previous studies, CEUS was proven to detect early flap failure. The aim of this study was to detect and quantify vascular disturbances after free flap transplantation using a fast integrated perfusion software tool. From 2011 to 2013, 33 patients were examined by one experienced radiologist using CEUS after a bolus injection of 1-2.4 ml of SonoVue(®). Flap perfusion was analysed qualitatively regarding contrast defects or delayed wash-in. Additionally, an integrated semi-quantitative analysis using time-intensity curve analysis (TIC) was performed. TIC analysis of the transplant was conducted on a centimetre-by-centimetre basis up to a penetration depth of 4 cm. The 2 perfusion parameters "Time to PEAK" and "Area under the Curve" were compared in patients without complications vs. patients with minor complications or complete flap loss to figure out significant differences. TtoPk is given in seconds (s) and Area is given in relative units (rU) Results: A regular postoperative process was observed in 26 (79%) patients. In contrast, 5 (15%) patients with partial superficial flap necrosis, 1 patient (3%) with complete flap loss and 1 patient (3%) with haematoma were observed. TtoPk revealed no significant differences, whereas Area revealed significantly lower perfusion values in the corresponding areas in patients with complications. The critical threshold for sufficient flap perfusion was set below 150 rU. In conclusion, CEUS is a mobile and cost-effective opportunity to quantify tissue perfusion and can even be used almost without any restrictions in multi-morbid patients with renal and hepatic failure. © Georg Thieme Verlag KG Stuttgart · New York.

  17. SpecOp: Optimal Extraction Software for Integral Field Unit Spectrographs

    Science.gov (United States)

    McCarron, Adam; Ciardullo, Robin; Eracleous, Michael

    2018-01-01

    The Hobby-Eberly Telescope’s new low resolution integral field spectrographs, LRS2-B and LRS2-R, each cover a 12”x6” area on the sky with 280 fibers and generate spectra with resolutions between R=1100 and R=1900. To extract 1-D spectra from the instrument’s 3D data cubes, a program is needed that is flexible enough to work for a wide variety of targets, including continuum point sources, emission line sources, and compact sources embedded in complex backgrounds. We therefore introduce SpecOp, a user-friendly python program for optimally extracting spectra from integral-field unit spectrographs. As input, SpecOp takes a sky-subtracted data cube consisting of images at each wavelength increment set by the instrument’s spectral resolution, and an error file for each count measurement. All of these files are generated by the current LRS2 reduction pipeline. The program then collapses the cube in the image plane using the optimal extraction algorithm detailed by Keith Horne (1986). The various user-selected options include the fraction of the total signal enclosed in a contour-defined region, the wavelength range to analyze, and the precision of the spatial profile calculation. SpecOp can output the weighted counts and errors at each wavelength in various table formats using python’s astropy package. We outline the algorithm used for extraction and explain how the software can be used to easily obtain high-quality 1-D spectra. We demonstrate the utility of the program by applying it to spectra of a variety of quasars and AGNs. In some of these targets, we extract the spectrum of a nuclear point source that is superposed on a spatially extended galaxy.

  18. HARVESTING, INTEGRATING AND DISTRIBUTING LARGE OPEN GEOSPATIAL DATASETS USING FREE AND OPEN-SOURCE SOFTWARE

    Directory of Open Access Journals (Sweden)

    R. Oliveira

    2016-06-01

    Full Text Available Federal, State and Local government agencies in the USA are investing heavily on the dissemination of Open Data sets produced by each of them. The main driver behind this thrust is to increase agencies’ transparency and accountability, as well as to improve citizens’ awareness. However, not all Open Data sets are easy to access and integrate with other Open Data sets available even from the same agency. The City and County of Denver Open Data Portal distributes several types of geospatial datasets, one of them is the city parcels information containing 224,256 records. Although this data layer contains many pieces of information it is incomplete for some custom purposes. Open-Source Software were used to first collect data from diverse City of Denver Open Data sets, then upload them to a repository in the Cloud where they were processed using a PostgreSQL installation on the Cloud and Python scripts. Our method was able to extract non-spatial information from a ‘not-ready-to-download’ source that could then be combined with the initial data set to enhance its potential use.

  19. Experience with Intel's many integrated core architecture in ATLAS software

    International Nuclear Information System (INIS)

    Fleischmann, S; Neumann, M; Kama, S; Lavrijsen, W; Vitillo, R

    2014-01-01

    Intel recently released the first commercial boards of its Many Integrated Core (MIC) Architecture. MIC is Intel's solution for the domain of throughput computing, currently dominated by general purpose programming on graphics processors (GPGPU). MIC allows the use of the more familiar x86 programming model and supports standard technologies such as OpenMP, MPI, and Intel's Threading Building Blocks (TBB). This should make it possible to develop for both throughput and latency devices using a single code base. In ATLAS Software, track reconstruction has been shown to be a good candidate for throughput computing on GPGPU devices. In addition, the newly proposed offline parallel event-processing framework, GaudiHive, uses TBB for task scheduling. The MIC is thus, in principle, a good fit for this domain. In this paper, we report our experiences of porting to and optimizing ATLAS tracking algorithms for the MIC, comparing the programmability and relative cost/performance of the MIC against those of current GPGPUs and latency-optimized CPUs.

  20. Infrastructure for Multiphysics Software Integration in High Performance Computing-Aided Science and Engineering

    Energy Technology Data Exchange (ETDEWEB)

    Campbell, Michael T. [Illinois Rocstar LLC, Champaign, IL (United States); Safdari, Masoud [Illinois Rocstar LLC, Champaign, IL (United States); Kress, Jessica E. [Illinois Rocstar LLC, Champaign, IL (United States); Anderson, Michael J. [Illinois Rocstar LLC, Champaign, IL (United States); Horvath, Samantha [Illinois Rocstar LLC, Champaign, IL (United States); Brandyberry, Mark D. [Illinois Rocstar LLC, Champaign, IL (United States); Kim, Woohyun [Illinois Rocstar LLC, Champaign, IL (United States); Sarwal, Neil [Illinois Rocstar LLC, Champaign, IL (United States); Weisberg, Brian [Illinois Rocstar LLC, Champaign, IL (United States)

    2016-10-15

    The project described in this report constructed and exercised an innovative multiphysics coupling toolkit called the Illinois Rocstar MultiPhysics Application Coupling Toolkit (IMPACT). IMPACT is an open source, flexible, natively parallel infrastructure for coupling multiple uniphysics simulation codes into multiphysics computational systems. IMPACT works with codes written in several high-performance-computing (HPC) programming languages, and is designed from the beginning for HPC multiphysics code development. It is designed to be minimally invasive to the individual physics codes being integrated, and has few requirements on those physics codes for integration. The goal of IMPACT is to provide the support needed to enable coupling existing tools together in unique and innovative ways to produce powerful new multiphysics technologies without extensive modification and rewrite of the physics packages being integrated. There are three major outcomes from this project: 1) construction, testing, application, and open-source release of the IMPACT infrastructure, 2) production of example open-source multiphysics tools using IMPACT, and 3) identification and engagement of interested organizations in the tools and applications resulting from the project. This last outcome represents the incipient development of a user community and application echosystem being built using IMPACT. Multiphysics coupling standardization can only come from organizations working together to define needs and processes that span the space of necessary multiphysics outcomes, which Illinois Rocstar plans to continue driving toward. The IMPACT system, including source code, documentation, and test problems are all now available through the public gitHUB.org system to anyone interested in multiphysics code coupling. Many of the basic documents explaining use and architecture of IMPACT are also attached as appendices to this document. Online HTML documentation is available through the gitHUB site

  1. The dynamics of software development project management: An integrative systems dynamic perspective

    Science.gov (United States)

    Vandervelde, W. E.; Abdel-Hamid, T.

    1984-01-01

    Rather than continuing to focus on software development projects per se, the system dynamics modeling approach outlined is extended to investigate a broader set of issues pertaining to the software development organization. Rather than trace the life cycle(s) of one or more software projects, the focus is on the operations of a software development department as a continuous stream of software products are developed, placed into operation, and maintained. A number of research questions are ""ripe'' for investigating including: (1) the efficacy of different organizational structures in different software development environments, (2) personnel turnover, (3) impact of management approaches such as management by objectives, and (4) the organizational/environmental determinants of productivity.

  2. A Fuzzy Approach for Integrated Measure of Object-Oriented Software Testability

    OpenAIRE

    Vandana Gupta; K. K. Aggarwal; Yogesh Singh

    2005-01-01

    For large software systems, testing phase seems to have profound effect on the overall acceptability and quality of the final product. The success of this activity can be judged by measuring the testability of the software. A good measure for testability can better manage the testing effort and time. Different Object Oriented Metrics are used in measurement of object-oriented testability but none of them is alone sufficient to give an overall reflection of software testabi...

  3. Behavior Tracking Software Enhancement and Integration of a Feedback Module, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — Horizon Performance designed a Behavior Tracking Software System to collect crew member behavior throughout a mission, giving NASA the capability to monitor...

  4. NEMO. A novel techno-economic tool suite for simulating and optimizing solutions for grid integration of electric vehicles and charging stations

    Energy Technology Data Exchange (ETDEWEB)

    Erge, Thomas; Stillahn, Thies; Dallmer-Zerbe, Kilian; Wille-Haussmann, Bernhard [Frauenhofer Institut for Solar Energy Systems ISE, Freiburg (Germany)

    2013-07-01

    With an increasing use of electric vehicles (EV) grid operators need to predict energy flows depending on electromobility use profiles to accordingly adjust grid infrastructure and operation control accordingly. Tools and methodologies are required to characterize grid problems resulting from the interconnection of EV with the grid. The simulation and optimization tool suite NEMO (Novel E-MObility grid model) was developed within a European research project and is currently being tested using realistic showcases. It is a combination of three professional tools. One of the tools aims at a combined techno-economic design and operation, primarily modeling plants on contracts or the spot market, at the same time participating in balancing markets. The second tool is designed for planning grid extension or reinforcement while the third tool is mainly used to quickly discover potential conflicts of grid operation approaches through load flow analysis. The tool suite is used to investigate real showcases in Denmark, Germany and the Netherlands. First studies show that significant alleviation of stress on distribution grid lines could be achieved by few but intelligent restrictions to EV charging procedures.

  5. NEMO. A novel techno-economic tool suite for simulating and optimizing solutions for grid integration of electric vehicles and charging stations

    International Nuclear Information System (INIS)

    Erge, Thomas; Stillahn, Thies; Dallmer-Zerbe, Kilian; Wille-Haussmann, Bernhard

    2013-01-01

    With an increasing use of electric vehicles (EV) grid operators need to predict energy flows depending on electromobility use profiles to accordingly adjust grid infrastructure and operation control accordingly. Tools and methodologies are required to characterize grid problems resulting from the interconnection of EV with the grid. The simulation and optimization tool suite NEMO (Novel E-MObility grid model) was developed within a European research project and is currently being tested using realistic showcases. It is a combination of three professional tools. One of the tools aims at a combined techno-economic design and operation, primarily modeling plants on contracts or the spot market, at the same time participating in balancing markets. The second tool is designed for planning grid extension or reinforcement while the third tool is mainly used to quickly discover potential conflicts of grid operation approaches through load flow analysis. The tool suite is used to investigate real showcases in Denmark, Germany and the Netherlands. First studies show that significant alleviation of stress on distribution grid lines could be achieved by few but intelligent restrictions to EV charging procedures.

  6. Software component quality evaluation

    Science.gov (United States)

    Clough, A. J.

    1991-01-01

    The paper describes a software inspection process that can be used to evaluate the quality of software components. Quality criteria, process application, independent testing of the process and proposed associated tool support are covered. Early results indicate that this technique is well suited for assessing software component quality in a standardized fashion. With automated machine assistance to facilitate both the evaluation and selection of software components, such a technique should promote effective reuse of software components.

  7. Validation of an integrated software for the detection of rapid eye movement sleep behavior disorder.

    Science.gov (United States)

    Frauscher, Birgit; Gabelia, David; Biermayr, Marlene; Stefani, Ambra; Hackner, Heinz; Mitterling, Thomas; Poewe, Werner; Högl, Birgit

    2014-10-01

    Rapid eye movement sleep without atonia (RWA) is the polysomnographic hallmark of REM sleep behavior disorder (RBD). To partially overcome the disadvantages of manual RWA scoring, which is time consuming but essential for the accurate diagnosis of RBD, we aimed to validate software specifically developed and integrated with polysomnography for RWA detection against the gold standard of manual RWA quantification. Academic referral center sleep laboratory. Polysomnographic recordings of 20 patients with RBD and 60 healthy volunteers were analyzed. N/A. Motor activity during REM sleep was quantified manually and computer assisted (with and without artifact detection) according to Sleep Innsbruck Barcelona (SINBAR) criteria for the mentalis ("any," phasic, tonic electromyographic [EMG] activity) and the flexor digitorum superficialis (FDS) muscle (phasic EMG activity). Computer-derived indices (with and without artifact correction) for "any," phasic, tonic mentalis EMG activity, phasic FDS EMG activity, and the SINBAR index ("any" mentalis + phasic FDS) correlated well with the manually derived indices (all Spearman rhos 0.66-0.98). In contrast with computerized scoring alone, computerized scoring plus manual artifact correction (median duration 5.4 min) led to a significant reduction of false positives for "any" mentalis (40%), phasic mentalis (40.6%), and the SINBAR index (41.2%). Quantification of tonic mentalis and phasic FDS EMG activity was not influenced by artifact correction. The computer algorithm used here appears to be a promising tool for REM sleep behavior disorder detection in both research and clinical routine. A short check for plausibility of automatic detection should be a basic prerequisite for this and all other available computer algorithms. © 2014 Associated Professional Sleep Societies, LLC.

  8. Digital Aquifer - Integrating modeling, technical, software and policy aspects to develop a groundwater management tool

    Science.gov (United States)

    Tirupathi, S.; McKenna, S. A.; Fleming, K.; Wambua, M.; Waweru, P.; Ondula, E.

    2016-12-01

    Groundwater management has traditionally been observed as a study for long term policy measures to ensure that the water resource is sustainable. IBM Research, in association with the World Bank, extended this traditional analysis to include realtime groundwater management by building a context-aware, water rights management and permitting system. As part of this effort, one of the primary objectives was to develop a groundwater flow model that can help the policy makers with a visual overview of the current groundwater distribution. In addition, the system helps the policy makers simulate a range of scenarios and check the sustainability of the groundwater resource in a given region. The system also enables a license provider to check the effect of the introduction of a new well on the existing wells in the domain as well as the groundwater resource in general. This process simplifies how an engineer will determine if a new well should be approved. Distance to the nearest well neighbors and the maximum decreases in water levels of nearby wells are continually assessed and presented as evidence for an engineer to make the final judgment on approving the permit. The system also facilitates updated insights on the amount of groundwater left in an area and provides advice on how water fees should be structured to balance conservation and economic development goals. In this talk, we will discuss the concept of Digital Aquifer, the challenges in integrating modeling, technical and software aspects to develop a management system that helps policy makers and license providers with a robust decision making tool. We will concentrate on the groundwater model developed using the analytic element method that plays a very important role in the decision making aspects. Finally, the efficiency of this system and methodology is shown through a case study in Laguna Province, Philippines, which was done in collaboration with the National Water Resource Board, Philippines and World

  9. Integrating Free Computer Software in Chemistry and Biochemistry Instruction: An International Collaboration

    Science.gov (United States)

    Cedeno, David L.; Jones, Marjorie A.; Friesen, Jon A.; Wirtz, Mark W.; Rios, Luz Amalia; Ocampo, Gonzalo Taborda

    2010-01-01

    At the Universidad de Caldas, Manizales, Colombia, we used their new computer facilities to introduce chemistry graduate students to biochemical database mining and quantum chemistry calculations using freeware. These hands-on workshops allowed the students a strong introduction to easily accessible software and how to use this software to begin…

  10. Methodology and software to detect viral integration site hot-spots

    Science.gov (United States)

    2011-01-01

    Background Modern gene therapy methods have limited control over where a therapeutic viral vector inserts into the host genome. Vector integration can activate local gene expression, which can cause cancer if the vector inserts near an oncogene. Viral integration hot-spots or 'common insertion sites' (CIS) are scrutinized to evaluate and predict patient safety. CIS are typically defined by a minimum density of insertions (such as 2-4 within a 30-100 kb region), which unfortunately depends on the total number of observed VIS. This is problematic for comparing hot-spot distributions across data sets and patients, where the VIS numbers may vary. Results We develop two new methods for defining hot-spots that are relatively independent of data set size. Both methods operate on distributions of VIS across consecutive 1 Mb 'bins' of the genome. The first method 'z-threshold' tallies the number of VIS per bin, converts these counts to z-scores, and applies a threshold to define high density bins. The second method 'BCP' applies a Bayesian change-point model to the z-scores to define hot-spots. The novel hot-spot methods are compared with a conventional CIS method using simulated data sets and data sets from five published human studies, including the X-linked ALD (adrenoleukodystrophy), CGD (chronic granulomatous disease) and SCID-X1 (X-linked severe combined immunodeficiency) trials. The BCP analysis of the human X-linked ALD data for two patients separately (774 and 1627 VIS) and combined (2401 VIS) resulted in 5-6 hot-spots covering 0.17-0.251% of the genome and containing 5.56-7.74% of the total VIS. In comparison, the CIS analysis resulted in 12-110 hot-spots covering 0.018-0.246% of the genome and containing 5.81-22.7% of the VIS, corresponding to a greater number of hot-spots as the data set size increased. Our hot-spot methods enable one to evaluate the extent of VIS clustering, and formally compare data sets in terms of hot-spot overlap. Finally, we show that the

  11. Capability Maturity Model Integration (CMMISM), Version 1.1 CMMISM for Systems Engineering, Software Engineering, Integrated Product and Process Development, and Supplier Sourcing (CMMI-SE/SW/IPPD/SS, V1.1). Staged Representation

    National Research Council Canada - National Science Library

    2002-01-01

    .... Concepts covered by this model include systems engineering, software engineering, integrated product and process development, and supplier sourcing as well as traditional CMM concepts such as process...

  12. Automated Structure Solution with the PHENIX Suite

    Energy Technology Data Exchange (ETDEWEB)

    Zwart, Peter H.; Zwart, Peter H.; Afonine, Pavel; Grosse-Kunstleve, Ralf W.; Hung, Li-Wei; Ioerger, Tom R.; McCoy, A.J.; McKee, Eric; Moriarty, Nigel; Read, Randy J.; Sacchettini, James C.; Sauter, Nicholas K.; Storoni, L.C.; Terwilliger, Tomas C.; Adams, Paul D.

    2008-06-09

    Significant time and effort are often required to solve and complete a macromolecular crystal structure. The development of automated computational methods for the analysis, solution and completion of crystallographic structures has the potential to produce minimally biased models in a short time without the need for manual intervention. The PHENIX software suite is a highly automated system for macromolecular structure determination that can rapidly arrive at an initial partial model of a structure without significant human intervention, given moderate resolution and good quality data. This achievement has been made possible by the development of new algorithms for structure determination, maximum-likelihood molecular replacement (PHASER), heavy-atom search (HySS), template and pattern-based automated model-building (RESOLVE, TEXTAL), automated macromolecular refinement (phenix.refine), and iterative model-building, density modification and refinement that can operate at moderate resolution (RESOLVE, AutoBuild). These algorithms are based on a highly integrated and comprehensive set of crystallographic libraries that have been built and made available to the community. The algorithms are tightly linked and made easily accessible to users through the PHENIX Wizards and the PHENIX GUI.

  13. Automated structure solution with the PHENIX suite

    Energy Technology Data Exchange (ETDEWEB)

    Terwilliger, Thomas C [Los Alamos National Laboratory; Zwart, Peter H [LBNL; Afonine, Pavel V [LBNL; Grosse - Kunstleve, Ralf W [LBNL

    2008-01-01

    Significant time and effort are often required to solve and complete a macromolecular crystal structure. The development of automated computational methods for the analysis, solution, and completion of crystallographic structures has the potential to produce minimally biased models in a short time without the need for manual intervention. The PHENIX software suite is a highly automated system for macromolecular structure determination that can rapidly arrive at an initial partial model of a structure without significant human intervention, given moderate resolution, and good quality data. This achievement has been made possible by the development of new algorithms for structure determination, maximum-likelihood molecular replacement (PHASER), heavy-atom search (HySS), template- and pattern-based automated model-building (RESOLVE, TEXTAL), automated macromolecular refinement (phenix. refine), and iterative model-building, density modification and refinement that can operate at moderate resolution (RESOLVE, AutoBuild). These algorithms are based on a highly integrated and comprehensive set of crystallographic libraries that have been built and made available to the community. The algorithms are tightly linked and made easily accessible to users through the PHENIX Wizards and the PHENIX GUI.

  14. Servicing HEP experiments with a complete set of ready integreated and configured common software components

    International Nuclear Information System (INIS)

    Roiser, Stefan; Gaspar, Ana; Perrin, Yves; Kruzelecki, Karol

    2010-01-01

    The LCG Applications Area at CERN provides basic software components for the LHC experiments such as ROOT, POOL, COOL which are developed in house and also a set of 'external' software packages (70) which are needed in addition such as Python, Boost, Qt, CLHEP, etc. These packages target many different areas of HEP computing such as data persistency, math, simulation, grid computing, databases, graphics, etc. Other packages provide tools for documentation, debugging, scripting languages and compilers. All these packages are provided in a consistent manner on different compilers, architectures and operating systems. The Software Process and Infrastructure project (SPI) [1] is responsible for the continous testing, coordination, release and deployment of these software packages. The main driving force for the actions carried out by SPI are the needs of the LHC experiments, but also other HEP experiments could profit from the set of consistent libraries provided and receive a stable and well tested foundation to build their experiment software frameworks. This presentation will first provide a brief description of the tools and services provided for the coordination, testing, release, deployment and presentation of LCG/AA software packages and then focus on a second set of tools provided for outside LHC experiments to deploy a stable set of HEP related software packages both as binary distribution or from source.

  15. Servicing HEP experiments with a complete set of ready integreated and configured common software components

    Energy Technology Data Exchange (ETDEWEB)

    Roiser, Stefan; Gaspar, Ana; Perrin, Yves [CERN, CH-1211 Geneva 23, PH Department, SFT Group (Switzerland); Kruzelecki, Karol, E-mail: stefan.roiser@cern.c, E-mail: ana.gaspar@cern.c, E-mail: yves.perrin@cern.c, E-mail: karol.kruzelecki@cern.c [CERN, CH-1211 Geneva 23, PH Department, LBC Group (Switzerland)

    2010-04-01

    The LCG Applications Area at CERN provides basic software components for the LHC experiments such as ROOT, POOL, COOL which are developed in house and also a set of 'external' software packages (70) which are needed in addition such as Python, Boost, Qt, CLHEP, etc. These packages target many different areas of HEP computing such as data persistency, math, simulation, grid computing, databases, graphics, etc. Other packages provide tools for documentation, debugging, scripting languages and compilers. All these packages are provided in a consistent manner on different compilers, architectures and operating systems. The Software Process and Infrastructure project (SPI) [1] is responsible for the continous testing, coordination, release and deployment of these software packages. The main driving force for the actions carried out by SPI are the needs of the LHC experiments, but also other HEP experiments could profit from the set of consistent libraries provided and receive a stable and well tested foundation to build their experiment software frameworks. This presentation will first provide a brief description of the tools and services provided for the coordination, testing, release, deployment and presentation of LCG/AA software packages and then focus on a second set of tools provided for outside LHC experiments to deploy a stable set of HEP related software packages both as binary distribution or from source.

  16. Integrated conception of hardware/software mixed systems used in nuclear instrumentation

    International Nuclear Information System (INIS)

    Dias, Ailton F.; Sorel, Yves; Akil, Mohamed

    2002-01-01

    Hardware/software codesign carries out the design of systems composed by a hardware portion, with specific components, and a software portion, with microprocessor based architecture. This paper describes the Algorithm Architecture Adequation (AAA) design methodology - originally oriented to programmable multicomponent architectures, its extension to reconfigurable circuits and its application to design and development of nuclear instrumentation systems composed by programmable and configurable circuits. AAA methodology uses an unified model to describe algorithm, architecture and implementation, based on graph theory. The great advantage of AAA methodology is the utilization of a same model from the specification to the implementation of hardware/software systems, reducing the complexity and design time. (author)

  17. ElectroMagnetoEncephalography software: overview and integration with other EEG/MEG toolboxes.

    Science.gov (United States)

    Peyk, Peter; De Cesarei, Andrea; Junghöfer, Markus

    2011-01-01

    EMEGS (electromagnetic encephalography software) is a MATLAB toolbox designed to provide novice as well as expert users in the field of neuroscience with a variety of functions to perform analysis of EEG and MEG data. The software consists of a set of graphical interfaces devoted to preprocessing, analysis, and visualization of electromagnetic data. Moreover, it can be extended using a plug-in interface. Here, an overview of the capabilities of the toolbox is provided, together with a simple tutorial for both a standard ERP analysis and a time-frequency analysis. Latest features and future directions of the software development are presented in the final section.

  18. Software-defined networking control plane for seamless integration of multiple silicon photonic switches in Datacom networks.

    Science.gov (United States)

    Shen, Yiwen; Hattink, Maarten H N; Samadi, Payman; Cheng, Qixiang; Hu, Ziyiz; Gazman, Alexander; Bergman, Keren

    2018-04-16

    Silicon photonics based switches offer an effective option for the delivery of dynamic bandwidth for future large-scale Datacom systems while maintaining scalable energy efficiency. The integration of a silicon photonics-based optical switching fabric within electronic Datacom architectures requires novel network topologies and arbitration strategies to effectively manage the active elements in the network. We present a scalable software-defined networking control plane to integrate silicon photonic based switches with conventional Ethernet or InfiniBand networks. Our software-defined control plane manages both electronic packet switches and multiple silicon photonic switches for simultaneous packet and circuit switching. We built an experimental Dragonfly network testbed with 16 electronic packet switches and 2 silicon photonic switches to evaluate our control plane. Observed latencies occupied by each step of the switching procedure demonstrate a total of 344 µs control plane latency for data-center and high performance computing platforms.

  19. ACHIEVING HIGH INTEGRITY OF PROCESS-CONTROL SOFTWARE BY GRAPHICAL DESIGN AND FORMAL VERIFICATION

    NARCIS (Netherlands)

    HALANG, WA; Kramer, B.J.

    The International Electrotechnical Commission is currently standardising four compatible languages for designing and implementing programmable logic controllers (PLCs). The language family includes a diagrammatic notation that supports the idea of software ICs to encourage graphical design

  20. Z-Plant material information tracking system (ZMITS) software development and integration project management plan

    International Nuclear Information System (INIS)

    IBSEN, T.G.

    1999-01-01

    This document plans for software and interface development governing the implementation of ZMITS and other supporting systems necessary to manage information for material stabilization needs of the Project Hanford Management Contract (PHMC)

  1. Increasing Open Source Software Integration on the Department of Defense Unclassified Desktop

    National Research Council Canada - National Science Library

    Schearer, Steven A

    2008-01-01

    .... While some of this expenditure goes to fund special-purpose military software, much of it is absorbed by license fees for computer operating systems and general-purpose office automation applications...

  2. Integrated Syntactic/Semantic XML Data Validation with a Reusable Software Component

    Science.gov (United States)

    Golikov, Steven

    2013-01-01

    Data integration is a critical component of enterprise system integration, and XML data validation is the foundation for sound data integration of XML-based information systems. Since B2B e-commerce relies on data validation as one of the critical components for enterprise integration, it is imperative for financial industries and e-commerce…

  3. Integration of biological responses from a suite of bioassays for the Venice Lagoon (Italy) through sediment toxicity index - Part A: Development and comparison of two methodological approaches

    International Nuclear Information System (INIS)

    Losso, Chiara; Novelli, Alessandra Arizzi; De Salvador, Davide; Ghetti, Pier Francesco; Ghirardini, Annamaria Volpi

    2010-01-01

    Marine and coastal quality assessment, based on test batteries involving a wide array of endpoints, organisms and test matrices, needs for setting up toxicity indices that integrate multiple toxicological measures for decision-making processes and that classify the continuous toxicity response into discrete categories according to the European Water Framework Directive. Two toxicity indices were developed for the lagoon environment such as the Venice Lagoon. Stepwise procedure included: the construction of a database that identified test-matrix pairs (indicators); the selection of a minimum number of ecotoxicological indicators, called toxicological core metrics (CMs-tox) on the basis of specific criteria; the development of toxicity scores for each CM-tox; the integration of the CMs-tox into two indices, the Toxicity Effect Index (TEI), based on the transformation of Toxic Unit (TU) data that were integrated as logarithmic sum, and the Weighted Average Toxicity Index (WATI), starting from toxicity classes integrated as weighted mean. Results from the indices are compared; advantages and drawbacks of both approaches are discussed. - Two toxicity indices were set up and compared to integrate toxicity data from a battery of bioassays for the Venice lagoon.

  4. Integration of biological responses from a suite of bioassays for the Venice Lagoon (Italy) through sediment toxicity index - Part A: Development and comparison of two methodological approaches

    Energy Technology Data Exchange (ETDEWEB)

    Losso, Chiara, E-mail: closso@unive.i [Environmental Sciences Department, University Ca' Foscari of Venice, Campo della Celestia 2737/b, I-30122 Venice (Italy); Novelli, Alessandra Arizzi [Environmental Sciences Department, University Ca' Foscari of Venice, Campo della Celestia 2737/b, I-30122 Venice (Italy); De Salvador, Davide [Physics Department, Padova University, via Marzolo 8, 35131 Padova (Italy); Ghetti, Pier Francesco; Ghirardini, Annamaria Volpi [Environmental Sciences Department, University Ca' Foscari of Venice, Campo della Celestia 2737/b, I-30122 Venice (Italy)

    2010-12-15

    Marine and coastal quality assessment, based on test batteries involving a wide array of endpoints, organisms and test matrices, needs for setting up toxicity indices that integrate multiple toxicological measures for decision-making processes and that classify the continuous toxicity response into discrete categories according to the European Water Framework Directive. Two toxicity indices were developed for the lagoon environment such as the Venice Lagoon. Stepwise procedure included: the construction of a database that identified test-matrix pairs (indicators); the selection of a minimum number of ecotoxicological indicators, called toxicological core metrics (CMs-tox) on the basis of specific criteria; the development of toxicity scores for each CM-tox; the integration of the CMs-tox into two indices, the Toxicity Effect Index (TEI), based on the transformation of Toxic Unit (TU) data that were integrated as logarithmic sum, and the Weighted Average Toxicity Index (WATI), starting from toxicity classes integrated as weighted mean. Results from the indices are compared; advantages and drawbacks of both approaches are discussed. - Two toxicity indices were set up and compared to integrate toxicity data from a battery of bioassays for the Venice lagoon.

  5. Integration of biological responses from a suite of bioassays for the Venice Lagoon (Italy) through sediment toxicity index - part A: development and comparison of two methodological approaches.

    Science.gov (United States)

    Losso, Chiara; Novelli, Alessandra Arizzi; De Salvador, Davide; Ghetti, Pier Francesco; Ghirardini, Annamaria Volpi

    2010-12-01

    Marine and coastal quality assessment, based on test batteries involving a wide array of endpoints, organisms and test matrices, needs for setting up toxicity indices that integrate multiple toxicological measures for decision-making processes and that classify the continuous toxicity response into discrete categories according to the European Water Framework Directive. Two toxicity indices were developed for the lagoon environment such as the Venice Lagoon. Stepwise procedure included: the construction of a database that identified test-matrix pairs (indicators); the selection of a minimum number of ecotoxicological indicators, called toxicological core metrics (CMs-tox) on the basis of specific criteria; the development of toxicity scores for each CM-tox; the integration of the CMs-tox into two indices, the Toxicity Effect Index (TEI), based on the transformation of Toxic Unit (TU) data that were integrated as logarithmic sum, and the Weighted Average Toxicity Index (WATI), starting from toxicity classes integrated as weighted mean. Results from the indices are compared; advantages and drawbacks of both approaches are discussed. Copyright © 2010. Published by Elsevier Ltd.

  6. Learning DHTMLX suite UI

    CERN Document Server

    Geske, Eli

    2013-01-01

    A fast-paced, example-based guide to learning DHTMLX.""Learning DHTMLX Suite UI"" is for web designers who have a basic knowledge of JavaScript and who are looking for powerful tools that will give them an extra edge in their own application development. This book is also useful for experienced developers who wish to get started with DHTMLX without going through the trouble of learning its quirks through trial and error. Readers are expected to have some knowledge of JavaScript, HTML, Document Object Model, and the ability to install a local web server.

  7. Integrated smart bearings for next generation aero-engines Part 1: Development of a sensor suite for automatic bearing health monitoring

    OpenAIRE

    Bashir, Imran; Wang, Ling; Harvey, Terence; Zaghari, Bahareh; Weddell, Alexander; White, Neil

    2017-01-01

    The development of smart bearing solutions will contribute to increased aircraft engine reliability, allowing the early detection of bearing failure through robust health monitoring. This project aims to develop intelligent bearing systems for an Ultra High Propulsion Efficiency (UHPE) ground test demonstrator, where a fully integrated self-powered wireless sensing system will be developed for future aircraft. This paper provides a comprehensive review of the state-of-the-art smart bearing te...

  8. Dtest Testing Software

    Science.gov (United States)

    Jain, Abhinandan; Cameron, Jonathan M.; Myint, Steven

    2013-01-01

    This software runs a suite of arbitrary software tests spanning various software languages and types of tests (unit level, system level, or file comparison tests). The dtest utility can be set to automate periodic testing of large suites of software, as well as running individual tests. It supports distributing multiple tests over multiple CPU cores, if available. The dtest tool is a utility program (written in Python) that scans through a directory (and its subdirectories) and finds all directories that match a certain pattern and then executes any tests in that directory as described in simple configuration files.

  9. Hardware Interface Description for the Integrated Power, Avionics, and Software (iPAS) Space Telecommunications Radio Ssystem (STRS) Radio

    Science.gov (United States)

    Shalkhauser, Mary Jo W.; Roche, Rigoberto

    2017-01-01

    The Space Telecommunications Radio System (STRS) provides a common, consistent framework for software defined radios (SDRs) to abstract the application software from the radio platform hardware. The STRS standard aims to reduce the cost and risk of using complex, configurable and reprogrammable radio systems across NASA missions. To promote the use of the STRS architecture for future NASA advanced exploration missions, NASA Glenn Research Center (GRC) developed an STRS-compliant SDR on a radio platform used by the Advance Exploration System program at the Johnson Space Center (JSC) in their Integrated Power, Avionics, and Software (iPAS) laboratory. The iPAS STRS Radio was implemented on the Reconfigurable, Intelligently-Adaptive Communication System (RIACS) platform, currently being used for radio development at JSC. The platform consists of a Xilinx ML605 Virtex-6 FPGA board, an Analog Devices FMCOMMS1-EBZ RF transceiver board, and an Embedded PC (Axiomtek eBox 620-110-FL) running the Ubuntu 12.4 operating system. Figure 1 shows the RIACS platform hardware. The result of this development is a very low cost STRS compliant platform that can be used for waveform developments for multiple applications.The purpose of this document is to describe how to develop a new waveform using the RIACS platform and the Very High Speed Integrated Circuits (VHSIC) Hardware Description Language (VHDL) FPGA wrapper code and the STRS implementation on the Axiomtek processor.

  10. Waveform Developer's Guide for the Integrated Power, Avionics, and Software (iPAS) Space Telecommunications Radio System (STRS) Radio

    Science.gov (United States)

    Shalkhauser, Mary Jo W.; Roche, Rigoberto

    2017-01-01

    The Space Telecommunications Radio System (STRS) provides a common, consistent framework for software defined radios (SDRs) to abstract the application software from the radio platform hardware. The STRS standard aims to reduce the cost and risk of using complex, configurable and reprogrammable radio systems across NASA missions. To promote the use of the STRS architecture for future NASA advanced exploration missions, NASA Glenn Research Center (GRC) developed an STRS-compliant SDR on a radio platform used by the Advance Exploration System program at the Johnson Space Center (JSC) in their Integrated Power, Avionics, and Software (iPAS) laboratory. The iPAS STRS Radio was implemented on the Reconfigurable, Intelligently-Adaptive Communication System (RIACS) platform, currently being used for radio development at JSC. The platform consists of a Xilinx(Trademark) ML605 Virtex(Trademark)-6 FPGA board, an Analog Devices FMCOMMS1-EBZ RF transceiver board, and an Embedded PC (Axiomtek(Trademark) eBox 620-110-FL) running the Ubuntu 12.4 operating system. The result of this development is a very low cost STRS compliant platform that can be used for waveform developments for multiple applications. The purpose of this document is to describe how to develop a new waveform using the RIACS platform and the Very High Speed Integrated Circuits (VHSIC) Hardware Description Language (VHDL) FPGA wrapper code and the STRS implementation on the Axiomtek processor.

  11. Scientific Computation Application Partnerships in Materials and Chemical Sciences, Charge Transfer and Charge Transport in Photoactivated Systems, Developing Electron-Correlated Methods for Excited State Structure and Dynamics in the NWChem Software Suite

    Energy Technology Data Exchange (ETDEWEB)

    Cramer, Christopher J. [Univ. of Minnesota, Minneapolis, MN (United States)

    2017-11-12

    Charge transfer and charge transport in photoactivated systems are fundamental processes that underlie solar energy capture, solar energy conversion, and photoactivated catalysis, both organometallic and enzymatic. We developed methods, algorithms, and software tools needed for reliable treatment of the underlying physics for charge transfer and charge transport, an undertaking with broad applicability to the goals of the fundamental-interaction component of the Department of Energy Office of Basic Energy Sciences and the exascale initiative of the Office of Advanced Scientific Computing Research.

  12. Integrating semantic web and software agents : Exchanging RIF and BDI rules

    NARCIS (Netherlands)

    Gong, Y.; Overbeek, S.J.

    2011-01-01

    Software agents and rules are both used for creating flexibility. Exchanging rules between Semantic Web and agents can ensure consistency in rules and support easy updating and changing of rules. The Rule Interchange Format (RIF) is a new W3C recommendation Semantic Web standard for exchanging rules

  13. An integrated approach for requirement selection and scheduling in software release planning

    NARCIS (Netherlands)

    Li, C.; van den Akker, Marjan; Brinkkemper, Sjaak; Diepen, Guido

    2010-01-01

    It is essential for product software companies to decide which requirements should be included in the next release and to make an appropriate time plan of the development project. Compared to the extensive research done on requirement selection, very little research has been performed on time

  14. High Technology Systems with Low Technology Failures: Some Experiences with Rockets on Software Quality and Integration

    Science.gov (United States)

    Craig, Larry G.

    2010-01-01

    This slide presentation reviews three failures of software and how the failures contributed to or caused the failure of a launch or payload insertion into orbit. In order to avoid these systematic failures in the future, failure mitigation strategies are suggested for use.

  15. Architecture-Driven Integration of Modeling Languages for the Design of Software-Intensive Systems

    NARCIS (Netherlands)

    Dos Santos Soares, M.

    2010-01-01

    In the research that led to this thesis a multi-disciplinary approach, combining Traffic Engineering and Software Engineering, was used. Traffic engineers come up with new control strategies and algorithms for improving traffic. Once new solutions are defined from a Traffic Engineering point of

  16. Integrated Biological Warfare Technology Platform (IBWTP). Intelligent Software Supporting Situation Awareness, Response, and Operations

    Science.gov (United States)

    2007-01-01

    phases of the technology, QLI used a common software development and maintenance environment, called the Quantum Leap Uber Build System (QLUBS). QLI...May be used in internal tools and application. Using internally developed code on internal application ( Eating your own dogfood) provides

  17. Integrated Development and Maintenance of Software Products to Support Efficient Updating of Customer Configurations: A Case Study in Mass Market ERP Software

    NARCIS (Netherlands)

    Jansen, S.R.L.; Brinkkemper, S.; Ballintijn, G.; Nieuwland, Arco van

    2006-01-01

    The maintenance of enterprise application software at a customer site is a potentially complex task for software vendors. This complexity can unfortunately result in a significant amount of work and risk. This paper presents a case study of a product software vendor that tries to reduce this

  18. Towards a validation of a cellular biomarker suite in native and transplanted zebra mussels: A 2-year integrative field study of seasonal and pollution-induced variations

    Energy Technology Data Exchange (ETDEWEB)

    Guerlet, Edwige [Laboratoire Ecotoxicite, Sante Environnementale, CNRS UMR 7146, Universite Paul Verlaine-Metz, Rue General Delestraint, F-57070 Metz (France); Ledy, Karine [Laboratoire Ecotoxicite, Sante Environnementale, CNRS UMR 7146, Universite Paul Verlaine-Metz, Rue General Delestraint, F-57070 Metz (France); Meyer, Antoinette [Laboratoire Ecotoxicite, Sante Environnementale, CNRS UMR 7146, Universite Paul Verlaine-Metz, Rue General Delestraint, F-57070 Metz (France); Giamberini, Laure [Laboratoire Ecotoxicite, Sante Environnementale, CNRS UMR 7146, Universite Paul Verlaine-Metz, Rue General Delestraint, F-57070 Metz (France)]. E-mail: giamb@univ-metz.fr

    2007-03-30

    Two of the questions raised in the validation process of biomarkers are their relevance in the identification and discrimination of environmental perturbations, and the influence of seasonal factors on these biological endpoints. Determining the advantages and restrictions associated with the use of native or transplanted animals and comparing their responses is also needed. To obtain this information, a 2-year integrative field study was conducted in the vicinity of a nuclear power plant in northeastern France. A station was located in the reservoir receiving the cooling waters of the plant, and two other sites were studied 2 km upstream and 5 km downstream from the reservoir's discharge in the Moselle river. Elevated temperatures, copper contamination and a 1.4-fold-concentration factor of dissolved salts affected water quality of the reservoir. Native and transplanted zebra mussels (Dreissena polymorpha) were collected monthly and their digestive glands were processed for histochemical determinations of the lysosomal and peroxisomal systems and of the lipofuscin and neutral lipid contents. The responses were quantified using automated image analysis and stereology. Apart from neutral lipid contents, there were no systematic seasonal patterns in mussel populations or from 1 year to another. Principal Component Analyses showed a general higher discrimination potential of biological responses in transplanted organisms compared to native ones. They also pointed out the relationships between the cellular and physiological markers and abiotic factors. The present multiple biomarker integrative approach in transplanted D. polymorpha brings promising elements in their validation process as relevant biomonitoring tools.

  19. Towards a validation of a cellular biomarker suite in native and transplanted zebra mussels: A 2-year integrative field study of seasonal and pollution-induced variations

    International Nuclear Information System (INIS)

    Guerlet, Edwige; Ledy, Karine; Meyer, Antoinette; Giamberini, Laure

    2007-01-01

    Two of the questions raised in the validation process of biomarkers are their relevance in the identification and discrimination of environmental perturbations, and the influence of seasonal factors on these biological endpoints. Determining the advantages and restrictions associated with the use of native or transplanted animals and comparing their responses is also needed. To obtain this information, a 2-year integrative field study was conducted in the vicinity of a nuclear power plant in northeastern France. A station was located in the reservoir receiving the cooling waters of the plant, and two other sites were studied 2 km upstream and 5 km downstream from the reservoir's discharge in the Moselle river. Elevated temperatures, copper contamination and a 1.4-fold-concentration factor of dissolved salts affected water quality of the reservoir. Native and transplanted zebra mussels (Dreissena polymorpha) were collected monthly and their digestive glands were processed for histochemical determinations of the lysosomal and peroxisomal systems and of the lipofuscin and neutral lipid contents. The responses were quantified using automated image analysis and stereology. Apart from neutral lipid contents, there were no systematic seasonal patterns in mussel populations or from 1 year to another. Principal Component Analyses showed a general higher discrimination potential of biological responses in transplanted organisms compared to native ones. They also pointed out the relationships between the cellular and physiological markers and abiotic factors. The present multiple biomarker integrative approach in transplanted D. polymorpha brings promising elements in their validation process as relevant biomonitoring tools

  20. A COTS RF/Optical Software Defined Radio for the Integrated Radio and Optical Communications Test Bed

    Science.gov (United States)

    Nappier, Jennifer M.; Zeleznikar, Daniel J.; Wroblewski, Adam C.; Tokars, Roger P.; Schoenholz, Bryan L.; Lantz, Nicholas C.

    2017-01-01

    The Integrated Radio and Optical Communications (iROC) project at the National Aeronautics and Space Administration (NASA) is investigating the merits of a hybrid radio frequency (RF) and optical communication system for deep space missions. In an effort to demonstrate the feasibility and advantages of a hybrid RF/Optical software defined radio (SDR), a laboratory prototype was assembled from primarily commercial-off-the-shelf (COTS) hardware components. This COTS platform has been used to demonstrate simultaneous transmission of the radio and optical communications waveforms through to the physical layer (telescope and antenna). This paper details the hardware and software used in the platform and various measures of its performance. A laboratory optical receiver platform has also been assembled in order to demonstrate hybrid free space links in combination with the transmitter.

  1. Clementine sensor suite

    Energy Technology Data Exchange (ETDEWEB)

    Ledebuhr, A.G. [Lawrence Livermore National Lab., CA (United States)

    1994-11-15

    LLNL designed and built the suite of six miniaturized light-weight space-qualified sensors utilized in the Clementine mission. A major goal of the Clementine program was to demonstrate technologies originally developed for Ballistic Missile Defense Organization Programs. These sensors were modified to gather data from the moon. This overview presents each of these sensors and some preliminary on-orbit performance estimates. The basic subsystems of these sensors include optical baffles to reject off-axis stray light, light-weight ruggedized optical systems, filter wheel assemblies, radiation tolerant focal plane arrays, radiation hardened control and readout electronics and low mass and power mechanical cryogenic coolers for the infrared sensors. Descriptions of each sensor type are given along with design specifications, photographs and on-orbit data collected.

  2. Functional modelling for integration of human-software-hardware in complex physical systems

    International Nuclear Information System (INIS)

    Modarres, M.

    1996-01-01

    A framework describing the properties of complex physical systems composed of human-software-hardware interactions in terms of their functions is described. It is argued that such a framework is domain-general, so that functional primitives present a language that is more general than most other modeling methods such as mathematical simulation. The characteristics and types of functional models are described. Examples of uses of the framework in modeling physical systems composed of human-software-hardware (hereby we refer to them as only physical systems) are presented. It is concluded that a function-centered model of a physical system provides a capability for generating a high-level simulation of the system for intelligent diagnostic, control or other similar applications

  3. An approach to integrated design based componet software and OLE-technology

    DEFF Research Database (Denmark)

    Bagger-Petersen, Susanne C; Emborg, Jørgen; Andersen, Tom

    1996-01-01

    The paper reports on a prototype developed as to demonstrate the (dis)abilities of the OLE-standard to integrate different design-applications to a CAD-syste.......The paper reports on a prototype developed as to demonstrate the (dis)abilities of the OLE-standard to integrate different design-applications to a CAD-syste....

  4. Virtual reality devices integration in scientific visualization software in the VtkVRPN framework

    International Nuclear Information System (INIS)

    Journe, G.; Guilbaud, C.

    2005-01-01

    A high-quality scientific visualization software relies on ergonomic navigation and exploration. Those are essential to be able to perform an efficient data analysis. To help solving this issue, management of virtual reality devices has been developed inside the CEA 'VtkVRPN' framework. This framework is based on VTK, a 3D graphical library, and VRPN, a virtual reality devices management library. This document describes the developments done during a post-graduate training course. (authors)

  5. Viewport: An object-oriented approach to integrate workstation software for tile and stack mode display

    OpenAIRE

    Ghosh, Srinka; Andriole, Katherine P.; Avrin, David E.

    1997-01-01

    Diagnostic workstation design has migrated towards display presentation in one of two modes: tiled images or stacked images. It is our impression that the workstation setup or configuration in each of these two modes is rather distinct. We sought to establish a commonality to simplify software design, and to enable a single descriptor method to facilitate folder manager development of “hanging” protocols. All current workstation designs use a combination of “off-screen” and “on-screen” memory...

  6. A Reference Software Architecture to Support Unmanned Aircraft Integration in the National Airspace System

    Science.gov (United States)

    2012-07-01

    and Avoid ( SAA ) testbed that provides some of the core services . This paper describes the general architecture and a SAA testbed implementation that...that provides data and software services to enable a set of Unmanned Aircraft (UA) platforms to operate in a wide range of air domains which may...implemented by MIT Lincoln Laboratory in the form of a Sense and Avoid ( SAA ) testbed that provides some of the core services . This paper describes the general

  7. Integrated navigation and control software system for MRI-guided robotic prostate interventions.

    Science.gov (United States)

    Tokuda, Junichi; Fischer, Gregory S; DiMaio, Simon P; Gobbi, David G; Csoma, Csaba; Mewes, Philip W; Fichtinger, Gabor; Tempany, Clare M; Hata, Nobuhiko

    2010-01-01

    A software system to provide intuitive navigation for MRI-guided robotic transperineal prostate therapy is presented. In the system, the robot control unit, the MRI scanner, and the open-source navigation software are connected together via Ethernet to exchange commands, coordinates, and images using an open network communication protocol, OpenIGTLink. The system has six states called "workphases" that provide the necessary synchronization of all components during each stage of the clinical workflow, and the user interface guides the operator linearly through these workphases. On top of this framework, the software provides the following features for needle guidance: interactive target planning; 3D image visualization with current needle position; treatment monitoring through real-time MR images of needle trajectories in the prostate. These features are supported by calibration of robot and image coordinates by fiducial-based registration. Performance tests show that the registration error of the system was 2.6mm within the prostate volume. Registered real-time 2D images were displayed 1.97 s after the image location is specified. Copyright 2009 Elsevier Ltd. All rights reserved.

  8. Integrated navigation and control software system for MRI-guided robotic prostate interventions

    Science.gov (United States)

    Tokuda, Junichi; Fischer, Gregory S.; DiMaio, Simon P.; Gobbi, David G.; Csoma, Csaba; Mewes, Philip W.; Fichtinger, Gabor; Tempany, Clare M.; Hata, Nobuhiko

    2010-01-01

    A software system to provide intuitive navigation for MRI-guided robotic transperineal prostate therapy is presented. In the system, the robot control unit, the MRI scanner, and the open-source navigation software are connected together via Ethernet to exchange commands, coordinates, and images using an open network communication protocol, OpenIGTLink. The system has six states called “workphases” that provide the necessary synchronization of all components during each stage of the clinical workflow, and the user interface guides the operator linearly through these workphases. On top of this framework, the software provides the following features for needle guidance: interactive target planning; 3D image visualization with current needle position; treatment monitoring through real-time MR images of needle trajectories in the prostate. These features are supported by calibration of robot and image coordinates by fiducial-based registration. Performance tests show that the registration error of the system was 2.6 mm within the prostate volume. Registered real-time 2D images were displayed 1.97 s after the image location is specified. PMID:19699057

  9. An Optical Receiver Post Processing System for the Integrated Radio and Optical Communications Software Defined Radio Test Bed

    Science.gov (United States)

    Nappier, Jennifer M.; Tokars, Roger P.; Wroblewski, Adam C.

    2016-01-01

    The Integrated Radio and Optical Communications (iROC) project at the National Aeronautics and Space Administrations (NASA) Glenn Research Center is investigating the feasibility of a hybrid radio frequency (RF) and optical communication system for future deep space missions. As a part of this investigation, a test bed for a radio frequency (RF) and optical software defined radio (SDR) has been built. Receivers and modems for the NASA deep space optical waveform are not commercially available so a custom ground optical receiver system has been built. This paper documents the ground optical receiver, which is used in order to test the RF and optical SDR in a free space optical communications link.

  10. An Optical Receiver Post-Processing System for the Integrated Radio and Optical Communications Software Defined Radio Test Bed

    Science.gov (United States)

    Nappier, Jennifer M.; Tokars, Roger P.; Wroblewski, Adam C.

    2016-01-01

    The Integrated Radio and Optical Communications (iROC) project at the National Aeronautics and Space Administration's (NASA) Glenn Research Center is investigating the feasibility of a hybrid radio frequency (RF) and optical communication system for future deep space missions. As a part of this investigation, a test bed for a radio frequency (RF) and optical software defined radio (SDR) has been built. Receivers and modems for the NASA deep space optical waveform are not commercially available so a custom ground optical receiver system has been built. This paper documents the ground optical receiver, which is used in order to test the RF and optical SDR in a free space optical communications link.

  11. Evaluation of System-Integrated Smart Grid Devices using Software- and Hardware-in-the-Loop

    Energy Technology Data Exchange (ETDEWEB)

    Lundstrom, Blake; Chakraborty, Sudipta; Lauss, Georg; Brundlinger, Roland; Conklin, Russell

    2016-12-12

    This paper presents a concise description of state-of-the-art real-time simulation-based testing methods and demonstrates how they can be used independently and/or in combination as an integrated development and validation approach for smart grid DERs and systems. A three-part case study demonstrating the application of this integrated approach at the different stages of development and validation of a system-integrated smart photovoltaic (PV) inverter is also presented. Laboratory testing results and perspectives from two international research laboratories are included in the case study.

  12. Requirement Volatility, Standardization and Knowledge Integration in Software Projects: An Empirical Analysis on Outsourced IS Development Projects

    Directory of Open Access Journals (Sweden)

    Rajesri Govindaraju

    2015-08-01

    Full Text Available Information systems development (ISD projects are highly complex, with different groups of people having  to collaborate and exchange their knowledge. Considering the intensity of knowledge exchange that takes place in outsourced ISD projects, in this study a conceptual model was developed, aiming to examine the influence of four antecedents, i.e. standardization, requirement volatility, internal integration, and external integration, on two dependent variables, i.e. process performance and product performance. Data  were collected from 46 software companies in four big cities in Indonesia. The collected data were examined to verify the proposed theoretical model using the partial least square structural equation modeling (PLS-SEM technique. The results show that process performance is significantly influenced by internal integration and standardization, while product performance is  significantly influenced by external integration and  requirement volatility. This study contributes  to a better understanding of how knowledge integration can be managed in outsourced ISD projects in view of increasing their success.

  13. Uso de software libre para el aprendizaje de la integral definida

    OpenAIRE

    Medina, Mabel Azucena; Rubio, Héctor Eduardo

    2013-01-01

    Se desarrolla la experiencia de una unidad didáctica en el marco de la teoría de Brousseau y de la Enseñanza para la Comprensión. El Tópico Generativo es la integral definida. Las Metas de Comprensión son la definición de la integral definida y las formas de evaluación de la integral definida. Los Desempeños de Comprensión son actividades autónomas de evaluación de integrales definidas. El propósito de esta actividad es que los alumnos comprendan que pueden calcular aproximadamente una integr...

  14. Sighten Final Technical Report DEEE0006690 Deploying an integrated and comprehensive solar financing software platform

    Energy Technology Data Exchange (ETDEWEB)

    O' Leary, Conlan [Sighten, Inc., San Francisco, CA (United States)

    2017-10-15

    Over the project, Sighten built a comprehensive software-as-a-service (Saas) platform to automate and streamline the residential solar financing workflow. Before the project period, significant time and money was spent by companies on front-end tools related to system design and proposal creation, but comparatively few resources were available to support the many back-end calculations and data management processes that underpin third party financing. Without a tool like Sighten, the solar financing processes involved passing information from the homeowner prospect into separate tools for system design, financing, and then later to reporting tools including Microsoft Excel, CRM software, in-house software, outside software, and offline, manual processes. Passing data between tools and attempting to connect disparate systems results in inefficiency and inaccuracy for the industry. Sighten was built to consolidate all financial and solar-related calculations in a single software platform. It significantly improves upon the accuracy of these calculations and exposes sophisticated new analysis tools resulting in a rigorous, efficient and cost-effective toolset for scaling residential solar. Widely deploying a platform like Sighten’s significantly and immediately impacts the residential solar space in several important ways: 1) standardizing and improving the quality of all quantitative calculations involved in the residential financing process, most notably project finance, system production and reporting calculations; 2) representing a true step change in terms of reporting and analysis capabilities by maintaining more accurate data and exposing sophisticated tools around simulation, tranching, and financial reporting, among others, to all stakeholders in the space; 3) allowing a broader group of developers/installers/finance companies to access the capital markets by providing an out-of-the-box toolset that handles the execution of running investor capital through a

  15. Viewport: an object-oriented approach to integrate workstation software for tile and stack mode display.

    Science.gov (United States)

    Ghosh, S; Andriole, K P; Avrin, D E

    1997-08-01

    Diagnostic workstation design has migrated towards display presentation in one of two modes: tiled images or stacked images. It is our impression that the workstation setup or configuration in each of these two modes is rather distinct. We sought to establish a commonality to simplify software design, and to enable a single descriptor method to facilitate folder manager development of "hanging" protocols. All current workstation designs use a combination of "off-screen" and "on-screen" memory whether or not they use a dedicated display subsystem, or merely a video board. Most diagnostic workstations also have two or more monitors. Our central concept is that of a "logical" viewport that can be smaller than, the same size as, or larger than a single monitor. Each port "views" an image data sequence loaded into offscreen memory. Each viewport can display one or more images in sequence in a one-on-one or traditionally tiled presentation. Viewports can be assigned to the available monitor "real estate" in any manner that fits. For example, a single sequence computed tomography (CT) study could be displayed across all monitors in a tiled appearance by assigning a single large viewport to the monitors. At the other extreme, a multisequence magnetic resonance (MR) study could be compared with a similar previous study by assigning four viewports to each monitor, single image display per viewport, and assigning four of the sequences of the current study to the left monitor viewports, and four of the earlier study to the right monitor viewports. Ergonomic controls activate scrolling through the off-screen image sequence data. Workstation folder manager hanging protocols could then specify viewports, number of images per viewport, and the automatic assignment of appropriately named sequences of current and previous studies to the viewports on a radiologist-specific basis. Furthermore, software development is simplified by common base objects and methods of the tile and stack

  16. High integrity software for nuclear power plants: Candidate guidelines, technical basis and research needs. Executive summary: Volume 1

    International Nuclear Information System (INIS)

    Seth, S.; Bail, W.; Cleaves, D.; Cohen, H.; Hybertson, D.; Schaefer, C.; Stark, G.; Ta, A.; Ulery, B.

    1995-06-01

    The work documented in this report was performed in support of the US Nuclear Regulatory Commission to examine the technical basis for candidate guidelines that could be considered in reviewing and evaluating high integrity computer software used in the safety systems of nuclear power plants. The framework for the work consisted of the following software development and assurance activities: requirements specification; design; coding; verification and validation, including static analysis and dynamic testing; safety analysis; operation and maintenance; configuration management; quality assurance; and planning and management. Each activity (framework element) was subdivided into technical areas (framework subelements). The report describes the development of approximately 200 candidate guidelines that span the entire range of software life-cycle activities; the assessment of the technical basis for those candidate guidelines; and the identification, categorization and prioritization of research needs for improving the technical basis. The report has two volumes: Volume 1, Executive Summary, includes an overview of the framework and of each framework element, the complete set of candidate guidelines, the results of the assessment of the technical basis for each candidate guideline, and a discussion of research needs that support the regulatory function; Volume 2 is the main report

  17. Hardware/software co-design and optimization for cyberphysical integration in digital microfluidic biochips

    CERN Document Server

    Luo, Yan; Ho, Tsung-Yi

    2015-01-01

    This book describes a comprehensive framework for hardware/software co-design, optimization, and use of robust, low-cost, and cyberphysical digital microfluidic systems. Readers with a background in electronic design automation will find this book to be a valuable reference for leveraging conventional VLSI CAD techniques for emerging technologies, e.g., biochips or bioMEMS. Readers from the circuit/system design community will benefit from methods presented to extend design and testing techniques from microelectronics to mixed-technology microsystems. For readers from the microfluidics domain,

  18. An Integrated Software Development Framework for PLC and FPGA based Digital I and Cs

    International Nuclear Information System (INIS)

    Yoo, Jun Beom; Kim, Eui Sub; Lee, Dong Ah; Choi, Jong Gyun

    2014-01-01

    NuDE 2.0 (Nuclear Development Environment) is a model-based software development environment for safety- critical digital systems in nuclear power plants. It makes possible to develop PLC-based systems as well as FPGA-based systems simultaneously from the same requirement or design specifications. The case study showed that the NuDE 2.0 can be adopted as an effective method of bridging the gap between the existing PLC and upcoming FPGA-based developments as well as a means of gaining diversity

  19. An Integrated Software Development Framework for PLC and FPGA based Digital I and Cs

    Energy Technology Data Exchange (ETDEWEB)

    Yoo, Jun Beom; Kim, Eui Sub; Lee, Dong Ah [Konkuk University, Seoul (Korea, Republic of); Choi, Jong Gyun [KAERI, Daejeon (Korea, Republic of)

    2014-08-15

    NuDE 2.0 (Nuclear Development Environment) is a model-based software development environment for safety- critical digital systems in nuclear power plants. It makes possible to develop PLC-based systems as well as FPGA-based systems simultaneously from the same requirement or design specifications. The case study showed that the NuDE 2.0 can be adopted as an effective method of bridging the gap between the existing PLC and upcoming FPGA-based developments as well as a means of gaining diversity.

  20. The integration of Workload Management Systems for the ProtoDUNE Software and Computing cluster

    CERN Document Server

    Oniciuc, Oriana-Maria

    2017-01-01

    The protoDUNE experimental program is designed to test and validate the technologies for DUNE. All of the many elements in the chain of data acquisition, storage, distribution and processing are critically important to derive physics results from the data. To achieve these, a software stack has been chosen to implement automatic propagation of configurations across all the nodes in the NP cluster. This report presents the architecture of the system and the operations through which the cluster features can be scaled.

  1. Hardware and software architecture for the integration of the new EC waves launcher in FTU control system

    Energy Technology Data Exchange (ETDEWEB)

    Boncagni, L. [Associazione EURATOM-ENEA sulla Fusione – ENEA, Via Enrico Fermi, 45 00045 Frascati (RM) (Italy); Centioli, C., E-mail: cristina.centioli@enea.it [Associazione EURATOM-ENEA sulla Fusione – ENEA, Via Enrico Fermi, 45 00045 Frascati (RM) (Italy); Galperti, C.; Alessi, E.; Granucci, G. [Associazione EURATOM-ENEA-CNR sulla Fusione – IFP-CNR, Via Roberto Cozzi, 53 20125 Milano (Italy); Grosso, L.A. [Associazione EURATOM-ENEA sulla Fusione – ENEA, Via Enrico Fermi, 45 00045 Frascati (RM) (Italy); Marchetto, C. [Associazione EURATOM-ENEA-CNR sulla Fusione – IFP-CNR, Via Roberto Cozzi, 53 20125 Milano (Italy); Napolitano, M. [Associazione EURATOM-ENEA sulla Fusione – ENEA, Via Enrico Fermi, 45 00045 Frascati (RM) (Italy); Nowak, S. [Associazione EURATOM-ENEA-CNR sulla Fusione – IFP-CNR, Via Roberto Cozzi, 53 20125 Milano (Italy); Panella, M. [Associazione EURATOM-ENEA sulla Fusione – ENEA, Via Enrico Fermi, 45 00045 Frascati (RM) (Italy); Sozzi, C. [Associazione EURATOM-ENEA-CNR sulla Fusione – IFP-CNR, Via Roberto Cozzi, 53 20125 Milano (Italy); Tilia, B.; Vitale, V. [Associazione EURATOM-ENEA sulla Fusione – ENEA, Via Enrico Fermi, 45 00045 Frascati (RM) (Italy)

    2013-10-15

    Highlights: ► The integration of a new ECRH launcher to FTU legacy control system is reported. ► Fast control has been developed with a three-node RT cluster within MARTe framework. ► Slow control was implemented with a Simatic S7 PLC and an EPICS IOC-CA application. ► The first results have assessed the feasibility of the launcher control architecture. -- Abstract: The role of high power electron cyclotron (EC) waves in controlling magnetohydrodynamic (MHD) instabilities in tokamaks has been assessed in several experiments, exploiting the physical effects induced by resonant heating and current drive. Recently a new EC launcher, whose main goal is controlling tearing modes and possibly preventing their onset, is being implemented on FTU. So far most of the components of the launcher control strategy have been realized and successfully tested on plasma experiments. Nevertheless the operations of the new launcher must be completely integrated into the existing one, and to FTU control system. This work deals with this final step, proposing a hardware and software architecture implementing up to date technologies, to achieve a modular and effective control strategy well integrated into a legacy system. The slow control system of the new EC launcher is based on a Siemens S7 Programmable Logic Controller (PLC), integrated into FTU control system supervisor through an EPICS input output controller (IOC) and an in-house developed Channel Access client application creating an abstraction layer that decouples the IOC and the PLC from the FTU Supervisor software. This architecture could enable a smooth migration to an EPICS-only supervisory control system. The real time component of the control system is based on the open source MARTe framework relying on a Linux real time cluster, devoted to the detection of MHD instabilities and the calculation of the injection angles and the time reference for the radiofrequency power enable commands for the EC launcher.

  2. The CMSSW benchmarking suite: Using HEP code to measure CPU performance

    International Nuclear Information System (INIS)

    Benelli, G

    2010-01-01

    The demanding computing needs of the CMS experiment require thoughtful planning and management of its computing infrastructure. A key factor in this process is the use of realistic benchmarks when assessing the computing power of the different architectures available. In recent years a discrepancy has been observed between the CPU performance estimates given by the reference benchmark for HEP computing (SPECint) and actual performances of HEP code. Making use of the CPU performance tools from the CMSSW performance suite, comparative CPU performance studies have been carried out on several architectures. A benchmarking suite has been developed and integrated in the CMSSW framework, to allow computing centers and interested third parties to benchmark architectures directly with CMSSW. The CMSSW benchmarking suite can be used out of the box, to test and compare several machines in terms of CPU performance and report with the wanted level of detail the different benchmarking scores (e.g. by processing step) and results. In this talk we describe briefly the CMSSW software performance suite, and in detail the CMSSW benchmarking suite client/server design, the performance data analysis and the available CMSSW benchmark scores. The experience in the use of HEP code for benchmarking will be discussed and CMSSW benchmark results presented.

  3. Supervision Software for the Integration of the Beam Interlock System with the CERN Accelerator Complex

    CERN Document Server

    Audrain, M; Dragu, M; Fuchsberger, K; Garnier, JC; Gorzawski, AA; Koza, M; Krol, K; Moscatelli, A; Puccio, B; Stamos, K; Zerlauth, M

    2014-01-01

    The Accelerator complex at the European Organisation for Nuclear Research (CERN) is composed of many systems which are required to function in a valid state to ensure safe beam operation. One key component of machine protection, the Beam Interlock System (BIS), was designed to interface critical systems around the accelerator chain, provide fast and reliable transmission of beam dump requests and trigger beam extraction in case of malfunctioning of equipment systems or beam losses. Numerous upgrades of accelerator and controls components during the Long Shutdown 1 (LS1) are followed by subsequent software updates that need to be thoroughly validated before the restart of beam operation in 2015. In parallel, the ongoing deployments of the BIS hardware in the PS booster (PSB) and the future LINAC4 give rise to new requirements for the related controls and monitoring software due to their fast cycle times. This paper describes the current status and ongoing work as well as the long-term vision for the integratio...

  4. Development of Soil Compaction Analysis Software (SCAN Integrating a Low Cost GPS Receiver and Compactometer

    Directory of Open Access Journals (Sweden)

    Dongha Lee

    2012-02-01

    Full Text Available A software for soil compaction analysis (SCAN has been developed for evaluating the compaction states using the data from the GPS as well as a compactometer attached on the roller. The SCAN is distinguished from other previous software for intelligent compaction (IC in that it can use the results from various types of GPS positioning methods, and it also has an optimal structure for remotely managing the large amounts of data gathered from numerous rollers. For this, several methods were developed: (1 improving the accuracy of low cost GPS receiver’s positioning results; (2 modeling the trajectory of a moving roller using a GPS receiver’s results and linking it with the data from the compactometer; and (3 extracting the information regarding the compaction states of the ground from the modeled trajectory, using spatial analysis methods. The SCAN was verified throughout various field compaction tests, and it has been confirmed that it can be a very effective tool in evaluating field compaction states.

  5. ParseCNV integrative copy number variation association software with quality tracking.

    Science.gov (United States)

    Glessner, Joseph T; Li, Jin; Hakonarson, Hakon

    2013-03-01

    A number of copy number variation (CNV) calling algorithms exist; however, comprehensive software tools for CNV association studies are lacking. We describe ParseCNV, unique software that takes CNV calls and creates probe-based statistics for CNV occurrence in both case-control design and in family based studies addressing both de novo and inheritance events, which are then summarized based on CNV regions (CNVRs). CNVRs are defined in a dynamic manner to allow for a complex CNV overlap while maintaining precise association region. Using this approach, we avoid failure to converge and non-monotonic curve fitting weaknesses of programs, such as CNVtools and CNVassoc, and although Plink is easy to use, it only provides combined CNV state probe-based statistics, not state-specific CNVRs. Existing CNV association methods do not provide any quality tracking information to filter confident associations, a key issue which is fully addressed by ParseCNV. In addition, uncertainty in CNV calls underlying CNV associations is evaluated to verify significant results, including CNV overlap profiles, genomic context, number of probes supporting the CNV and single-probe intensities. When optimal quality control parameters are followed using ParseCNV, 90% of CNVs validate by polymerase chain reaction, an often problematic stage because of inadequate significant association review. ParseCNV is freely available at http://parsecnv.sourceforge.net.

  6. L’applicazione delle tecnologie fotovoltaiche integrate sulle coperture degli edifici con software GIS

    Directory of Open Access Journals (Sweden)

    Stefano Bonesso

    2013-08-01

    Full Text Available Per pianificare l’utilizzo di tecnologie che sfruttano energia rinnovabile su un territorio, in particolare quelle che riguarda il solare, si possono utilizzare i software GIS (Sistemi Informativi Geografici che consentono di analizzare e rappresentare un dato geo-riferito.  Potential of photovoltaic technologies on buildings’ roofs using geographic information systems (GIS - In order to plan the diffusion of renewable energy technologies, geographic information systems (GIS can be useful. In this study photovoltaic technologies in urban environments were examined, considering the shadows of urban contest and of territory orography evaluated with GIS (ESRI ArcGIS. The results of potential photovoltaic technologies strongly depend on input data but not always roof data are accurate. The aim of this workis to define a tool to improve the results of a GIS simulation on urban scale. To validate the procedure, the results were compared with data monitored by PERSIL project. The analysis was based on the use of geographic information systems, data scanner laser (LiDAR, and software for 3D reconstruction of the buildings.

  7. NeuroMatic: An Integrated Open-Source Software Toolkit for Acquisition, Analysis and Simulation of Electrophysiological Data

    Science.gov (United States)

    Rothman, Jason S.; Silver, R. Angus

    2018-01-01

    Acquisition, analysis and simulation of electrophysiological properties of the nervous system require multiple software packages. This makes it difficult to conserve experimental metadata and track the analysis performed. It also complicates certain experimental approaches such as online analysis. To address this, we developed NeuroMatic, an open-source software toolkit that performs data acquisition (episodic, continuous and triggered recordings), data analysis (spike rasters, spontaneous event detection, curve fitting, stationarity) and simulations (stochastic synaptic transmission, synaptic short-term plasticity, integrate-and-fire and Hodgkin-Huxley-like single-compartment models). The merging of a wide range of tools into a single package facilitates a more integrated style of research, from the development of online analysis functions during data acquisition, to the simulation of synaptic conductance trains during dynamic-clamp experiments. Moreover, NeuroMatic has the advantage of working within Igor Pro, a platform-independent environment that includes an extensive library of built-in functions, a history window for reviewing the user's workflow and the ability to produce publication-quality graphics. Since its original release, NeuroMatic has been used in a wide range of scientific studies and its user base has grown considerably. NeuroMatic version 3.0 can be found at http://www.neuromatic.thinkrandom.com and https://github.com/SilverLabUCL/NeuroMatic. PMID:29670519

  8. TU-C-17A-03: An Integrated Contour Evaluation Software Tool Using Supervised Pattern Recognition for Radiotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Chen, H; Tan, J; Kavanaugh, J; Dolly, S; Gay, H; Thorstad, W; Anastasio, M; Altman, M; Mutic, S; Li, H [Washington University School of Medicine, Saint Louis, MO (United States)

    2014-06-15

    Purpose: Radiotherapy (RT) contours delineated either manually or semiautomatically require verification before clinical usage. Manual evaluation is very time consuming. A new integrated software tool using supervised pattern contour recognition was thus developed to facilitate this process. Methods: The contouring tool was developed using an object-oriented programming language C# and application programming interfaces, e.g. visualization toolkit (VTK). The C# language served as the tool design basis. The Accord.Net scientific computing libraries were utilized for the required statistical data processing and pattern recognition, while the VTK was used to build and render 3-D mesh models from critical RT structures in real-time and 360° visualization. Principal component analysis (PCA) was used for system self-updating geometry variations of normal structures based on physician-approved RT contours as a training dataset. The inhouse design of supervised PCA-based contour recognition method was used for automatically evaluating contour normality/abnormality. The function for reporting the contour evaluation results was implemented by using C# and Windows Form Designer. Results: The software input was RT simulation images and RT structures from commercial clinical treatment planning systems. Several abilities were demonstrated: automatic assessment of RT contours, file loading/saving of various modality medical images and RT contours, and generation/visualization of 3-D images and anatomical models. Moreover, it supported the 360° rendering of the RT structures in a multi-slice view, which allows physicians to visually check and edit abnormally contoured structures. Conclusion: This new software integrates the supervised learning framework with image processing and graphical visualization modules for RT contour verification. This tool has great potential for facilitating treatment planning with the assistance of an automatic contour evaluation module in avoiding

  9. Integrated Reliability Estimation of a Nuclear Maintenance Robot including a Software

    Energy Technology Data Exchange (ETDEWEB)

    Eom, Heung Seop; Kim, Jae Hee; Jeong, Kyung Min [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2011-10-15

    Conventional reliability estimation techniques such as Fault Tree Analysis (FTA), Reliability Block Diagram (RBD), Markov Model, and Event Tree Analysis (ETA) have been used widely and approved in some industries. Then there are some limitations when we use them for a complicate robot systems including software such as intelligent reactor inspection robots. Therefore an expert's judgment plays an important role in estimating the reliability of a complicate system in practice, because experts can deal with diverse evidence related to the reliability and then perform an inference based on them. The proposed method in this paper combines qualitative and quantitative evidences and performs an inference like experts. Furthermore, it does the work in a formal and in a quantitative way unlike human experts, by the benefits of Bayesian Nets (BNs)

  10. Integration of control and building performance simulation software by run-time coupling

    NARCIS (Netherlands)

    Yahiaoui, A.; Hensen, J.L.M.; Soethout, L.L.

    2003-01-01

    This paper presents the background, approach and initial results of a project, which aims to achieve better integrated building and systems control modeling in building performance simulation by runtime coupling of distributed computer programs. This paper focuses on one of the essential steps

  11. Monitoring single-cell gene regulation under dynamically controllable conditions with integrated microfluidics and software

    NARCIS (Netherlands)

    Kaiser, Matthias; Jug, Florian; Julou, Thomas; Deshpande, S.R.; Pfohl, Thomas; Silander, Olin K.; Myers, Gene; Van Nimwegen, Erik

    2018-01-01

    Much is still not understood about how gene regulatory interactions control cell fate decisions in single cells, in part due to the difficulty of directly observing gene regulatory processes in vivo. We introduce here a novel integrated setup consisting of a microfluidic chip and accompanying

  12. Integrating Multimedia ICT Software in Language Curriculum: Students' Perception, Use, and Effectiveness

    Science.gov (United States)

    Penner, Nikolai; Grodek, Elzbieta

    2014-01-01

    Information and Communication Technologies (ICT) constitute an integral part of the teaching and learning environment in present-day educational institutions and play an increasingly important role in the modern second language classroom. In this study, an online language learning tool "Tell Me More" (TMM) has been introduced as a…

  13. Integration of a Robotic Arm with the Surgical Assistant Workstation Software Framework

    NARCIS (Netherlands)

    Young, J.; Elhawary, H.; Popovic, A.

    2012-01-01

    We have integrated the Philips Research robot arm with the Johns Hopkins University cisst library, an open-source platform for computerassisted surgical intervention. The development of a Matlab to C++ wrapper to abstract away servo-level details facilitates the rapid development of a

  14. MOIRA Software Framework - Integrated User-friendly Shell for The Environmental Decision Support Systems

    International Nuclear Information System (INIS)

    Hofman, Dmitry; Nordlinder, Sture

    2003-01-01

    MOIRA DSS is a model-based computerised system for the identification of optimal remedial strategies to restore radionuclide contaminated fresh water environment The examples of the questions which decision-maker could address to the system are 'Is lake liming effective in reducing the radiocesium uptake by fish?', C an control of catchment run-off be an effective measure against further redistribution of radionuclides by river?', 'Is sediment removal worthwhile to reduce further contamination of the aquatic environment?'. The MOIRA system could help decision-maker to avoid implementation of inappropriate and expensive countermeasures. MOIRA gives the possibility to predict effeas of implementation of different types of the countermeasures and evaluate both 'ecological' and 'social' effect of the countermeasures. Decision support process using MOIRA DSS can be subdivided to the following steps: Definition of the site-specific environmental and socio-economic parameters using GIS-based data. Unknown site-specific data could be estimated using GIS-based models, default data for the socio-economic parameters, data directly provided by user. Providing data about fallout of the radionuclides. Definition of the time interval for which prognosis will be made. Definition of the alternative strategies of the countermeasures. Evaluation of the sequences of the implementation of the user-defined strategies and 'no actions' strategy using predictive models. Ranking strategies using Multi-Attribute Analysis Module (MAA) Preparation of the recommendations in the form of report. This process requires usage of several computerised tools such as predictive models, multi-attribute analysis software, geographical information system, data base. MOIRA software framework could be used as the basis for the creation of the wide range of the user-friendly and easy-to-learn decision support systems. It can also provide the advanced graphical user interface and data checking system for the

  15. Apparatus for storing protective suits

    International Nuclear Information System (INIS)

    Englemann, H.J.; Koller, J.; Schrader, H.R.; Schade, G.; Pedrerol, J.

    1975-01-01

    Arrangements are described for storing one or more protective suits when contaminated on the outside. In order to permit a person wearing a contaminated suit to leave a contaminated area safely, and without contaminating the environment, it has hitherto been the practice for the suit to be passed through a 'lock' and cleansed under decontaminating showers whilst still being worn. This procedure is time wasting and not always completely effective, and it may be necessary to provide a second suit for use whilst the first suit is being decontaminated. Repeated decontamination may also result in undue wear and tear. The arrangements described provide a 'lock' chamber in which a contaminated suit may be stowed away without its interior becoming contaminated, thus allowing repeated use by persons donning and shedding it. (U.K.)

  16. Integration of a Robotic Arm with the Surgical Assistant Workstation Software Framework

    OpenAIRE

    Young, J.; Elhawary, H.; Popovic, A.

    2012-01-01

    We have integrated the Philips Research robot arm with the Johns Hopkins University cisst library, an open-source platform for computerassisted surgical intervention. The development of a Matlab to C++ wrapper to abstract away servo-level details facilitates the rapid development of a component-based framework with “plug and play” features. This allows the user to easily exchange the robot with an alternative manipulator while maintaining the same overall functionality.

  17. COMSY- A Software Tool For Aging And Plant Life Management With An Integrated Documentation Tool

    International Nuclear Information System (INIS)

    Baier, Roman; Zander, Andre

    2008-01-01

    For the aging and plant life management the integrity of the mechanical components and structures is one of the key objectives. In order to ensure this integrity it is essential to implement a comprehensive aging management. This should be applied to all safety relevant mechanical systems or components, civil structures, electrical systems as well as instrumentation and control (I and C). The following aspects should be covered: - Identification and assessment of relevant degradation mechanisms; - Verification and evaluation of the quality status of all safety relevant systems, structures and components (SSC's); - Verification and modernization of I and C and electrical systems; - Reliable and up-to-date documentation. For the support of this issue AREVA NP GmbH has developed the computer program COMSY, which utilizes more than 30 years of experience resulting from research activities and operational experience. The program provides the option to perform a plant-wide screening for identifying system areas, which are sensitive to specific degradation mechanisms. Another object is the administration and evaluation of NDE measurements from different techniques. An integrated documentation tool makes the document management and maintenance fast, reliable and independent from staff service. (authors)

  18. Software integration for automated stability analysis and design optimization of a bearingless rotor blade

    Science.gov (United States)

    Gunduz, Mustafa Emre

    Many government agencies and corporations around the world have found the unique capabilities of rotorcraft indispensable. Incorporating such capabilities into rotorcraft design poses extra challenges because it is a complicated multidisciplinary process. The concept of applying several disciplines to the design and optimization processes may not be new, but it does not currently seem to be widely accepted in industry. The reason for this might be the lack of well-known tools for realizing a complete multidisciplinary design and analysis of a product. This study aims to propose a method that enables engineers in some design disciplines to perform a fairly detailed analysis and optimization of a design using commercially available software as well as codes developed at Georgia Tech. The ultimate goal is when the system is set up properly, the CAD model of the design, including all subsystems, will be automatically updated as soon as a new part or assembly is added to the design; or it will be updated when an analysis and/or an optimization is performed and the geometry needs to be modified. Designers and engineers will be involved in only checking the latest design for errors or adding/removing features. Such a design process will take dramatically less time to complete; therefore, it should reduce development time and costs. The optimization method is demonstrated on an existing helicopter rotor originally designed in the 1960's. The rotor is already an effective design with novel features. However, application of the optimization principles together with high-speed computing resulted in an even better design. The objective function to be minimized is related to the vibrations of the rotor system under gusty wind conditions. The design parameters are all continuous variables. Optimization is performed in a number of steps. First, the most crucial design variables of the objective function are identified. With these variables, Latin Hypercube Sampling method is used

  19. Software Quality Assurance in Software Projects: A Study of Pakistan

    OpenAIRE

    Faisal Shafique Butt; Sundus Shaukat; M. Wasif Nisar; Ehsan Ullah Munir; Muhammad Waseem; Kashif Ayyub

    2013-01-01

    Software quality is specific property which tells what kind of standard software should have. In a software project, quality is the key factor of success and decline of software related organization. Many researches have been done regarding software quality. Software related organization follows standards introduced by Capability Maturity Model Integration (CMMI) to achieve good quality software. Quality is divided into three main layers which are Software Quality Assurance (SQA), Software Qu...

  20. Integration of Web Technologies in Software Applications. Is Web 2.0 a Solution?

    Directory of Open Access Journals (Sweden)

    Cezar Liviu CERVINSCHI

    2010-12-01

    Full Text Available Starting from the idea that Web 2.0 represents “the era of dynamic web”, the paper proposes to provide arguments (demonstrated by physical results regarding the question that is at the foundation if this article. Due to the findings we can definitely affirm that Web 2.0 is a solution to building powerful and robust software, since the Internet has become more than just a simple presence on the users’ desktop that develops easy access to information, services, entertainment, online transactions, e-commerce, e-learning and so on, but basically every kind of human or institutional interaction can happen online. This paper seeks to study the impact of two of these braches upon the user – e-commerce and e-testing. The statistic reports will be made on different sets of people, while the conclusions are the results of a detailed research and study of the applications’ behaviour in the actual operating environment.

  1. Extension of the AMBER molecular dynamics software to Intel's Many Integrated Core (MIC) architecture

    Science.gov (United States)

    Needham, Perri J.; Bhuiyan, Ashraf; Walker, Ross C.

    2016-04-01

    We present an implementation of explicit solvent particle mesh Ewald (PME) classical molecular dynamics (MD) within the PMEMD molecular dynamics engine, that forms part of the AMBER v14 MD software package, that makes use of Intel Xeon Phi coprocessors by offloading portions of the PME direct summation and neighbor list build to the coprocessor. We refer to this implementation as pmemd MIC offload and in this paper present the technical details of the algorithm, including basic models for MPI and OpenMP configuration, and analyze the resultant performance. The algorithm provides the best performance improvement for large systems (>400,000 atoms), achieving a ∼35% performance improvement for satellite tobacco mosaic virus (1,067,095 atoms) when 2 Intel E5-2697 v2 processors (2 ×12 cores, 30M cache, 2.7 GHz) are coupled to an Intel Xeon Phi coprocessor (Model 7120P-1.238/1.333 GHz, 61 cores). The implementation utilizes a two-fold decomposition strategy: spatial decomposition using an MPI library and thread-based decomposition using OpenMP. We also present compiler optimization settings that improve the performance on Intel Xeon processors, while retaining simulation accuracy.

  2. OPTiM: Optical projection tomography integrated microscope using open-source hardware and software.

    Science.gov (United States)

    Watson, Thomas; Andrews, Natalie; Davis, Samuel; Bugeon, Laurence; Dallman, Margaret D; McGinty, James

    2017-01-01

    We describe the implementation of an OPT plate to perform optical projection tomography (OPT) on a commercial wide-field inverted microscope, using our open-source hardware and software. The OPT plate includes a tilt adjustment for alignment and a stepper motor for sample rotation as required by standard projection tomography. Depending on magnification requirements, three methods of performing OPT are detailed using this adaptor plate: a conventional direct OPT method requiring only the addition of a limiting aperture behind the objective lens; an external optical-relay method allowing conventional OPT to be performed at magnifications >4x; a remote focal scanning and region-of-interest method for improved spatial resolution OPT (up to ~1.6 μm). All three methods use the microscope's existing incoherent light source (i.e. arc-lamp) and all of its inherent functionality is maintained for day-to-day use. OPT acquisitions are performed on in vivo zebrafish embryos to demonstrate the implementations' viability.

  3. Managing the CMS Online Software integrity through development and production cycles

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    The Data Acquisition system of the Compact Muon Solenoid experiment at CERN is a distributed system made of several different network technologies and computers to collect data from more than 600 custom detector Front-End Drivers. It assembles events at a rate of 100 kHz, transporting event data at an aggregate throughput of 100 GByte/s. The architecture takes advantage of the latest developments in the computing industry. For data concentration, 10/40 Gbit Ethernet technologies are used while a 56Gbps Infiniband FDR CLOS network has been chosen for the event builder with a throughput of ~4 Tbps. The CMS Online Software (CMSOS) infrastructure is a complex product created specifically for the development of large distributed data acquisition systems as well as all application components to achieve the CMS data acquisition task. It is designed to benefit from different networking technologies, parallelism available on a processing platform such as multi-core or multi-processor systems. It provides platform i...

  4. Performance evaluation of multi-stratum resources integrated resilience for software defined inter-data center interconnect.

    Science.gov (United States)

    Yang, Hui; Zhang, Jie; Zhao, Yongli; Ji, Yuefeng; Wu, Jialin; Lin, Yi; Han, Jianrui; Lee, Young

    2015-05-18

    Inter-data center interconnect with IP over elastic optical network (EON) is a promising scenario to meet the high burstiness and high-bandwidth requirements of data center services. In our previous work, we implemented multi-stratum resources integration among IP networks, optical networks and application stratums resources that allows to accommodate data center services. In view of this, this study extends to consider the service resilience in case of edge optical node failure. We propose a novel multi-stratum resources integrated resilience (MSRIR) architecture for the services in software defined inter-data center interconnect based on IP over EON. A global resources integrated resilience (GRIR) algorithm is introduced based on the proposed architecture. The MSRIR can enable cross stratum optimization and provide resilience using the multiple stratums resources, and enhance the data center service resilience responsiveness to the dynamic end-to-end service demands. The overall feasibility and efficiency of the proposed architecture is experimentally verified on the control plane of our OpenFlow-based enhanced SDN (eSDN) testbed. The performance of GRIR algorithm under heavy traffic load scenario is also quantitatively evaluated based on MSRIR architecture in terms of path blocking probability, resilience latency and resource utilization, compared with other resilience algorithms.

  5. Army-NASA aircrew/aircraft integration program (A3I) software detailed design document, phase 3

    Science.gov (United States)

    Banda, Carolyn; Chiu, Alex; Helms, Gretchen; Hsieh, Tehming; Lui, Andrew; Murray, Jerry; Shankar, Renuka

    1990-01-01

    The capabilities and design approach of the MIDAS (Man-machine Integration Design and Analysis System) computer-aided engineering (CAE) workstation under development by the Army-NASA Aircrew/Aircraft Integration Program is detailed. This workstation uses graphic, symbolic, and numeric prototyping tools and human performance models as part of an integrated design/analysis environment for crewstation human engineering. Developed incrementally, the requirements and design for Phase 3 (Dec. 1987 to Jun. 1989) are described. Software tools/models developed or significantly modified during this phase included: an interactive 3-D graphic cockpit design editor; multiple-perspective graphic views to observe simulation scenarios; symbolic methods to model the mission decomposition, equipment functions, pilot tasking and loading, as well as control the simulation; a 3-D dynamic anthropometric model; an intermachine communications package; and a training assessment component. These components were successfully used during Phase 3 to demonstrate the complex interactions and human engineering findings involved with a proposed cockpit communications design change in a simulated AH-64A Apache helicopter/mission that maps to empirical data from a similar study and AH-1 Cobra flight test.

  6. Hardware and Software Integration in Project Development of Automated Controller System Using LABVIEW FPGA

    International Nuclear Information System (INIS)

    Mohd Khairulezwan Abd Manan; Mohd Sabri Minhat; Izhar Abu Hussin

    2014-01-01

    The Field-Programmable Gate Array (FPGA) is a semiconductor device that can be programmed after manufacturing. Instead of being restricted to any predetermined hardware function, an FPGA allows user to program product features and functions, adapt to new standards, and reconfigure hardware for specific applications even after the product has been installed in the field, hence the name field-programmable. This project developed a control system using LabVIEW FPGA. LabVIEW FPGA is easier where it is programmed by using drag and drop icon. Then it will be integrated with the hardware input and output. (author)

  7. Present status of an integrated software system for HASP (Human Acts Simulation Program)

    International Nuclear Information System (INIS)

    Otani, Takayuki; Ebihara, Ken-ichi; Kambayashi, Shaw; Kume, Etsuo; Higuchi, Kenji; Fujii, Minoru; Akimoto, Masayuki

    1994-01-01

    In Human Acts Simulation Program (HASP), human acts to be realized by a human-shaped intelligent robot in a nuclear power plant are simulated by computers. The major purpose of HASP is to develop basic and underlying design technologies for intelligent and automatic power plant. The objectives of this paper is to show the present status of the HASP, with particular emphasis on activities targetted at the integration of developed subsystems to simulate the important capabilities of the intelligent robot such as planning, robot dynamics, and so on. (author)

  8. Advanced EVA Suit Camera System Development Project

    Science.gov (United States)

    Mock, Kyla

    2016-01-01

    The National Aeronautics and Space Administration (NASA) at the Johnson Space Center (JSC) is developing a new extra-vehicular activity (EVA) suit known as the Advanced EVA Z2 Suit. All of the improvements to the EVA Suit provide the opportunity to update the technology of the video imagery. My summer internship project involved improving the video streaming capabilities of the cameras that will be used on the Z2 Suit for data acquisition. To accomplish this, I familiarized myself with the architecture of the camera that is currently being tested to be able to make improvements on the design. Because there is a lot of benefit to saving space, power, and weight on the EVA suit, my job was to use Altium Design to start designing a much smaller and simplified interface board for the camera's microprocessor and external components. This involved checking datasheets of various components and checking signal connections to ensure that this architecture could be used for both the Z2 suit and potentially other future projects. The Orion spacecraft is a specific project that may benefit from this condensed camera interface design. The camera's physical placement on the suit also needed to be determined and tested so that image resolution can be maximized. Many of the options of the camera placement may be tested along with other future suit testing. There are multiple teams that work on different parts of the suit, so the camera's placement could directly affect their research or design. For this reason, a big part of my project was initiating contact with other branches and setting up multiple meetings to learn more about the pros and cons of the potential camera placements we are analyzing. Collaboration with the multiple teams working on the Advanced EVA Z2 Suit is absolutely necessary and these comparisons will be used as further progress is made for the overall suit design. This prototype will not be finished in time for the scheduled Z2 Suit testing, so my time was

  9. Fostering successful scientific software communities

    Science.gov (United States)

    Bangerth, W.; Heister, T.; Hwang, L.; Kellogg, L. H.

    2016-12-01

    Developing sustainable open source software packages for the sciences appears at first to be primarily a technical challenge: How can one create stable and robust algorithms, appropriate software designs, sufficient documentation, quality assurance strategies such as continuous integration and test suites, or backward compatibility approaches that yield high-quality software usable not only by the authors, but also the broader community of scientists? However, our experience from almost two decades of leading the development of the deal.II software library (http://www.dealii.org, a widely-used finite element package) and the ASPECT code (http://aspect.dealii.org, used to simulate convection in the Earth's mantle) has taught us that technical aspects are not the most difficult ones in scientific open source software. Rather, it is the social challenge of building and maintaining a community of users and developers interested in answering questions on user forums, contributing code, and jointly finding solutions to common technical and non-technical challenges. These problems are posed in an environment where project leaders typically have no resources to reward the majority of contributors, where very few people are specifically paid for the work they do on the project, and with frequent turnover of contributors as project members rotate into and out of jobs. In particular, much software work is done by graduate students who may become fluent enough in a software only a year or two before they leave academia. We will discuss strategies we have found do and do not work in maintaining and growing communities around the scientific software projects we lead. Specifically, we will discuss the management style necessary to keep contributors engaged, ways to give credit where credit is due, and structuring documentation to decrease reliance on forums and thereby allow user communities to grow without straining those who answer questions.

  10. Applying integrated software to optimize corporate production performance: a case study at Suncor

    International Nuclear Information System (INIS)

    Masse, L.P.; Rhynes, P.

    1997-01-01

    The feasibility and need to introduce a central database of basic well data for use in the petroleum industry in order to enhance production performance was discussed. Suncor developed a central database of well data as the foundation for a future systems architecture for its own use. The perceived, current and future benefits of such a system were described. Suncor identified the need for a corporate repository which is accessible to multiple applications, and provides the opportunity to upgrade the system to new technology that will benefit from integration. The objective was to document existing data sets, identify what additional data would be useful and document existing processes around this well data. The integrated set of data is supplied by multiple vendors and includes public land data, production budget, public well data, forecasting, economics, drilling, procurement system, fixed assets, maintenance, land administration, field data capture, production accounting and financial accounting. In addition to being able to access the current well data, significant added value is expected from the pro-active communication within the departments, and the additional time available for analysis and decisions as opposed to searching for data and comparing sources. 4 figs

  11. BioModels.net Web Services, a free and integrated toolkit for computational modelling software.

    Science.gov (United States)

    Li, Chen; Courtot, Mélanie; Le Novère, Nicolas; Laibe, Camille

    2010-05-01

    Exchanging and sharing scientific results are essential for researchers in the field of computational modelling. BioModels.net defines agreed-upon standards for model curation. A fundamental one, MIRIAM (Minimum Information Requested in the Annotation of Models), standardises the annotation and curation process of quantitative models in biology. To support this standard, MIRIAM Resources maintains a set of standard data types for annotating models, and provides services for manipulating these annotations. Furthermore, BioModels.net creates controlled vocabularies, such as SBO (Systems Biology Ontology) which strictly indexes, defines and links terms used in Systems Biology. Finally, BioModels Database provides a free, centralised, publicly accessible database for storing, searching and retrieving curated and annotated computational models. Each resource provides a web interface to submit, search, retrieve and display its data. In addition, the BioModels.net team provides a set of Web Services which allows the community to programmatically access the resources. A user is then able to perform remote queries, such as retrieving a model and resolving all its MIRIAM Annotations, as well as getting the details about the associated SBO terms. These web services use established standards. Communications rely on SOAP (Simple Object Access Protocol) messages and the available queries are described in a WSDL (Web Services Description Language) file. Several libraries are provided in order to simplify the development of client software. BioModels.net Web Services make one step further for the researchers to simulate and understand the entirety of a biological system, by allowing them to retrieve biological models in their own tool, combine queries in workflows and efficiently analyse models.

  12. FIGENIX: Intelligent automation of genomic annotation: expertise integration in a new software platform

    Directory of Open Access Journals (Sweden)

    Pontarotti Pierre

    2005-08-01

    Full Text Available Abstract Background Two of the main objectives of the genomic and post-genomic era are to structurally and functionally annotate genomes which consists of detecting genes' position and structure, and inferring their function (as well as of other features of genomes. Structural and functional annotation both require the complex chaining of numerous different software, algorithms and methods under the supervision of a biologist. The automation of these pipelines is necessary to manage huge amounts of data released by sequencing projects. Several pipelines already automate some of these complex chaining but still necessitate an important contribution of biologists for supervising and controlling the results at various steps. Results Here we propose an innovative automated platform, FIGENIX, which includes an expert system capable to substitute to human expertise at several key steps. FIGENIX currently automates complex pipelines of structural and functional annotation under the supervision of the expert system (which allows for example to make key decisions, check intermediate results or refine the dataset. The quality of the results produced by FIGENIX is comparable to those obtained by expert biologists with a drastic gain in terms of time costs and avoidance of errors due to the human manipulation of data. Conclusion The core engine and expert system of the FIGENIX platform currently handle complex annotation processes of broad interest for the genomic community. They could be easily adapted to new, or more specialized pipelines, such as for example the annotation of miRNAs, the classification of complex multigenic families, annotation of regulatory elements and other genomic features of interest.

  13. An integrated model for reliability estimation of digital nuclear protection system based on fault tree and software control flow methodologies

    International Nuclear Information System (INIS)

    Kim, Man Cheol; Seong, Poong Hyun

    2000-01-01

    In the nuclear industry, the difficulty of proving the reliabilities of digital systems prohibits the widespread use of digital systems in various nuclear application such as plant protection system. Even though there exist a few models which are used to estimate the reliabilities of digital systems, we develop a new integrated model which is more realistic than the existing models. We divide the process of estimating the reliability of a digital system into two phases, a high-level phase and a low-level phase, and the boundary of two phases is the reliabilities of subsystems. We apply software control flow method to the low-level phase and fault tree analysis to the high-level phase. The application of the model to Dynamic Safety System(DDS) shows that the estimated reliability of the system is quite reasonable and realistic

  14. An integrated model for reliability estimation of digital nuclear protection system based on fault tree and software control flow methodologies

    International Nuclear Information System (INIS)

    Kim, Man Cheol; Seong, Poong Hyun

    2000-01-01

    In nuclear industry, the difficulty of proving the reliabilities of digital systems prohibits the widespread use of digital systems in various nuclear application such as plant protection system. Even though there exist a few models which are used to estimate the reliabilities of digital systems, we develop a new integrated model which is more realistic than the existing models. We divide the process of estimating the reliability of a digital system into two phases, a high-level phase and a low-level phase, and the boundary of two phases is the reliabilities of subsystems. We apply software control flow method to the low-level phase and fault tree analysis to the high-level phase. The application of the model of dynamic safety system (DSS) shows that the estimated reliability of the system is quite reasonable and realistic. (author)

  15. DIMP: an interoperable solution for software integration and product data exchange

    Science.gov (United States)

    Wang, Xi Vincent; Xu, Xun William

    2012-08-01

    Today, globalisation has become one of the main trends of manufacturing business that has led to a world-wide decentralisation of resources amongst not only individual departments within one company but also business partners. However, despite the development and improvement in the last few decades, difficulties in information exchange and sharing still exist in heterogeneous applications environments. This article is divided into two parts. In the first part, related research work and integrating solutions are reviewed and discussed. The second part introduces a collaborative environment called distributed interoperable manufacturing platform, which is based on a module-based, service-oriented architecture (SOA). In the platform, the STEP-NC data model is used to facilitate data-exchange among heterogeneous CAD/CAM/CNC systems.

  16. Free and open source software at CERN: integration of drivers in the Linux kernel

    International Nuclear Information System (INIS)

    Gonzalez Cobas, J.D.; Iglesias Gonsalvez, S.; Howard Lewis, J.; Serrano, J.; Vanga, M.; Cota, E.G.; Rubini, A.; Vaga, F.

    2012-01-01

    Most device drivers written for accelerator control systems suffer from a severe lack of portability due to the ad hoc nature of the code, often embodied with intimate knowledge of the particular machine it is deployed in. In this paper we challenge this practice by arguing for the opposite approach: development in the open, which in our case translates into the integration of our code within the Linux kernel. We make our case by describing the upstream merge effort of the tsi148 driver, a critical (and complex) component of the control system. The encouraging results from this effort have then led us to follow the same approach with two more ambitious projects, currently in the works: Linux support for the upcoming FMC boards and a new I/O subsystem. (authors)

  17. MODFLOW-OWHM v2: The next generation of fully integrated hydrologic simulation software

    Science.gov (United States)

    Boyce, S. E.; Hanson, R. T.; Ferguson, I. M.; Reimann, T.; Henson, W.; Mehl, S.; Leake, S.; Maddock, T.

    2016-12-01

    The One-Water Hydrologic Flow Model (One-Water) is a MODFLOW-based integrated hydrologic flow model designed for the analysis of a broad range of conjunctive-use and climate-related issues. One-Water fully links the movement and use of groundwater, surface water, and imported water for consumption by agriculture and natural vegetation on the landscape, and for potable and other uses within a supply-and-demand framework. One-Water includes linkages for deformation-, flow-, and head-dependent flows; additional observation and parameter options for higher-order calibrations; and redesigned code for facilitation of self-updating models and faster simulation run times. The next version of One-Water, currently under development, will include a new surface-water operations module that simulates dynamic reservoir operations, a new sustainability analysis package that facilitates the estimation and simulation of reduced storage depletion and captured discharge, a conduit-flow process for karst aquifers and leaky pipe networks, a soil zone process that adds an enhanced infiltration process, interflow, deep percolation and soil moisture, and a new subsidence and aquifer compaction package. It will also include enhancements to local grid refinement, and additional features to facilitate easier model updates, faster execution, better error messages, and more integration/cross communication between the traditional MODFLOW packages. By retaining and tracking the water within the hydrosphere, One-Water accounts for "all of the water everywhere and all of the time." This philosophy provides more confidence in the water accounting by the scientific community and provides the public a foundation needed to address wider classes of problems. Ultimately, more complex questions are being asked about water resources, so they require a more complete answer about conjunctive-use and climate-related issues.

  18. An integrated development framework for rapid development of platform-independent and reusable satellite on-board software

    Science.gov (United States)

    Ziemke, Claas; Kuwahara, Toshinori; Kossev, Ivan

    2011-09-01

    finally by providing generic functionalities compliant to the ECSS-E-70-41A standard the proposed framework can provide a great boost in productivity. Together with open source tools such like the GNU tool-chain, Eclipse SDK, the simulation framework OpenSimKit, the emulator QEMU, the proposed on-board software framework forms an integrated development framework. It is possible to design, code and build the on-board software together with the operating system and then run it on a simulated satellite for performance analysis and debugging purposes. This makes it possible to rapidly develop and deploy a full-fledged satellite on-board software with minimal cost and in a limited time frame.

  19. VOLCWORKS: A suite for optimization of hazards mapping

    Science.gov (United States)

    Delgado Granados, H.; Ramírez Guzmán, R.; Villareal Benítez, J. L.; García Sánchez, T.

    2012-04-01

    Making hazards maps is a process linking basic science, applied science and engineering for the benefit of the society. The methodologies for hazards maps' construction have evolved enormously together with the tools that allow the forecasting of the behavior of the materials produced by different eruptive processes. However, in spite of the development of tools and evolution of methodologies, the utility of hazards maps has not changed: prevention and mitigation of volcanic disasters. Integration of different tools for simulation of different processes for a single volcano is a challenge to be solved using software tools including processing, simulation and visualization techniques, and data structures in order to build up a suit that helps in the construction process starting from the integration of the geological data, simulations and simplification of the output to design a hazards/scenario map. Scientific visualization is a powerful tool to explore and gain insight into complex data from instruments and simulations. The workflow from data collection, quality control and preparation for simulations, to achieve visual and appropriate presentation is a process that is usually disconnected, using in most of the cases different applications for each of the needed processes, because it requires many tools that are not built for the solution of a specific problem, or were developed by research groups to solve particular tasks, but disconnected. In volcanology, due to its complexity, groups typically examine only one aspect of the phenomenon: ash dispersal, laharic flows, pyroclastic flows, lava flows, and ballistic projectile ejection, among others. However, when studying the hazards associated to the activity of a volcano, it is important to analyze all the processes comprehensively, especially for communication of results to the end users: decision makers and planners. In order to solve this problem and connect different parts of a workflow we are developing the

  20. Web-Based Software Integration For Dissemination Of Archival Images: The Frontiers Of Science Website

    Directory of Open Access Journals (Sweden)

    Gary Browne

    2011-07-01

    Full Text Available The Frontiers of Science illustrated comic strip of 'science fact' ran from 1961 to 1982, syndicated worldwide through over 600 newspapers. The Rare Books and Special Collections Library at the University of Sydney, in association with Sydney eScholarship, digitized all 939 strips. We aimed to create a website that could disseminate these comic strips to scholars, enthusiasts and the general public. We wanted to enable users to search and browse through the images simply and effectively, with an intuitive and novel viewing platform. Time and resource constraints dictated the use of (mostly open source code modules wherever possible and the integration and customisation of a range of web-based applications, code snippets and technologies (DSpace, eXtensible Text Framework (XTF, OmniFormat, JQuery Tools, Thickbox and Zoomify, stylistically pulled together using CSS. This approach allowed for a rapid development cycle (6 weeks to deliver the site on time as well as provide us with a framework for similar projects.

  1. Integrated Design Software Predicts the Creep Life of Monolithic Ceramic Components

    Science.gov (United States)

    1996-01-01

    Significant improvements in propulsion and power generation for the next century will require revolutionary advances in high-temperature materials and structural design. Advanced ceramics are candidate materials for these elevated-temperature applications. As design protocols emerge for these material systems, designers must be aware of several innate features, including the degrading ability of ceramics to carry sustained load. Usually, time-dependent failure in ceramics occurs because of two different, delayedfailure mechanisms: slow crack growth and creep rupture. Slow crack growth initiates at a preexisting flaw and continues until a critical crack length is reached, causing catastrophic failure. Creep rupture, on the other hand, occurs because of bulk damage in the material: void nucleation and coalescence that eventually leads to macrocracks which then propagate to failure. Successful application of advanced ceramics depends on proper characterization of material behavior and the use of an appropriate design methodology. The life of a ceramic component can be predicted with the NASA Lewis Research Center's Ceramics Analysis and Reliability Evaluation of Structures (CARES) integrated design programs. CARES/CREEP determines the expected life of a component under creep conditions, and CARES/LIFE predicts the component life due to fast fracture and subcritical crack growth. The previously developed CARES/LIFE program has been used in numerous industrial and Government applications.

  2. Social Sensors (S2ensors): A Kind of Hardware-Software-Integrated Mediators for Social Manufacturing Systems Under Mass Individualization

    Science.gov (United States)

    Ding, Kai; Jiang, Ping-Yu

    2017-09-01

    Currently, little work has been devoted to the mediators and tools for multi-role production interactions in the mass individualization environment. This paper proposes a kind of hardware-software-integrated mediators called social sensors (S2ensors) to facilitate the production interactions among customers, manufacturers, and other stakeholders in the social manufacturing systems (SMS). The concept, classification, operational logics, and formalization of S2ensors are clarified. S2ensors collect subjective data from physical sensors and objective data from sensory input in mobile Apps, merge them into meaningful information for decision-making, and finally feed the decisions back for reaction and execution. Then, an S2ensors-Cloud platform is discussed to integrate different S2ensors to work for SMSs in an autonomous way. A demonstrative case is studied by developing a prototype system and the results show that S2ensors and S2ensors-Cloud platform can assist multi-role stakeholders interact and collaborate for the production tasks. It reveals the mediator-enabled mechanisms and methods for production interactions among stakeholders in SMS.

  3. DYNA3D, INGRID, and TAURUS: an integrated, interactive software system for crashworthiness engineering

    International Nuclear Information System (INIS)

    Benson, D.J.; Hallquist, J.O.; Stillman, D.W.

    1985-04-01

    Crashworthiness engineering has always been a high priority at Lawrence Livermore National Laboratory because of its role in the safe transport of radioactive material for the nuclear power industry and military. As a result, the authors have developed an integrated, interactive set of finite element programs for crashworthiness analysis. The heart of the system is DYNA3D, an explicit, fully vectorized, large deformation structural dynamics code. DYNA3D has the following four capabilities that are critical for the efficient and accurate analysis of crashes: (1) fully nonlinear solid, shell, and beam elements for representing a structure, (2) a broad range of constitutive models for representing the materials, (3) sophisticated contact algorithms for the impact interactions, and (4) a rigid body capability to represent the bodies away from the impact zones at a greatly reduced cost without sacrificing any accuracy in the momentum calculations. To generate the large and complex data files for DYNA3D, INGRID, a general purpose mesh generator, is used. It runs on everything from IBM PCs to CRAYS, and can generate 1000 nodes/minute on a PC. With its efficient hidden line algorithms and many options for specifying geometry, INGRID also doubles as a geometric modeller. TAURUS, an interactive post processor, is used to display DYNA3D output. In addition to the standard monochrome hidden line display, time history plotting, and contouring, TAURUS generates interactive color displays on 8 color video screens by plotting color bands superimposed on the mesh which indicate the value of the state variables. For higher quality color output, graphic output files may be sent to the DICOMED film recorders. We have found that color is every bit as important as hidden line removal in aiding the analyst in understanding his results. In this paper the basic methodologies of the programs are presented along with several crashworthiness calculations

  4. "Usability of data integration and visualization software for multidisciplinary pediatric intensive care: a human factors approach to assessing technology".

    Science.gov (United States)

    Lin, Ying Ling; Guerguerian, Anne-Marie; Tomasi, Jessica; Laussen, Peter; Trbovich, Patricia

    2017-08-14

    established or derived. Usability issues, observed through contextual use, provided directions for tangible design improvements of data integration software that may lessen use errors and promote safe use. Data-driven decision making can benefit from iterative interface redesign involving clinician-users in simulated environments. This study is a first step in understanding how software can support clinicians' decision making with integrated continuous monitoring data. Importantly, testing of similar platforms by all the different disciplines who may become clinician users is a fundamental step necessary to understand the impact on clinical outcomes of decision aids.

  5. Monitoring and reporting software for the coal industry

    Energy Technology Data Exchange (ETDEWEB)

    Okanovic, M. [Advanced Systems Integration Pty Ltd. (Australia)

    2001-08-01

    This paper explains the development and launch of MineSuite software, designed to facilitate report production in coal mines. Advanced Systems Integration (ASI) has developed a system that is generic to all mining operations. Mine personnel can define all processes, KPIs, equipment, delays, reports etc. that are vital in monitoring mining operations. Its capabilities have been realised in opencut, underground and preparation plants throughout Australia. Written in Java, MineSuite is a multi-user, multi-threaded, multi-tasking distributed application. 3 figs.

  6. EDL Sensor Suite, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Optical Air Data Systems (OADS) L.L.C. proposes a LIDAR based remote measurement sensor suite capable of satisfying a significant number of the desired sensing...

  7. Satellite Ocean Heat Content Suite

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This collection contains an operational Satellite Ocean Heat Content Suite (SOHCS) product generated by NOAA National Environmental Satellite, Data, and Information...

  8. EVA Suit Microbial Leakage Investigation

    Data.gov (United States)

    National Aeronautics and Space Administration — The objective of this project is to collect microbial samples from various EVA suits to determine how much microbial contamination is typically released during...

  9. Software reengineering

    Science.gov (United States)

    Fridge, Ernest M., III

    1991-01-01

    Today's software systems generally use obsolete technology, are not integrated properly with other software systems, and are difficult and costly to maintain. The discipline of reverse engineering is becoming prominent as organizations try to move their systems up to more modern and maintainable technology in a cost effective manner. JSC created a significant set of tools to develop and maintain FORTRAN and C code during development of the Space Shuttle. This tool set forms the basis for an integrated environment to re-engineer existing code into modern software engineering structures which are then easier and less costly to maintain and which allow a fairly straightforward translation into other target languages. The environment will support these structures and practices even in areas where the language definition and compilers do not enforce good software engineering. The knowledge and data captured using the reverse engineering tools is passed to standard forward engineering tools to redesign or perform major upgrades to software systems in a much more cost effective manner than using older technologies. A beta vision of the environment was released in Mar. 1991. The commercial potential for such re-engineering tools is very great. CASE TRENDS magazine reported it to be the primary concern of over four hundred of the top MIS executives.

  10. SLIMarray: Lightweight software for microarray facility management

    Directory of Open Access Journals (Sweden)

    Marzolf Bruz

    2006-10-01

    Full Text Available Abstract Background Microarray core facilities are commonplace in biological research organizations, and need systems for accurately tracking various logistical aspects of their operation. Although these different needs could be handled separately, an integrated management system provides benefits in organization, automation and reduction in errors. Results We present SLIMarray (System for Lab Information Management of Microarrays, an open source, modular database web application capable of managing microarray inventories, sample processing and usage charges. The software allows modular configuration and is well suited for further development, providing users the flexibility to adapt it to their needs. SLIMarray Lite, a version of the software that is especially easy to install and run, is also available. Conclusion SLIMarray addresses the previously unmet need for free and open source software for managing the logistics of a microarray core facility.

  11. NDAS NASA Data Acquisition Software Suite- Version 2.0

    Data.gov (United States)

    National Aeronautics and Space Administration — Current NASA propulsion test facilities include Stennis Space Center in Mississippi, Marshall Space Flight Center in Alabama, Plum Brook Station in Ohio, and White...

  12. Space Suit Joint Torque Testing

    Science.gov (United States)

    Valish, Dana J.

    2011-01-01

    In 2009 and early 2010, a test was performed to quantify the torque required to manipulate joints in several existing operational and prototype space suits in an effort to develop joint torque requirements appropriate for a new Constellation Program space suit system. The same test method was levied on the Constellation space suit contractors to verify that their suit design meets the requirements. However, because the original test was set up and conducted by a single test operator there was some question as to whether this method was repeatable enough to be considered a standard verification method for Constellation or other future space suits. In order to validate the method itself, a representative subset of the previous test was repeated, using the same information that would be available to space suit contractors, but set up and conducted by someone not familiar with the previous test. The resultant data was compared using graphical and statistical analysis and a variance in torque values for some of the tested joints was apparent. Potential variables that could have affected the data were identified and re-testing was conducted in an attempt to eliminate these variables. The results of the retest will be used to determine if further testing and modification is necessary before the method can be validated.

  13. Open architecture of smart sensor suites

    Science.gov (United States)

    Müller, Wilmuth; Kuwertz, Achim; Grönwall, Christina; Petersson, Henrik; Dekker, Rob; Reinert, Frank; Ditzel, Maarten

    2017-10-01

    Experiences from recent conflicts show the strong need for smart sensor suites comprising different multi-spectral imaging sensors as core elements as well as additional non-imaging sensors. Smart sensor suites should be part of a smart sensor network - a network of sensors, databases, evaluation stations and user terminals. Its goal is to optimize the use of various information sources for military operations such as situation assessment, intelligence, surveillance, reconnaissance, target recognition and tracking. Such a smart sensor network will enable commanders to achieve higher levels of situational awareness. Within the study at hand, an open system architecture was developed in order to increase the efficiency of sensor suites. The open system architecture for smart sensor suites, based on a system-of-systems approach, enables combining different sensors in multiple physical configurations, such as distributed sensors, co-located sensors combined in a single package, tower-mounted sensors, sensors integrated in a mobile platform, and trigger sensors. The architecture was derived from a set of system requirements and relevant scenarios. Its mode of operation is adaptable to a series of scenarios with respect to relevant objects of interest, activities to be observed, available transmission bandwidth, etc. The presented open architecture is designed in accordance with the NATO Architecture Framework (NAF). The architecture allows smart sensor suites to be part of a surveillance network, linked e.g. to a sensor planning system and a C4ISR center, and to be used in combination with future RPAS (Remotely Piloted Aircraft Systems) for supporting a more flexible dynamic configuration of RPAS payloads.

  14. CMMI for Systems Engineering, Software Engineering, Integrated Product and Process Development, and Supplier Sourcing, Version 1.1 (CMMI-SE/SW/IPPD/SS, V1.1) Continuous Representation

    National Research Council Canada - National Science Library

    2002-01-01

    .... Concepts covered by this model include systems engineering, software engineering, integrated product and process development, and supplier sourcing as well as traditional CMM concepts such as process...

  15. Software Engineering Support of the Third Round of Scientific Grand Challenge Investigations: An Earth Modeling System Software Framework Strawman Design that Integrates Cactus and UCLA/UCB Distributed Data Broker

    Science.gov (United States)

    Talbot, Bryan; Zhou, Shu-Jia; Higgins, Glenn

    2002-01-01

    One of the most significant challenges in large-scale climate modeling, as well as in high-performance computing in other scientific fields, is that of effectively integrating many software models from multiple contributors. A software framework facilitates the integration task. both in the development and runtime stages of the simulation. Effective software frameworks reduce the programming burden for the investigators, freeing them to focus more on the science and less on the parallel communication implementation, while maintaining high performance across numerous supercomputer and workstation architectures. This document proposes a strawman framework design for the climate community based on the integration of Cactus, from the relativistic physics community, and UCLA/UCB Distributed Data Broker (DDB) from the climate community. This design is the result of an extensive survey of climate models and frameworks in the climate community as well as frameworks from many other scientific communities. The design addresses fundamental development and runtime needs using Cactus, a framework with interfaces for FORTRAN and C-based languages, and high-performance model communication needs using DDB. This document also specifically explores object-oriented design issues in the context of climate modeling as well as climate modeling issues in terms of object-oriented design.

  16. Use MACES IVA Suit for EVA Mobility Evaluations

    Science.gov (United States)

    Watson, Richard D.

    2014-01-01

    The use of an Intra-Vehicular Activity (IVA) suit for a spacewalk or Extra-Vehicular Activity (EVA) was evaluated for mobility and usability in the Neutral Buoyancy Lab (NBL) environment. The Space Shuttle Advanced Crew Escape Suit (ACES) has been modified (MACES) to integrate with the Orion spacecraft. The first several missions of the Orion MPCV spacecraft will not have mass available to carry an EVA specific suit so any EVA required will have to be performed by the MACES. Since the MACES was not designed with EVA in mind, it was unknown what mobility the suit would be able to provide for an EVA or if a person could perform useful tasks for an extended time inside the pressurized suit. The suit was evaluated in multiple NBL runs by a variety of subjects including crewmembers with significant EVA experience. Various functional mobility tasks performed included: translation, body positioning, carrying tools, body stabilization, equipment handling, and use of tools. Hardware configurations included with and without TMG, suit with IVA gloves and suit with EVA gloves. Most tasks were completed on ISS mockups with existing EVA tools. Some limited tasks were completed with prototype tools on a simulated rocky surface. Major findings include: demonstration of the ability to weigh-out the suit, understanding the need to have subjects perform multiple runs prior to getting feedback, determination of critical sizing factors, and need for adjustment of suit work envelop. The early testing has demonstrated the feasibility of EVA's limited duration and limited scope. Further testing is required with more flight like tasking and constraints to validate these early results. If the suit is used for EVA, it will require mission specific modifications for umbilical management or PLSS integration, safety tether attachment, and tool interfaces. These evaluations are continuing through calendar year 2014.

  17. Integrating Real-Time Room Acoustics Simulation into a CAD Modeling Software to Enhance the Architectural Design Process

    Directory of Open Access Journals (Sweden)

    Sönke Pelzer

    2014-04-01

    Full Text Available For architects, real-time 3D visual rendering of CAD-models is a valuable tool. The architect usually perceives the visual appearance of the building interior in a natural and realistic way during the design process. Unfortunately this only emphasizes the role of the visual appearance of a building, while the acoustics often remain disregarded. Controlling the room acoustics is not integrated into most architects’ workflows—due to a lack of tools. The present contribution describes a newly developed plug-in for adding an adequate 3D-acoustics feedback to the architect. To present intuitively the acoustical effect of the current design project, the plug-in uses real-time audio rendering and 3D-reproduction. The room acoustics of the design can be varied by modifying structural shapes as well as by changing the material selection. In addition to the audio feedback, also a visualization of important room acoustics qualities is provided by displaying color-coded maps inside the CAD software.

  18. High integrity software for nuclear power plants: Candidate guidelines, technical basis and research needs. Main report, Volume 2

    International Nuclear Information System (INIS)

    Seth, S.; Bail, W.; Cleaves, D.; Cohen, H.; Hybertson, D.; Schaefer, C.; Stark, G.; Ta, A.; Ulery, B.

    1995-06-01

    The work documented in this report was performed in support of the US Nuclear Regulatory Commission to examine the technical basis for candidate guidelines that could be considered in reviewing and evaluating high integrity computer e following software development and assurance activities: Requirements specification; design; coding; verification and validation, inclukding static analysis and dynamic testing; safety analysis; operation and maintenance; configuration management; quality assurance; and planning and management. Each activity (framework element) was subdivided into technical areas (framework subelements). The report describes the development of approximately 200 candidate guidelines that span the entire ran e identification, categorization and prioritization of technical basis for those candidate guidelines; and the identification, categorization and prioritization of research needs for improving the technical basis. The report has two volumes: Volume 1, Executive Summary includes an overview of the framwork and of each framework element, the complete set of candidate guidelines, the results of the assessment of the technical basis for each candidate guideline, and a discussion of research needs that support the regulatory function; this document, Volume 2, is the main report

  19. High integrity software for nuclear power plants: Candidate guidelines, technical basis and research needs. Main report, Volume 2

    Energy Technology Data Exchange (ETDEWEB)

    Seth, S.; Bail, W.; Cleaves, D.; Cohen, H.; Hybertson, D.; Schaefer, C.; Stark, G.; Ta, A.; Ulery, B. [Mitre Corp., McLean, VA (United States)

    1995-06-01

    The work documented in this report was performed in support of the US Nuclear Regulatory Commission to examine the technical basis for candidate guidelines that could be considered in reviewing and evaluating high integrity computer e following software development and assurance activities: Requirements specification; design; coding; verification and validation, inclukding static analysis and dynamic testing; safety analysis; operation and maintenance; configuration management; quality assurance; and planning and management. Each activity (framework element) was subdivided into technical areas (framework subelements). The report describes the development of approximately 200 candidate guidelines that span the entire ran e identification, categorization and prioritization of technical basis for those candidate guidelines; and the identification, categorization and prioritization of research needs for improving the technical basis. The report has two volumes: Volume 1, Executive Summary includes an overview of the framwork and of each framework element, the complete set of candidate guidelines, the results of the assessment of the technical basis for each candidate guideline, and a discussion of research needs that support the regulatory function; this document, Volume 2, is the main report.

  20. Journal and Wave Bearing Impedance Calculation Software

    Science.gov (United States)

    Hanford, Amanda; Campbell, Robert

    2012-01-01

    The wave bearing software suite is a MALTA application that computes bearing properties for user-specified wave bearing conditions, as well as plain journal bearings. Wave bearings are fluid film journal bearings with multi-lobed wave patterns around the circumference of the bearing surface. In this software suite, the dynamic coefficients are outputted in a way for easy implementation in a finite element model used in rotor dynamics analysis. The software has a graphical user interface (GUI) for inputting bearing geometry parameters, and uses MATLAB s structure interface for ease of interpreting data. This innovation was developed to provide the stiffness and damping components of wave bearing impedances. The computational method for computing bearing coefficients was originally designed for plain journal bearings and tilting pad bearings. Modifications to include a wave bearing profile consisted of changing the film thickness profile given by an equation, and writing an algorithm to locate the integration limits for each fluid region. Careful consideration was needed to implement the correct integration limits while computing the dynamic coefficients, depending on the form of the input/output variables specified in the algorithm.

  1. Development of Power Assisting Suit

    Science.gov (United States)

    Yamamoto, Keijiro; Ishii, Mineo; Hyodo, Kazuhito; Yoshimitsu, Toshihiro; Matsuo, Takashi

    In order to realize a wearable power assisting suit for assisting a nurse to carry a patient in her arms, the power supply and control systems of the suit have to be miniaturized, and it has to be wireless and pipeline-less. The new wearable suit consists of shoulders, arms, back, waist and legs units to be fitted on the nurse's body. The arms, waist and legs have new pneumatic rotary actuators driven directly by micro air pumps supplied by portable Ni-Cd batteries. The muscle forces are sensed by a new muscle hardness sensor utilizing a sensing tip mounted on a force sensing film device. An embedded microcomputer is used for the calculations of control signals. The new wearable suit was applied practically to a human body and a series of movement experiments that weights in the arms were held and taken up and down was performed. Each unit of the suit could transmit assisting torque directly to each joint verifying its practicability.

  2. Integrated Web-Based Immersive Exploration of the Coordinated Canyon Experiment Data using Open Source STOQS Software

    Science.gov (United States)

    McCann, M. P.; Gwiazda, R.; O'Reilly, T. C.; Maier, K. L.; Lundsten, E. M.; Parsons, D. R.; Paull, C. K.

    2017-12-01

    The Coordinated Canyon Experiment (CCE) in Monterey Submarine Canyon has produced a wealth of oceanographic measurements whose analysis will improve understanding of turbidity current processes. Exploration of this data set, consisting of over 60 parameters from 15 platforms, is facilitated by using the open source Spatial Temporal Oceanographic Query System (STOQS) software (https://github.com/stoqs/stoqs). The Monterey Bay Aquarium Research Institute (MBARI) originally developed STOQS to help manage and visualize upper water column oceanographic measurements, but the generality of its data model permits effective use for any kind of spatial/temporal measurement data. STOQS consists of a PostgreSQL database and server-side Python/Django software; the client-side is jQuery JavaScript supporting AJAX requests to update a single page web application. The User Interface (UI) is optimized to provide a quick overview of data in spatial and temporal dimensions, as well as in parameter, platform, and data value space. A user may zoom into any feature of interest and select it, initiating a filter operation that updates the UI with an overview of all the data in the new filtered selection. When details are desired, radio buttons and checkboxes are selected to generate a number of different types of visualizations. These include color-filled temporal section and line plots, parameter-parameter plots, 2D map plots, and interactive 3D spatial visualizations. The Extensible 3D (X3D) standard and X3DOM JavaScript library provide the technology for presenting animated 3D data directly within the web browser. Most of the oceanographic measurements from the CCE (e.g. mooring mounted ADCP and CTD data) are easily visualized using established methods. However, unified integration and multiparameter display of several concurrently deployed sensors across a network of platforms is a challenge we hope to solve. Moreover, STOQS also allows display of data from a new instrument - the

  3. Physics Research Integrated Development Environment (PRIDE)

    International Nuclear Information System (INIS)

    Burton, J.; Cormell, L.

    1993-12-01

    Past efforts to implement a Software Engineering approach to High Energy Physics computing have been met with significant resistance and have been, in many cases, only marginally successful. At least a portion of the problem has been the Lick of an integrated development environment, tailored to High Energy Physics and incorporating a suite of Computer Aided Software Engineering tools. The Superconducting Super Collider Physics Research Division Computing Department is implementing pilot projects to develop just such an environment

  4. Geophysical characterization from Itu intrusive suite

    International Nuclear Information System (INIS)

    Pascholati, M.E.

    1989-01-01

    The integrated use of geophysical, geological, geochemical, petrographical and remote sensing data resulted in a substantial increase in the knowledge of the Itu Intrusive Suite. The main geophysical method was gamma-ray spectrometry together with fluorimetry and autoradiography. Three methods were used for calculation of laboratory gamma-ray spectrometry data. For U, the regression method was the best one. For K and Th, equations system and absolute calibration presented the best results. Surface gamma-ray spectrometry allowed comparison with laboratory data and permitted important contribution to the study of environmental radiation. (author)

  5. Effectiveness of an Integrated Tertiary Software Mobile Information System for Student Registration and Admission at a University in Gauteng

    Directory of Open Access Journals (Sweden)

    Frans MASHABELA

    2017-07-01

    Full Text Available This study investigates whether the new online registration and admission system implemented at a Tertiary Institution in Gauteng, South Africa, was successful and effective. The Institution under study is the first in South Africa to implement this new online registration system from the 3rd of January 2013 using a system called the Integrated Tertiary Software (ITS Mobile information system. The information system enables students to apply online without physically visiting the institution and provides the status of their registration and admission applications via their smartphones. A total of one hundred 1st year students and ten personnel were sampled to respond to self-completed questionnaires. The efficiency of this new online system was evaluated using the Technology Acceptance Model (TAM, the Web of System Performance (WOSP model and the DeLone and McLean IS Success model as well as the indicators of system ineffectiveness and attributes on the basis of which an information system was evaluated. Key findings emerging from the data analysis and interpretation show that the new online system met the expectations of most staff and students with the exception of few staff members and students. The findings show that the investment made on the new online registration system is benefiting the University and students. The implementation of the new online registration and admission system was a success to a larger extent because the expectations of most users were met. The online system is effective as it was evaluated using the conventional measuring methods and resulted in positive outcomes.

  6. Suited Contingency Ops Food - 2

    Science.gov (United States)

    Glass, J. W.; Leong, M. L.; Douglas, G. L.

    2014-01-01

    The contingency scenario for an emergency cabin depressurization event may require crewmembers to subsist in a pressurized suit for up to 144 hours. This scenario requires the capability for safe nutrition delivery through a helmet feed port against a 4 psi pressure differential to enable crewmembers to maintain strength and cognition to perform critical tasks. Two nutritional delivery prototypes were developed and analyzed for compatibility with the helmet feed port interface and for operational effectiveness against the pressure differential. The bag-in-bag (BiB) prototype, designed to equalize the suit pressure with the beverage pouch and enable a crewmember to drink normally, delivered water successfully to three different subjects in suits pressurized to 4 psi. The Boa restrainer pouch, designed to provide mechanical leverage to overcome the pressure differential, did not operate sufficiently. Guidelines were developed and compiled for contingency beverages that provide macro-nutritional requirements, a minimum one-year shelf life, and compatibility with the delivery hardware. Evaluation results and food product parameters have the potential to be used to improve future prototype designs and develop complete nutritional beverages for contingency events. These feeding capabilities would have additional use on extended surface mission EVAs, where the current in-suit drinking device may be insufficient.

  7. The ZPIC educational code suite

    Science.gov (United States)

    Calado, R.; Pardal, M.; Ninhos, P.; Helm, A.; Mori, W. B.; Decyk, V. K.; Vieira, J.; Silva, L. O.; Fonseca, R. A.

    2017-10-01

    Particle-in-Cell (PIC) codes are used in almost all areas of plasma physics, such as fusion energy research, plasma accelerators, space physics, ion propulsion, and plasma processing, and many other areas. In this work, we present the ZPIC educational code suite, a new initiative to foster training in plasma physics using computer simulations. Leveraging on our expertise and experience from the development and use of the OSIRIS PIC code, we have developed a suite of 1D/2D fully relativistic electromagnetic PIC codes, as well as 1D electrostatic. These codes are self-contained and require only a standard laptop/desktop computer with a C compiler to be run. The output files are written in a new file format called ZDF that can be easily read using the supplied routines in a number of languages, such as Python, and IDL. The code suite also includes a number of example problems that can be used to illustrate several textbook and advanced plasma mechanisms, including instructions for parameter space exploration. We also invite contributions to this repository of test problems that will be made freely available to the community provided the input files comply with the format defined by the ZPIC team. The code suite is freely available and hosted on GitHub at https://github.com/zambzamb/zpic. Work partially supported by PICKSC.

  8. Hybrid Enhanced Epidermal SpaceSuit Design Approaches

    Science.gov (United States)

    Jessup, Joseph M.

    A Space suit that does not rely on gas pressurization is a multi-faceted problem that requires major stability controls to be incorporated during design and construction. The concept of Hybrid Epidermal Enhancement space suit integrates evolved human anthropomorphic and physiological adaptations into its functionality, using commercially available bio-medical technologies to address shortcomings of conventional gas pressure suits, and the impracticalities of MCP suits. The prototype HEE Space Suit explored integumentary homeostasis, thermal control and mobility using advanced bio-medical materials technology and construction concepts. The goal was a space suit that functions as an enhanced, multi-functional bio-mimic of the human epidermal layer that works in attunement with the wearer rather than as a separate system. In addressing human physiological requirements for design and construction of the HEE suit, testing regimes were devised and integrated into the prototype which was then subject to a series of detailed tests using both anatomical reproduction methods and human subject.

  9. BioWord: A sequence manipulation suite for Microsoft Word

    Directory of Open Access Journals (Sweden)

    Anzaldi Laura J

    2012-06-01

    Full Text Available Abstract Background The ability to manipulate, edit and process DNA and protein sequences has rapidly become a necessary skill for practicing biologists across a wide swath of disciplines. In spite of this, most everyday sequence manipulation tools are distributed across several programs and web servers, sometimes requiring installation and typically involving frequent switching between applications. To address this problem, here we have developed BioWord, a macro-enabled self-installing template for Microsoft Word documents that integrates an extensive suite of DNA and protein sequence manipulation tools. Results BioWord is distributed as a single macro-enabled template that self-installs with a single click. After installation, BioWord will open as a tab in the Office ribbon. Biologists can then easily manipulate DNA and protein sequences using a familiar interface and minimize the need to switch between applications. Beyond simple sequence manipulation, BioWord integrates functionality ranging from dyad search and consensus logos to motif discovery and pair-wise alignment. Written in Visual Basic for Applications (VBA as an open source, object-oriented project, BioWord allows users with varying programming experience to expand and customize the program to better meet their own needs. Conclusions BioWord integrates a powerful set of tools for biological sequence manipulation within a handy, user-friendly tab in a widely used word processing software package. The use of a simple scripting language and an object-oriented scheme facilitates customization by users and provides a very accessible educational platform for introducing students to basic bioinformatics algorithms.

  10. BioWord: A sequence manipulation suite for Microsoft Word

    Science.gov (United States)

    2012-01-01

    Background The ability to manipulate, edit and process DNA and protein sequences has rapidly become a necessary skill for practicing biologists across a wide swath of disciplines. In spite of this, most everyday sequence manipulation tools are distributed across several programs and web servers, sometimes requiring installation and typically involving frequent switching between applications. To address this problem, here we have developed BioWord, a macro-enabled self-installing template for Microsoft Word documents that integrates an extensive suite of DNA and protein sequence manipulation tools. Results BioWord is distributed as a single macro-enabled template that self-installs with a single click. After installation, BioWord will open as a tab in the Office ribbon. Biologists can then easily manipulate DNA and protein sequences using a familiar interface and minimize the need to switch between applications. Beyond simple sequence manipulation, BioWord integrates functionality ranging from dyad search and consensus logos to motif discovery and pair-wise alignment. Written in Visual Basic for Applications (VBA) as an open source, object-oriented project, BioWord allows users with varying programming experience to expand and customize the program to better meet their own needs. Conclusions BioWord integrates a powerful set of tools for biological sequence manipulation within a handy, user-friendly tab in a widely used word processing software package. The use of a simple scripting language and an object-oriented scheme facilitates customization by users and provides a very accessible educational platform for introducing students to basic bioinformatics algorithms. PMID:22676326

  11. BioWord: a sequence manipulation suite for Microsoft Word.

    Science.gov (United States)

    Anzaldi, Laura J; Muñoz-Fernández, Daniel; Erill, Ivan

    2012-06-07

    The ability to manipulate, edit and process DNA and protein sequences has rapidly become a necessary skill for practicing biologists across a wide swath of disciplines. In spite of this, most everyday sequence manipulation tools are distributed across several programs and web servers, sometimes requiring installation and typically involving frequent switching between applications. To address this problem, here we have developed BioWord, a macro-enabled self-installing template for Microsoft Word documents that integrates an extensive suite of DNA and protein sequence manipulation tools. BioWord is distributed as a single macro-enabled template that self-installs with a single click. After installation, BioWord will open as a tab in the Office ribbon. Biologists can then easily manipulate DNA and protein sequences using a familiar interface and minimize the need to switch between applications. Beyond simple sequence manipulation, BioWord integrates functionality ranging from dyad search and consensus logos to motif discovery and pair-wise alignment. Written in Visual Basic for Applications (VBA) as an open source, object-oriented project, BioWord allows users with varying programming experience to expand and customize the program to better meet their own needs. BioWord integrates a powerful set of tools for biological sequence manipulation within a handy, user-friendly tab in a widely used word processing software package. The use of a simple scripting language and an object-oriented scheme facilitates customization by users and provides a very accessible educational platform for introducing students to basic bioinformatics algorithms.

  12. A Secure Communication Suite for Underwater Acoustic Sensor Networks

    Directory of Open Access Journals (Sweden)

    Angelica Lo Duca

    2012-11-01

    Full Text Available In this paper we describe a security suite for Underwater Acoustic Sensor Networks comprising both fixed and mobile nodes. The security suite is composed of a secure routing protocol and a set of cryptographic primitives aimed at protecting the confidentiality and the integrity of underwater communication while taking into account the unique characteristics and constraints of the acoustic channel. By means of experiments and simulations based on real data, we show that the suite is suitable for an underwater networking environment as it introduces limited, and sometimes negligible, communication and power consumption overhead.

  13. Software for Probabilistic Risk Reduction

    Science.gov (United States)

    Hensley, Scott; Michel, Thierry; Madsen, Soren; Chapin, Elaine; Rodriguez, Ernesto

    2004-01-01

    A computer program implements a methodology, denoted probabilistic risk reduction, that is intended to aid in planning the development of complex software and/or hardware systems. This methodology integrates two complementary prior methodologies: (1) that of probabilistic risk assessment and (2) a risk-based planning methodology, implemented in a prior computer program known as Defect Detection and Prevention (DDP), in which multiple requirements and the beneficial effects of risk-mitigation actions are taken into account. The present methodology and the software are able to accommodate both process knowledge (notably of the efficacy of development practices) and product knowledge (notably of the logical structure of a system, the development of which one seeks to plan). Estimates of the costs and benefits of a planned development can be derived. Functional and non-functional aspects of software can be taken into account, and trades made among them. It becomes possible to optimize the planning process in the sense that it becomes possible to select the best suite of process steps and design choices to maximize the expectation of success while remaining within budget.

  14. Development of a software tool for the management of quality control in a helical tomotherapy unit; Desarrollo de una herramienta de software para la gestion integral del control de calidad en una unidad de tomoterapia helicoidal

    Energy Technology Data Exchange (ETDEWEB)

    Garcia Repiso, S.; Hernandez Rodriguez, J.; Martin Rincon, C.; Ramos Pacho, J. A.; Verde Velasco, J. M.; Delgado Aparacio, J. M.; Perez Alvarez, M. e.; Gomez Gonzalez, N.; Cons Perez, V.; Saez Beltran, M.

    2013-07-01

    The large amount of data and information that is managed in units of external radiotherapy quality control tests makes necessary the use of tools that facilitate, on the one hand, the management of measures and results in real time, and on other tasks of management, file, query and reporting of stored data. This paper presents an application of software of own development which is used for the integral management of the helical TomoTherapy unit in the aspects related to the roles and responsibilities of the hospital Radiophysics. (Author)

  15. Formal methods in software development: A road less travelled

    Directory of Open Access Journals (Sweden)

    John A van der Poll

    2010-08-01

    Full Text Available An integration of traditional verification techniques and formal specifications in software engineering is presented. Advocates of such techniques claim that mathematical formalisms allow them to produce quality, verifiably correct, or at least highly dependable software and that the testing and maintenance phases are shortened. Critics on the other hand maintain that software formalisms are hard to master, tedious to use and not well suited for the fast turnaround times demanded by industry. In this paper some popular formalisms and the advantages of using these during the early phases of the software development life cycle are presented. Employing the Floyd-Hoare verification principles during the formal specification phase facilitates reasoning about the properties of a specification. Some observations that may help to alleviate the formal-methods controversy are established and a number of formal methods successes is presented. Possible conditions for an increased acceptance of formalisms in oftware development are discussed.

  16. Integrating a flexible modeling framework (FMF) with the network security assessment instrument to reduce software security risk

    Science.gov (United States)

    Gilliam, D. P.; Powell, J. D.

    2002-01-01

    This paper presents a portion of an overall research project on the generation of the network security assessment instrument to aid developers in assessing and assuring the security of software in the development and maintenance lifecycles.

  17. Application of an integrated multi-criteria decision making AHP-TOPSIS methodology for ETL software selection.

    Science.gov (United States)

    Hanine, Mohamed; Boutkhoum, Omar; Tikniouine, Abdessadek; Agouti, Tarik

    2016-01-01

    Actually, a set of ETL software (Extract, Transform and Load) is available to constitute a major investment market. Each ETL uses its own techniques for extracting, transforming and loading data into data warehouse, which makes the task of evaluating ETL software very difficult. However, choosing the right software of ETL is critical to the success or failure of any Business Intelligence project. As there are many impacting factors in the selection of ETL software, the same process is considered as a complex multi-criteria decision making (MCDM) problem. In this study, an application of decision-making methodology that employs the two well-known MCDM techniques, namely Analytic Hierarchy Process (AHP) and Technique for Order Preference by Similarity to Ideal Solution (TOPSIS) methods is designed. In this respect, the aim of using AHP is to analyze the structure of the ETL software selection problem and obtain weights of the selected criteria. Then, TOPSIS technique is used to calculate the alternatives' ratings. An example is given to illustrate the proposed methodology. Finally, a software prototype for demonstrating both methods is implemented.

  18. Performance evaluation of multi-stratum resources integration based on network function virtualization in software defined elastic data center optical interconnect.

    Science.gov (United States)

    Yang, Hui; Zhang, Jie; Ji, Yuefeng; Tian, Rui; Han, Jianrui; Lee, Young

    2015-11-30

    Data center interconnect with elastic optical network is a promising scenario to meet the high burstiness and high-bandwidth requirements of data center services. In our previous work, we implemented multi-stratum resilience between IP and elastic optical networks that allows to accommodate data center services. In view of this, this study extends to consider the resource integration by breaking the limit of network device, which can enhance the resource utilization. We propose a novel multi-stratum resources integration (MSRI) architecture based on network function virtualization in software defined elastic data center optical interconnect. A resource integrated mapping (RIM) scheme for MSRI is introduced in the proposed architecture. The MSRI can accommodate the data center services with resources integration when the single function or resource is relatively scarce to provision the services, and enhance globally integrated optimization of optical network and application resources. The overall feasibility and efficiency of the proposed architecture are experimentally verified on the control plane of OpenFlow-based enhanced software defined networking (eSDN) testbed. The performance of RIM scheme under heavy traffic load scenario is also quantitatively evaluated based on MSRI architecture in terms of path blocking probability, provisioning latency and resource utilization, compared with other provisioning schemes.

  19. Final Report- "An Algorithmic and Software Framework for Applied Partial Differential Equations (APDEC): A DOE SciDAC Integrated Software Infrastructure Center (ISIC)

    Energy Technology Data Exchange (ETDEWEB)

    Elbridge Gerry Puckett

    2008-05-13

    this he had been a Deputy Section Head at the National Center for Atmospheric Research in Colorado. My understanding is that Chris Algieri is the first person that Bill hired after coming to LBNL. The plan is that Chris Algieri will finish his PhD thesis while employed as a staff scientist in Bill's group. Both Sarah and Chris were supported in part with funds from DE-FC02-01ER25473. In Sarah's case she received support both while at U.C. Davis (UCD) taking classes and writing an MS thesis and during some of the time she was living in Berkeley, working at LBNL and finishing her PhD thesis. In Chris' case he was at U.C. Davis during the entire time he received support from DE-FC02-01ER25473. More specific details of their work are included in the report below. Finally my own research conducted under the auspices of DE-FC02-01ER25473 either involved direct collaboration with researchers at LBNL - Phil Colella and Peter Schwartz who is a member of Phil's Applied Numerical Algorithms Group - or was on problems that are closely related to research that has been and continues to be conducted by researchers at LBNL. Specific details of this work can be found below. Finally, I would like to note that the work conducted by my students and me under the auspices of this contract is closely related to work that I have performed with funding from my DOE MICS contract DE-FC02-03ER25579 'Development of High-Order Accurate Interface Tracking Algorithms and Improved Constitutive Models for Problems in Continuum Mechanics with Applications to Jetting' and with my CoPI on that grant Professor Greg Miller of the Department of Applied Science at UCD. In theory I tried to use funds from the SciDAC grant DE-FC02-01ER25473 to support work that directly involved implementing algorithms developed by my research group at U.C. Davis in software that was developed and is maintained by my SciDAC CoPI's at LBNL.

  20. Results from Carbon Dioxide Washout Testing Using a Suited Manikin Test Apparatus with a Space Suit Ventilation Test Loop

    Science.gov (United States)

    Chullen, Cinda; Conger, Bruce; McMillin, Summer; Vonau, Walt; Kanne, Bryan; Korona, Adam; Swickrath, Mike

    2016-01-01

    NASA is developing an advanced portable life support system (PLSS) to meet the needs of a new NASA advanced space suit. The PLSS is one of the most critical aspects of the space suit providing the necessary oxygen, ventilation, and thermal protection for an astronaut performing a spacewalk. The ventilation subsystem in the PLSS must provide sufficient carbon dioxide (CO2) removal and ensure that the CO2 is washed away from the oronasal region of the astronaut. CO2 washout is a term used to describe the mechanism by which CO2 levels are controlled within the helmet to limit the concentration of CO2 inhaled by the astronaut. Accumulation of CO2 in the helmet or throughout the ventilation loop could cause the suited astronaut to experience hypercapnia (excessive carbon dioxide in the blood). A suited manikin test apparatus (SMTA) integrated with a space suit ventilation test loop was designed, developed, and assembled at NASA in order to experimentally validate adequate CO2 removal throughout the PLSS ventilation subsystem and to quantify CO2 washout performance under various conditions. The test results from this integrated system will be used to validate analytical models and augment human testing. This paper presents the system integration of the PLSS ventilation test loop with the SMTA including the newly developed regenerative Rapid Cycle Amine component used for CO2 removal and tidal breathing capability to emulate the human. The testing and analytical results of the integrated system are presented along with future work.

  1. Dedicated algorithm and software for the integrated analysis of AC and DC electrical outputs of piezoelectric vibration energy harvesters

    International Nuclear Information System (INIS)

    Kim, Jae Eum

    2014-01-01

    DC electrical outputs of a piezoelectric vibration energy harvester by nonlinear rectifying circuitry can hardly be obtained either by any mathematical models developed so far or by finite element analysis. To address the issue, this work used an equivalent electrical circuit model and newly developed an algorithm to efficiently identify relevant circuit parameters of arbitrarily-shaped cantilevered piezoelectric energy harvesters. The developed algorithm was then realized as a dedicated software module by adopting ANSYS finite element analysis software for the parameters identification and the Tcl/Tk programming language for a graphical user interface and linkage with ANSYS. For verifications, various AC electrical outputs by the developed software were compared with those by traditional finite element analysis. DC electrical outputs through rectifying circuitry were also examined for varying values of the smoothing capacitance and load resistance.

  2. Dedicated algorithm and software for the integrated analysis of AC and DC electrical outputs of piezoelectric vibration energy harvesters

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jae Eum [Catholic University of Daegu, Gyeongsan (Korea, Republic of)

    2014-10-15

    DC electrical outputs of a piezoelectric vibration energy harvester by nonlinear rectifying circuitry can hardly be obtained either by any mathematical models developed so far or by finite element analysis. To address the issue, this work used an equivalent electrical circuit model and newly developed an algorithm to efficiently identify relevant circuit parameters of arbitrarily-shaped cantilevered piezoelectric energy harvesters. The developed algorithm was then realized as a dedicated software module by adopting ANSYS finite element analysis software for the parameters identification and the Tcl/Tk programming language for a graphical user interface and linkage with ANSYS. For verifications, various AC electrical outputs by the developed software were compared with those by traditional finite element analysis. DC electrical outputs through rectifying circuitry were also examined for varying values of the smoothing capacitance and load resistance.

  3. Integrating software reliability concepts into risk and reliability modeling of digital instrumentation and control systems used in nuclear power plants

    International Nuclear Information System (INIS)

    Arndt, S. A.

    2006-01-01

    As software-based digital systems are becoming more and more common in all aspects of industrial process control, including the nuclear power industry, it is vital that the current state of the art in quality, reliability, and safety analysis be advanced to support the quantitative review of these systems. Several research groups throughout the world are working on the development and assessment of software-based digital system reliability methods and their applications in the nuclear power, aerospace, transportation, and defense industries. However, these groups are hampered by the fact that software experts and probabilistic safety assessment experts view reliability engineering very differently. This paper discusses the characteristics of a common vocabulary and modeling framework. (authors)

  4. The Variable Vector Countermeasure Suit (V2Suit for Space Habitation and Exploration

    Directory of Open Access Journals (Sweden)

    Kevin R Duda

    2015-04-01

    Full Text Available The Variable Vector Countermeasure Suit (V2Suit for Space Habitation and Exploration is a novel system concept that provides a platform for integrating sensors and actuators with daily astronaut intravehicular activities to improve health and performance, while reducing the mass and volume of the physiologic adaptation countermeasure systems, as well as the required exercise time during long-duration space exploration missions. The V2Suit system leverages wearable kinematic monitoring technology and uses inertial measurement units (IMUs and control moment gyroscopes (CMGs within miniaturized modules placed on body segments to provide a viscous resistance during movements against a specified direction of down – initially as a countermeasure to the sensorimotor adaptation performance decrements that manifest themselves while living and working in microgravity and during gravitational transitions during long-duration spaceflight, including post-flight recovery and rehabilitation. Several aspects of the V2Suit system concept were explored and simulated prior to developing a brassboard prototype for technology demonstration. This included a system architecture for identifying the key components and their interconnects, initial identification of key human-system integration challenges, development of a simulation architecture for CMG selection and parameter sizing, and the detailed mechanical design and fabrication of a module. The brassboard prototype demonstrates closed-loop control from down initialization through CMG actuation, and provides a research platform for human performance evaluations to mitigate sensorimotor adaptation, as well as a tool for determining the performance requirements when used as a musculoskeletal deconditioning countermeasure. This type of countermeasure system also has Earth benefits, particularly in gait or movement stabilization and rehabilitation.

  5. Simulation software support (S3) system a software testing and debugging tool

    International Nuclear Information System (INIS)

    Burgess, D.C.; Mahjouri, F.S.

    1990-01-01

    The largest percentage of technical effort in the software development process is accounted for debugging and testing. It is not unusual for a software development organization to spend over 50% of the total project effort on testing. In the extreme, testing of human-rated software (e.g., nuclear reactor monitoring, training simulator) can cost three to five times as much as all other software engineering steps combined. The Simulation Software Support (S 3 ) System, developed by the Link-Miles Simulation Corporation is ideally suited for real-time simulation applications which involve a large database with models programmed in FORTRAN. This paper will focus on testing elements of the S 3 system. In this paper system support software utilities are provided which enable the loading and execution of modules in the development environment. These elements include the Linking/Loader (LLD) for dynamically linking program modules and loading them into memory and the interactive executive (IEXEC) for controlling the execution of the modules. Features of the Interactive Symbolic Debugger (SD) and the Real Time Executive (RTEXEC) to support the unit and integrated testing will be explored

  6. Systematic Integration of Innovation in Process Improvement Projects Using the Enhanced Sigma-TRIZ Algorithm and Its Effective Use by Means of a Knowledge Management Software Platform

    Directory of Open Access Journals (Sweden)

    Mircea FULEA

    2009-01-01

    Full Text Available In an evolving, highly turbulent and uncertain socio-economic environment, organizations must consider strategies of systematic and continuous integration of innovation within their business systems, as a fundamental condition for sustainable development. Adequate methodologies are required in this respect. A mature framework for integrating innovative problem solving approaches within business process improvement methodologies is proposed in this paper. It considers a TRIZ-centred algorithm in the improvement phase of the DMAIC methodology. The new tool is called enhanced sigma-TRIZ. A case study reveals the practical application of the proposed methodology. The integration of enhanced sigma-TRIZ within a knowledge management software platform (KMSP is further described. Specific developments to support processes of knowledge creation, knowledge storage and retrieval, knowledge transfer and knowledge application in a friendly and effective way within the KMSP are also highlighted.

  7. Pragmatic Software Innovation

    DEFF Research Database (Denmark)

    Aaen, Ivan; Jensen, Rikke Hagensby

    2014-01-01

    We understand software innovation as concerned with introducing innovation into the development of software intensive systems, i.e. systems in which software development and/or integration are dominant considerations. Innovation is key in almost any strategy for competitiveness in existing markets......, for creating new markets, or for curbing rising public expenses, and software intensive systems are core elements in most such strategies. Software innovation therefore is vital for about every sector of the economy. Changes in software technologies over the last decades have opened up for experimentation......, learning, and flexibility in ongoing software projects, but how can this change be used to facilitate software innovation? How can a team systematically identify and pursue opportunities to create added value in ongoing projects? In this paper, we describe Deweyan pragmatism as the philosophical foundation...

  8. Integrative structure modeling with the Integrative Modeling Platform.

    Science.gov (United States)

    Webb, Benjamin; Viswanath, Shruthi; Bonomi, Massimiliano; Pellarin, Riccardo; Greenberg, Charles H; Saltzberg, Daniel; Sali, Andrej

    2018-01-01

    Building models of a biological system that are consistent with the myriad data available is one of the key challenges in biology. Modeling the structure and dynamics of macromolecular assemblies, for example, can give insights into how biological systems work, evolved, might be controlled, and even designed. Integrative structure modeling casts the building of structural models as a computational optimization problem, for which information about the assembly is encoded into a scoring function that evaluates candidate models. Here, we describe our open source software suite for integrative structure modeling, Integrative Modeling Platform (https://integrativemodeling.org), and demonstrate its use. © 2017 The Protein Society.

  9. A coherent environment of software improvement tools for CMS

    International Nuclear Information System (INIS)

    Eulisse, G.; Muzaffar, S.; Osborne, I.; Taylor, L.; Tuura, L.A.

    2004-01-01

    CMS has developed approximately one million lines of C++ code and uses many more from HEP, Grid and public domain projects. We describe a suite of tools which help to manage this complexity by measuring software dependencies, quality metrics, and CPU and memory performance. This coherent environment integrates and extends existing open-source tools where possible and provides new in-house components where a suitable solution does not already exist. This is a freely available environment with graphical user interface which can be run on any software without the need to recompile or instrument it. We have developed ignominy which performs software dependency analysis of source code, binary products and external software. CPU profiling is provided based on oprofile, with added features such as profile snapshots, distributed profiling and aggregate profiles for farm systems including server-side tools for collecting profile data. Finally, we have developed a low-overhead performance and memory profiling tool, MemProf, which can perform (gprof-style) hierarchical performance profiling, in a way that works with multiple threads and dynamically loaded libraries (unlike gprof). It also gathers exact memory allocation profiles including which code allocates most, in what sizes of chunks, for how long, where the memory is getting freed and where it is getting leaked. We describe this tool suite and how it has been used to enhance the quality of CMS software

  10. Evaluation of automated decisionmaking methodologies and development of an integrated robotic system simulation, volume 2, part 1. Appendix A: Software documentation

    Science.gov (United States)

    Lowrie, J. W.; Fermelia, A. J.; Haley, D. C.; Gremban, K. D.; Vanbaalen, J.; Walsh, R. W.

    1982-01-01

    Documentation of the preliminary software developed as a framework for a generalized integrated robotic system simulation is presented. The program structure is composed of three major functions controlled by a program executive. The three major functions are: system definition, analysis tools, and post processing. The system definition function handles user input of system parameters and definition of the manipulator configuration. The analysis tools function handles the computational requirements of the program. The post processing function allows for more detailed study of the results of analysis tool function executions. Also documented is the manipulator joint model software to be used as the basis of the manipulator simulation which will be part of the analysis tools capability.

  11. The research of the test-class method based on interface object in the software integration test of the large container inspection system

    International Nuclear Information System (INIS)

    Sun Shaohua; Chen Zhiqiang; Zhang Li; Gao Wenhuan; Kang Kejun

    2000-01-01

    Software test is the important stage in software process. The has been mature theory, method and model for unit test in practice. But for integration test, there is not regular method to adhere to. The author presents a new method, developed during the development of the large container inspection system, named test class method based on interface object. In this method a set of basic test-class based on the concept of class in the object-oriented method is established and the method of combining the interface graph and the class set is used to describe the test process. So the strict control and the scientific management for the test process are achieved. The conception of test database is introduced in this method, thus the traceability and the repeatability of test process are improved

  12. The research of the test-class method based on interface object in the software integration test of the large container inspection system

    International Nuclear Information System (INIS)

    Sun Shaohua; Chen Zhiqiang; Zhang Li; Gao Wenhuan; Kang Kejun

    2001-01-01

    Software test is the important stage in software process. There has been mature theory, method and model for unit test in practice. But for integration test, there is not regular method to adhere to. The author presents a new method, developed during the development of the large container inspection system, named test-class method based on interface object. A set of basis test-class based on the concept of class in the object-oriented method is established and the method of combining the interface graph and the class set is used to describe the test process. So the strict control and the scientific management for the test process are achieved. The conception of test database is introduced in this method, thus the traceability and the repeatability of test process are improved

  13. Metallogenic aspects of Itu intrusive suite

    International Nuclear Information System (INIS)

    Amaral, G.; Pascholati, E.M.

    1990-01-01

    The integrated use of geological, geochemical, geophysical and remote sensing data is providing interesting new information on the metallogenic characteristics of the Itu Intrusive Suite. During World War II, up to 1959, a wolframite deposit was mined near the border of the northernmost body (Itupeva Granite). This deposit is formed by greisen veins associated with cassiterite and topaz, clearly linked with later phases of magmatic differentiation. Generally those veins are related to hydrothermal alteration of the granites and the above mentioned shear zone. U, Th and K determinations by field and laboratory gammaspectrometry were used for regional distribution analysis of those elements and its ratios and calculation of radioactivity heat production. In this aspects, the Itupeva Granite is the hottest and presents several anomalies in the Th/U ratio, indicative of late or post magmatic oxidation processes. (author)

  14. Air Vehicle Technology Integration Program (AVTIP) Delivery Order 0015: Open Control Platform (OCP) Software Enabled Control (SEC) Hardware in the Loop Simulation - OCP Hardware Integration

    National Research Council Canada - National Science Library

    Paunicka, James L

    2005-01-01

    ...) project sponsored by the DARPA Software Enabled Control (SEC) Program. The purpose of this project is to develop the capability to be an OCP test-bed and to evaluate the OCP controls and simulation environment for a specific test case...

  15. DIALS: implementation and evaluation of a new integration package.

    Science.gov (United States)

    Winter, Graeme; Waterman, David G; Parkhurst, James M; Brewster, Aaron S; Gildea, Richard J; Gerstel, Markus; Fuentes-Montero, Luis; Vollmar, Melanie; Michels-Clark, Tara; Young, Iris D; Sauter, Nicholas K; Evans, Gwyndaf

    2018-02-01

    The DIALS project is a collaboration between Diamond Light Source, Lawrence Berkeley National Laboratory and CCP4 to develop a new software suite for the analysis of crystallographic X-ray diffraction data, initially encompassing spot finding, indexing, refinement and integration. The design, core algorithms and structure of the software are introduced, alongside results from the analysis of data from biological and chemical crystallography experiments.

  16. A Parametric Model of Shoulder Articulation for Virtual Assessment of Space Suit Fit

    Science.gov (United States)

    Kim, K. Han; Young, Karen S.; Bernal, Yaritza; Boppana, Abhishektha; Vu, Linh Q.; Benson, Elizabeth A.; Jarvis, Sarah; Rajulu, Sudhakar L.

    2016-01-01

    Suboptimal suit fit is a known risk factor for crewmember shoulder injury. Suit fit assessment is however prohibitively time consuming and cannot be generalized across wide variations of body shapes and poses. In this work, we have developed a new design tool based on the statistical analysis of body shape scans. This tool is aimed at predicting the skin deformation and shape variations for any body size and shoulder pose for a target population. This new process, when incorporated with CAD software, will enable virtual suit fit assessments, predictively quantifying the contact volume, and clearance between the suit and body surface at reduced time and cost.

  17. Integrating Visualization Applications, such as ParaView, into HEP Software Frameworks for In-situ Event Displays

    Science.gov (United States)

    Lyon, A. L.; Kowalkowski, J. B.; Jones, C. D.

    2017-10-01

    ParaView is a high performance visualization application not widely used in High Energy Physics (HEP). It is a long standing open source project led by Kitware and involves several Department of Energy (DOE) and Department of Defense (DOD) laboratories. Futhermore, it has been adopted by many DOE supercomputing centers and other sites. ParaView is unique in speed and efficiency by using state-of-the-art techniques developed by the academic visualization community that are often not found in applications written by the HEP community. In-situ visualization of events, where event details are visualized during processing/analysis, is a common task for experiment software frameworks. Kitware supplies Catalyst, a library that enables scientific software to serve visualization objects to client ParaView viewers yielding a real-time event display. Connecting ParaView to the Fermilab art framework will be described and the capabilities it brings discussed.

  18. Basis of Estimate Software Tool (BEST) - a practical solution to part of the cost and schedule integration puzzle

    International Nuclear Information System (INIS)

    Murphy, L.; Bain, P.

    1997-01-01

    The Basis of Estimate Software Tool (BEST) was developed at the Rocky Flats Environmental Technology Site (Rocky Flats) to bridge the gap that exists in conventional project control systems between scheduled activities, their allocated or assigned resources, and the set of assumptions (basis of estimate) that correlate resources and activities. Having a documented and auditable basis of estimate (BOE) is necessary for budget validation, work scope analysis, change control, and a number of related management control functions. The uniqueness of BEST is demonstrated by the manner in which it responds to the diverse needs of the heavily regulated environmental workplace - containing many features not found in conventional off-the-shelf software products. However, even companies dealing in relatively unregulated work places will find many attractive features in BEST. This product will be of particular interest to current Government contractors and contractors preparing proposals that may require subsequent validation. 2 figs

  19. Integrating Visualization Applications, such as ParaView, into HEP Software Frameworks for In-situ Event Displays

    Energy Technology Data Exchange (ETDEWEB)

    Lyon, A. L. [Fermilab; Kowalkowski, J. B. [Fermilab; Jones, C. D. [Fermilab

    2017-11-22

    ParaView is a high performance visualization application not widely used in High Energy Physics (HEP). It is a long standing open source project led by Kitware and involves several Department of Energy (DOE) and Department of Defense (DOD) laboratories. Futhermore, it has been adopted by many DOE supercomputing centers and other sites. ParaView is unique in speed and efficiency by using state-of-the-art techniques developed by the academic visualization community that are often not found in applications written by the HEP community. In-situ visualization of events, where event details are visualized during processing/analysis, is a common task for experiment software frameworks. Kitware supplies Catalyst, a library that enables scientific software to serve visualization objects to client ParaView viewers yielding a real-time event display. Connecting ParaView to the Fermilab art framework will be described and the capabilities it brings discussed.

  20. The integration of automated knowledge acquisition with computer-aided software engineering for space shuttle expert systems

    Science.gov (United States)

    Modesitt, Kenneth L.

    1990-01-01

    A prediction was made that the terms expert systems and knowledge acquisition would begin to disappear over the next several years. This is not because they are falling into disuse; it is rather that practitioners are realizing that they are valuable adjuncts to software engineering, in terms of problem domains addressed, user acceptance, and in development methodologies. A specific problem was discussed, that of constructing an automated test analysis system for the Space Shuttle Main Engine. In this domain, knowledge acquisition was part of requirements systems analysis, and was performed with the aid of a powerful inductive ESBT in conjunction with a computer aided software engineering (CASE) tool. The original prediction is not a very risky one -- it has already been accomplished.

  1. Safety in the use of pressurized suits

    International Nuclear Information System (INIS)

    1984-01-01

    This Code of Practice describes the procedures relating to the safe operation of Pressurized Suit Areas and their supporting services. It is directed at personnel responsible for the design and/or operation of Pressurized Suit Areas. (author)

  2. Integrating Remote Sensing with Species Distribution Models; Mapping Tamarisk Invasions Using the Software for Assisted Habitat Modeling (SAHM)

    OpenAIRE

    West, Amanda M.; Evangelista, Paul H.; Jarnevich, Catherine S.; Young, Nicholas E.; Stohlgren, Thomas J.; Talbert, Colin; Talbert, Marian; Morisette, Jeffrey; Anderson, Ryan

    2016-01-01

    Early detection of invasive plant species is vital for the management of natural resources and protection of ecosystem processes. The use of satellite remote sensing for mapping the distribution of invasive plants is becoming more common, however conventional imaging software and classification methods have been shown to be unreliable. In this study, we test and evaluate the use of five species distribution model techniques fit with satellite remote sensing data to map invasive tamarisk (Tama...

  3. IMS2 – An integrated medical software system for early lung cancer detection using ion mobility spectrometry data of human breath

    Directory of Open Access Journals (Sweden)

    Baumbach Jan

    2007-12-01

    Full Text Available IMS2 is an Integrated Medical Software system for the analysis of Ion Mobility Spectrometry (IMS data. It assists medical staff with the following IMS data processing steps: acquisition, visualization, classification, and annotation. IMS2 provides data analysis and interpretation features on the one hand, and also helps to improve the classification by increasing the number of the pre-classified datasets on the other hand. It is designed to facilitate early detection of lung cancer, one of the most common cancer types with one million deaths each year around the world.

  4. Transdisciplinary integration and interfacing software in mechatronic system for carbon sequestration and harvesting energy in the agricultural soils for rewarding farmers through green certificates

    Science.gov (United States)

    Pop, P. P.; Pop-Vadean, A.; Barz, C.; Latinovic, T.

    2017-01-01

    In this article we will present a transdisciplinary approach to carbon sequestration in agricultural soils. The software provides a method proposed to measure the amount of carbon that can be captured from different soil types and different crop. The application has integrated an intuitive interface, is portable and calculate the number of green certificates as a reward for farmers financial support for environmental protection. We plan to initiate a scientific approach to environmental protection through financial incentives for agriculture fits in EU rules by taxing big polluters and rewarding those who maintain a suitable environment for the development of ecological and competitive agriculture.

  5. Integration of auto analysis program of gamma spectrum and software and determination of element content in sample by k-zero method

    International Nuclear Information System (INIS)

    Trinh Quang Vinh; Truong Thi Hong Loan; Mai Van Nhon; Huynh Truc Phuong

    2014-01-01

    Integrating the gamma spectrum auto-analysis program with elemental analysis software by k-zero method is the objective for many researchers. This work is the first stepin building an auto analysis program of gamma spectrum, which includes modules of reading spectrum, displaying spectrum, calibrating energy of peak, smoothing spectrum, calculating peak area and determining content of elements in sample. Then, the results from the measurements of standard samples by a low level spectrometer using HPGe detector are compared to those of other gamma spectrum auto-analysis programs. (author)

  6. Data Analysis and Graphing in an Introductory Physics Laboratory: Spreadsheet versus Statistics Suite

    Science.gov (United States)

    Peterlin, Primoz

    2010-01-01

    Two methods of data analysis are compared: spreadsheet software and a statistics software suite. Their use is compared analysing data collected in three selected experiments taken from an introductory physics laboratory, which include a linear dependence, a nonlinear dependence and a histogram. The merits of each method are compared. (Contains 7…

  7. Data analysis and graphing in an introductory physics laboratory: spreadsheet versus statistics suite

    International Nuclear Information System (INIS)

    Peterlin, Primoz

    2010-01-01

    Two methods of data analysis are compared: spreadsheet software and a statistics software suite. Their use is compared analysing data collected in three selected experiments taken from an introductory physics laboratory, which include a linear dependence, a nonlinear dependence and a histogram. The merits of each method are compared.

  8. OASYS (OrAnge SYnchrotron Suite): an open-source graphical environment for x-ray virtual experiments

    Science.gov (United States)

    Rebuffi, Luca; Sanchez del Rio, Manuel

    2017-08-01

    The evolution of the hardware platforms, the modernization of the software tools, the access to the codes of a large number of young people and the popularization of the open source software for scientific applications drove us to design OASYS (ORange SYnchrotron Suite), a completely new graphical environment for modelling X-ray experiments. The implemented software architecture allows to obtain not only an intuitive and very-easy-to-use graphical interface, but also provides high flexibility and rapidity for interactive simulations, making configuration changes to quickly compare multiple beamline configurations. Its purpose is to integrate in a synergetic way the most powerful calculation engines available. OASYS integrates different simulation strategies via the implementation of adequate simulation tools for X-ray Optics (e.g. ray tracing and wave optics packages). It provides a language to make them to communicate by sending and receiving encapsulated data. Python has been chosen as main programming language, because of its universality and popularity in scientific computing. The software Orange, developed at the University of Ljubljana (SLO), is the high level workflow engine that provides the interaction with the user and communication mechanisms.

  9. Core component integration tests for the back-end software sub-system in the ATLAS data acquisition and event filter prototype -1 project

    International Nuclear Information System (INIS)

    Badescu, E.; Caprini, M.; Niculescu, M.; Radu, A.

    2000-01-01

    The ATLAS data acquisition (DAQ) and Event Filter (EF) prototype -1 project was intended to produce a prototype system for evaluating candidate technologies and architectures for the final ATLAS DAQ system on the LHC accelerator at CERN. Within the prototype project, the back-end sub-system encompasses the software for configuring, controlling and monitoring the DAQ. The back-end sub-system includes core components and detector integration components. The core components provide the basic functionality and had priority in terms of time-scale for development in order to have a baseline sub-system that can be used for integration with the data-flow sub-system and event filter. The following components are considered to be the core of the back-end sub-system: - Configuration databases, describe a large number of parameters of the DAQ system architecture, hardware and software components, running modes and status; - Message reporting system (MRS), allows all software components to report messages to other components in the distributed environment; - Information service (IS) allows the information exchange for software components; - Process manager (PMG), performs basic job control of software components (start, stop, monitoring the status); - Run control (RC), controls the data taking activities by coordinating the operations of the DAQ sub-systems, back-end software and external systems. Performance and scalability tests have been made for individual components. The back-end subsystem integration tests bring together all the core components and several trigger/DAQ/detector integration components to simulate the control and configuration of data taking sessions. For back-end integration tests a test plan was provided. The tests have been done using a shell script that goes through different phases as follows: - starting the back-end server processes to initialize communication services and PMG; - launching configuration specific processes via DAQ supervisor as

  10. Software for microcircuit systems

    International Nuclear Information System (INIS)

    Kunz, P.F.

    1978-10-01

    Modern Large Scale Integration (LSI) microcircuits are meant to be programed in order to control the function that they perform. The basics of microprograming and new microcircuits have already been discussed. In this course, the methods of developing software for these microcircuits are explored. This generally requires a package of support software in order to assemble the microprogram, and also some amount of support software to test the microprograms and to test the microprogramed circuit itself. 15 figures, 2 tables

  11. Software quality management

    International Nuclear Information System (INIS)

    Bishop, D.C.; Pymm, P.

    1991-01-01

    As programmable electronic (software-based) systems are increasingly being proposed as design solutions for high integrity applications in nuclear power stations, the need to adopt suitable quality management arrangements is paramount. The authors describe Scottish Nuclear's strategy for software quality management and, using the main on-line monitoring system at Torness Power Station as an example, explain how this strategy is put into practice. Particular attention is given to the topics of software quality planning and change control. (author)

  12. Event driven software package for the database of Integrated Coastal and Marine Area Management (ICMAM) (Developed in 'C')

    Digital Repository Service at National Institute of Oceanography (India)

    Sadhuram, Y.; Murty, T.V.R.; Chandramouli, P.; Murthy, K.S.R.

    National Institute of Oceanography (NIO, RC, Visakhapatnam, India) had taken up the Integrated Coastal and Marine Area Management (ICMAM) project funded by Department of Ocean Development (DOD), New Delhi, India. The main objective of this project...

  13. Integrating Remote Sensing with Species Distribution Models; Mapping Tamarisk Invasions Using the Software for Assisted Habitat Modeling (SAHM).

    Science.gov (United States)

    West, Amanda M; Evangelista, Paul H; Jarnevich, Catherine S; Young, Nicholas E; Stohlgren, Thomas J; Talbert, Colin; Talbert, Marian; Morisette, Jeffrey; Anderson, Ryan

    2016-10-11

    Early detection of invasive plant species is vital for the management of natural resources and protection of ecosystem processes. The use of satellite remote sensing for mapping the distribution of invasive plants is becoming more common, however conventional imaging software and classification methods have been shown to be unreliable. In this study, we test and evaluate the use of five species distribution model techniques fit with satellite remote sensing data to map invasive tamarisk (Tamarix spp.) along the Arkansas River in Southeastern Colorado. The models tested included boosted regression trees (BRT), Random Forest (RF), multivariate adaptive regression splines (MARS), generalized linear model (GLM), and Maxent. These analyses were conducted using a newly developed software package called the Software for Assisted Habitat Modeling (SAHM). All models were trained with 499 presence points, 10,000 pseudo-absence points, and predictor variables acquired from the Landsat 5 Thematic Mapper (TM) sensor over an eight-month period to distinguish tamarisk from native riparian vegetation using detection of phenological differences. From the Landsat scenes, we used individual bands and calculated Normalized Difference Vegetation Index (NDVI), Soil-Adjusted Vegetation Index (SAVI), and tasseled capped transformations. All five models identified current tamarisk distribution on the landscape successfully based on threshold independent and threshold dependent evaluation metrics with independent location data. To account for model specific differences, we produced an ensemble of all five models with map output highlighting areas of agreement and areas of uncertainty. Our results demonstrate the usefulness of species distribution models in analyzing remotely sensed data and the utility of ensemble mapping, and showcase the capability of SAHM in pre-processing and executing multiple complex models.

  14. Integrating remote sensing with species distribution models; Mapping tamarisk invasions using the Software for Assisted Habitat Modeling (SAHM)

    Science.gov (United States)

    West, Amanda M.; Evangelista, Paul H.; Jarnevich, Catherine S.; Young, Nicholas E.; Stohlgren, Thomas J.; Talbert, Colin; Talbert, Marian; Morisette, Jeffrey; Anderson, Ryan

    2016-01-01

    Early detection of invasive plant species is vital for the management of natural resources and protection of ecosystem processes. The use of satellite remote sensing for mapping the distribution of invasive plants is becoming more common, however conventional imaging software and classification methods have been shown to be unreliable. In this study, we test and evaluate the use of five species distribution model techniques fit with satellite remote sensing data to map invasive tamarisk (Tamarix spp.) along the Arkansas River in Southeastern Colorado. The models tested included boosted regression trees (BRT), Random Forest (RF), multivariate adaptive regression splines (MARS), generalized linear model (GLM), and Maxent. These analyses were conducted using a newly developed software package called the Software for Assisted Habitat Modeling (SAHM). All models were trained with 499 presence points, 10,000 pseudo-absence points, and predictor variables acquired from the Landsat 5 Thematic Mapper (TM) sensor over an eight-month period to distinguish tamarisk from native riparian vegetation using detection of phenological differences. From the Landsat scenes, we used individual bands and calculated Normalized Difference Vegetation Index (NDVI), Soil-Adjusted Vegetation Index (SAVI), and tasseled capped transformations. All five models identified current tamarisk distribution on the landscape successfully based on threshold independent and threshold dependent evaluation metrics with independent location data. To account for model specific differences, we produced an ensemble of all five models with map output highlighting areas of agreement and areas of uncertainty. Our results demonstrate the usefulness of species distribution models in analyzing remotely sensed data and the utility of ensemble mapping, and showcase the capability of SAHM in pre-processing and executing multiple complex models.

  15. Integration of interactive three-dimensional image post-processing software into undergraduate radiology education effectively improves diagnostic skills and visual-spatial ability.

    Science.gov (United States)

    Rengier, Fabian; Häfner, Matthias F; Unterhinninghofen, Roland; Nawrotzki, Ralph; Kirsch, Joachim; Kauczor, Hans-Ulrich; Giesel, Frederik L

    2013-08-01

    Integrating interactive three-dimensional post-processing software into undergraduate radiology teaching might be a promising approach to synergistically improve both visual-spatial ability and radiological skills, thereby reducing students' deficiencies in image interpretation. The purpose of this study was to test our hypothesis that a hands-on radiology course for medical students using interactive three-dimensional image post-processing software improves radiological knowledge, diagnostic skills and visual-spatial ability. A hands-on radiology course was developed using interactive three-dimensional image post-processing software. The course consisted of seven seminars held on a weekly basis. The 25 participating fourth- and fifth-year medical students learnt to systematically analyse cross-sectional imaging data and correlated the two-dimensional images with three-dimensional reconstructions. They were instructed by experienced radiologists and collegiate tutors. The improvement in radiological knowledge, diagnostic skills and visual-spatial ability was assessed immediately before and after the course by multiple-choice tests comprising 64 questions each. Wilcoxon signed rank test for paired samples was applied. The total number of correctly answered questions improved from 36.9±4.8 to 49.5±5.4 (pability by 11.3% (psoftware into undergraduate radiology education effectively improves radiological reasoning, diagnostic skills and visual-spatial ability, and thereby even diagnostic skills for imaging modalities not included in the course. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  16. Verification of operation of the actuator control system using the integration the B&R Automation Studio software with a virtual model of the actuator system

    Science.gov (United States)

    Herbuś, K.; Ociepka, P.

    2017-08-01

    In the work is analysed a sequential control system of a machine for separating and grouping work pieces for processing. Whereas, the area of the considered problem is related with verification of operation of an actuator system of an electro-pneumatic control system equipped with a PLC controller. Wherein to verification is subjected the way of operation of actuators in view of logic relationships assumed in the control system. The actuators of the considered control system were three drives of linear motion (pneumatic cylinders). And the logical structure of the system of operation of the control system is based on the signals flow graph. The tested logical structure of operation of the electro-pneumatic control system was implemented in the Automation Studio software of B&R company. This software is used to create programs for the PLC controllers. Next, in the FluidSIM software was created the model of the actuator system of the control system of a machine. To verify the created program for the PLC controller, simulating the operation of the created model, it was utilized the approach of integration these two programs using the tool for data exchange in the form of the OPC server.

  17. Z-2 Suit Support Stand and MKIII Suit Center of Gravity Test

    Science.gov (United States)

    Nguyen, Tuan Q.

    2014-01-01

    NASA's next generation spacesuits are the Z-Series suits, made for a range of possible exploration missions in the near future. The prototype Z-1 suit has been developed and assembled to incorporate new technologies that has never been utilized before in the Apollo suits and the Extravehicular Mobility Unit (EMU). NASA engineers tested the Z-1 suit extensively in order to developed design requirements for the new Z-2 suit. At the end of 2014, NASA will be receiving the new Z-2 suit to perform more testing and to further develop the new technologies of the suit. In order to do so, a suit support stand will be designed and fabricated to support the Z-2 suit during maintenance, sizing, and structural leakage testing. The Z-2 Suit Support Stand (Z2SSS) will be utilized for these purposes in the early testing stages of the Z-2 suit.

  18. Software didattico: integrazione scolastica

    Directory of Open Access Journals (Sweden)

    Lucia Ferlino

    1996-01-01

    Full Text Available Discussion of the use of educational software for school integration. Requires being aware of its potential effectiveness and know that it also lies in the choice of functional products.

  19. Software Quality Control at Belle II

    Science.gov (United States)

    Ritter, M.; Kuhr, T.; Hauth, T.; Gebard, T.; Kristof, M.; Pulvermacher, C.; Belle Software Group, II

    2017-10-01

    Over the last seven years the software stack of the next generation B factory experiment Belle II has grown to over one million lines of C++ and Python code, counting only the part included in offline software releases. There are several thousand commits to the central repository by about 100 individual developers per year. To keep a coherent software stack of high quality that it can be sustained and used efficiently for data acquisition, simulation, reconstruction, and analysis over the lifetime of the Belle II experiment is a challenge. A set of tools is employed to monitor the quality of the software and provide fast feedback to the developers. They are integrated in a machinery that is controlled by a buildbot master and automates the quality checks. The tools include different compilers, cppcheck, the clang static analyzer, valgrind memcheck, doxygen, a geometry overlap checker, a check for missing or extra library links, unit tests, steering file level tests, a sophisticated high-level validation suite, and an issue tracker. The technological development infrastructure is complemented by organizational means to coordinate the development.

  20. Modern software approaches applied to a Hydrological model: the GEOtop Open-Source Software Project

    Science.gov (United States)

    Cozzini, Stefano; Endrizzi, Stefano; Cordano, Emanuele; Bertoldi, Giacomo; Dall'Amico, Matteo

    2017-04-01

    and the suite of tests (easily manageable by means of ctest tools) greatly reduces the burden of the installation and allows us to enhance portability on different compilers and Operating system platforms. The package was also complemented by several software tools which provide web-based visualization of results based on R plugins, in particular "shiny" (Chang at al, 2016), "geotopbricks" and "geotopOptim2" (Cordano et al, 2016) packages, which allow rapid and efficient scientific validation of new examples and tests. The software re-engineering activities are still under development. However, our first results are promising enough to eventually reach a robust and stable software project that manages in a flexible way a complex state-of-the-art hydrological model like GEOtop and integrates it into wider workflows.

  1. A New Approach to Integrate Internet-of-Things and Software-as-a-Service Model for Logistic Systems: A Case Study

    Science.gov (United States)

    Chen, Shang-Liang; Chen, Yun-Yao; Hsu, Chiang

    2014-01-01

    Cloud computing is changing the ways software is developed and managed in enterprises, which is changing the way of doing business in that dynamically scalable and virtualized resources are regarded as services over the Internet. Traditional manufacturing systems such as supply chain management (SCM), customer relationship management (CRM), and enterprise resource planning (ERP) are often developed case by case. However, effective collaboration between different systems, platforms, programming languages, and interfaces has been suggested by researchers. In cloud-computing-based systems, distributed resources are encapsulated into cloud services and centrally managed, which allows high automation, flexibility, fast provision, and ease of integration at low cost. The integration between physical resources and cloud services can be improved by combining Internet of things (IoT) technology and Software-as-a-Service (SaaS) technology. This study proposes a new approach for developing cloud-based manufacturing systems based on a four-layer SaaS model. There are three main contributions of this paper: (1) enterprises can develop their own cloud-based logistic management information systems based on the approach proposed in this paper; (2) a case study based on literature reviews with experimental results is proposed to verify that the system performance is remarkable; (3) challenges encountered and feedback collected from T Company in the case study are discussed in this paper for the purpose of enterprise deployment. PMID:24686728

  2. A New Approach to Integrate Internet-of-Things and Software-as-a-Service Model for Logistic Systems: A Case Study

    Directory of Open Access Journals (Sweden)

    Shang-Liang Chen

    2014-03-01

    Full Text Available Cloud computing is changing the ways software is developed and managed in enterprises, which is changing the way of doing business in that dynamically scalable and virtualized resources are regarded as services over the Internet. Traditional manufacturing systems such as supply chain management (SCM, customer relationship management (CRM, and enterprise resource planning (ERP are often developed case by case. However, effective collaboration between different systems, platforms, programming languages, and interfaces has been suggested by researchers. In cloud-computing-based systems, distributed resources are encapsulated into cloud services and centrally managed, which allows high automation, flexibility, fast provision, and ease of integration at low cost. The integration between physical resources and cloud services can be improved by combining Internet of things (IoT technology and Software-as-a-Service (SaaS technology. This study proposes a new approach for developing cloud-based manufacturing systems based on a four-layer SaaS model. There are three main contributions of this paper: (1 enterprises can develop their own cloud-based logistic management information systems based on the approach proposed in this paper; (2 a case study based on literature reviews with experimental results is proposed to verify that the system performance is remarkable; (3 challenges encountered and feedback collected from T Company in the case study are discussed in this paper for the purpose of enterprise deployment.

  3. A new approach to integrate Internet-of-things and software-as-a-service model for logistic systems: a case study.

    Science.gov (United States)

    Chen, Shang-Liang; Chen, Yun-Yao; Hsu, Chiang

    2014-03-28

    Cloud computing is changing the ways software is developed and managed in enterprises, which is changing the way of doing business in that dynamically scalable and virtualized resources are regarded as services over the Internet. Traditional manufacturing systems such as supply chain management (SCM), customer relationship management (CRM), and enterprise resource planning (ERP) are often developed case by case. However, effective collaboration between different systems, platforms, programming languages, and interfaces has been suggested by researchers. In cloud-computing-based systems, distributed resources are encapsulated into cloud services and centrally managed, which allows high automation, flexibility, fast provision, and ease of integration at low cost. The integration between physical resources and cloud services can be improved by combining Internet of things (IoT) technology and Software-as-a-Service (SaaS) technology. This study proposes a new approach for developing cloud-based manufacturing systems based on a four-layer SaaS model. There are three main contributions of this paper: (1) enterprises can develop their own cloud-based logistic management information systems based on the approach proposed in this paper; (2) a case study based on literature reviews with experimental results is proposed to verify that the system performance is remarkable; (3) challenges encountered and feedback collected from T Company in the case study are discussed in this paper for the purpose of enterprise deployment.

  4. A computational systems biology software platform for multiscale modeling and simulation: Integrating whole-body physiology, disease biology, and molecular reaction networks

    Directory of Open Access Journals (Sweden)

    Thomas eEissing

    2011-02-01

    Full Text Available Today, in silico studies and trial simulations already complement experimental approaches in pharmaceutical R&D and have become indispensable tools for decision making and communication with regulatory agencies. While biology is multi-scale by nature, project work and software tools usually focus on isolated aspects of drug action, such as pharmacokinetics at the organism scale or pharmacodynamic interaction on the molecular level. We present a modeling and simulation software platform consisting of PK-Sim® and MoBi® capable of building and simulating models that integrate across biological scales. A prototypical multiscale model for the progression of a pancreatic tumor and its response to pharmacotherapy is constructed and virtual patients are treated with a prodrug activated by hepatic metabolization. Tumor growth is driven by signal transduction leading to cell cycle transition and proliferation. Free tumor concentrations of the active metabolite inhibit Raf kinase in the signaling cascade and thereby cell cycle progression. In a virtual clinical study, the individual therapeutic outcome of the chemotherapeutic intervention is simulated for a large population with heterogeneous genomic background. Thereby, the platform allows efficient model building and integration of biological knowledge and prior data from all biological scales. Experimental in vitro model systems can be linked with observations in animal experiments and clinical trials. The interplay between patients, diseases, and drugs and topics with high clinical relevance such as the role of pharmacogenomics, drug-drug or drug-metabolite interactions can be addressed using this mechanistic, insight driven multiscale modeling approach.

  5. MoFi: A Software Tool for Annotating Glycoprotein Mass Spectra by Integrating Hybrid Data from the Intact Protein and Glycopeptide Level.

    Science.gov (United States)

    Skala, Wolfgang; Wohlschlager, Therese; Senn, Stefan; Huber, Gabriel E; Huber, Christian G

    2018-04-18

    Hybrid mass spectrometry (MS) is an emerging technique for characterizing glycoproteins, which typically display pronounced microheterogeneity. Since hybrid MS combines information from different experimental levels, it crucially depends on computational methods. Here, we describe a novel software tool, MoFi, which integrates hybrid MS data to assign glycans and other post-translational modifications (PTMs) in deconvoluted mass spectra of intact proteins. Its two-stage search algorithm first assigns monosaccharide/PTM compositions to each peak and then compiles a hierarchical list of glycan combinations compatible with these compositions. Importantly, the program only includes those combinations which are supported by a glycan library as derived from glycopeptide or released glycan analysis. By applying MoFi to mass spectra of rituximab, ado-trastuzumab emtansine, and recombinant human erythropoietin, we demonstrate how integration of bottom-up data may be used to refine information collected at the intact protein level. Accordingly, our software reveals that a single mass frequently can be explained by a considerable number of glycoforms. Yet, it simultaneously ranks proteoforms according to their probability, based on a score which is calculated from relative glycan abundances. Notably, glycoforms that comprise identical glycans may nevertheless differ in score if those glycans occupy different sites. Hence, MoFi exposes different layers of complexity that are present in the annotation of a glycoprotein mass spectrum.

  6. Potential of a suite of robot/computer-assisted motivating systems for personalized, home-based, stroke rehabilitation

    Directory of Open Access Journals (Sweden)

    Feng Xin

    2007-03-01

    Full Text Available Abstract Background There is a need to improve semi-autonomous stroke therapy in home environments often characterized by low supervision of clinical experts and low extrinsic motivation. Our distributed device approach to this problem consists of an integrated suite of low-cost robotic/computer-assistive technologies driven by a novel universal access software framework called UniTherapy. Our design strategy for personalizing the therapy, providing extrinsic motivation and outcome assessment is presented and evaluated. Methods Three studies were conducted to evaluate the potential of the suite. A conventional force-reflecting joystick, a modified joystick therapy platform (TheraJoy, and a steering wheel platform (TheraDrive were tested separately with the UniTherapy software. Stroke subjects with hemiparesis and able-bodied subjects completed tracking activities with the devices in different positions. We quantify motor performance across subject groups and across device platforms and muscle activation across devices at two positions in the arm workspace. Results Trends in the assessment metrics were consistent across devices with able-bodied and high functioning strokes subjects being significantly more accurate and quicker in their motor performance than low functioning subjects. Muscle activation patterns were different for shoulder and elbow across different devices and locations. Conclusion The Robot/CAMR suite has potential for stroke rehabilitation. By manipulating hardware and software variables, we can create personalized therapy environments that engage patients, address their therapy need, and track their progress. A larger longitudinal study is still needed to evaluate these systems in under-supervised environments such as the home.

  7. Software testing for evolutionary iterative rapid prototyping

    OpenAIRE

    Davis, Edward V., Jr.

    1990-01-01

    Approved for public release; distribution unlimited. Rapid prototyping is emerging as a promising software development paradigm. It provides a systematic and automatable means of developing a software system under circumstances where initial requirements are not well known or where requirements change frequently during development. To provide high software quality assurance requires sufficient software testing. The unique nature of evolutionary iterative prototyping is not well-suited for ...

  8. Dose - a software package for the calculation of integrated exposure resulting from an accident in a nuclear power plant

    International Nuclear Information System (INIS)

    Doron, E.; Ohaion, H.; Asculai, E.

    1985-05-01

    A software package intended for the assessments of risks resulting from accidental release of radioactive materials from a nuclear power plant is presented. The models and the various programs based on them, are described. The work includes detailed operating instructions for the various programs, as well as instructions for the preparation of the necessary input data. Various options are described for additions and changes to the programs with the aim of extending their usefulness to more general cases from the aspects of meteorology and pollution sources. finally, a sample calculation that enables the user to test the proper functioning of the whole package, as well as his own proficiency in its use, is given. (author)

  9. Implementation of electronic medical records requires more than new software: Lessons on integrating and managing health technologies from Mbarara, Uganda.

    Science.gov (United States)

    Madore, Amy; Rosenberg, Julie; Muyindike, Winnie R; Bangsberg, David R; Bwana, Mwebesa B; Martin, Jeffrey N; Kanyesigye, Michael; Weintraub, Rebecca

    2015-12-01

    Implementation lessons: • Technology alone does not necessarily lead to improvement in health service delivery, in contrast to the common assumption that advanced technology goes hand in hand with progress. • Implementation of electronic medical record (EMR) systems is a complex, resource-intensive process that, in addition to software, hardware, and human resource investments, requires careful planning, change management skills, adaptability, and continuous engagement of stakeholders. • Research requirements and goals must be balanced with service delivery needs when determining how much information is essential to collect and who should be interfacing with the EMR system. • EMR systems require ongoing monitoring and regular updates to ensure they are responsive to evolving clinical use cases and research questions. • High-quality data and analyses are essential for EMRs to deliver value to providers, researchers, and patients. Copyright © 2015 Elsevier Inc. All rights reserved.

  10. Integration of life cycle assessment software with tools for economic and sustainability analyses and process simulation for sustainable process design

    DEFF Research Database (Denmark)

    Kalakul, Sawitree; Malakul, Pomthong; Siemanond, Kitipat

    2014-01-01

    The sustainable future of the world challenges engineers to develop chemical process designs that are not only technically and economically feasible but also environmental friendly. Life cycle assessment (LCA) is a tool for identifying and quantifying environmental impacts of the chemical product...... with other process design tools such as sustainable design (SustainPro), economic analysis (ECON) and process simulation. The software framework contains four main tools: Tool-I is for life cycle inventory (LCI) knowledge management that enables easy maintenance and future expansion of the LCI database; Tool...... and/or the process that makes it. It can be used in conjunction with process simulation and economic analysis tools to evaluate the design of any existing and/or new chemical-biochemical process and to propose improvement options in order to arrive at the best design among various alternatives...

  11. Software quality assurance

    CERN Document Server

    Laporte, Claude Y

    2018-01-01

    This book introduces Software Quality Assurance (SQA) and provides an overview of standards used to implement SQA. It defines ways to assess the effectiveness of how one approaches software quality across key industry sectors such as telecommunications, transport, defense, and aerospace. * Includes supplementary website with an instructor's guide and solutions * Applies IEEE software standards as well as the Capability Maturity Model Integration for Development (CMMI) * Illustrates the application of software quality assurance practices through the use of practical examples, quotes from experts, and tips from the authors

  12. Software engineering

    CERN Document Server

    Sommerville, Ian

    2010-01-01

    The ninth edition of Software Engineering presents a broad perspective of software engineering, focusing on the processes and techniques fundamental to the creation of reliable, software systems. Increased coverage of agile methods and software reuse, along with coverage of 'traditional' plan-driven software engineering, gives readers the most up-to-date view of the field currently available. Practical case studies, a full set of easy-to-access supplements, and extensive web resources make teaching the course easier than ever.

  13. Software architecture evolution

    DEFF Research Database (Denmark)

    Barais, Olivier; Le Meur, Anne-Francoise; Duchien, Laurence

    2008-01-01

    Software architectures must frequently evolve to cope with changing requirements, and this evolution often implies integrating new concerns. Unfortunately, when the new concerns are crosscutting, existing architecture description languages provide little or no support for this kind of evolution....... The software architect must modify multiple elements of the architecture manually, which risks introducing inconsistencies. This chapter provides an overview, comparison and detailed treatment of the various state-of-the-art approaches to describing and evolving software architectures. Furthermore, we discuss...... one particular framework named Tran SAT, which addresses the above problems of software architecture evolution. Tran SAT provides a new element in the software architecture descriptions language, called an architectural aspect, for describing new concerns and their integration into an existing...

  14. Integration of interactive three-dimensional image post-processing software into undergraduate radiology education effectively improves diagnostic skills and visual-spatial ability

    Energy Technology Data Exchange (ETDEWEB)

    Rengier, Fabian, E-mail: fabian.rengier@web.de [University Hospital Heidelberg, Department of Diagnostic and Interventional Radiology, Im Neuenheimer Feld 110, 69120 Heidelberg (Germany); Häfner, Matthias F. [University Hospital Heidelberg, Department of Radiation Oncology, Im Neuenheimer Feld 400, 69120 Heidelberg (Germany); Unterhinninghofen, Roland [Karlsruhe Institute of Technology (KIT), Institute for Anthropomatics, Department of Informatics, Adenauerring 2, 76131 Karlsruhe (Germany); Nawrotzki, Ralph; Kirsch, Joachim [University of Heidelberg, Institute of Anatomy and Cell Biology, Im Neuenheimer Feld 307, 69120 Heidelberg (Germany); Kauczor, Hans-Ulrich [University Hospital Heidelberg, Department of Diagnostic and Interventional Radiology, Im Neuenheimer Feld 110, 69120 Heidelberg (Germany); Giesel, Frederik L. [University of Heidelberg, Institute of Anatomy and Cell Biology, Im Neuenheimer Feld 307, 69120 Heidelberg (Germany); University Hospital Heidelberg, Department of Nuclear Medicine, Im Neuenheimer Feld 400, 69120 Heidelberg (Germany)

    2013-08-15

    Purpose: Integrating interactive three-dimensional post-processing software into undergraduate radiology teaching might be a promising approach to synergistically improve both visual-spatial ability and radiological skills, thereby reducing students’ deficiencies in image interpretation. The purpose of this study was to test our hypothesis that a hands-on radiology course for medical students using interactive three-dimensional image post-processing software improves radiological knowledge, diagnostic skills and visual-spatial ability. Materials and methods: A hands-on radiology course was developed using interactive three-dimensional image post-processing software. The course consisted of seven seminars held on a weekly basis. The 25 participating fourth- and fifth-year medical students learnt to systematically analyse cross-sectional imaging data and correlated the two-dimensional images with three-dimensional reconstructions. They were instructed by experienced radiologists and collegiate tutors. The improvement in radiological knowledge, diagnostic skills and visual-spatial ability was assessed immediately before and after the course by multiple-choice tests comprising 64 questions each. Wilcoxon signed rank test for paired samples was applied. Results: The total number of correctly answered questions improved from 36.9 ± 4.8 to 49.5 ± 5.4 (p < 0.001) which corresponded to a mean improvement of 12.6 (95% confidence interval 9.9–15.3) or 19.8%. Radiological knowledge improved by 36.0% (p < 0.001), diagnostic skills for cross-sectional imaging by 38.7% (p < 0.001), diagnostic skills for other imaging modalities – which were not included in the course – by 14.0% (p = 0.001), and visual-spatial ability by 11.3% (p < 0.001). Conclusion: The integration of interactive three-dimensional image post-processing software into undergraduate radiology education effectively improves radiological reasoning, diagnostic skills and visual-spatial ability, and thereby

  15. Integration of interactive three-dimensional image post-processing software into undergraduate radiology education effectively improves diagnostic skills and visual-spatial ability

    International Nuclear Information System (INIS)

    Rengier, Fabian; Häfner, Matthias F.; Unterhinninghofen, Roland; Nawrotzki, Ralph; Kirsch, Joachim; Kauczor, Hans-Ulrich; Giesel, Frederik L.

    2013-01-01

    Purpose: Integrating interactive three-dimensional post-processing software into undergraduate radiology teaching might be a promising approach to synergistically improve both visual-spatial ability and radiological skills, thereby reducing students’ deficiencies in image interpretation. The purpose of this study was to test our hypothesis that a hands-on radiology course for medical students using interactive three-dimensional image post-processing software improves radiological knowledge, diagnostic skills and visual-spatial ability. Materials and methods: A hands-on radiology course was developed using interactive three-dimensional image post-processing software. The course consisted of seven seminars held on a weekly basis. The 25 participating fourth- and fifth-year medical students learnt to systematically analyse cross-sectional imaging data and correlated the two-dimensional images with three-dimensional reconstructions. They were instructed by experienced radiologists and collegiate tutors. The improvement in radiological knowledge, diagnostic skills and visual-spatial ability was assessed immediately before and after the course by multiple-choice tests comprising 64 questions each. Wilcoxon signed rank test for paired samples was applied. Results: The total number of correctly answered questions improved from 36.9 ± 4.8 to 49.5 ± 5.4 (p < 0.001) which corresponded to a mean improvement of 12.6 (95% confidence interval 9.9–15.3) or 19.8%. Radiological knowledge improved by 36.0% (p < 0.001), diagnostic skills for cross-sectional imaging by 38.7% (p < 0.001), diagnostic skills for other imaging modalities – which were not included in the course – by 14.0% (p = 0.001), and visual-spatial ability by 11.3% (p < 0.001). Conclusion: The integration of interactive three-dimensional image post-processing software into undergraduate radiology education effectively improves radiological reasoning, diagnostic skills and visual-spatial ability, and thereby

  16. A user's guide to the GoldSim/BLT-MS integrated software package:a low-level radioactive waste disposal performance assessment model

    International Nuclear Information System (INIS)

    Knowlton, Robert G.; Arnold, Bill Walter; Mattie, Patrick D.

    2007-01-01

    Sandia National Laboratories (Sandia), a U.S. Department of Energy National Laboratory, has over 30 years experience in the assessment of radioactive waste disposal and at the time of this publication is providing assistance internationally in a number of areas relevant to the safety assessment of radioactive waste disposal systems. In countries with small radioactive waste programs, international technology transfer program efforts are often hampered by small budgets, schedule constraints, and a lack of experienced personnel. In an effort to surmount these difficulties, Sandia has developed a system that utilizes a combination of commercially available software codes and existing legacy codes for probabilistic safety assessment modeling that facilitates the technology transfer and maximizes limited available funding. Numerous codes developed and endorsed by the United States Nuclear Regulatory Commission (NRC) and codes developed and maintained by United States Department of Energy are generally available to foreign countries after addressing import/export control and copyright requirements. From a programmatic view, it is easier to utilize existing codes than to develop new codes. From an economic perspective, it is not possible for most countries with small radioactive waste disposal programs to maintain complex software, which meets the rigors of both domestic regulatory requirements and international peer review. Therefore, revitalization of deterministic legacy codes, as well as an adaptation of contemporary deterministic codes, provides a credible and solid computational platform for constructing probabilistic safety assessment models. This document is a reference users guide for the GoldSim/BLT-MS integrated modeling software package developed as part of a cooperative technology transfer project between Sandia National Laboratories and the Institute of Nuclear Energy Research (INER) in Taiwan for the preliminary assessment of several candidate low

  17. Developing Human-Computer Interface Models and Representation Techniques(Dialogue Management as an Integral Part of Software Engineering)

    OpenAIRE

    Hartson, H. Rex; Hix, Deborah; Kraly, Thomas M.

    1987-01-01

    The Dialogue Management Project at Virginia Tech is studying the poorly understood problem of human-computer dialogue development. This problem often leads to low usability in human-computer dialogues. The Dialogue Management Project approaches solutions to low usability in interfaces by addressing human-computer dialogue development as an integral and equal part of the total system development process. This project consists of two rather distinct, but dependent, parts. One is development of ...

  18. HPC Benchmark Suite NMx, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Intelligent Automation Inc., (IAI) and University of Central Florida (UCF) propose to develop a comprehensive numerical test suite for benchmarking current and...

  19. The Integration of Ecosystem Services in Planning: An Evaluation of the Nutrient Retention Model Using InVEST Software

    Directory of Open Access Journals (Sweden)

    Stefano Salata

    2017-07-01

    Full Text Available Mapping ecosystem services (ES increases the awareness of natural capital value, leading to building sustainability into decision-making processes. Recently, many techniques to assess the value of ES delivered by different scenarios of land use/land cover (LULC are available, thus becoming important practices in mapping to support the land use planning process. The spatial analysis of the biophysical ES distribution allows a better comprehension of the environmental and social implications of planning, especially when ES concerns the management of risk (e.g., erosion, pollution. This paper investigates the nutrient retention model of InVEST software through its spatial distribution and its quantitative value. The model was analyzed by testing its response to changes in input parameters: (1 the digital terrain elevation model (DEM; and (2 different LULC attribute configurations. The paper increases the level of attention to specific ES models that use water runoff as a proxy of nutrient delivery. It shows that the spatial distribution of biophysical values is highly influenced by many factors, among which the characteristics of the DEM and its interaction with LULC are included. The results seem to confirm that the biophysical value of ES is still affected by a high degree of uncertainty and encourage an expert field campaign as the only solution to use ES mapping for a regulative land use framework.

  20. Advancing the integration of hospital IT. Pitfalls and perspectives when replacing specialized software for high-risk environments with enterprise system extensions.

    Science.gov (United States)

    Engelmann, Carsten; Ametowobla, Dzifa

    2017-05-17

    Planning and controlling surgical operations hugely impacts upon productivity, patient safety, and surgeons' careers. Established, specialized software for this task is being increasingly replaced by "Operating Room (OR)-modules" appended to enterprise-wide resource planning (ERP) systems. As a result, usability problems are re-emerging and require developers' attention. Systematic evaluation of the functionality and social repercussions of a global, market-leading IT business control system (SAP R3, Germany), adapted for real-time OR process steering. Field study involving document analyses, interviews, and a 73-item survey addressed to 77 qualified (> 1-year system experience) senior planning executives (end users; "planners") working in surgical departments of university hospitals. Planners reported that 57% of electronic operation requests contained contradictory information. Key screens contained clinically irrelevant areas (36 +/- 29%). Compared to the legacy system, users reported either no improvements or worse performance, in regard to co-ordination of OR stakeholders, intra-day program changes, and safety. Planners concluded that the ERP-planning module was "non-intuitive" (66%), increased planning work (56%, p=0.002), and did not impact upon either organizational mishap spectrum or frequency. Interviews evidenced intra-institutional power shifts due to increased system complexity. Planners resented e.g. a trend towards increased personal culpability for mishap. Highly complex enterprise system extensions may not be directly suited to specific process steering tasks in a high risk/low error-environment like the OR. In view of surgeons' high primary task load, the repeated call for simpler IT is an imperative for ERP extensions. System design should consider a) that current OR IT suffers from an input limitation regarding planning-relevant real-time data, and b) that there are social processes that strongly affect planning and particularly ERP use beyond

  1. NASA's Software Safety Standard

    Science.gov (United States)

    Ramsay, Christopher M.

    2007-01-01

    requirements. This allows the projects leeway to meet these requirements in many forms that best suit a particular project's needs and safety risk. In other words, it tells the project what to do, not how to do it. This update also incorporated advances in the state of the practice of software safety from academia and private industry. It addresses some of the more common issues now facing software developers in the NASA environment such as the use of Commercial-Off-the-Shelf Software (COTS), Modified OTS (MOTS), Government OTS (GOTS), and reused software. A team from across NASA developed the update and it has had both NASA-wide internal reviews by software engineering, quality, safety, and project management. It has also had expert external review. This presentation and paper will discuss the new NASA Software Safety Standard, its organization, and key features. It will start with a brief discussion of some NASA mission failures and incidents that had software as one of their root causes. It will then give a brief overview of the NASA Software Safety Process. This will include an overview of the key personnel responsibilities and functions that must be performed for safety-critical software.

  2. Analysis and databasing software for integrated tomographic gamma scanner (TGS) and passive-active neutron (PAN) assay systems

    International Nuclear Information System (INIS)

    Estep, R.J.; Melton, S.G.; Buenafe, C.

    2000-01-01

    The CTEN-FIT program, written for Windows 9x/NT in C++,performs databasing and analysis of combined thermal/epithermal neutron (CTEN) passive and active neutron assay data and integrates that with isotopics results and gamma-ray data from methods such as tomographic gamma scanning (TGS). The binary database is reflected in a companion Excel database that allows extensive customization via Visual Basic for Applications macros. Automated analysis options make the analysis of the data transparent to the assay system operator. Various record browsers and information displays simplify record keeping tasks

  3. Models, methods and software for distributed knowledge acquisition for the automated construction of integrated expert systems knowledge bases

    International Nuclear Information System (INIS)

    Dejneko, A.O.

    2011-01-01

    Based on an analysis of existing models, methods and means of acquiring knowledge, a base method of automated knowledge acquisition has been chosen. On the base of this method, a new approach to integrate information acquired from knowledge sources of different typologies has been proposed, and the concept of a distributed knowledge acquisition with the aim of computerized formation of the most complete and consistent models of problem areas has been introduced. An original algorithm for distributed knowledge acquisition from databases, based on the construction of binary decision trees has been developed [ru

  4. SDN-NGenIA, a software defined next generation integrated architecture for HEP and data intensive science

    Science.gov (United States)

    Balcas, J.; Hendricks, T. W.; Kcira, D.; Mughal, A.; Newman, H.; Spiropulu, M.; Vlimant, J. R.

    2017-10-01

    The SDN Next Generation Integrated Architecture (SDN-NGeNIA) project addresses some of the key challenges facing the present and next generations of science programs in HEP, astrophysics, and other fields, whose potential discoveries depend on their ability to distribute, process and analyze globally distributed Petascale to Exascale datasets. The SDN-NGenIA system under development by Caltech and partner HEP and network teams is focused on the coordinated use of network, computing and storage infrastructures, through a set of developments that build on the experience gained in recently completed and previous projects that use dynamic circuits with bandwidth guarantees to support major network flows, as demonstrated across LHC Open Network Environment [1] and in large scale demonstrations over the last three years, and recently integrated with PhEDEx and Asynchronous Stage Out data management applications of the CMS experiment at the Large Hadron Collider. In addition to the general program goals of supporting the network needs of the LHC and other science programs with similar needs, a recent focus is the use of the Leadership HPC facility at Argonne National Lab (ALCF) for data intensive applications.

  5. The impact of new accelerator control software on LEP performance

    International Nuclear Information System (INIS)

    Bailey, R.; Belk, A.; Collier, P.; Lamont, M.; Rigk, G. de; Tarrant, M.

    1993-01-01

    After the first year of running LEP, it became apparent that a new generation of application software would be required for efficient long term exploitation of the accelerator. In response to this need, a suite of accelerator control software has been developed, which is new both in style and functionality. During 1992 this software has been extensively used for driving LEP in many different operational modes, which include several different optics, polarisation runs at different energies and 8 bunch operation with Pretzels. The software has performed well and has undoubtedly enhanced the efficiency of accelerator operations. In particular the turnaround time has been significantly reduced, giving an increase of around 20% in the integrated luminosity for the year. Furthermore the software has made the accelerator accessible to less experienced operators. After outlining the development strategy, the overall functionality and performance of the software is discussed, with particular emphasis on improvements in operating efficiency. Some evaluation of the performance and reliability of ORACLE as an on-line database is also given

  6. SOFTWARE OPEN SOURCE, SOFTWARE GRATIS?

    Directory of Open Access Journals (Sweden)

    Nur Aini Rakhmawati

    2006-01-01

    Full Text Available Normal 0 false false false IN X-NONE X-NONE MicrosoftInternetExplorer4 Berlakunya Undang – undang Hak Atas Kekayaan Intelektual (HAKI, memunculkan suatu alternatif baru untuk menggunakan software open source. Penggunaan software open source menyebar seiring dengan isu global pada Information Communication Technology (ICT saat ini. Beberapa organisasi dan perusahaan mulai menjadikan software open source sebagai pertimbangan. Banyak konsep mengenai software open source ini. Mulai dari software yang gratis sampai software tidak berlisensi. Tidak sepenuhnya isu software open source benar, untuk itu perlu dikenalkan konsep software open source mulai dari sejarah, lisensi dan bagaimana cara memilih lisensi, serta pertimbangan dalam memilih software open source yang ada. Kata kunci :Lisensi, Open Source, HAKI

  7. Technical Reference Suite Addressing Challenges of Providing Assurance for Fault Management Architectural Design

    Science.gov (United States)

    Fitz, Rhonda; Whitman, Gerek

    2016-01-01

    Research into complexities of software systems Fault Management (FM) and how architectural design decisions affect safety, preservation of assets, and maintenance of desired system functionality has coalesced into a technical reference (TR) suite that advances the provision of safety and mission assurance. The NASA Independent Verification and Validation (IV&V) Program, with Software Assurance Research Program support, extracted FM architectures across the IV&V portfolio to evaluate robustness, assess visibility for validation and test, and define software assurance methods applied to the architectures and designs. This investigation spanned IV&V projects with seven different primary developers, a wide range of sizes and complexities, and encompassed Deep Space Robotic, Human Spaceflight, and Earth Orbiter mission FM architectures. The initiative continues with an expansion of the TR suite to include Launch Vehicles, adding the benefit of investigating differences intrinsic to model-based FM architectures and insight into complexities of FM within an Agile software development environment, in order to improve awareness of how nontraditional processes affect FM architectural design and system health management. The identification of particular FM architectures, visibility, and associated IV&V techniques provides a TR suite that enables greater assurance that critical software systems will adequately protect against faults and respond to adverse conditions. Additionally, the role FM has with regard to strengthened security requirements, with potential to advance overall asset protection of flight software systems, is being addressed with the development of an adverse conditions database encompassing flight software vulnerabilities. Capitalizing on the established framework, this TR suite provides assurance capability for a variety of FM architectures and varied development approaches. Research results are being disseminated across NASA, other agencies, and the

  8. NCEP BUFRLIB Software User Guide

    Science.gov (United States)

    Integration Branch > Decoders > BUFRLIB BUFRLIB Software User Guide This document set describes how to use the NCEP BUFRLIB software to encode or decode BUFR messages. It is not intended to be a primer on background knowledge of the basic concepts of BUFR and will focus solely on how to use the BUFRLIB software

  9. Bioinformatics resource manager v2.3: an integrated software environment for systems biology with microRNA and cross-species analysis tools

    Directory of Open Access Journals (Sweden)

    Tilton Susan C

    2012-11-01

    Full Text Available Abstract Background MicroRNAs (miRNAs are noncoding RNAs that direct post-transcriptional regulation of protein coding genes. Recent studies have shown miRNAs are important for controlling many biological processes, including nervous system development, and are highly conserved across species. Given their importance, computational tools are necessary for analysis, interpretation and integration of high-throughput (HTP miRNA data in an increasing number of model species. The Bioinformatics Resource Manager (BRM v2.3 is a software environment for data management, mining, integration and functional annotation of HTP biological data. In this study, we report recent updates to BRM for miRNA data analysis and cross-species comparisons across datasets. Results BRM v2.3 has the capability to query predicted miRNA targets from multiple databases, retrieve potential regulatory miRNAs for known genes, integrate experimentally derived miRNA and mRNA datasets, perform ortholog mapping across species, and retrieve annotation and cross-reference identifiers for an expanded number of species. Here we use BRM to show that developmental exposure of zebrafish to 30 uM nicotine from 6–48 hours post fertilization (hpf results in behavioral hyperactivity in larval zebrafish and alteration of putative miRNA gene targets in whole embryos at developmental stages that encompass early neurogenesis. We show typical workflows for using BRM to integrate experimental zebrafish miRNA and mRNA microarray datasets with example retrievals for zebrafish, including pathway annotation and mapping to human ortholog. Functional analysis of differentially regulated (p Conclusions BRM provides the ability to mine complex data for identification of candidate miRNAs or pathways that drive phenotypic outcome and, therefore, is a useful hypothesis generation tool for systems biology. The miRNA workflow in BRM allows for efficient processing of multiple miRNA and mRNA datasets in a single

  10. Bioinformatics resource manager v2.3: an integrated software environment for systems biology with microRNA and cross-species analysis tools

    Science.gov (United States)

    2012-01-01

    Background MicroRNAs (miRNAs) are noncoding RNAs that direct post-transcriptional regulation of protein coding genes. Recent studies have shown miRNAs are important for controlling many biological processes, including nervous system development, and are highly conserved across species. Given their importance, computational tools are necessary for analysis, interpretation and integration of high-throughput (HTP) miRNA data in an increasing number of model species. The Bioinformatics Resource Manager (BRM) v2.3 is a software environment for data management, mining, integration and functional annotation of HTP biological data. In this study, we report recent updates to BRM for miRNA data analysis and cross-species comparisons across datasets. Results BRM v2.3 has the capability to query predicted miRNA targets from multiple databases, retrieve potential regulatory miRNAs for known genes, integrate experimentally derived miRNA and mRNA datasets, perform ortholog mapping across species, and retrieve annotation and cross-reference identifiers for an expanded number of species. Here we use BRM to show that developmental exposure of zebrafish to 30 uM nicotine from 6–48 hours post fertilization (hpf) results in behavioral hyperactivity in larval zebrafish and alteration of putative miRNA gene targets in whole embryos at developmental stages that encompass early neurogenesis. We show typical workflows for using BRM to integrate experimental zebrafish miRNA and mRNA microarray datasets with example retrievals for zebrafish, including pathway annotation and mapping to human ortholog. Functional analysis of differentially regulated (p<0.05) gene targets in BRM indicates that nicotine exposure disrupts genes involved in neurogenesis, possibly through misregulation of nicotine-sensitive miRNAs. Conclusions BRM provides the ability to mine complex data for identification of candidate miRNAs or pathways that drive phenotypic outcome and, therefore, is a useful hypothesis

  11. The Community Intercomparison Suite (CIS)

    Science.gov (United States)

    Watson-Parris, Duncan; Schutgens, Nick; Cook, Nick; Kipling, Zak; Kershaw, Phil; Gryspeerdt, Ed; Lawrence, Bryan; Stier, Philip

    2017-04-01

    Earth observations (both remote and in-situ) create vast amounts of data providing invaluable constraints for the climate science community. Efficient exploitation of these complex and highly heterogeneous datasets has been limited however by the lack of suitable software tools, particularly for comparison of gridded and ungridded data, thus reducing scientific productivity. CIS (http://cistools.net) is an open-source, command line tool and Python library which allows the straight-forward quantitative analysis, intercomparison and visualisation of remote sensing, in-situ and model data. The CIS can read gridded and ungridded remote sensing, in-situ and model data - and many other data sources 'out-of-the-box', such as ESA Aerosol and Cloud CCI product, MODIS, Cloud CCI, Cloudsat, AERONET. Perhaps most importantly however CIS also employs a modular plugin architecture to allow for the reading of limitless different data types. Users are able to write their own plugins for reading the data sources which they are familiar with, and share them within the community, allowing all to benefit from their expertise. To enable the intercomparison of this data the CIS provides a number of operations including: the aggregation of ungridded and gridded datasets to coarser representations using a number of different built in averaging kernels; the subsetting of data to reduce its extent or dimensionality; the co-location of two distinct datasets onto a single set of co-ordinates; the visualisation of the input or output data through a number of different plots and graphs; the evaluation of arbitrary mathematical expressions against any number of datasets; and a number of other supporting functions such as a statistical comparison of two co-located datasets. These operations can be performed efficiently on local machines or large computing clusters - and is already available on the JASMIN computing facility. A case-study using the GASSP collection of in-situ aerosol observations

  12. Software Epistemology

    Science.gov (United States)

    2016-03-01

    in-vitro decision to incubate a startup, Lexumo [7], which is developing a commercial Software as a Service ( SaaS ) vulnerability assessment...LTS Label Transition System MUSE Mining and Understanding Software Enclaves RTEMS Real-Time Executive for Multi-processor Systems SaaS Software ...as a Service SSA Static Single Assignment SWE Software Epistemology UD/DU Def-Use/Use-Def Chains (Dataflow Graph)

  13. Sibelius. Karelia Suite, Op. 11 / Robert Layton

    Index Scriptorium Estoniae

    Layton, Robert

    1996-01-01

    Uuest heliplaadist "Sibelius. Karelia Suite, Op. 11. Luonnotar, Op. 70 a. Andante festivo. The Oceanides, Op. 73. King Christian II, Op. 27-Suite. Finlandia, Op. 26a. Gothenburg Symphony Orchester, Neeme Järvi" DG 447 760-2GH (72 minutes: DDD)

  14. FPGAs for software programmers

    CERN Document Server

    Hannig, Frank; Ziener, Daniel

    2016-01-01

    This book makes powerful Field Programmable Gate Array (FPGA) and reconfigurable technology accessible to software engineers by covering different state-of-the-art high-level synthesis approaches (e.g., OpenCL and several C-to-gates compilers). It introduces FPGA technology, its programming model, and how various applications can be implemented on FPGAs without going through low-level hardware design phases. Readers will get a realistic sense for problems that are suited for FPGAs and how to implement them from a software designer’s point of view. The authors demonstrate that FPGAs and their programming model reflect the needs of stream processing problems much better than traditional CPU or GPU architectures, making them well-suited for a wide variety of systems, from embedded systems performing sensor processing to large setups for Big Data number crunching. This book serves as an invaluable tool for software designers and FPGA design engineers who are interested in high design productivity through behavi...

  15. Modeling the Impact of Space Suit Components and Anthropometry on the Center of Mass of a Seated Crewmember

    Science.gov (United States)

    Rajulu, Sudhakar; Blackledge, Christopher; Ferrer, Mike; Margerum, Sarah

    2009-01-01

    The designers of the Orion Crew Exploration Vehicle (CEV) utilize an intensive simulation program in order to predict the launch and landing characteristics of the Crew Impact Attenuation System (CIAS). The CIAS is the energy absorbing strut concept that dampens loads to levels sustainable by the crew during landing and consists of the crew module seat pallet that accommodates four to six seated astronauts. An important parameter required for proper dynamic modeling of the CIAS is knowledge of the suited center of mass (COM) variations within the crew population. Significant center of mass variations across suited crew configurations would amplify the inertial effects of the pallet and potentially create unacceptable crew loading during launch and landing. Established suited, whole-body, and posture-based mass properties were not available due to the uncertainty of the final CEV seat posture and suit hardware configurations. While unsuited segmental center of mass values can be obtained via regression equations from previous studies, building them into a model that was posture dependent with custom anthropometry and integrated suit components proved cumbersome and time consuming. Therefore, the objective of this study was to quantify the effects of posture, suit components, and the expected range of anthropometry on the center of mass of a seated individual. Several elements are required for the COM calculation of a suited human in a seated position: anthropometry; body segment mass; suit component mass; suit component location relative to the body; and joint angles defining the seated posture. Anthropometry and body segment masses used in this study were taken from a selection of three-dimensional human body models, called boundary manikins, which were developed in a previous project. These boundary manikins represent the critical anthropometric dimension extremes for the anticipated astronaut population. Six male manikins and 6 female manikins, representing a

  16. Software design practice using two SCADA software packages

    DEFF Research Database (Denmark)

    Basse, K.P.; Christensen, Georg Kronborg; Frederiksen, P. K.

    1996-01-01

    Typical software development for manufacturing control is done either by specialists with consideral real-time programming experience or done by the adaptation of standard software packages for manufacturing control. After investigation and test of two commercial software packages: "InTouch" and ......Touch" and "Fix", it is argued, that a more efficient software solution can be achieved by utilising an integrated specification for SCADA and PLC-programming. Experiences gained from process control is planned investigated for descrete parts manufacturing....

  17. Evaluating Suit Fit Using Performance Degradation

    Science.gov (United States)

    Margerum, Sarah E.; Cowley, Matthew; Harvill, Lauren; Benson, Elizabeth; Rajulu, Sudhakar

    2011-01-01

    The Mark III suit has multiple sizes of suit components (arm, leg, and gloves) as well as sizing inserts to tailor the fit of the suit to an individual. This study sought to determine a way to identify the point an ideal suit fit transforms into a bad fit and how to quantify this breakdown using mobility-based physical performance data. This study examined the changes in human physical performance via degradation of the elbow and wrist range of motion of the planetary suit prototype (Mark III) with respect to changes in sizing and as well as how to apply that knowledge to suit sizing options and improvements in suit fit. The methods implemented in this study focused on changes in elbow and wrist mobility due to incremental suit sizing modifications. This incremental sizing was within a range that included both optimum and poor fit. Suited range of motion data was collected using a motion analysis system for nine isolated and functional tasks encompassing the elbow and wrist joints. A total of four subjects were tested with motions involving both arms simultaneously as well as the right arm only. The results were then compared across sizing configurations. The results of this study indicate that range of motion may be used as a viable parameter to quantify at what stage suit sizing causes a detriment in performance; however the human performance decrement appeared to be based on the interaction of multiple joints along a limb, not a single joint angle. The study was able to identify a preliminary method to quantify the impact of size on performance and to develop a means to gauge tolerances around optimal size. More work is needed to improve the assessment of optimal fit and to compensate for multiple joint interactions.

  18. Adobe Creative Suite 6 Design and Web Premium all-in-one for dummies

    CERN Document Server

    Smith, Jennifer; Gerantabee, Fred

    2012-01-01

    The must-have book on the leading suite of software for graphic and web designers Fully revised and updated this hands-on resource offers a one-stop learning opportunity through eight mini-book dedicated to each product inside Adobe's Design & Web Premium Suite. The mini-books include Adobe Creative Suite Basics, InDesign, Illustrator, Photoshop, Acrobat, Dreamweaver, Flash, Fireworks. The book may contain new image enhancements to After Effects, 64-bit versions of Illustrator and Flash Professional, and a new tool, dubbed Helium, that will enable designers to create content using HTML5 and

  19. ROSMOD: A Toolsuite for Modeling, Generating, Deploying, and Managing Distributed Real-time Component-based Software using ROS

    Directory of Open Access Journals (Sweden)

    Pranav Srinivas Kumar

    2016-09-01

    Full Text Available This paper presents the Robot Operating System Model-driven development tool suite, (ROSMOD an integrated development environment for rapid prototyping component-based software for the Robot Operating System (ROS middleware. ROSMOD is well suited for the design, development and deployment of large-scale distributed applications on embedded devices. We present the various features of ROSMOD including the modeling language, the graphical user interface, code generators, and deployment infrastructure. We demonstrate the utility of this tool with a real-world case study: an Autonomous Ground Support Equipment (AGSE robot that was designed and prototyped using ROSMOD for the NASA Student Launch competition, 2014–2015.

  20. Software reliability

    CERN Document Server

    Bendell, A

    1986-01-01

    Software Reliability reviews some fundamental issues of software reliability as well as the techniques, models, and metrics used to predict the reliability of software. Topics covered include fault avoidance, fault removal, and fault tolerance, along with statistical methods for the objective assessment of predictive accuracy. Development cost models and life-cycle cost models are also discussed. This book is divided into eight sections and begins with a chapter on adaptive modeling used to predict software reliability, followed by a discussion on failure rate in software reliability growth mo