WorldWideScience

Sample records for bartab software tools

  1. BARCRAWL and BARTAB: software tools for the design and implementation of barcoded primers for highly multiplexed DNA sequencing

    Science.gov (United States)

    Frank, Daniel N

    2009-01-01

    Background Advances in automated DNA sequencing technology have greatly increased the scale of genomic and metagenomic studies. An increasingly popular means of increasing project throughput is by multiplexing samples during the sequencing phase. This can be achieved by covalently linking short, unique "barcode" DNA segments to genomic DNA samples, for instance through incorporation of barcode sequences in PCR primers. Although several strategies have been described to insure that barcode sequences are unique and robust to sequencing errors, these have not been integrated into the overall primer design process, thus potentially introducing bias into PCR amplification and/or sequencing steps. Results Barcrawl is a software program that facilitates the design of barcoded primers, for multiplexed high-throughput sequencing. The program bartab can be used to deconvolute DNA sequence datasets produced by the use of multiple barcoded primers. This paper describes the functions implemented by barcrawl and bartab and presents a proof-of-concept case study of both programs in which barcoded rRNA primers were designed and validated by high-throughput sequencing. Conclusion Barcrawl and bartab can benefit researchers who are engaged in metagenomic projects that employ multiplexed specimen processing. The source code is released under the GNU general public license and can be accessed at . PMID:19874596

  2. BARCRAWL and BARTAB: software tools for the design and implementation of barcoded primers for highly multiplexed DNA sequencing

    Directory of Open Access Journals (Sweden)

    Frank Daniel N

    2009-10-01

    Full Text Available Abstract Background Advances in automated DNA sequencing technology have greatly increased the scale of genomic and metagenomic studies. An increasingly popular means of increasing project throughput is by multiplexing samples during the sequencing phase. This can be achieved by covalently linking short, unique "barcode" DNA segments to genomic DNA samples, for instance through incorporation of barcode sequences in PCR primers. Although several strategies have been described to insure that barcode sequences are unique and robust to sequencing errors, these have not been integrated into the overall primer design process, thus potentially introducing bias into PCR amplification and/or sequencing steps. Results Barcrawl is a software program that facilitates the design of barcoded primers, for multiplexed high-throughput sequencing. The program bartab can be used to deconvolute DNA sequence datasets produced by the use of multiple barcoded primers. This paper describes the functions implemented by barcrawl and bartab and presents a proof-of-concept case study of both programs in which barcoded rRNA primers were designed and validated by high-throughput sequencing. Conclusion Barcrawl and bartab can benefit researchers who are engaged in metagenomic projects that employ multiplexed specimen processing. The source code is released under the GNU general public license and can be accessed at http://www.phyloware.com.

  3. Tools for software visualization

    OpenAIRE

    Stojanova, Aleksandra; Stojkovic, Natasa; Bikov, Dusan

    2015-01-01

    Software visualization is a kind of computer art, and in the same time is a science for generating visual representations of different software aspects and of software development process. There are many tools that allow software visualization but we are focusing on some of them. In this paper will be examined in details just four tools: Jeliot 3, SRec, jGrasp and DDD. Visualizations that they produce will be reviewed and analyzed and will be mentioned possible places for their application. A...

  4. Software engineering tools.

    Science.gov (United States)

    Wear, L L; Pinkert, J R

    1994-01-01

    We have looked at general descriptions and illustrations of several software development tools, such as tools for prototyping, developing DFDs, testing, and maintenance. Many others are available, and new ones are being developed. However, you have at least seen some examples of powerful CASE tools for systems development. PMID:10131419

  5. Machine Tool Software

    Science.gov (United States)

    1988-01-01

    A NASA-developed software package has played a part in technical education of students who major in Mechanical Engineering Technology at William Rainey Harper College. Professor Hack has been using (APT) Automatically Programmed Tool Software since 1969 in his CAD/CAM Computer Aided Design and Manufacturing curriculum. Professor Hack teaches the use of APT programming languages for control of metal cutting machines. Machine tool instructions are geometry definitions written in APT Language to constitute a "part program." The part program is processed by the machine tool. CAD/CAM students go from writing a program to cutting steel in the course of a semester.

  6. Software engineering methodologies and tools

    Science.gov (United States)

    Wilcox, Lawrence M.

    1993-01-01

    Over the years many engineering disciplines have developed, including chemical, electronic, etc. Common to all engineering disciplines is the use of rigor, models, metrics, and predefined methodologies. Recently, a new engineering discipline has appeared on the scene, called software engineering. For over thirty years computer software has been developed and the track record has not been good. Software development projects often miss schedules, are over budget, do not give the user what is wanted, and produce defects. One estimate is there are one to three defects per 1000 lines of deployed code. More and more systems are requiring larger and more complex software for support. As this requirement grows, the software development problems grow exponentially. It is believed that software quality can be improved by applying engineering principles. Another compelling reason to bring the engineering disciplines to software development is productivity. It has been estimated that productivity of producing software has only increased one to two percent a year in the last thirty years. Ironically, the computer and its software have contributed significantly to the industry-wide productivity, but computer professionals have done a poor job of using the computer to do their job. Engineering disciplines and methodologies are now emerging supported by software tools that address the problems of software development. This paper addresses some of the current software engineering methodologies as a backdrop for the general evaluation of computer assisted software engineering (CASE) tools from actual installation of and experimentation with some specific tools.

  7. Modern Tools for Modern Software

    Energy Technology Data Exchange (ETDEWEB)

    Kumfert, G; Epperly, T

    2001-10-31

    This is a proposal for a new software configure/build tool for building, maintaining, deploying, and installing software. At its completion, this new tool will replace current standard tool suites such as ''autoconf'', ''automake'', ''libtool'', and the de facto standard build tool, ''make''. This ambitious project is born out of the realization that as scientific software has grown in size and complexity over the years, the difficulty of configuring and building software has increased as well. For high performance scientific software, additional complexities often arises from the need for portability to multiple platforms (including many one-of-a-kind platforms), multilanguage implementations, use of third party libraries, and a need to adapt algorithms to the specific features of the hardware. Development of scientific software is being hampered by the quality of configuration and build tools commonly available. Inordinate amounts of time and expertise are required to develop and maintain the configure and build system for a moderately complex project. Better build and configure tools will increase developer productivity. This proposal is a first step in a process of shoring up the foundation upon which DOE software is created and used.

  8. Software Release Procedure and Tools

    OpenAIRE

    Giammatteo, Gabriele; Frosini, Luca; Laskaris, Nikolas

    2015-01-01

    Deliverable D4.1 - "Software Release Procedures and Tools" aims to provide a detailed description of the procedures applied and tools used to manage releases of the gCube System within Work Package 4. gCube System is the software at the basis of all VREs applications, data management services and portals. Given the large size of the gCube system, its high degree of modularity and the number of developers involved in the implementation, a set of procedures that formalize and simplify the integ...

  9. RainTools Software Development

    OpenAIRE

    Van Luijtelaar, Dirk Jan

    2015-01-01

    The aim of this Bachelor’s thesis was to develop the RainTools software pack-age for the customer, Stichting RIONED, and learn about the process of soft-ware development. The main aim of this Bachelor’s thesis was to learn the ca-pabilities and possibilities of C# in combination with WPF and XAML as op-posed to regular WinForms. To achieve this brainstorm began to develop a user interface for the customer to translate input data to XML, feed it to a third party application, read the re-su...

  10. Design of parametric software tools

    DEFF Research Database (Denmark)

    Sabra, Jakob Borrits; Mullins, Michael

    2011-01-01

    The studies investigate the field of evidence-based design used in architectural design practice and propose a method using 2D/3D CAD applications to: 1) enhance integration of evidence-based design knowledge in architectural design phases with a focus on lighting and interior design and 2) assess...... fulfilment of evidence-based design criterion regarding light distribution and location in relation to patient safety in architectural health care design proposals. The study uses 2D/3D CAD modelling software Rhinoceros 3D with plug-in Grasshopper to create parametric tool prototypes to exemplify the...... operations and functions of the design method. To evaluate the prototype potentials, surveys with architectural and healthcare design companies are conducted. Evaluation is done by the administration of questionnaires being part of the development of the tools. The results show that architects, designers and...

  11. Tools for Embedded Computing Systems Software

    Science.gov (United States)

    1978-01-01

    A workshop was held to assess the state of tools for embedded systems software and to determine directions for tool development. A synopsis of the talk and the key figures of each workshop presentation, together with chairmen summaries, are presented. The presentations covered four major areas: (1) tools and the software environment (development and testing); (2) tools and software requirements, design, and specification; (3) tools and language processors; and (4) tools and verification and validation (analysis and testing). The utility and contribution of existing tools and research results for the development and testing of embedded computing systems software are described and assessed.

  12. Software Tools Streamline Project Management

    Science.gov (United States)

    2009-01-01

    Three innovative software inventions from Ames Research Center (NETMARK, Program Management Tool, and Query-Based Document Management) are finding their way into NASA missions as well as industry applications. The first, NETMARK, is a program that enables integrated searching of data stored in a variety of databases and documents, meaning that users no longer have to look in several places for related information. NETMARK allows users to search and query information across all of these sources in one step. This cross-cutting capability in information analysis has exponentially reduced the amount of time needed to mine data from days or weeks to mere seconds. NETMARK has been used widely throughout NASA, enabling this automatic integration of information across many documents and databases. NASA projects that use NETMARK include the internal reporting system and project performance dashboard, Erasmus, NASA s enterprise management tool, which enhances organizational collaboration and information sharing through document routing and review; the Integrated Financial Management Program; International Space Station Knowledge Management; Mishap and Anomaly Information Reporting System; and management of the Mars Exploration Rovers. Approximately $1 billion worth of NASA s projects are currently managed using Program Management Tool (PMT), which is based on NETMARK. PMT is a comprehensive, Web-enabled application tool used to assist program and project managers within NASA enterprises in monitoring, disseminating, and tracking the progress of program and project milestones and other relevant resources. The PMT consists of an integrated knowledge repository built upon advanced enterprise-wide database integration techniques and the latest Web-enabled technologies. The current system is in a pilot operational mode allowing users to automatically manage, track, define, update, and view customizable milestone objectives and goals. The third software invention, Query

  13. Software tools for optical interferometry

    Science.gov (United States)

    Thureau, Nathalie D.; Ireland, Michael; Monnier, John D.; Pedretti, Ettore

    2006-06-01

    We describe a set of general purpose utilities for visualizing and manipulating optical interferometry data stored in the FITS-based OIFITS data format. This class of routines contains code like the OiPlot navigation/visualization tool which allows the user to extract visibility, closure phase and UV-coverage information from the OIFITS files and to display the information in various ways. OiPlot also has basic data model fitting capabilities which can be used for a rapid first analysis of the scientific data. More advanced image reconstruction techniques are part of a dedicated utility. In addition, these routines allow data from multiple interferometers to be combined and used together. Part of our work also aims at developing software specific to the Michigan InfraRed Combiner (MIRC). Our experience designing a flexible and robust graphical user interfaced based on sockets using python libraries has wide applicability and this paper will discuss practicalities.

  14. Software Architecture Risk Assessment (SARA Tool

    Directory of Open Access Journals (Sweden)

    K. Shaik

    2008-07-01

    Full Text Available This paper presents Software Architecture Risk Assessment (SARA Tool to demonstrate the process of risk assessment at the software architecture level. The prototype tool accepts different types of inputs that define software architecture. It parses these input files and produces quantitative metrics that are used to estimate the required risk factors. The final result of this process is to discern the potentially high risk components in the software system. By manipulating the data acquired from domain expert and measures obtained from Unified Modeling Language (UML artifacts, SARA Tool can be used at the architecture development phase, at the design phase, or at the implementation phase of the software development process to improve the quality of the software product.

  15. A Software Tool for Legal Drafting

    Directory of Open Access Journals (Sweden)

    Daniel Gorín

    2011-09-01

    Full Text Available Although many attempts at automated aids for legal drafting have been made, they were based on the construction of a new tool, completely from scratch. This is at least curious, considering that a strong parallelism can be established between a normative document and a software specification: both describe what an entity should or should not do, can or cannot do. In this article we compare normative documents and software specifications to find out their similarities and differences. The comparison shows that there are distinctive particularities, but they are restricted to a very specific subclass of normative propositions. The rest, we postulate, can be dealt with software tools. For such an enterprise the FormaLex tool set was devised: an LTL-based language and companion tools that utilize model checking to find out normative incoherences in regulations, contracts and other legal documents. A feature-rich case study is analyzed with the presented tools.

  16. A Software Tool for Legal Drafting

    CERN Document Server

    Gorín, Daniel; Schapachnik, Fernando; 10.4204/EPTCS.68.7

    2011-01-01

    Although many attempts at automated aids for legal drafting have been made, they were based on the construction of a new tool, completely from scratch. This is at least curious, considering that a strong parallelism can be established between a normative document and a software specification: both describe what an entity should or should not do, can or cannot do. In this article we compare normative documents and software specifications to find out their similarities and differences. The comparison shows that there are distinctive particularities, but they are restricted to a very specific subclass of normative propositions. The rest, we postulate, can be dealt with software tools. For such an enterprise the \\FormaLex tool set was devised: an LTL-based language and companion tools that utilize model checking to find out normative incoherences in regulations, contracts and other legal documents. A feature-rich case study is analyzed with the presented tools.

  17. Some software tools for scientific programming

    International Nuclear Information System (INIS)

    A number of advanced software tools are described which have been used by Logica or its clients for scientific or technical software development. These are: RAPPORT, a Relational Database Management System; A Fortran Program Analyser, which is designed to answer those questions about large Fortran Programs which are not easily answered by examining listings; A Test Coverage Monitor, which measures how well the code and branches in a Fortran program have been exercised by a set of test runs; The UNIX operating system and the tools available with it. These tools will be described with examples of their use in practice. (orig.)

  18. A Software Tool for Legal Drafting

    OpenAIRE

    Daniel Gorín; Sergio Mera; Fernando Schapachnik

    2011-01-01

    Although many attempts at automated aids for legal drafting have been made, they were based on the construction of a new tool, completely from scratch. This is at least curious, considering that a strong parallelism can be established between a normative document and a software specification: both describe what an entity should or should not do, can or cannot do. In this article we compare normative documents and software specifications to find out their similarities and differences. The comp...

  19. Westinghouse waste simulation and optimization software tool

    International Nuclear Information System (INIS)

    The Westinghouse waste simulation and optimization software tool helps the final user to identify the best process setup to achieve the overall lowest life cycle cost by providing a better insight and understanding of complex interacting and integrated processes and facilities. Any waste management facility that treats radioactive waste from operating NPP, D and D activities or legacy waste can be subject for the software tool. Sensitivity analyses in a complex environment and process bottleneck identification, as well as detailed cost analysis and cost driver identification are key capabilities of the simulation. Using the simulation enables virtual trial and error without risk to identify the best applicable treatment technology. The Westinghouse waste simulation and optimization software tool is built to support the user with reliable data for mature decisions. (orig.)

  20. Tool Use Within NASA Software Quality Assurance

    Science.gov (United States)

    Shigeta, Denise; Port, Dan; Nikora, Allen P.; Wilf, Joel

    2013-01-01

    As space mission software systems become larger and more complex, it is increasingly important for the software assurance effort to have the ability to effectively assess both the artifacts produced during software system development and the development process itself. Conceptually, assurance is a straightforward idea - it is the result of activities carried out by an organization independent of the software developers to better inform project management of potential technical and programmatic risks, and thus increase management's confidence in the decisions they ultimately make. In practice, effective assurance for large, complex systems often entails assessing large, complex software artifacts (e.g., requirements specifications, architectural descriptions) as well as substantial amounts of unstructured information (e.g., anomaly reports resulting from testing activities during development). In such an environment, assurance engineers can benefit greatly from appropriate tool support. In order to do so, an assurance organization will need accurate and timely information on the tool support available for various types of assurance activities. In this paper, we investigate the current use of tool support for assurance organizations within NASA, and describe on-going work at JPL for providing assurance organizations with the information about tools they need to use them effectively.

  1. Software Tools Used for Continuous Assessment

    Directory of Open Access Journals (Sweden)

    Corina SBUGHEA

    2016-04-01

    Full Text Available he present paper addresses the subject of continuous evaluation and of the IT tools that support it. The approach starts from the main concepts and methods used in the teaching process, according to the assessment methodology and, then, it focuses on their implementation in the Wondershare QuizCreator software.

  2. OSIRIS Software: The Mask Designer Tool

    CERN Document Server

    González-Serrano, J I; Castaneda, H; Quirk, R; De Miguel, E D; Aguiar, M; Cepa, J

    2006-01-01

    OSIRIS is a Day One instrument that will be available at the 10m GTC telescope which is being built at La Palma observatory in the Canary Islands. This optical instrument is designed to obtain wide-field narrow-band images using tunable filters and to do low-resolution spectroscopy in both long-slit and multislit modes. For the multislit spectroscopy mode, we have developed a software to assist the observers to design focal plane masks. In this paper we describe the characteristics of this Mask Designer tool. We discuss the main design concepts, the functionality and particular features of the software.

  3. SUSTAINABLE REMEDIATION SOFTWARE TOOL EXERCISE AND EVALUATION

    Energy Technology Data Exchange (ETDEWEB)

    Kohn, J.; Nichols, R.; Looney, B.

    2011-05-12

    The goal of this study was to examine two different software tools designed to account for the environmental impacts of remediation projects. Three case studies from the Savannah River Site (SRS) near Aiken, SC were used to exercise SiteWise (SW) and Sustainable Remediation Tool (SRT) by including both traditional and novel remediation techniques, contaminants, and contaminated media. This study combined retrospective analysis of implemented projects with prospective analysis of options that were not implemented. Input data were derived from engineering plans, project reports, and planning documents with a few factors supplied from calculations based on Life Cycle Assessment (LCA). Conclusions drawn from software output were generally consistent within a tool; both tools identified the same remediation options as the 'best' for a given site. Magnitudes of impacts varied between the two tools, and it was not always possible to identify the source of the disagreement. The tools differed in their quantitative approaches: SRT based impacts on specific contaminants, media, and site geometry and modeled contaminant removal. SW based impacts on processes and equipment instead of chemical modeling. While SW was able to handle greater variety in remediation scenarios, it did not include a measure of the effectiveness of the scenario.

  4. Software Engineering Tools for Scientific Models

    Science.gov (United States)

    Abrams, Marc; Saboo, Pallabi; Sonsini, Mike

    2013-01-01

    Software tools were constructed to address issues the NASA Fortran development community faces, and they were tested on real models currently in use at NASA. These proof-of-concept tools address the High-End Computing Program and the Modeling, Analysis, and Prediction Program. Two examples are the NASA Goddard Earth Observing System Model, Version 5 (GEOS-5) atmospheric model in Cell Fortran on the Cell Broadband Engine, and the Goddard Institute for Space Studies (GISS) coupled atmosphere- ocean model called ModelE, written in fixed format Fortran.

  5. ANTHROMEDA - Software Tool for Anthropometric Examinations

    Czech Academy of Sciences Publication Activity Database

    Hanzlíček, Petr; Rexová, Patrícia; Adášková, Jana; Bláha, P.

    Prague : Charles University, Faculty of Science, 2003 - (Vignerová, J.; Riedlová, J.; Bláha, P.). s. 54 ISBN 80-86561-06-2. [International Anthropological Congress "Anthropology and Society". 22.05.2003-24.05.2003, Prague - Humpolec] R&D Projects: GA MŠk LN00B107 Keywords : anthropometry * software tool * database Subject RIV: BD - Theory of Information

  6. A Software Tool for Robust PID Design

    OpenAIRE

    Garpinger, Olof; Hägglund, Tore

    2008-01-01

    This paper presents a fast, interactive and easily modifiable software tool for robust PID design. The Matlab based program is supposed to give people with moderate knowledge on PID control a possibility to learn more and also be a future part of an autotuner. The PID design is made by minimizing the integrated absolute error value during a load disturbance on the process input. The optimization is performed with H-infinity constraints on the sensitivity and complementary sensitivity function...

  7. A software tool for dataflow graph scheduling

    Science.gov (United States)

    Jones, Robert L., III

    1994-01-01

    A graph-theoretic design process and software tool is presented for selecting a multiprocessing scheduling solution for a class of computational problems. The problems of interest are those that can be described using a dataflow graph and are intended to be executed repetitively on multiple processors. The dataflow paradigm is very useful in exposing the parallelism inherent in algorithms. It provides a graphical and mathematical model which describes a partial ordering of algorithm tasks based on data precedence.

  8. Software tools for microprocessor based systems

    International Nuclear Information System (INIS)

    After a short review of the hardware and/or software tools for the development of single-chip, fixed instruction set microprocessor-based sytems we focus on the software tools for designing systems based on microprogrammed bit-sliced microprocessors. Emphasis is placed on meta-microassemblers and simulation facilties at the register-transfer-level and architecture level. We review available meta-microassemblers giving their most important features, advantages and disadvantages. We also make extentions to higher-level microprogramming languages and associated systems specifically developed for bit-slices. In the area of simulation facilities we first discuss the simulation objectives and the criteria for chosing the right simulation language. We consertrate to simulation facilities already used in bit-slices projects and discuss the gained experience. We conclude by describing the way the Signetics meta-microassembler and the ISPS simulation tool have been employed in the design of a fast microprogrammed machine, called MICE, made out of ECL bit-slices. (orig.)

  9. Intelligent Software Tools for Advanced Computing

    Energy Technology Data Exchange (ETDEWEB)

    Baumgart, C.W.

    2001-04-03

    Feature extraction and evaluation are two procedures common to the development of any pattern recognition application. These features are the primary pieces of information which are used to train the pattern recognition tool, whether that tool is a neural network, a fuzzy logic rulebase, or a genetic algorithm. Careful selection of the features to be used by the pattern recognition tool can significantly streamline the overall development and training of the solution for the pattern recognition application. This report summarizes the development of an integrated, computer-based software package called the Feature Extraction Toolbox (FET), which can be used for the development and deployment of solutions to generic pattern recognition problems. This toolbox integrates a number of software techniques for signal processing, feature extraction and evaluation, and pattern recognition, all under a single, user-friendly development environment. The toolbox has been developed to run on a laptop computer, so that it may be taken to a site and used to develop pattern recognition applications in the field. A prototype version of this toolbox has been completed and is currently being used for applications development on several projects in support of the Department of Energy.

  10. Towards an interoperability ontology for software development tools

    OpenAIRE

    Hasni, Neji.

    2003-01-01

    Approved for public release; distribution is unlimited The automation of software development has long been a goal of software engineering to increase efficiency of the development effort and improve the software product. This efficiency (high productivity with less software faults) results from best practices in building, managing and tes ting software projects via the use of these automated tools and processes. However, each software development tool has its own characteristics, semantic...

  11. ATLAS software configuration and build tool optimisation

    International Nuclear Information System (INIS)

    ATLAS software code base is over 6 million lines organised in about 2000 packages. It makes use of some 100 external software packages, is developed by more than 400 developers and used by more than 2500 physicists from over 200 universities and laboratories in 6 continents. To meet the challenge of configuration and building of this software, the Configuration Management Tool (CMT) is used. CMT expects each package to describe its build targets, build and environment setup parameters, dependencies on other packages in a text file called requirements, and each project (group of packages) to describe its policies and dependencies on other projects in a text project file. Based on the effective set of configuration parameters read from the requirements files of dependent packages and project files, CMT commands build the packages, generate the environment for their use, or query the packages. The main focus was on build time performance that was optimised within several approaches: reduction of the number of reads of requirements files that are now read once per package by a CMT build command that generates cached requirements files for subsequent CMT build commands; introduction of more fine-grained build parallelism at package task level, i.e., dependent applications and libraries are compiled in parallel; code optimisation of CMT commands used for build; introduction of package level build parallelism, i. e., parallelise the build of independent packages. By default, CMT launches NUMBER-OF-PROCESSORS build commands in parallel. The other focus was on CMT commands optimisation in general that made them approximately 2 times faster. CMT can generate a cached requirements file for the environment setup command, which is especially useful for deployment on distributed file systems like AFS or CERN VMFS. The use of parallelism, caching and code optimisation significantly-by several times-reduced software build time, environment setup time, increased the efficiency of

  12. Software Tools for Stochastic Simulations of Turbulence

    Science.gov (United States)

    Kaufman, Ryan

    We present two software tools useful for the analysis of mesh based physics application data, and specifically for turbulent mixing simulations. Each has a broader, but separate scope, as we describe. Both features play a key role as we push computational science to its limits and thus the present work contributes to the frontier of research. The first tool is Wstar, a weak* comparison tool, which addresses the stochastic nature of turbulent flow. The goal is to compare underresolved turbulent data in convergence, parameter dependence, or validation studies. This is achieved by separating space-time data from state data (e.g. density, pressure, momentum, etc.) through coarsening and sampling. The collection of fine grained data in a single coarse cell is treated as a random sample in state space, whose cumulative distribution function defines a measure within that cell. This set of measures with the spacial dependence defined by the coarse grid defines a Young measure solution to the PDE. The second tool is a front tracking application programming interface (API) called FTI. It has the capability to generate geometric surfaces (e.g. the location of interspecies boundaries) of high complexity, and track them dynamically. FTI also includes the ghost fluid method, which enables mesh based fluid codes to maintain sharpness at interspecies boundaries by modifying solution stencils that cross such a boundary. FTI outlines and standardizes the methods involved in this model. FronTier, as developed here, is a software package which implements this standard. The client must implement the physics and grid interpolation routines outlined in the client interface to FTI. Specific client programs using this interface include the weather forecasting code WRF; the high energy physics code, FLASH; and two locally constructed fluid codes, cFluid and iFluid for compressible and incompressible flow respectively.

  13. Tool support for software lookup table optimization.

    Science.gov (United States)

    Wilcox, Chris; Strout, Michelle Mills; Bieman, James M

    2011-12-01

    A number of scientific applications are performance-limited by expressions that repeatedly call costly elementary functions. Lookup table (LUT) optimization accelerates the evaluation of such functions by reusing previously computed results. LUT methods can speed up applications that tolerate an approximation of function results, thereby achieving a high level of fuzzy reuse. One problem with LUT optimization is the difficulty of controlling the tradeoff between performance and accuracy. The current practice of manual LUT optimization adds programming effort by requiring extensive experimentation to make this tradeoff, and such hand tuning can obfuscate algorithms. In this paper we describe a methodology and tool implementation to improve the application of software LUT optimization. Our Mesa tool implements source-to-source transformations for C or C++ code to automate the tedious and error-prone aspects of LUT generation such as domain profiling, error analysis, and code generation. We evaluate Mesa with five scientific applications. Our results show a performance improvement of 3.0 × and 6.9 × for two molecular biology algorithms, 1.4 × for a molecular dynamics program, 2.1 × to 2.8 × for a neural network application, and 4.6 × for a hydrology calculation. We find that Mesa enables LUT optimization with more control over accuracy and less effort than manual approaches. PMID:24532963

  14. Tool Support for Software Lookup Table Optimization

    Directory of Open Access Journals (Sweden)

    Chris Wilcox

    2011-01-01

    Full Text Available A number of scientific applications are performance-limited by expressions that repeatedly call costly elementary functions. Lookup table (LUT optimization accelerates the evaluation of such functions by reusing previously computed results. LUT methods can speed up applications that tolerate an approximation of function results, thereby achieving a high level of fuzzy reuse. One problem with LUT optimization is the difficulty of controlling the tradeoff between performance and accuracy. The current practice of manual LUT optimization adds programming effort by requiring extensive experimentation to make this tradeoff, and such hand tuning can obfuscate algorithms. In this paper we describe a methodology and tool implementation to improve the application of software LUT optimization. Our Mesa tool implements source-to-source transformations for C or C++ code to automate the tedious and error-prone aspects of LUT generation such as domain profiling, error analysis, and code generation. We evaluate Mesa with five scientific applications. Our results show a performance improvement of 3.0× and 6.9× for two molecular biology algorithms, 1.4× for a molecular dynamics program, 2.1× to 2.8× for a neural network application, and 4.6× for a hydrology calculation. We find that Mesa enables LUT optimization with more control over accuracy and less effort than manual approaches.

  15. Sandia software guidelines: Volume 5, Tools, techniques, and methodologies

    Energy Technology Data Exchange (ETDEWEB)

    1989-07-01

    This volume is one in a series of Sandia Software Guidelines intended for use in producing quality software within Sandia National Laboratories. This volume describes software tools and methodologies available to Sandia personnel for the development of software, and outlines techniques that have proven useful within the Laboratories and elsewhere. References and evaluations by Sandia personnel are included. 6 figs.

  16. Tools and Behavioral Abstraction: A Direction for Software Engineering

    Science.gov (United States)

    Leino, K. Rustan M.

    As in other engineering professions, software engineers rely on tools. Such tools can analyze program texts and design specifications more automatically and in more detail than ever before. While many tools today are applied to find new defects in old code, I predict that more software-engineering tools of the future will be available to software authors at the time of authoring. If such analysis tools can be made to be fast enough and easy enough to use, they can help software engineers better produce and evolve programs.

  17. Testing automation tools for secure software development

    OpenAIRE

    Eatinger, Christopher J.

    2007-01-01

    Software testing is a crucial step in the development of any software system, large or small. Testing can reveal the presence of logic errors and other flaws in the code that could cripple the system's effectiveness. Many flaws common in software today can also be exploited to breach the security of the system on which the software is running. These flaws can be subtle and difficult to find. Frequently it takes a combination of multiple events to bring them out. Traditional testing techni...

  18. TSSR: A Proposed Tool for Secure Software Requirement Management

    OpenAIRE

    Mohammad Ubaidullah Bokhari; Shams Tabrez Siddiqui

    2014-01-01

    This paper provides a unified framework in which entire design of the project can be captured right from the beginning of the software development. This paper discusses about the requirements which should be included in the development of the requirement management tools. As the requirements, criteria which have been discussed, we introduce a requirement management tool known as TSSR (Tool for Secure Software Requirement). This tool manages risk analysis, system requirements, security of the ...

  19. Towards E-CASE Tools for Software Engineering

    Directory of Open Access Journals (Sweden)

    Nabil Arman

    2013-02-01

    Full Text Available CASE tools are having an important role in all phases of software systems development and engineering. This is evident in the huge benefits obtained from using these tools including their cost-effectiveness, rapid software application development, and improving the possibility of software reuse to name just a few. In this paper, the idea of moving towards E-CASE tools, rather than traditional CASE tools, is advocated since these E-CASE tools have all the benefits and advantages of traditional CASE tools and add to that all the benefits of web technology. This is presented by focusing on the role of E-CASE tools in facilitating the trend of telecommuting and virtual workplaces among software engineering and information technology professionals. In addition, E-CASE tools integrates smoothly with the trend of E-learning in conducting software engineering courses. Finally, two surveys were conducted for a group of software engineering professional and students of software engineering courses. The surveys show that E-CASE tools are of great value to both communities of students and professionals of software engineering.

  20. Herramientas libres para modelar software Free tools to model software

    Directory of Open Access Journals (Sweden)

    Mauro Callejas Cuervo Óscar Yovany Baquero Moreno

    2010-11-01

    Full Text Available Observación acerca del  software libre y de suimplicación en procesos de desarrollo de  softwarecon herramientas 4G por parte de entidades opersonas sin capitales astronómicos y sin lamentalidad acaparadora de dominar el mercado conproductos costosos que las hagan multimillonarias yque no ofrecen una garantía real, ni la posibilidadsiquiera de conocer el  software por el que se hapagado, y mucho menos de modificarlo si no cumplenuestras expectativas.

  1. Herramientas libres para modelar software Free tools to model software

    OpenAIRE

    Mauro Callejas Cuervo Óscar Yovany Baquero Moreno

    2010-01-01

    Observación acerca del  software libre y de suimplicación en procesos de desarrollo de  softwarecon herramientas 4G por parte de entidades opersonas sin capitales astronómicos y sin lamentalidad acaparadora de dominar el mercado conproductos costosos que las hagan multimillonarias yque no ofrecen una garantía real, ni la posibilidadsiquiera de conocer el  software por el que se hapagado, y mucho menos de modificarlo si no cumplenuestras expectativas.

  2. TAUS:A File—Based Software Understanding Tool

    Institute of Scientific and Technical Information of China (English)

    费翔林; 汪承藻; 等

    1990-01-01

    A program called TAUS,a Tool for Analyzing and Understanding Software,was developed.It is designed to help the programmer analyze and understand the software interactively.Its aim is to reduce the dependence on human intelligence in software understanding and improve the programmer's understanding productivity.The design and implementation of TAUS and its applications are described.

  3. Hyperdev: Hypertext tool to support object-oriented software development

    International Nuclear Information System (INIS)

    The authors propose a software tool, based on hypertext techniques, to support the object-oriented development of scientific applications. Within HyperDev, all kinds of software information such as plain text, formatted text, graphics and code are connected through links allowing for different views of the same object and, consequently, achieving a better understanding of the software components

  4. Caesy: A software tool for computer-aided engineering

    Science.gov (United States)

    Wette, Matt

    1993-01-01

    A new software tool, Caesy, is described. This tool provides a strongly typed programming environment for research in the development of algorithms and software for computer-aided control system design. A description of the user language and its implementation as they currently stand are presented along with a description of work in progress and areas of future work.

  5. Estimation of toxicity using a Java based software tool

    Science.gov (United States)

    A software tool has been developed that will allow a user to estimate the toxicity for a variety of endpoints (such as acute aquatic toxicity). The software tool is coded in Java and can be accessed using a web browser (or alternatively downloaded and ran as a stand alone applic...

  6. EISA 432 Energy Audits Best Practices: Software Tools

    Energy Technology Data Exchange (ETDEWEB)

    Maryl Fisher

    2014-11-01

    Five whole building analysis software tools that can aid an energy manager with fulfilling energy audit and commissioning/retro-commissioning requirements were selected for review in this best practices study. A description of each software tool is provided as well as a discussion of the user interface and level of expertise required for each tool, a review of how to use the tool for analyzing energy conservation opportunities, the format and content of reports generated by the tool, and a discussion on the applicability of the tool for commissioning.

  7. Software Tools for Fault Management Technologies Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Fault Management (FM) is a key requirement for safety, efficient onboard and ground operations, maintenance, and repair. QSI's TEAMS Software suite is a leading...

  8. Educational Software as a Learning Tool for Primary School Students

    OpenAIRE

    Vannucci, Marco; Colla, Valentina

    2010-01-01

    In this chapter the role and the significance of educational softwares as learning tools was discussed. In particular their features were highlighted together with the advantages provided to the users (both learners and teachers) by the use of multimedia and interaction with the user which are offered by the last generations of these tools. The evolution of educational software has been outlined drawing attention to technological development which since the first appearance of learning tools ...

  9. Technology Transfer Challenges for High-Assurance Software Engineering Tools

    Science.gov (United States)

    Koga, Dennis (Technical Monitor); Penix, John; Markosian, Lawrence Z.

    2003-01-01

    In this paper, we describe our experience with the challenges thar we are currently facing in our effort to develop advanced software verification and validation tools. We categorize these challenges into several areas: cost benefits modeling, tool usability, customer application domain, and organizational issues. We provide examples of challenges in each area and identrfj, open research issues in areas which limit our ability to transfer high-assurance software engineering tools into practice.

  10. TSSR: A Proposed Tool for Secure Software Requirement Management

    Directory of Open Access Journals (Sweden)

    Mohammad Ubaidullah Bokhari

    2014-12-01

    Full Text Available This paper provides a unified framework in which entire design of the project can be captured right from the beginning of the software development. This paper discusses about the requirements which should be included in the development of the requirement management tools. As the requirements, criteria which have been discussed, we introduce a requirement management tool known as TSSR (Tool for Secure Software Requirement. This tool manages risk analysis, system requirements, security of the system and project, users/group restriction, encrypted database, traceability and extension of the tool to interact with external requirement management tools. The aim of this paper is to describe the TSSR framework and its four components: Planner, Modeller, Prover and Documenter which will be helpful in interacting and managing requirements with arbitrary number of external tools for secure software development.

  11. beSMART : a software tool to support the selection of decision software

    OpenAIRE

    Tereso, Anabela Pereira; Sampaio, Ana; Frade, Hugo; Costa, Miguel; Abreu, Tiago

    2011-01-01

    This paper presents a tool which goal is aiding the user to choose the best Decision Support Software between a set of Software residing in a database, according to the features desired, and using multicriteria decision methods. This application was developed using the C# programming language and allows to save in a file all user data, such as the set of Decision Software Tools under consideration and its features to be used afterwards.

  12. Criteria and tools for scientific software quality measurements

    International Nuclear Information System (INIS)

    Not all software used in the nuclear industry needs the rigorous formal verification, reliability testing and quality assessment that are being applied to safety critical software. Recently, however, there is increasing recognition that systematic and objective quality assessment of the scientific software used in design and safety analyses of nuclear facilities is necessary to support safety and licensing decisions. Because of the complexity and large size of these programs and the resource constraints faced by the AECB reviewer, it is desirable that appropriate automated tools are used wherever practical. To objectively assess the quality of software, a set of attributes of a software product by which its quality is described and evaluated must be established. These attributes must be relevant to the application domain of software under evaluation. To effectively assess the quality of software, metrics defining quantitative scale and method appropriate to determine the value of attributes need to be applied. To cost-effectively perform the evaluation, use of suitable automated tools is desirable. In this project, criteria for evaluating the quality of scientific software are presented; metrics for which those criteria can be evaluated are identified; a survey of automated tools to measure those metrics was conducted and the most appropriate tool (QA Fortran) was acquired; and the tool usage was demonstrated on three sample programs. (author) 5 refs

  13. Innovative Software Tools Measure Behavioral Alertness

    Science.gov (United States)

    2014-01-01

    To monitor astronaut behavioral alertness in space, Johnson Space Center awarded Philadelphia-based Pulsar Informatics Inc. SBIR funding to develop software to be used onboard the International Space Station. Now used by the government and private companies, the technology has increased revenues for the firm by an average of 75 percent every year.

  14. Innovative Software Algorithms and Tools parallel sessions summary

    International Nuclear Information System (INIS)

    A variety of results were presented in the poster and 5 parallel sessions of the Innovative Software, Algorithms and Tools (ISAT) sessions. I will briefly summarize these presentations and attempt to identify some unifying trends

  15. Computer- Aided Design in Power Engineering Application of Software Tools

    CERN Document Server

    Stojkovic, Zlatan

    2012-01-01

    This textbooks demonstrates the application of software tools in solving a series of problems from the field of designing power system structures and systems. It contains four chapters: The first chapter leads the reader through all the phases necessary in the procedures of computer aided modeling and simulation. It guides through the complex problems presenting on the basis of eleven original examples. The second chapter presents  application of software tools in power system calculations of power systems equipment design. Several design example calculations are carried out using engineering standards like MATLAB, EMTP/ATP, Excel & Access, AutoCAD and Simulink. The third chapters focuses on the graphical documentation using a collection of software tools (AutoCAD, EPLAN, SIMARIS SIVACON, SIMARIS DESIGN) which enable the complete automation of the development of graphical documentation of a power systems. In the fourth chapter, the application of software tools in the project management in power systems ...

  16. VISAGE information software a new tool in exploration

    Energy Technology Data Exchange (ETDEWEB)

    Ross, Elsie

    2011-12-15

    VISAGE Information Solutions is a Calgary-based software company which went commercial 7 years ago. The company developed a tool which can reduce the initial exploration risk for oil and gas companies by providing access, analysis and interpretation of hundreds of thousands wells in Western Canada. This powerful software offers a very large data set and allows customers to look at many different plays in a few minutes; it is possible to group data by type of play, operator or drilling contractor to make accurate decisions. Elkhorn Resources Inc., a small entity, has used VISAGE's software to build its business plan and has benefited from the tool by examining the lessons other companies have learned in drilling their wells. VISAGE's tool is an easy and affordable software which allows engineers to concentrate on analyzing and exploring data rather than gathering it; the tool is in continually evolving and improving.

  17. ISWHM: Tools and Techniques for Software and System Health Management

    Science.gov (United States)

    Schumann, Johann; Mengshoel, Ole J.; Darwiche, Adnan

    2010-01-01

    This presentation presents status and results of research on Software Health Management done within the NRA "ISWHM: Tools and Techniques for Software and System Health Management." Topics include: Ingredients of a Guidance, Navigation, and Control System (GN and C); Selected GN and C Testbed example; Health Management of major ingredients; ISWHM testbed architecture; and Conclusions and next Steps.

  18. Developing a Decision Support System: The Software and Hardware Tools.

    Science.gov (United States)

    Clark, Phillip M.

    1989-01-01

    Describes some of the available software and hardware tools that can be used to develop a decision support system implemented on microcomputers. Activities that should be supported by software are discussed, including data entry, data coding, finding and combining data, and data compatibility. Hardware considerations include speed, storage…

  19. Researchers create free, downloadable software radio design tool

    OpenAIRE

    Crumbley, Liz

    2004-01-01

    The Mobile and Portable Radio Research Group (MPRG) in Virginia Tech's Bradley Department of Electrical and Computer Engineering has developed the fundamental software for use in designing software radios and is offering this tool free to other wireless communications researchers throughout the world.

  20. iPhone examination with modern forensic software tools

    Science.gov (United States)

    Höne, Thomas; Kröger, Knut; Luttenberger, Silas; Creutzburg, Reiner

    2012-06-01

    The aim of the paper is to show the usefulness of modern forensic software tools for iPhone examination. In particular, we focus on the new version of Elcomsoft iOS Forensic Toolkit and compare it with Oxygen Forensics Suite 2012 regarding functionality, usability and capabilities. It is shown how these software tools works and how capable they are in examining non-jailbreaked and jailbreaked iPhones.

  1. Software tool for xenon gamma-ray spectrometer control

    Science.gov (United States)

    Chernysheva, I. V.; Novikov, A. S.; Shustov, A. E.; Dmitrenko, V. V.; Pyae Nyein, Sone; Petrenko, D.; Ulin, S. E.; Uteshev, Z. M.; Vlasik, K. F.

    2016-02-01

    Software tool "Acquisition and processing of gamma-ray spectra" for xenon gamma-ray spectrometers control was developed. It supports the multi-windows interface. Software tool has the possibilities for acquisition of gamma-ray spectra from xenon gamma-ray detector via USB or RS-485 interfaces, directly or via TCP-IP protocol, energy calibration of gamma-ray spectra, saving gamma-ray spectra on a disk.

  2. Some Interactive Aspects of a Software Design Schema Acquisition Tool

    Science.gov (United States)

    Lee, Hing-Yan; Harandi, Mehdi T.

    1991-01-01

    This paper describes a design schema acquisition tool which forms an important component of a hybrid software design system for reuse. The hybrid system incorporates both schema-based approaches in supporting software design reuse activities and is realized by extensions to the IDeA system. The paper also examines some of the interactive aspects that the tool requires with the domain analyst to accomplish its acquisition task.

  3. Flow sheeting software as a tool when teaching Chemical Engineering

    OpenAIRE

    Abbas, Asad

    2011-01-01

    The aim of this thesis is to design different chemical processes by using flow sheeting software and to show the usefulness of flow sheeting software as an educational tool. The industries studied are hydrogen, sulfur, nitric acid and ethylene glycol production and a model of drying technique is also included. Firstly, there is an introduction of chemcad as a tool when teaching chemical processes and explanation of each industry which is selected to design. Various production methods for each...

  4. Generating DEM from LIDAR data - comparison of available software tools

    Science.gov (United States)

    Korzeniowska, K.; Lacka, M.

    2011-12-01

    In recent years many software tools and applications have appeared that offer procedures, scripts and algorithms to process and visualize ALS data. This variety of software tools and of "point cloud" processing methods contributed to the aim of this study: to assess algorithms available in various software tools that are used to classify LIDAR "point cloud" data, through a careful examination of Digital Elevation Models (DEMs) generated from LIDAR data on a base of these algorithms. The works focused on the most important available software tools: both commercial and open source ones. Two sites in a mountain area were selected for the study. The area of each site is 0.645 sq km. DEMs generated with analysed software tools ware compared with a reference dataset, generated using manual methods to eliminate non ground points. Surfaces were analysed using raster analysis. Minimum, maximum and mean differences between reference DEM and DEMs generated with analysed software tools were calculated, together with Root Mean Square Error. Differences between DEMs were also examined visually using transects along the grid axes in the test sites.

  5. Claire, a simulation and testing tool for critical softwares

    International Nuclear Information System (INIS)

    The CEA and IPSN (Institute of Nuclear Protection and Safety) needs concerning the testing of critical softwares, have led to the development of the CLAIRE tool which is able to test the softwares without modification. This tool allows to graphically model the system and its environment and to include components into the model which observe and do not modify the behaviour of the system to be tested. The executable codes are integrated in the model. The tool uses target machine simulators (microprocessors). The technique used (the event simulation) allows to associate actions with events such as the execution of an instruction, the access to a variable etc.. The simulation results are exploited using graphic, states research and test cover measurement tools. In particular, this tool can give help to the evaluation of critical softwares with pre-existing components. (J.S.)

  6. The evolution of CACSD tools-a software engineering perspective

    DEFF Research Database (Denmark)

    Ravn, Ole; Szymkat, Maciej

    1992-01-01

    The earlier evolution of computer-aided control system design (CACSD) tools is discussed from a software engineering perspective. A model of the design process is presented as the basis for principles and requirements of future CACSD tools. Combinability, interfacing in memory, and an open...... workspace are seen as important concepts in CACSD. Some points are made about the problem of buy or make when new software is required, and the idea of buy and make is put forward. Emphasis is put on the time perspective and the life cycle of the software...

  7. Tool for Measuring Coupling in Object- Oriented Java Software

    Directory of Open Access Journals (Sweden)

    Mr. V. S. Bidve

    2016-04-01

    Full Text Available The importance of object-oriented software metrics is increasing day by day to evaluate and predict the quality of software. Coupling is one of the object-oriented metrics. It is a dependency degree to which one program module depends on one of the other modules. Coupling measures play a significant role in the quality aspect of object-oriented software, from design up to maintenance. To correctly predict the quality factors of object oriented software, the coupling should be accurately measured. In the related literature, we find many techniques to measure coupling. But, No any author explained the implementation of his technique(s in details and made the tool available to know how exactly coupling has been measured. In this paper, we propose a tool for measurement of coupling among classes of Java software. Java source code is taken as an input for the tool to measure coupling. The input Java code is parsed, and tokens are extracted. These tokens along with the code are used to measure different types of Coupling in Java software. Coupling of different sample Java codes is measured with the tool to observe values of each coupling type.

  8. Management of Astronomical Software Projects with Open Source Tools

    Science.gov (United States)

    Briegel, F.; Bertram, T.; Berwein, J.; Kittmann, F.

    2010-12-01

    In this paper we will offer an innovative approach to managing the software development process with free open source tools, for building and automated testing, a system to automate the compile/test cycle on a variety of platforms to validate code changes, using virtualization to compile in parallel on various operating system platforms, version control and change management, enhanced wiki and issue tracking system for online documentation and reporting and groupware tools as they are: blog, discussion and calendar. Initially starting with the Linc-Nirvana instrument a new project and configuration management tool for developing astronomical software was looked for. After evaluation of various systems of this kind, we are satisfied with the selection we are using now. Following the lead of Linc-Nirvana most of the other software projects at the MPIA are using it now.

  9. Software engineering and data management for automated payload experiment tool

    Science.gov (United States)

    Maddux, Gary A.; Provancha, Anna; Chattam, David

    1994-01-01

    The Microgravity Projects Office identified a need to develop a software package that will lead experiment developers through the development planning process, obtain necessary information, establish an electronic data exchange avenue, and allow easier manipulation/reformatting of the collected information. An MS-DOS compatible software package called the Automated Payload Experiment Tool (APET) has been developed and delivered. The objective of this task is to expand on the results of the APET work previously performed by UAH and provide versions of the software in a Macintosh and Windows compatible format.

  10. PAnalyzer: A software tool for protein inference in shotgun proteomics

    Directory of Open Access Journals (Sweden)

    Prieto Gorka

    2012-11-01

    Full Text Available Abstract Background Protein inference from peptide identifications in shotgun proteomics must deal with ambiguities that arise due to the presence of peptides shared between different proteins, which is common in higher eukaryotes. Recently data independent acquisition (DIA approaches have emerged as an alternative to the traditional data dependent acquisition (DDA in shotgun proteomics experiments. MSE is the term used to name one of the DIA approaches used in QTOF instruments. MSE data require specialized software to process acquired spectra and to perform peptide and protein identifications. However the software available at the moment does not group the identified proteins in a transparent way by taking into account peptide evidence categories. Furthermore the inspection, comparison and report of the obtained results require tedious manual intervention. Here we report a software tool to address these limitations for MSE data. Results In this paper we present PAnalyzer, a software tool focused on the protein inference process of shotgun proteomics. Our approach considers all the identified proteins and groups them when necessary indicating their confidence using different evidence categories. PAnalyzer can read protein identification files in the XML output format of the ProteinLynx Global Server (PLGS software provided by Waters Corporation for their MSE data, and also in the mzIdentML format recently standardized by HUPO-PSI. Multiple files can also be read simultaneously and are considered as technical replicates. Results are saved to CSV, HTML and mzIdentML (in the case of a single mzIdentML input file files. An MSE analysis of a real sample is presented to compare the results of PAnalyzer and ProteinLynx Global Server. Conclusions We present a software tool to deal with the ambiguities that arise in the protein inference process. Key contributions are support for MSE data analysis by ProteinLynx Global Server and technical replicates

  11. Automotive Software Engineering. Fundamentals, processes, methods, tools; Automotive Software Engineering. Grundlagen, Prozesse, Methoden und Werkzeuge

    Energy Technology Data Exchange (ETDEWEB)

    Schaeuffele, J.; Zurawka, T.

    2003-07-01

    The book presents fundamentals and practical examples of processes, methods and tools to ensure safe operation of electronic systems and software in motor vehicles. The focus is on the electronic systems of the powertrain, suspension and ar body. Contents: The overall system of car, driver and environment; Fundamentals; Processes for development of electronic systems and software; Methods and tools for the development, production and servicing of electronic systems. The book addresses staff members of motor car producers and suppliers of electronic systems and software, as well as students of computer science, electrical and mechanical engineering specifying in car engineering, control engineering, mechatronics and software engineering. [German] Dieses Buch enthaelt Grundlagen und praktische Beispiele zu Prozessen, Methoden und Werkzeugen, die zur sicheren Beherrschbarkeit von elektronischen Systemen und Software im Fahrzeug beitragen. Dabei stehen die elektronischen Systeme des Antriebsstrangs, des Fahrwerks und der Karosserie im Vordergrund. Zum Inhalt gehoeren die folgenden Rubriken: Gesamtsystem Fahrzeug-Fahrer-Umwelt - Grundlagen - Prozesse zur Entwicklung von elektronischen Systemen und Software - Methoden und Werkzeuge fuer die Entwicklung, die Produktion un den Service elektronischer Systeme. Das Buch richtet sich an alle Mitarbeiter von Fahrzeugherstellern und Zulieferern, die sich mit elektornischen Systemen und Software im Fahrzeug beschaeftigen. Studierende der Informatik, der Elektrotechnik oder des Maschinenbaus mit den Schwerpunkten Fahrzeugtechnik, Steuerungs- und Regelungstechnik, Mechatronik oder Software-Technik. (orig.)

  12. Validation of a software dependability tool via fault injection experiments

    OpenAIRE

    Tagliaferri, Luca; Benso, Alfredo; Di Carlo, Stefano; Di Natale, Giorgio; Prinetto, Paolo Ernesto

    2001-01-01

    Presents the validation of the strategies employed in the RECCO tool to analyze a C/C++ software; the RECCO compiler scans C/C++ source code to extract information about the significance of the variables that populate the program and the code structure itself. Experimental results gathered on an Open Source Router are used to compare and correlate two sets of critical variables, one obtained by fault injection experiments, and the other applying the RECCO tool, respectively. Then the two sets...

  13. The Systems Biology Research Tool: evolvable open-source software

    Directory of Open Access Journals (Sweden)

    Wright Jeremiah

    2008-06-01

    Full Text Available Abstract Background Research in the field of systems biology requires software for a variety of purposes. Software must be used to store, retrieve, analyze, and sometimes even to collect the data obtained from system-level (often high-throughput experiments. Software must also be used to implement mathematical models and algorithms required for simulation and theoretical predictions on the system-level. Results We introduce a free, easy-to-use, open-source, integrated software platform called the Systems Biology Research Tool (SBRT to facilitate the computational aspects of systems biology. The SBRT currently performs 35 methods for analyzing stoichiometric networks and 16 methods from fields such as graph theory, geometry, algebra, and combinatorics. New computational techniques can be added to the SBRT via process plug-ins, providing a high degree of evolvability and a unifying framework for software development in systems biology. Conclusion The Systems Biology Research Tool represents a technological advance for systems biology. This software can be used to make sophisticated computational techniques accessible to everyone (including those with no programming ability, to facilitate cooperation among researchers, and to expedite progress in the field of systems biology.

  14. Software tools for quay crane exploitation and training

    OpenAIRE

    Dragomir Cristina; Breazu Alina

    2011-01-01

    One of the objectives of berth crane operations management in ports is productivity maximization of berth cranes, matched with the vessel requirement of minimizing waiting times. This paper presents several management software tools for berth crane operations in ports that are used for aquiring such an objective

  15. Knickpoint finder: A software tool that improves neotectonic analysis

    Science.gov (United States)

    Queiroz, G. L.; Salamuni, E.; Nascimento, E. R.

    2015-03-01

    This work presents a new software tool for morphometric analysis of drainage networks based on the methods of Hack (1973) and Etchebehere et al. (2004). This tool is applicable to studies of morphotectonics and neotectonics. The software used a digital elevation model (DEM) to identify the relief breakpoints along drainage profiles (knickpoints). The program was coded in Python for use on the ArcGIS platform and is called Knickpoint Finder. A study area was selected to test and evaluate the software's ability to analyze and identify neotectonic morphostructures based on the morphology of the terrain. For an assessment of its validity, we chose an area of the James River basin, which covers most of the Piedmont area of Virginia (USA), which is an area of constant intraplate seismicity and non-orogenic active tectonics and exhibits a relatively homogeneous geodesic surface currently being altered by the seismogenic features of the region. After using the tool in the chosen area, we found that the knickpoint locations are associated with the geologic structures, epicenters of recent earthquakes, and drainages with rectilinear anomalies. The regional analysis demanded the use of a spatial representation of the data after processing using Knickpoint Finder. The results were satisfactory in terms of the correlation of dense areas of knickpoints with active lineaments and the rapidity of the identification of deformed areas. Therefore, this software tool may be considered useful in neotectonic analyses of large areas and may be applied to any area where there is DEM coverage.

  16. Proposing a Mathematical Software Tool in Physics Secondary Education

    Science.gov (United States)

    Baltzis, Konstantinos B.

    2009-01-01

    MathCad® is a very popular software tool for mathematical and statistical analysis in science and engineering. Its low cost, ease of use, extensive function library, and worksheet-like user interface distinguish it among other commercial packages. Its features are also well suited to educational process. The use of natural mathematical notation…

  17. Software simulation: a tool for enhancing control system design

    International Nuclear Information System (INIS)

    The creation, implementation and management of engineering design tools are important to the quality and efficiency of any large engineering project. Some of the most complicated tools to develop are system simulators. The development and implementation of system simulators to support replacement fuel handling control systems is of particular interest to the Canadian nuclear industry given the current age of installations and the risk of obsolescence to many utilities. The use of such simulator tools has been known to significantly improve successful deployment of new software packages and maintenance-related software changes while reducing the time required for their overall development. Moreover, these simulation systems can also serve as operator training stations and provide a virtual environment for site engineers to test operational changes before they are uploaded to the actual system. (author)

  18. Software simulation: a tool for enhancing control system design

    Energy Technology Data Exchange (ETDEWEB)

    Sze, B.; Ridgway, G.H., E-mail: beatrice.sze@ge.com, E-mail: guy.ridgway@ge.com [GE Hitachi Nuclear Energy Canada, Peterborough, Ontario (Canada)

    2008-07-01

    The creation, implementation and management of engineering design tools are important to the quality and efficiency of any large engineering project. Some of the most complicated tools to develop are system simulators. The development and implementation of system simulators to support replacement fuel handling control systems is of particular interest to the Canadian nuclear industry given the current age of installations and the risk of obsolescence to many utilities. The use of such simulator tools has been known to significantly improve successful deployment of new software packages and maintenance-related software changes while reducing the time required for their overall development. Moreover, these simulation systems can also serve as operator training stations and provide a virtual environment for site engineers to test operational changes before they are uploaded to the actual system. (author)

  19. A Brief Review of Software Tools for Pangenomics

    Institute of Scientific and Technical Information of China (English)

    Jingfa Xiao; Zhewen Zhang; Jiayan Wu; Jun Yu

    2015-01-01

    Since the proposal for pangenomic study, there have been a dozen software tools actively in use for pangenomic analysis. By the end of 2014, Panseq and the pan-genomes analysis pipeline (PGAP) ranked as the top two most popular packages according to cumulative citations of peer-reviewed scientific publications. The functions of the software packages and tools, albeit variable among them, include categorizing orthologous genes, calculating pangenomic profiles, integrating gene annotations, and constructing phylogenies. As epigenomic elements are being gradually revealed in prokaryotes, it is expected that pangenomic databases and toolkits have to be extended to handle information of detailed functional annotations for genes and non-protein-coding sequences including non-coding RNAs, insertion elements, and conserved structural elements. To develop better bioinformatic tools, user feedback and integration of novel features are both of essence.

  20. Software tools at the Rome CMS/ECAL Regional Center

    CERN Document Server

    Organtini, G

    2001-01-01

    The construction of the CMS electromagnetic calorimeter is under way in Rome and at CERN. To this purpose, two Regional Centers were set up in both sites. In Rome, the project was entirely carried out using new software technologies such as object oriented programming, object databases, CORBA programming and Web tools. It can be regarded as a use case for the evaluation of the benefits of new software technologies in high energy physics. Our experience is positive and encouraging for the future. (10 refs).

  1. CLAIRE, an event-driven simulation tool for testing software

    International Nuclear Information System (INIS)

    CLAIRE is a software tool created to perform validations on executable codes or on specifications of distributed real-time applications for nuclear safety. CLAIRE can be used both to verify the safety properties by modelling the specifications, and also to validate the final code by simulating the behaviour of its equipment and software interfaces. It can be used to observe and provide dynamic control of the simulation process, and also to record changes to the simulated data for off-line analysis. (R.P.)

  2. COSTMODL: An automated software development cost estimation tool

    Science.gov (United States)

    Roush, George B.

    1991-01-01

    The cost of developing computer software continues to consume an increasing portion of many organizations' total budgets, both in the public and private sector. As this trend develops, the capability to produce reliable estimates of the effort and schedule required to develop a candidate software product takes on increasing importance. The COSTMODL program was developed to provide an in-house capability to perform development cost estimates for NASA software projects. COSTMODL is an automated software development cost estimation tool which incorporates five cost estimation algorithms including the latest models for the Ada language and incrementally developed products. The principal characteristic which sets COSTMODL apart from other software cost estimation programs is its capacity to be completely customized to a particular environment. The estimation equations can be recalibrated to reflect the programmer productivity characteristics demonstrated by the user's organization, and the set of significant factors which effect software development costs can be customized to reflect any unique properties of the user's development environment. Careful use of a capability such as COSTMODL can significantly reduce the risk of cost overruns and failed projects.

  3. Software Tools to Support the Assessment of System Health

    Science.gov (United States)

    Melcher, Kevin J.

    2013-01-01

    This presentation provides an overview of three software tools that were developed by the NASA Glenn Research Center to support the assessment of system health: the Propulsion Diagnostic Method Evaluation Strategy (ProDIMES), the Systematic Sensor Selection Strategy (S4), and the Extended Testability Analysis (ETA) tool. Originally developed to support specific NASA projects in aeronautics and space, these software tools are currently available to U.S. citizens through the NASA Glenn Software Catalog. The ProDiMES software tool was developed to support a uniform comparison of propulsion gas path diagnostic methods. Methods published in the open literature are typically applied to dissimilar platforms with different levels of complexity. They often address different diagnostic problems and use inconsistent metrics for evaluating performance. As a result, it is difficult to perform a one ]to ]one comparison of the various diagnostic methods. ProDIMES solves this problem by serving as a theme problem to aid in propulsion gas path diagnostic technology development and evaluation. The overall goal is to provide a tool that will serve as an industry standard, and will truly facilitate the development and evaluation of significant Engine Health Management (EHM) capabilities. ProDiMES has been developed under a collaborative project of The Technical Cooperation Program (TTCP) based on feedback provided by individuals within the aircraft engine health management community. The S4 software tool provides a framework that supports the optimal selection of sensors for health management assessments. S4 is structured to accommodate user ]defined applications, diagnostic systems, search techniques, and system requirements/constraints. One or more sensor suites that maximize this performance while meeting other user ]defined system requirements that are presumed to exist. S4 provides a systematic approach for evaluating combinations of sensors to determine the set or sets of

  4. Software Tools for Electrical Quality Assurance in the LHC

    CERN Document Server

    Bednarek, Mateusz

    2011-01-01

    There are over 1600 superconducting magnet circuits in the LHC machine. Many of them consist of a large number of components electrically connected in series. This enhances the sensitivity of the whole circuits to electrical faults of individual components. Furthermore, circuits are equipped with a large number of instrumentation wires, which are exposed to accidental damage or swapping. In order to ensure safe operation, an Electrical Quality Assurance (ELQA) campaign is needed after each thermal cycle. Due to the complexity of the circuits, as well as their distant geographical distribution (tunnel of 27km circumference divided in 8 sectors), suitable software and hardware platforms had to be developed. The software combines an Oracle database, LabView data acquisition applications and PHP-based web follow-up tools. This paper describes the software used for the ELQA of the LHC.

  5. The Open2-Innova8ion Tool - A Software Tool for Rating Organisational Innovation Performance

    OpenAIRE

    Caird, Sally; Hallett, Stephen; Potter, Stephen

    2013-01-01

    The Open2-Innova8ion Tool is an interactive, multi-media, web-based software tool for rating organisational innovation performance. This tool was designed for organisations to use as an adaptation of the European Commission’s work on developing empirical measures of national innovation performance with the Summary Innovation Index (SII). It is designed for users with experience of employment in an organisation, from senior managers to all types of employees, with an interest in rating the inn...

  6. Development to requirements for a procedures software tool

    International Nuclear Information System (INIS)

    In 1989, the Electric Power Research Institute (EPRI) and the Central Research Institute of the Electric Power Industry (CRIEPI) in Japan initiated a joint research program to investigate various interventions to reduce personnel errors and inefficiencies in the maintenance of nuclear power plants. This program, consisting of several interrelated projects, was initiated because of the mutual recognition of the importance of the human element in the efficient and safe operation of utilities and the continuing need to enhance personnel performance to sustain plant safety and availability. This paper summarizes one of the projects, jointly funded by EPRI and CRIEPI, to analyze the requirements for, and prepare a functional description of, a procedures software tool (PST). The primary objective of this project was to develop a description of the features and functions of a software tool that would help procedure writers to improve the quality of maintenance and testing procedures, thereby enhancing the performance of both procedure writers and maintenance personnel

  7. Proposing a Mathematical Software Tool in Physics Secondary Education

    Directory of Open Access Journals (Sweden)

    Konstantinos B. Baltzis

    2009-03-01

    Full Text Available MathCad® is a very popular software tool for mathematical and statistical analysis in science and engineering. Its low cost, ease of use, extensive function library, and worksheet–like user interface distinguish it among other commercial packages. Its features are also well suited to educational process. The use of natural mathematical notation and built–in measurement units are its two major advantages in teaching and learning. In this paper, its complementary use in the upper secondary physics education in Greece is explored. In order to demonstrate its application in the teaching process, a set of representative examples are presented. The main features and advantages of the software are also pointed out. The paper aims to present the benefits of the application of mathematical information technology tools in secondary physics education. In this effort, MathCad® is probably the most promising solution.

  8. Development of the software generation method using model driven software engineering tool

    Energy Technology Data Exchange (ETDEWEB)

    Jang, H. S.; Jeong, J. C.; Kim, J. H.; Han, H. W.; Kim, D. Y.; Jang, Y. W. [KOPEC, Taejon (Korea, Republic of); Moon, W. S. [NEXTech Inc., Seoul (Korea, Republic of)

    2003-10-01

    The methodologies to generate the automated software design specification and source code for the nuclear I and C systems software using model driven language is developed in this work. For qualitative analysis of the algorithm, the activity diagram is modeled and generated using Unified Modeling Language (UML), and then the sequence diagram is designed for automated source code generation. For validation of the generated code, the code audits and module test is performed using Test and QA tool. The code coverage and complexities of example code are examined in this stage. The low pressure pressurizer reactor trip module of the Plant Protection System was programmed as subject for this task. The test result using the test tool shows that errors were easily detected from the generated source codes that have been generated using test tool. The accuracy of input/output processing by the execution modules was clearly identified.

  9. Software tool for assessmentof food contamination and food bans regulations

    Czech Academy of Sciences Publication Activity Database

    Pecha, Petr; Hofman, Radek; Pechová, E.

    Freising, Germany: EUROSIS-ETI Publication; Fraunhofer IVV, Freising, 2012, s. 1-5. ISBN 978-90-77381-72-4. [FOODSIM 2012. Freising (DE), 18.06.2012-20.06.2012] R&D Projects: GA MV(CZ) VG20102013018 Institutional support: RVO:67985556 Keywords : INGESTION * DOSE * ESTIMATION Subject RIV: AQ - Safety, Health Protection, Human - Machine http://library.utia.cas.cz/separaty/2012/AS/pecha-software tool for assessmentof food contamination and food bans regulations.pdf

  10. Nucleonica: Web-based Software Tools for Simulations and Analysis

    OpenAIRE

    Magill, Joseph; DREHER Raymond; SOTI Zsolt; LASCHE George

    2012-01-01

    The authors present a description of a new web-based software portal for simulation and analysis for a wide range of commonly encountered nuclear science applications. Advantages of a web-based approach include availability wherever there is internet access, intuitive user-friendly interface, remote access to high-power computing resources, and continual maintenance, improvement, and addition of tools and techniques common to the nuclear science industry. A description of the nuclear data res...

  11. Handling Complex Configurations in Software Product Lines: a Tooled Approach

    OpenAIRE

    Urli, Simon; Blay-Fornarino, Mireille; Collet, Philippe

    2014-01-01

    As Software Product Lines (SPLs) are now more widely ap- plied in new application fields such as IT or Web systems, complex and large-scale configurations have to be handled. In these fields, the strong domain orientation leads to the need to manage interrelated SPLs and multiple instances of configured sub-products, resulting in complex configurations that cannot be easily represented by simple sets of features. In this paper we propose a tooled approach to manage such SPLs through a domain ...

  12. Software Information Base(SIB)and Its Integration with Data Flow Diagram(DFD)Tool

    Institute of Scientific and Technical Information of China (English)

    董士海

    1989-01-01

    Software in formation base is the main technique of the integration of software engineering environment.Data flow diagram tool is an important software tool to support software requirement analysis phase.This article introduces the functions,structures of a Software Information Base(SIB),and a Data Flow Diagram tool first.The E-R data model of SIB and its integration with Data Flow Diagram tool are emphatically described.

  13. Building a High-Level Process Model for Soliciting Requirements on Software Tools to Support Software Development : Experience Report

    OpenAIRE

    Bider, Ilia; Karapantelakis, Athanasios; Khadka, Nirjal

    2013-01-01

    Use of software tools to support business processes is both a possibility and necessity for both large and small enterprises of today. Given the variety of tools on the market, the question of how to choose the right tools for the process in question or analyze the suitability of the tools already employed arises. The paper presents an experience report of using a high-level business process model for analyzing software tools suitability at a large ICT organization that recently transitioned ...

  14. Tool Support for Parametric Analysis of Large Software Simulation Systems

    Science.gov (United States)

    Schumann, Johann; Gundy-Burlet, Karen; Pasareanu, Corina; Menzies, Tim; Barrett, Tony

    2008-01-01

    The analysis of large and complex parameterized software systems, e.g., systems simulation in aerospace, is very complicated and time-consuming due to the large parameter space, and the complex, highly coupled nonlinear nature of the different system components. Thus, such systems are generally validated only in regions local to anticipated operating points rather than through characterization of the entire feasible operational envelope of the system. We have addressed the factors deterring such an analysis with a tool to support envelope assessment: we utilize a combination of advanced Monte Carlo generation with n-factor combinatorial parameter variations to limit the number of cases, but still explore important interactions in the parameter space in a systematic fashion. Additional test-cases, automatically generated from models (e.g., UML, Simulink, Stateflow) improve the coverage. The distributed test runs of the software system produce vast amounts of data, making manual analysis impossible. Our tool automatically analyzes the generated data through a combination of unsupervised Bayesian clustering techniques (AutoBayes) and supervised learning of critical parameter ranges using the treatment learner TAR3. The tool has been developed around the Trick simulation environment, which is widely used within NASA. We will present this tool with a GN&C (Guidance, Navigation and Control) simulation of a small satellite system.

  15. COMSY - A Software Tool for Aging and Plant Life Management

    International Nuclear Information System (INIS)

    A Plant-wide and systematic Aging and Plant Life Management is essential for the safe operation and/or availability of nuclear power plants. The Aging Management (AM) has the objective to monitor and control degradation effects for safety relevant Systems, Structures and Components (SSCs) which may compromise safety functions of the plant. The Plant Life Management (PLM) methodology also includes aging surveillance for availability relevant SSCs. AM and PLM cover mechanical components, electrical and I and C systems and civil structures All Aging and Plant Life Management rules call for a comprehensive approach, requiring the systematic collection of various aging and safety relevant data on a plant-wide basis. This data needs to be serviced and periodically evaluated. Due to the complexity of the process, this activity needs to be supported by a qualified software tool for the management of aging relevant data and associated documents (approx. 30 000 SSCs). In order to support the power plant operators AREVA NP has developed the software tool COMSY. The COMSY software with its integrated AM modules enables the design and setup of a knowledge-based power plant model compatible to the requirements of international and national rules (e.g. IAEA Safety Guide NS-G-2.12, KTA 1403). In this process, a key task is to identify and monitor degradation mechanisms. For this purpose the COMSY tool provides prognosis and trending functions, which are based on more than 30 years of experience in the evaluation of degradation effects and numerous experimental studies. Since 1998 COMSY has been applied successfully in more than fifty reactor units in this field. The current version 3.0 was revised completely and offers additional AM functions. All aging-relevant component data are compiled and allocated via an integrated power plant model. Owing to existing interfaces to other software solutions and flexible import functions, COMSY is highly compatible with already existing data

  16. STARS software tool for analysis of reliability and safety

    International Nuclear Information System (INIS)

    This paper reports on the STARS (Software Tool for the Analysis of Reliability and Safety) project aims at developing an integrated set of Computer Aided Reliability Analysis tools for the various tasks involved in systems safety and reliability analysis including hazard identification, qualitative analysis, logic model construction and evaluation. The expert system technology offers the most promising perspective for developing a Computer Aided Reliability Analysis tool. Combined with graphics and analysis capabilities, it can provide a natural engineering oriented environment for computer assisted reliability and safety modelling and analysis. For hazard identification and fault tree construction, a frame/rule based expert system is used, in which the deductive (goal driven) reasoning and the heuristic, applied during manual fault tree construction, is modelled. Expert system can explain their reasoning so that the analyst can become aware of the why and the how results are being obtained. Hence, the learning aspect involved in manual reliability and safety analysis can be maintained and improved

  17. Classroom Live: a software-assisted gamification tool

    Science.gov (United States)

    de Freitas, Adrian A.; de Freitas, Michelle M.

    2013-06-01

    Teachers have come to rely on a variety of approaches in order to elicit and sustain student interest in the classroom. One particular approach, known as gamification, seeks to improve student engagement by transforming the traditional classroom experience into a competitive multiplayer game. Initial attempts at classroom gamification relied on the teacher manually tracking student progress. At the US Air Force Academy, we wanted to experiment with a software gamification tool. Our client/server suite, dubbed Classroom Live, streamlines the gamification process for the teacher by simplifying common tasks. Simultaneously, the tool provides students with an esthetically pleasing user interface that offers in game rewards in exchange for their participation. Classroom Live is still in development, but our initial experience using the tool has been extremely positive and confirms our belief that students respond positively to gamification, even at the undergraduate level.

  18. A NEO population generation and observation simulation software tool

    Science.gov (United States)

    Müller, Sven; Gelhaus, Johannes; Hahn, Gerhard; Franco, Raffaella

    One of the main targets of ESA's Space Situational Awareness (SSA) program is to build a wide knowledge base about objects that can potentially harm Earth (Near-Earth Objects, NEOs). An important part of this effort is to create the Small Bodies Data Centre (SBDC) which is going to aggregate measurement data from a fully-integrated NEO observation sensor network. Until this network is developed, artificial NEO measurement data is needed in order to validate SBDC algorithms. Moreover, to establish a functioning NEO observation sensor network, it has to be determined where to place sensors, what technical requirements have to be met in order to be able to detect NEOs and which observation strategies work the best. Because of this, a sensor simulation software was needed. This paper presents a software tool which allows users to create and analyse NEO populations and to simulate and analyse population observations. It is a console program written in Fortran and comes with a Graphical User Interface (GUI) written in Java and C. The tool can be distinguished into the components ``Population Generator'' and ``Observation Simulator''. The Population Generator component is responsible for generating and analysing a NEO population. Users can choose between creating fictitious (random) and synthetic populations. The latter are based on one of two models describing the orbital and size distribution of observed NEOs: The existing socalled ``Bottke Model'' (Bottke et al. 2000, 2002) and the new ``Granvik Model'' (Granvik et al. 2014, in preparation) which has been developed in parallel to the tool. Generated populations can be analysed by defining 2D, 3D and scatter plots using various NEO attributes. As a result, the tool creates the appropiate files for the plotting tool ``gnuplot''. The tool's Observation Simulator component yields the Observation Simulation and Observation Analysis functions. Users can define sensor systems using ground- or space-based locations as well as

  19. Software tool for horizontal-axis wind turbine simulation

    Energy Technology Data Exchange (ETDEWEB)

    Vitale, A.J. [Instituto Argentino de Oceanografia, Camino La Carrindanga Km. 7, 5 CC 804, B8000FWB Bahia Blanca (Argentina); Universidad Tecnologica Nacional Facultad Regional Bahia Blanca, GESE, 11 de Abril 461, B8000LMI Bahia Blanca (Argentina); Dpto. de Ing. Electrica y de Computadoras, Universidad Nacional del Sur, Av. Alem 1253, 8000 Bahia Blanca (Argentina); Rossi, A.P. [Universidad Tecnologica Nacional Facultad Regional Bahia Blanca, GESE, 11 de Abril 461, B8000LMI Bahia Blanca (Argentina); Dpto. de Ing. Electrica y de Computadoras, Universidad Nacional del Sur, Av. Alem 1253, 8000 Bahia Blanca (Argentina)

    2008-07-15

    The main problem of a wind turbine generator design project is the design of the right blades capable of satisfying the specific energy requirement of an electric system with optimum performance. Once the blade has been designed for optimum operation at a particular rotor angular speed, it is necessary to determine the overall performance of the rotor under the range of wind speed that it will encounter. A software tool that simulates low-power, horizontal-axis wind turbines was developed for this purpose. With this program, the user can calculate the rotor power output for any combination of wind and rotor speeds, with definite blade shape and airfoil characteristics. The software also provides information about distribution of forces along the blade span, for different operational conditions. (author)

  20. Northwestern University Schizophrenia Data and Software Tool (NUSDAST

    Directory of Open Access Journals (Sweden)

    Lei eWang

    2013-11-01

    Full Text Available The schizophrenia research community has invested substantial resources on collecting, managing and sharing large neuroimaging datasets. As part of this effort, our group has collected high resolution magnetic resonance (MR datasets from individuals with schizophrenia, their non-psychotic siblings, healthy controls and their siblings. This effort has resulted in a growing resource, the Northwestern University Schizophrenia Data and Software Tool (NUSDAST, an NIH-funded data sharing project to stimulate new research. This resource resides on XNAT Central, and it contains neuroimaging (MR scans, landmarks and surface maps for deep subcortical structures, and FreeSurfer cortical parcellation and measurement data, cognitive (cognitive domain scores for crystallized intelligence, working memory, episodic memory, and executive function, clinical (demographic, sibling relationship, SAPS and SANS psychopathology, and genetic (20 polymorphisms data, collected from more than 450 subjects, most with 2-year longitudinal follow-up. A neuroimaging mapping, analysis and visualization software tool, CAWorks, is also part of this resource. Moreover, in making our existing neuroimaging data along with the associated meta-data and computational tools publically accessible, we have established a web-based information retrieval portal that allows the user to efficiently search the collection. This research-ready dataset meaningfully combines neuroimaging data with other relevant information, and it can be used to help facilitate advancing neuroimaging research. It is our hope that this effort will help to overcome some of the commonly recognized technical barriers in advancing neuroimaging research such as lack of local organization and standard descriptions.

  1. Northwestern University Schizophrenia Data and Software Tool (NUSDAST).

    Science.gov (United States)

    Wang, Lei; Kogan, Alex; Cobia, Derin; Alpert, Kathryn; Kolasny, Anthony; Miller, Michael I; Marcus, Daniel

    2013-01-01

    The schizophrenia research community has invested substantial resources on collecting, managing and sharing large neuroimaging datasets. As part of this effort, our group has collected high resolution magnetic resonance (MR) datasets from individuals with schizophrenia, their non-psychotic siblings, healthy controls and their siblings. This effort has resulted in a growing resource, the Northwestern University Schizophrenia Data and Software Tool (NUSDAST), an NIH-funded data sharing project to stimulate new research. This resource resides on XNAT Central, and it contains neuroimaging (MR scans, landmarks and surface maps for deep subcortical structures, and FreeSurfer cortical parcellation and measurement data), cognitive (cognitive domain scores for crystallized intelligence, working memory, episodic memory, and executive function), clinical (demographic, sibling relationship, SAPS and SANS psychopathology), and genetic (20 polymorphisms) data, collected from more than 450 subjects, most with 2-year longitudinal follow-up. A neuroimaging mapping, analysis and visualization software tool, CAWorks, is also part of this resource. Moreover, in making our existing neuroimaging data along with the associated meta-data and computational tools publically accessible, we have established a web-based information retrieval portal that allows the user to efficiently search the collection. This research-ready dataset meaningfully combines neuroimaging data with other relevant information, and it can be used to help facilitate advancing neuroimaging research. It is our hope that this effort will help to overcome some of the commonly recognized technical barriers in advancing neuroimaging research such as lack of local organization and standard descriptions. PMID:24223551

  2. Software reference for SaTool - a Tool for Structural Analysis of Automated Systems

    DEFF Research Database (Denmark)

    Lorentzen, Torsten; Blanke, Mogens

    2004-01-01

    This software reference details the functions of SaTool – a tool for structural analysis of technical systems. SaTool is intended used as part of an industrial systems design cycle. Structural analysis is a graph-based technique where principal relations between variables express the system’s...... of the graph. SaTool makes analysis of the structure graph to provide knowledge about fundamental properties of the system in normal and faulty conditions. Salient features of SaTool include rapid analysis of possibility to diagnose faults and ability to make autonomous recovery should faults occur....... properties. Measured and controlled quantities in the system are related to variables through functional relations, which need only be stated as names, their explicit composition need not be described to the tool. The user enters a list of these relations that together describe the entirerity of the system...

  3. Advanced software tools for digital loose part monitoring systems

    International Nuclear Information System (INIS)

    The paper describes two software modules as analysis tools for digital loose part monitoring systems. The first module is called acoustic module which utilizes the multi-media features of modern personal computers to replay the digital stored short-time bursts with sufficient length and in good quality. This is possible due to the so-called puzzle technique developed at ISTec. The second module is called classification module which calculates advanced burst parameters and classifies the acoustic events in pre-defined classes with the help of an artificial multi-layer perception neural network trained with the back propagation algorithm. (author). 7 refs, 7 figs

  4. HARP - A Software Tool for Decision Support during Nuclear Emergenccies

    Czech Academy of Sciences Publication Activity Database

    Pecha, Petr; Hofman, Radek

    Bologna : University og Bologna, 2009, s. 81-82. [Handling Complexity and Uncertainty in Environmental Studies. Bologna (IT), 05.07.2009-09.07.2009] R&D Projects: GA ČR(CZ) GA102/07/1596 Institutional research plan: CEZ:AV0Z10750506 Keywords : pollution propagation * uncertainty analysis * population protection Subject RIV: AQ - Safety, Health Protection, Human - Machine http://library.utia.cas.cz/separaty/2009/AS/pecha-harp-a software tool for decision support during nuclear emergenccies.pdf

  5. Object-Oriented Software Tools for the Construction of Preconditioners

    Directory of Open Access Journals (Sweden)

    Eva Mossberg

    1997-01-01

    Full Text Available In recent years, there has been considerable progress concerning preconditioned iterative methods for large and sparse systems of equations arising from the discretization of differential equations. Such methods are particularly attractive in the context of high-performance (parallel computers. However, the implementation of a preconditioner is a nontrivial task. The focus of the present contribution is on a set of object-oriented software tools that support the construction of a family of preconditioners based on fast transforms. By combining objects of different classes, it is possible to conveniently construct any preconditioner within this family.

  6. Commissioning software tools at the Advanced Photon Source

    International Nuclear Information System (INIS)

    A software tool-oriented approach has been adopted in the commissioning of the Advanced Photon Source (APS) at Argonne National Laboratory, particularly in the commissioning of the Positron Accumulator Ring (PAR). The general philosophy is to decompose a complicated procedure involving measurement, data processing, and control into a series of simpler steps, each accomplished by a generic toolkit program. The implementation is greatly facilitated by adopting the SDDS (self-describing data set protocol), which comes with its own toolkit. The combined toolkit has made accelerator physics measurements easier. For instance, the measurement of the optical functions of the PAR and the beamlines connected to it have been largely automated. Complicated measurements are feasible with a combination of tools running independently

  7. A software tool for graphically assembling damage identification algorithms

    Science.gov (United States)

    Allen, David W.; Clough, Joshua A.; Sohn, Hoon; Farrar, Charles R.

    2003-08-01

    At Los Alamos National Laboratory (LANL), various algorithms for structural health monitoring problems have been explored in the last 5 to 6 years. The original DIAMOND (Damage Identification And MOdal aNalysis of Data) software was developed as a package of modal analysis tools with some frequency domain damage identification algorithms included. Since the conception of DIAMOND, the Structural Health Monitoring (SHM) paradigm at LANL has been cast in the framework of statistical pattern recognition, promoting data driven damage detection approaches. To reflect this shift and to allow user-friendly analyses of data, a new piece of software, DIAMOND II is under development. The Graphical User Interface (GUI) of the DIAMOND II software is based on the idea of GLASS (Graphical Linking and Assembly of Syntax Structure) technology, which is currently being implemented at LANL. GLASS is a Java based GUI that allows drag and drop construction of algorithms from various categories of existing functions. In the platform of the underlying GLASS technology, DIAMOND II is simply a module specifically targeting damage identification applications. Users can assemble various routines, building their own algorithms or benchmark testing different damage identification approaches without writing a single line of code.

  8. Software Tools for In-Situ Documentation of Built Heritage

    Science.gov (United States)

    Smars, P.

    2013-07-01

    The paper presents open source software tools developed by the author to facilitate in-situ documentation of architectural and archæological heritage. The design choices are exposed and related to a general issue in conservation and documentation: taking decisions about a valuable object under threat . The questions of level of objectivity is central to the three steps of this process. It is our belief that in-situ documentation has to be favoured in this demanding context, full of potential discoveries. The very powerful surveying techniques in rapid development nowadays enhance our vision but often tend to bring back a critical part of the documentation process to the office. The software presented facilitate a direct treatment of the data on the site. Emphasis is given to flexibility, interoperability and simplicity. Key features of the software are listed and illustrated with examples (3D model of Gothic vaults, analysis of the shape of a column, deformation of a wall, direct interaction with AutoCAD).

  9. START: A software tool for robotized maintenance operations

    International Nuclear Information System (INIS)

    EDF is in charge of the operation and maintenance of 58 nuclear reactors in France. The company has carried out a project, named START, aiming at using robots for maintenance in order to reduce costs and exposure time and to improve operations quality. The main result of the START project is the development of a software tool used to rapidly design the control system of any robot or dedicated machine, so as to promptly answer unplanned maintenance demands. The START software allows to design a wide variety of control systems, dedicated to very different machines, and for a large range of applications. The first application of the START software has been to control a robot used to perform a welding on a reactor vessel penetration tube. Another application consisted in renewing the control system of a mobile robot, which is driven between the two vessels of the Super Phenix fast breeder reactor in order to remotely control the welds. The next application is expected to be the design of the control system of a robot used to decontaminate and repair the pressurizer of a PWR. (author)

  10. Software tools to aid Pascal and Ada program design

    Energy Technology Data Exchange (ETDEWEB)

    Jankowitz, H.T.

    1987-01-01

    This thesis describes a software tool which analyses the style and structure of Pascal and Ada programs by ensuring that some minimum design requirements are fulfilled. The tool is used in much the same way as a compiler is used to teach students the syntax of a language, only in this case issues related to the design and structure of the program are of paramount importance. The tool operates by analyzing the design and structure of a syntactically correct program, automatically generating a report detailing changes that need to be made in order to ensure that the program is structurally sound. The author discusses how the model gradually evolved from a plagiarism detection system which extracted several measurable characteristics in a program to a model that analyzed the style of Pascal programs. In order to incorporate more-sophistical concepts like data abstraction, information hiding and data protection, this model was then extended to analyze the composition of Ada programs. The Ada model takes full advantage of facilities offered in the language and by using this tool the standard and quality of written programs is raised whilst the fundamental principles of program design are grasped through a process of self-tuition.

  11. The Software Improvement Process - Tools And Rules To Encourage Quality

    CERN Document Server

    Sigerud, K

    2011-01-01

    The Applications section of the CERN accelerator Controls group has decided to apply a systematic approach to quality assurance (QA), the “Software Improvement Process”, SIP. This process focuses on three areas: the development process itself, suitable QA tools, and how to practically encourage developers to do QA. For each stage of the development process we have agreed on the recommended activities and deliverables, and identified tools to automate and support the task. For example we do more code reviews. As peer reviews are resource-intensive, we only do them for complex parts of a product. As a complement, we are using static code checking tools, like FindBugs and Checkstyle. We also encourage unit testing and have agreed on a minimum level of test coverage recommended for all products, measured using Clover. Each of these tools is well integrated with our IDE (Eclipse) and give instant feedback to the developer about the quality of their code. The major challenges of SIP have been to 1) agree on com...

  12. Evaluating, selecting and relevance software tools in technology monitoring

    Directory of Open Access Journals (Sweden)

    Óscar Fernando Castellanos Domínguez

    2010-07-01

    Full Text Available The current setting for industrial and entrepreneurial development has posed the need for incorporating differentiating elements into the production apparatus leading to anticipating technological change. Technology monitoring (TM emerges as a methodology focused on analysing these changes for identifying challenges and opportunities (being mainly supported by information technology (IT through the search for, capture and analysis of data and information. This article proposes criteria for choosing and efficiently using software tools having different characteristics, requirements, capacity and cost which could be used in monitoring. An approach is made to different TM models, emphasising the identification and analysis of different information sources for coving and supporting information and access monitoring. Some evaluation, selection and analysis criteria are given for using these types of tools according to each production system’s individual profile and needs. Some of the existing software packages are described which are available on the market for carrying out monitoring prolects, relating them to their complexity, process characteristics and cost.

  13. Learning Photogrammetry with Interactive Software Tool PhoX

    Science.gov (United States)

    Luhmann, T.

    2016-06-01

    Photogrammetry is a complex topic in high-level university teaching, especially in the fields of geodesy, geoinformatics and metrology where high quality results are demanded. In addition, more and more black-box solutions for 3D image processing and point cloud generation are available that generate nice results easily, e.g. by structure-from-motion approaches. Within this context, the classical approach of teaching photogrammetry (e.g. focusing on aerial stereophotogrammetry) has to be reformed in order to educate students and professionals with new topics and provide them with more information behind the scene. Since around 20 years photogrammetry courses at the Jade University of Applied Sciences in Oldenburg, Germany, include the use of digital photogrammetry software that provide individual exercises, deep analysis of calculation results and a wide range of visualization tools for almost all standard tasks in photogrammetry. During the last years the software package PhoX has been developed that is part of a new didactic concept in photogrammetry and related subjects. It also serves as analysis tool in recent research projects. PhoX consists of a project-oriented data structure for images, image data, measured points and features and 3D objects. It allows for almost all basic photogrammetric measurement tools, image processing, calculation methods, graphical analysis functions, simulations and much more. Students use the program in order to conduct predefined exercises where they have the opportunity to analyse results in a high level of detail. This includes the analysis of statistical quality parameters but also the meaning of transformation parameters, rotation matrices, calibration and orientation data. As one specific advantage, PhoX allows for the interactive modification of single parameters and the direct view of the resulting effect in image or object space.

  14. Hardware replacements and software tools for digital control computers

    International Nuclear Information System (INIS)

    computers which use 'Varian' technology. A new software program, Desk Top Tools, permits the designer greater flexibility in digital control computer software design and testing. This software development allows the user to emulate control of the CANDU reactor system by system. All discussions will highlight the ability of the replacements and the new developments to enhance the operation of the existing and 'repeat' plant digital control computers and will explore future applications of these developments. Examples of current use of all replacement components and software are provided. (author)

  15. A Survey Identifying Trends on Use of Software Development Tools in Different Indian SMEs

    Directory of Open Access Journals (Sweden)

    Nomi Baruah Ashima

    2012-10-01

    Full Text Available Software Process Improvement defines the identification of the current state-of- practice of processeswithin an organization and then improving it. Software Process Improvement is ever lasting, never endingand ever changing process. Some of the issues which force an organization to undergo software processimprovement are customer dissatisfaction, inadequate software quality, inability to deliver on time andwithin budget, and excessive rework. The SMEs are using software process models but they are not able todeliver on time and within budget, and excessive rework. The SMEs are using software processimprovement models but they are not able to follow all the processes due to lack of resource and cost toimprove their productivity and quality of their product. A survey of 18 SMEs catering software market hasbeen carried out for finding software development scenarios.The intent of the study was to find theprevailing tools and techniques , the SMEs are using to automate software development process and toincorporate software project management. The survey identifies four different types of SoftwareDevelopment Tools which are proving to be effective in current scenario of software development. Theseare identified as Requirement Management Tools, Process Modelling Tools, Software ConfigurationManagement Tools and Cost Estimating Tools. This paper summarizes the trends followed in usage ofSoftware Development Tools in SMEs and it has been shown graphically also.

  16. A survey on open source software testing tools: a preliminary study in 2011

    Science.gov (United States)

    Emami, Seyed Amir; Sim, Jason Chin Lung; Sim, Kwan Yong

    2011-12-01

    Software Testing is a costly and time consuming process in software development. Therefore, software testing tools are often deployed to automate the process in order to reduce cost and improve efficiency. However, many of them are proprietary and expensive. Hence, open source software testing tools could be an appealing alternative. In this paper, we survey the current states of open source software testing tools from three aspects, namely, their availability for different programming platforms and types testing activities, maintenance of the tools and license limitations. From the 152 tools surveyed, we found that open source software testing tools not only are widely available for popular programming platforms, but also support a wide range of testing activities. Furthermore, we also found that more than half of the tools surveyed have been actively maintained and updated by the open source communities. Finally, these tools have very few licensing limitations for commercial use, customization and redistribution.

  17. User Guide for the STAYSL PNNL Suite of Software Tools

    Energy Technology Data Exchange (ETDEWEB)

    Greenwood, Lawrence R.; Johnson, Christian D.

    2013-02-27

    The STAYSL PNNL software suite provides a set of tools for working with neutron activation rates measured in a nuclear fission reactor, an accelerator-based neutron source, or any neutron field to determine the neutron flux spectrum through a generalized least-squares approach. This process is referred to as neutron spectral adjustment since the preferred approach is to use measured data to adjust neutron spectra provided by neutron physics calculations. The input data consist of the reaction rates based on measured activities, an initial estimate of the neutron flux spectrum, neutron activation cross sections and their associated uncertainties (covariances), and relevant correction factors. The output consists of the adjusted neutron flux spectrum and associated covariance matrix, which is useful for neutron dosimetry and radiation damage calculations.

  18. The software improvement process - tools and rules to encourage quality

    International Nuclear Information System (INIS)

    The Applications section of the CERN accelerator controls group has decided to apply a systematic approach to quality assurance (QA), the 'Software Improvement Process' - SIP. This process focuses on three areas: the development process itself, suitable QA tools, and how to practically encourage developers to do QA. For each stage of the development process we have agreed on the recommended activities and deliverables, and identified tools to automate and support the task. For example we do more code reviews. As peer reviews are resource intensive, we only do them for complex parts of a product. As a complement, we are using static code checking tools, like FindBugs and Checkstyle. We also encourage unit testing and have agreed on a minimum level of test coverage recommended for all products, measured using Clover. Each of these tools is well integrated with our IDE (Eclipse) and give instant feedback to the developer about the quality of their code. The major challenges of SIP have been to 1) agree on common standards and configurations, for example common code formatting and Javadoc documentation guidelines, and 2) how to encourage the developers to do QA. To address the second point, we have successfully implemented 'SIP days', i.e. one day dedicated to QA work to which the whole group of developers participates, and 'Top/Flop' lists, clearly indicating the best and worst products with regards to SIP guidelines and standards, for example test coverage. This paper presents the SIP initiative in more detail, summarizing our experience since two years and our future plans. (authors)

  19. Software tools for operations and maintenance planning at Jefferson Lab

    International Nuclear Information System (INIS)

    Time for maintenance, repair and testing is a precious commodity. A queue of pre-planned work is critical to taking advantage of unplanned repair downtime to squeeze in extra work. Jefferson Lab has a web-based tool called ATLis (Accelerator Task List) that collects work plans and routes them for review. By means of templates, requests for software installation and beam studies are also planned through ATLis. The entire list of ATLis work plans is available to all authenticated users who may append comments and additional information to any task at any time. ATLis is also integrated into the Operations Electronic Logbook and Operations Reporting system, so it is possible to see at a glance if a work plan is pending in ATLis to address an outstanding problem report. Following the successful adoption of ATLis by the Operations department, the software has been installed for and adapted to the work planning of additional groups at Jefferson Lab including Cryogenics and the Free Elect

  20. Software Development Of XML Parser Based On Algebraic Tools

    Science.gov (United States)

    Georgiev, Bozhidar; Georgieva, Adriana

    2011-12-01

    In this paper, is presented one software development and implementation of an algebraic method for XML data processing, which accelerates XML parsing process. Therefore, the proposed in this article nontraditional approach for fast XML navigation with algebraic tools contributes to advanced efforts in the making of an easier user-friendly API for XML transformations. Here the proposed software for XML documents processing (parser) is easy to use and can manage files with strictly defined data structure. The purpose of the presented algorithm is to offer a new approach for search and restructuring hierarchical XML data. This approach permits fast XML documents processing, using algebraic model developed in details in previous works of the same authors. So proposed parsing mechanism is easy accessible to the web consumer who is able to control XML file processing, to search different elements (tags) in it, to delete and to add a new XML content as well. The presented various tests show higher rapidity and low consumption of resources in comparison with some existing commercial parsers.

  1. MUST - An integrated system of support tools for research flight software engineering. [Multipurpose User-oriented Software Technology

    Science.gov (United States)

    Straeter, T. A.; Foudriat, E. C.; Will, R. W.

    1977-01-01

    The objectives of NASA's MUST (Multipurpose User-oriented Software Technology) program at Langley Research Center are to cut the cost of producing software which effectively utilizes digital systems for flight research. These objectives will be accomplished by providing an integrated system of support software tools for use throughout the research flight software development process. A description of the overall MUST program and its progress toward the release of a first MUST system will be presented. This release includes: a special interactive user interface, a library of subroutines, assemblers, a compiler, automatic documentation tools, and a test and simulation system.

  2. Evaluation of The Virtual Cells Software: a Teaching Tool

    Directory of Open Access Journals (Sweden)

    C.C.P. da Silva

    2005-07-01

    handling,  having an accessible language,  supporting the  software  as an education  tool that is capable  to facilitate  the learning  of the fundamental concepts  about the theme.  Other  workshops are programmed to happen with participants from different educational institutions of Sao Carlos  city,  with the goal to broaden our sample.

  3. Effective Implementation of Agile Practices - Object Oriented Metrics Tool to Improve Software Quality

    Directory of Open Access Journals (Sweden)

    K. Nageswara Rao

    2012-08-01

    Full Text Available Maintaining the quality of the software is the major challenge in the process of software development.Software inspections which use the methods like structured walkthroughs and formal code reviews involvecareful examination of each and every aspect/stage of software development. In Agile softwaredevelopment, refactoring helps to improve software quality. This refactoring is a technique to improvesoftware internal structure without changing its behaviour. After much study regarding the ways toimprove software quality, our research proposes an object oriented software metric tool called“MetricAnalyzer”. This tool is tested on different codebases and is proven to be much useful.

  4. ELER software - a new tool for urban earthquake loss assessment

    Science.gov (United States)

    Hancilar, U.; Tuzun, C.; Yenidogan, C.; Erdik, M.

    2010-12-01

    ATC-55 (Yang, 2005). An urban loss assessment exercise for a scenario earthquake for the city of Istanbul is conducted and physical and social losses are presented. Damage to the urban environment is compared to the results obtained from similar software, i.e. KOERILoss (KOERI, 2002) and DBELA (Crowley et al., 2004). The European rapid loss estimation tool is expected to help enable effective emergency response, on both local and global level, as well as public information.

  5. A software tool for soil clean-up technology selection

    International Nuclear Information System (INIS)

    Soil remediation is a difficult, time-consuming and expensive operation. A variety of mature and emerging soil remediation technologies is available and future trends in remediation will include continued competition among environmental service companies and technology developers, which will definitely result in further increase in the clean-up options. Consequently, the demand has enhanced developing decision support tools that could help the decision makers to select the most appropriate technology for the specific contaminated site, before the costly remedial actions are taken. Therefore, a software tool for soil clean-up technology selection is currently being developed with the aim of closely working with human decision makers (site owners, local community representatives, environmentalists, regulators, etc.) to assess the available technologies and preliminarily select the preferred remedial options. The analysis for the identification of the best remedial options is based on technical, financial, environmental, and social criteria. These criteria are ranked by all involved parties to determine their relative importance for a particular project. (author)

  6. SNPdetector: A Software Tool for Sensitive and Accurate SNP Detection.

    Directory of Open Access Journals (Sweden)

    2005-10-01

    Full Text Available Identification of single nucleotide polymorphisms (SNPs and mutations is important for the discovery of genetic predisposition to complex diseases. PCR resequencing is the method of choice for de novo SNP discovery. However, manual curation of putative SNPs has been a major bottleneck in the application of this method to high-throughput screening. Therefore it is critical to develop a more sensitive and accurate computational method for automated SNP detection. We developed a software tool, SNPdetector, for automated identification of SNPs and mutations in fluorescence-based resequencing reads. SNPdetector was designed to model the process of human visual inspection and has a very low false positive and false negative rate. We demonstrate the superior performance of SNPdetector in SNP and mutation analysis by comparing its results with those derived by human inspection, PolyPhred (a popular SNP detection tool, and independent genotype assays in three large-scale investigations. The first study identified and validated inter- and intra-subspecies variations in 4,650 traces of 25 inbred mouse strains that belong to either the Mus musculus species or the M. spretus species. Unexpected heterozgyosity in CAST/Ei strain was observed in two out of 1,167 mouse SNPs. The second study identified 11,241 candidate SNPs in five ENCODE regions of the human genome covering 2.5 Mb of genomic sequence. Approximately 50% of the candidate SNPs were selected for experimental genotyping; the validation rate exceeded 95%. The third study detected ENU-induced mutations (at 0.04% allele frequency in 64,896 traces of 1,236 zebra fish. Our analysis of three large and diverse test datasets demonstrated that SNPdetector is an effective tool for genome-scale research and for large-sample clinical studies. SNPdetector runs on Unix/Linux platform and is available publicly (http://lpg.nci.nih.gov.

  7. Technical Data Exchange Software Tools Adapted to Distributed Microsatellite Design

    Science.gov (United States)

    Pache, Charly

    2002-01-01

    One critical issue concerning distributed design of satellites, is the collaborative work it requires. In particular, the exchange of data between each group responsible for each subsystem can be complex and very time-consuming. The goal of this paper is to present a design collaborative tool, the SSETI Design Model (SDM), specifically developed for enabling satellite distributed design. SDM is actually used in the ongoing Student Space Exploration &Technology (SSETI) initiative (www.sseti.net). SSETI is lead by European Space Agency (ESA) outreach office (http://www.estec.esa.nl/outreach), involving student groups from all over Europe for design, construction and launch of a microsatellite. The first part of this paper presents the current version of the SDM tool, a collection of Microsoft Excel linked worksheets, one for each subsystem. An overview of the project framework/structure is given, explaining the different actors, the flows between them, as well as the different types of data and the links - formulas - between data sets. Unified Modeling Language (UML) diagrams give an overview of the different parts . Then the SDM's functionalities, developed in VBA scripts (Visual Basic for Application), are introduced, as well as the interactive features, user interfaces and administration tools. The second part discusses the capabilities and limitations of SDM current version. Taking into account these capabilities and limitations, the third part outlines the next version of SDM, a web-oriented, database-driven evolution of the current version. This new approach will enable real-time data exchange and processing between the different actors of the mission. Comprehensive UML diagrams will guide the audience through the entire modeling process of such a system. Tradeoffs simulation capabilities, security, reliability, hardware and software issues will also be thoroughly discussed.

  8. A software tool for rapid flood inundation mapping

    Science.gov (United States)

    Verdin, James; Verdin, Kristine; Mathis, Melissa; Magadzire, Tamuka; Kabuchanga, Eric; Woodbury, Mark; Gadain, Hussein

    2016-01-01

    The GIS Flood Tool (GFT) was developed by the U.S. Geological Survey with support from the U.S. Agency for International Development’s Office of U.S. Foreign Disaster Assistance to provide a means for production of reconnaissance-level flood inundation mapping for data-sparse and resource-limited areas of the world. The GFT has also attracted interest as a tool for rapid assessment flood inundation mapping for the Flood Inundation Mapping Program of the U.S. Geological Survey. The GFT can fill an important gap for communities that lack flood inundation mapping by providing a first-estimate of inundation zones, pending availability of resources to complete an engineering study. The tool can also help identify priority areas for application of scarce flood inundation mapping resources. The technical basis of the GFT is an application of the Manning equation for steady flow in an open channel, operating on specially processed digital elevation data. The GFT is implemented as a software extension in ArcGIS. Output maps from the GFT were validated at 11 sites with inundation maps produced previously by the Flood Inundation Mapping Program using standard one-dimensional hydraulic modeling techniques. In 80 percent of the cases, the GFT inundation patterns matched 75 percent or more of the one-dimensional hydraulic model inundation patterns. Lower rates of pattern agreement were seen at sites with low relief and subtle surface water divides. Although the GFT is simple to use, it should be applied with the oversight or review of a qualified hydraulic engineer who understands the simplifying assumptions of the approach.

  9. Development of software tools for exposure assessment in urban areas

    International Nuclear Information System (INIS)

    The main goal of this work is to present a software tool for radio frequency exposure assessment which integrates propagation models applied in different environments, dosimetric data sets recorded by personal exposure meters, geographic information systems and web technologies. This tool aims to complement the deployment of a network of electromagnetic field monitoring stations in the city of Valladolid for the control of E-field levels. Data sets of measurements were gathered in the city of Valladolid with a personal exposure meter in December 2006 and January 2007. A diary of measurements was maintained where position, time, relatively close base stations and potential sources of interference were noted down. In addition to this, empirical propagation models were implanted by means of a cadastral map that was used to create 2D maps and 3D models of the city of Valladolid. A Java interface was developed to add simulation parameters and to manage the different layers of information. We have contrasted simulated E-field with the dosimetric data, both in indoor and outdoor environments. Selective frequency spot measures were made with a triaxial isotropic probe and a portable spectrum analyzer. These measures showed a good agreement with personal exposure meter and electromagnetic field monitoring station in the 900 MHz band. In general, electromagnetic field exposure from base stations is low; dosimeter threshold lowest level (0.05 V/m) was generally reached only in regions of line-of-sight, Near-line-of-sight and street canyons. Indoor main contributions were obtained in rooms with LOS to a base station. In conclusion, a better understanding of the exposure to radio frequency can be reached by the integration of different sources: electromagnetic field monitoring stations installed on flat-roof tops, PEM data gathered at street level and propagation modelling tools. (author)

  10. Performance Evaluation of a Software Engineering Tool for Automated Design of Cooling Systems in Injection Moulding

    DEFF Research Database (Denmark)

    Jauregui-Becker, Juan M.; Tosello, Guido; van Houten, Fred J.A.M.;

    2013-01-01

    This paper presents a software tool for automating the design of cooling systems for injection moulding and a validation of its performance. Cooling system designs were automatically generated by the proposed software tool and by applying a best practice tool engineering design approach. The two ...

  11. Can agile software tools bring the benefits of a task board to globally distributed teams?

    NARCIS (Netherlands)

    Katsma, Christiaan; Amrit, Chintan; Hillegersberg, van Jos; Sikkel, Klaas; Oshri, Ilan; Kotlarsky, Julia; Willcocks, Leslie P.

    2013-01-01

    Software-based tooling has become an essential part of globally disitrbuted software development. In this study we focus on the usage of such tools and task boards in particular. We investigate the deployment of these tools through a field research in 4 different companies that feature agile and glo

  12. The Systems Biology Research Tool: evolvable open-source software

    OpenAIRE

    Wright Jeremiah; Wagner Andreas

    2008-01-01

    Abstract Background Research in the field of systems biology requires software for a variety of purposes. Software must be used to store, retrieve, analyze, and sometimes even to collect the data obtained from system-level (often high-throughput) experiments. Software must also be used to implement mathematical models and algorithms required for simulation and theoretical predictions on the system-level. Results We introduce a free, easy-to-use, open-source, integrated software platform calle...

  13. Software Reuse in Agile Development Organizations - A Conceptual Management Tool

    OpenAIRE

    Spoelstra, Wouter; Iacob, Maria; Sinderen, van, Marten

    2011-01-01

    The reuse of knowledge is considered a major factor for increasing productivity and quality. In the software industry knowledge is embodied in software assets such as code components, functional designs and test cases. This kind of knowledge reuse is also referred to as software reuse. Although the benefits can be substantial, software reuse has never reached its full potential. Organizations are not aware of the different levels of reuse or do not know how to address reuse issues. This paper...

  14. A Decision Support Tool for Assessing the Maturity of Software Product Line Process

    OpenAIRE

    Ahmed, Faheem; Capretz, Luiz Fernando

    2015-01-01

    The software product line aims at the effective utilization of software assets, reducing the time required to deliver a product, improving the quality, and decreasing the cost of software products. Organizations trying to incorporate this concept require an approach to assess the current maturity level of the software product line process in order to make management decisions. A decision support tool for assessing the maturity of the software product line process is developed to implement the...

  15. Software tool for researching annotations of proteins: open-source protein annotation software with data visualization.

    Science.gov (United States)

    Bhatia, Vivek N; Perlman, David H; Costello, Catherine E; McComb, Mark E

    2009-12-01

    In order that biological meaning may be derived and testable hypotheses may be built from proteomics experiments, assignments of proteins identified by mass spectrometry or other techniques must be supplemented with additional notation, such as information on known protein functions, protein-protein interactions, or biological pathway associations. Collecting, organizing, and interpreting this data often requires the input of experts in the biological field of study, in addition to the time-consuming search for and compilation of information from online protein databases. Furthermore, visualizing this bulk of information can be challenging due to the limited availability of easy-to-use and freely available tools for this process. In response to these constraints, we have undertaken the design of software to automate annotation and visualization of proteomics data in order to accelerate the pace of research. Here we present the Software Tool for Researching Annotations of Proteins (STRAP), a user-friendly, open-source C# application. STRAP automatically obtains gene ontology (GO) terms associated with proteins in a proteomics results ID list using the freely accessible UniProtKB and EBI GOA databases. Summarized in an easy-to-navigate tabular format, STRAP results include meta-information on the protein in addition to complementary GO terminology. Additionally, this information can be edited by the user so that in-house expertise on particular proteins may be integrated into the larger data set. STRAP provides a sortable tabular view for all terms, as well as graphical representations of GO-term association data in pie charts (biological process, cellular component, and molecular function) and bar charts (cross comparison of sample sets) to aid in the interpretation of large data sets and differential analyses experiments. Furthermore, proteins of interest may be exported as a unique FASTA-formatted file to allow for customizable re-searching of mass spectrometry

  16. Systems and software variability management concepts, tools and experiences

    CERN Document Server

    Capilla, Rafael; Kang, Kyo-Chul

    2013-01-01

    The success of product line engineering techniques in the last 15 years has popularized the use of software variability as a key modeling approach for describing the commonality and variability of systems at all stages of the software lifecycle. Software product lines enable a family of products to share a common core platform, while allowing for product specific functionality being built on top of the platform. Many companies have exploited the concept of software product lines to increase the resources that focus on highly differentiating functionality and thus improve their competitiveness

  17. Development of Software for Analyzing Breakage Cutting ToolsBased on Image Processing

    Institute of Scientific and Technical Information of China (English)

    赵彦玲; 刘献礼; 王鹏; 王波; 王红运

    2004-01-01

    As the present day digital microsystems do not provide specialized microscopes that can detect cutting-tool, analysis software has been developed using VC++. A module for verge test and image segmentation is designed specifically for cutting-tools. Known calibration relations and given postulates are used in scale measurements. Practical operations show that the software can perform accurate detection.

  18. Software development tool for PicoBlaze multi-processor implementation

    OpenAIRE

    Claudiu Lung; Buchman Attila

    2012-01-01

    This paper presents a useful software tool for projects with multi PicoBlaze microprocessors implemented in FPGA circuits. Application presented in this paper which use for software development PicoBlaze SDK tool is an Automatic Packet Report System (APRS), with three PicoBlaze microprocessors implemented in FPGA circuit.

  19. Kid Tools: Self-Management, Problem- Solving, Organizational, and Planning Software for Children and Teachers

    Science.gov (United States)

    Miller, Kevin J.; Fitzgerald, Gail E.; Koury, Kevin A.; Mitchem, Herine J.; Hollingsead, Candice

    2007-01-01

    This article provides an overview of KidTools, an electronic performance software system designed for elementary and middle school children to use independently on classroom or home computers. The software system contains 30 computerized research-based strategy tools that can be implemented in a classroom or home environment. Through the…

  20. Measuring the development process: A tool for software design evaluation

    Science.gov (United States)

    Moy, S. S.

    1980-01-01

    The design metrics evaluator (DME), a component of an automated software design analysis system, is described. The DME quantitatively evaluates software design attributes. Its use directs attention to areas of a procedure, module, or complete program having a high potential for error.

  1. Tool Support for Distributed Software Development : The past - present - and future of gaps between user requirements and tool functionalities

    OpenAIRE

    Herrera, Miles; Hillegersberg, van, R.; Harmsen, Frank; Amrit, Chintan; Geisberger, Eva; Keil, Patrick; Kuhrmann, Marco

    2007-01-01

    This paper presents the past, present, and our view on future user requirements and tool functionalities supporting Globally Distributed Software Teams and highlights the changing emphasis in these user requirements.

  2. A Software Tool for Removing Patient Identifying Information from Clinical Documents

    OpenAIRE

    Friedlin, F. Jeff; McDonald, Clement J.

    2008-01-01

    We created a software tool that accurately removes all patient identifying information from various kinds of clinical data documents, including laboratory and narrative reports. We created the Medical De-identification System (MeDS), a software tool that de-identifies clinical documents, and performed 2 evaluations. Our first evaluation used 2,400 Health Level Seven (HL7) messages from 10 different HL7 message producers. After modifying the software based on the results of this first evaluati...

  3. Possibilities for using software tools in the process of secuirty design

    OpenAIRE

    Ladislav Mariš; Andrej Veľas

    2013-01-01

    The authors deal with the use of software support the process of security design. The article proposes the theoretical basis of the implementation of software tools to design activities. Based on the selected design standards of electrical safety systems application design solutions, especially in drawing documentation. The article should serve the needs of the project team members in order to use selected software tools and a subsequent increase in the degree of automation of design activities.

  4. Software Engineering Practices and Tool Support: an exploratory study in New Zealand

    Directory of Open Access Journals (Sweden)

    Chris Phillips

    2003-11-01

    Full Text Available This study was designed as a preliminary investigation of the practices of software engineers within New Zealand, including their use of development tools. The project involved a review of relevant literature on software engineering and CASE tools, the development and testing of an interview protocol, and structured interviews with five software engineers. This paper describes the project, presents the findings, examines the results in the context of the literature and outlines on-going funded work involving a larger survey.

  5. Possibilities for using software tools in the process of secuirty design

    Directory of Open Access Journals (Sweden)

    Ladislav Mariš

    2013-07-01

    Full Text Available The authors deal with the use of software support the process of security design. The article proposes the theoretical basis of the implementation of software tools to design activities. Based on the selected design standards of electrical safety systems application design solutions, especially in drawing documentation. The article should serve the needs of the project team members in order to use selected software tools and a subsequent increase in the degree of automation of design activities.

  6. Nik Software Captured The Complete Guide to Using Nik Software's Photographic Tools

    CERN Document Server

    Corbell, Tony L

    2011-01-01

    Learn all the features and functionality of the complete Nik family of products Styled in such a way as to resemble the way photographers think, Nik Software Captured aims to help you learn to apply all the features and functionality of the Nik software products. With Nik Software Captured, authors and Nik Software, Inc. insiders Tony Corbell and Josh Haftel help you use after-capture software products easier and more creatively. Their sole aim is to ensure that you can apply the techniques discussed in the book while gaining a thorough understanding of the capabilities of programs such as Dfi

  7. Concurrent Software Testing : A Systematic Review and an Evaluation of Static Analysis Tools

    OpenAIRE

    Mamun, Md. Abdullah al; Khanam, Aklima

    2009-01-01

    Verification and validation is one of the most important concerns in the area of software engineering towards more reliable software development. Hence it is important to overcome the challenges of testing concurrent programs. The extensive use of concurrent systems warrants more attention to the concurrent software testing. For testing concurrent software, automatic tools development is getting increased focus. The first part of this study presents a systematic review that aims to explore th...

  8. Module Testing Techniques for Nuclear Safety Critical Software Using LDRA Testing Tool

    International Nuclear Information System (INIS)

    The safety critical software in the I and C systems of nuclear power plants requires high functional integrity and reliability. To achieve those requirement goals, the safety critical software should be verified and tested according to related codes and standards through verification and validation (V and V) activities. The safety critical software testing is performed at various stages during the development of the software, and is generally classified as three major activities: module testing, system integration testing, and system validation testing. Module testing involves the evaluation of module level functions of hardware and software. System integration testing investigates the characteristics of a collection of modules and aims at establishing their correct interactions. System validation testing demonstrates that the complete system satisfies its functional requirements. In order to generate reliable software and reduce high maintenance cost, it is important that software testing is carried out at module level. Module testing for the nuclear safety critical software has rarely been performed by formal and proven testing tools because of its various constraints. LDRA testing tool is a widely used and proven tool set that provides powerful source code testing and analysis facilities for the V and V of general purpose software and safety critical software. Use of the tool set is indispensable where software is required to be reliable and as error-free as possible, and its use brings in substantial time and cost savings, and efficiency

  9. An Approach to Building a Traceability Tool for Software Development

    Science.gov (United States)

    Delgado, Nelly; Watson, Tom

    1997-01-01

    It is difficult in a large, complex computer program to ensure that it meets the specified requirements. As the program evolves over time, a11 program constraints originally elicited during the requirements phase must be maintained. In addition, during the life cycle of the program, requirements typically change and the program must consistently reflect those changes. Imagine the following scenario. Company X wants to develop a system to automate its assembly line. With such a large system, there are many different stakeholders, e.g., managers, experts such as industrial and mechanical engineers, and end-users. Requirements would be elicited from all of the stake holders involved in the system with each stakeholder contributing their point of view to the requirements. For example, some of the requirements provided by an industrial engineer may concern the movement of parts through the assembly line. A point of view provided by the electrical engineer may be reflected in constraints concerning maximum power usage. End-users may be concerned with comfort and safety issues, whereas managers are concerned with the efficiency of the operation. With so many points of view affecting the requirements, it is difficult to manage them, communicate information to relevant stakeholders. and it is likely that conflicts in the requirements will arise. In the coding process, the implementors will make additional assumptions and interpretations on the design and the requirements of the system. During any stage of development, stakeholders may request that a requirement be added or changed. In such a dynamic environment, it is difficult to guarantee that the system will preserve the current set of requirements. Tracing, the mapping between objects in the artifacts of the system being developed, addresses this issue. Artifacts encompass documents such as the system definition, interview transcripts, memoranda, the software requirements specification, user's manuals, the functional

  10. Nmag micromagnetic simulation tool - software engineering lessons learned

    OpenAIRE

    Fangohr, Hans; Albert, Maximilian; Franchin, Matteo

    2016-01-01

    We review design and development decisions and their impact for the open source code Nmag from a software engineering in computational science point of view. We summarise lessons learned and recommendations for future computational science projects. Key lessons include that encapsulating the simulation functionality in a library of a general purpose language, here Python, provides great flexibility in using the software. The choice of Python for the top-level user interface was very well rece...

  11. MSP-Tool: a VBA-based software tool for the analysis of multispecimen paleointensity data

    Science.gov (United States)

    Monster, Marilyn; de Groot, Lennart; Dekkers, Mark

    2015-12-01

    The multispecimen protocol (MSP) is a method to estimate the Earth's magnetic field's past strength from volcanic rocks or archeological materials. By reducing the amount of heating steps and aligning the specimens parallel to the applied field, thermochemical alteration and multi-domain effects are minimized. We present a new software tool, written for Microsoft Excel 2010 in Visual Basic for Applications (VBA), that evaluates paleointensity data acquired using this protocol. In addition to the three ratios (standard, fraction-corrected and domain-state-corrected) calculated following Dekkers and Böhnel (2006) and Fabian and Leonhardt (2010) and a number of other parameters proposed by Fabian and Leonhardt (2010), it also provides several reliability criteria. These include an alteration criterion, whether or not the linear regression intersects the y axis within the theoretically prescribed range, and two directional checks. Overprints and misalignment are detected by isolating the remaining natural remanent magnetization (NRM) and the partial thermoremanent magnetization (pTRM) gained and comparing their declinations and inclinations. The NRM remaining and pTRM gained are then used to calculate alignment-corrected multispecimen plots. Data are analyzed using bootstrap statistics. The program was tested on lava samples that were given a full TRM and that acquired their pTRMs at angles of 0, 15, 30 and 90° with respect to their NRMs. MSP-Tool adequately detected and largely corrected these artificial alignment errors.

  12. MSP-Tool: a VBA-based software tool for the analysis of multispecimen paleointensity data

    Directory of Open Access Journals (Sweden)

    Marilyn Wilhelmina Leonora Monster

    2015-12-01

    Full Text Available The multispecimen protocol (MSP is a method to estimate the Earth’s magnetic field’s past strength from volcanic rocks or archeological materials. By reducing the amount of heating steps and aligning the specimens parallel to the applied field, thermochemical alteration and multi-domain effects are minimized. We present a new software tool, written for Microsoft Excel 2010 in Visual Basic for Applications (VBA, that evaluates paleointensity data acquired using this protocol. In addition to the three ratios (standard, fraction-corrected and domain-state-corrected calculated following Dekkers and Böhnel (2006 and Fabian and Leonhardt (2010 and a number of other parameters proposed by Fabian and Leonhardt (2010, it also provides several reliability criteria. These include an alteration criterion, whether or not the linear regression intersects the y axis within the theoretically prescribed range, and two directional checks. Overprints and misalignment are detected by isolating the remaining natural remanent magnetization (NRM and the partial thermoremanent magnetization (pTRM gained and comparing their declinations and inclinations. The NRM remaining and pTRM gained are then used to calculate alignment-corrected multispecimen plots. Data are analyzed using bootstrap statistics. The program was tested on lava samples that were given a full TRM and that acquired their pTRMs at angles of 0, 15, 30 and 90° with respect to their NRMs. MSP-Tool adequately detected and largely corrected these artificial alignment errors.

  13. A Case Study of Black-Box Testing for Embedded Software using Test Automation Tool

    Directory of Open Access Journals (Sweden)

    Changhyun Baek

    2007-01-01

    Full Text Available This research shows a case study of the Black-Box testing for Temperature Controller (TC which is one of the typical embedded systems. A test automation tool, TEST, was developed and some kinds of TCs were tested using the tool. We presented statistical analysis for the test results of the automated testing and defined the properties of the software bugs for the embedded system. The main result of the study were the following: (a test case prioritization technique was needed because the review phase, in the test process, takes long time; (b there are three types of software defects for the embedded software; (c the complexity of the system configuration have an effect on the software de-fect; (d the software defect was distributed in vulnerable points of the software; and (e testing activi-ties reduce the number of the software defects. The result can be useful in the asymptotic study of test case prioritization.

  14. A Web-based Tool for Automatizing the Software Process Improvement Initiatives in Small Software Enterprises

    NARCIS (Netherlands)

    Garcia, I.; Pacheco, C.

    2010-01-01

    Top-down process improvement approaches provide a high-level model of what the process of a software development organization should be. Such models are based on the consensus of a designated working group on how software should be developed or maintained. They are very useful in that they provide g

  15. C++ Software Quality in the ATLAS Experiment: Tools and Experience

    CERN Document Server

    Kluth, Stefan; The ATLAS collaboration; Obreshkov, Emil; Roe, Shaun; Seuster, Rolf; Snyder, Scott; Stewart, Graeme

    2016-01-01

    The ATLAS experiment at CERN uses about six million lines of code and currently has about 420 developers whose background is largely from physics. In this paper we explain how the C++ code quality is managed using a range of tools from compile-time through to run time testing and reflect on the great progress made in the last year largely through the use of static analysis tools such as Coverity®, an industry-standard tool which enables quality comparison with general open source C++ code. Other tools including cppcheck, Include-What-You-Use and run-time 'sanitizers' are also discussed.

  16. Software engineering capability for Ada (GRASP/Ada Tool)

    Science.gov (United States)

    Cross, James H., II

    1995-01-01

    The GRASP/Ada project (Graphical Representations of Algorithms, Structures, and Processes for Ada) has successfully created and prototyped a new algorithmic level graphical representation for Ada software, the Control Structure Diagram (CSD). The primary impetus for creation of the CSD was to improve the comprehension efficiency of Ada software and, as a result, improve reliability and reduce costs. The emphasis has been on the automatic generation of the CSD from Ada PDL or source code to support reverse engineering and maintenance. The CSD has the potential to replace traditional prettyprinted Ada Source code. A new Motif compliant graphical user interface has been developed for the GRASP/Ada prototype.

  17. Vulnerability management tools for COTS software - A comparison

    NARCIS (Netherlands)

    Welberg, S.M.

    2008-01-01

    In this paper, we compare vulnerability management tools in two stages. In the first stage, we perform a global comparison involving thirty tools available in the market. A framework composed of several criteria based on scope and analysis is used for this comparison. From this global view of the to

  18. A SOFTWARE TOOL FOR EXPERIMENTAL STUDY LEAP MOTION

    Directory of Open Access Journals (Sweden)

    Georgi Krastev

    2015-12-01

    Full Text Available The paper aims to present computer application that illustrates Leap Motion controller’s abilities. It is a peripheral and software for PC, which enables control by natural user interface based on gestures. The publication also describes how the controller works and its main advantages/disadvantages. Some apps using leap motion controller are discussed.

  19. A Study of Collaborative Software Development Using Groupware Tools

    Science.gov (United States)

    Defranco-Tommarello, Joanna; Deek, Fadi P.

    2005-01-01

    The experimental results of a collaborative problem solving and program development model that takes into consideration the cognitive and social activities that occur during software development is presented in this paper. This collaborative model is based on the Dual Common Model that focuses on individual cognitive aspects of problem solving and…

  20. Calico: An Early-Phase Software Design Tool

    Science.gov (United States)

    Mangano, Nicolas Francisco

    2013-01-01

    When developers are faced with a design challenge, they often turn to the whiteboard. This is typical during the conceptual stages of software design, when no code is in existence yet. It may also happen when a significant code base has already been developed, for instance, to plan new functionality or discuss optimizing a key component. While…

  1. Effectiveness of AutoCAD 3D Software as a Learning Support Tool

    Directory of Open Access Journals (Sweden)

    Fatariah Zakaria

    2012-06-01

    Full Text Available The aim of this study is to test the effectiveness of AutoCAD 3D software in learning of Engineering Drawing to enhance students understanding. Data were collected from a sample of students from a secondary school in Sungai Petani, Kedah. The quasi experimental design was used to find the effectiveness of the software in improving student’s achievement. The result from this study shows excellent increases in student achievement after using this software. These indicate the software can help school student visualization capability. This study suggests that teachers, school administrators and government to consider this software as learning tool in Malaysia school.

  2. Training, Quality Assurance Factors, and Tools Investigation: a Work Report and Suggestions on Software Quality Assurance

    Science.gov (United States)

    Lee, Pen-Nan

    1991-01-01

    Previously, several research tasks have been conducted, some observations were obtained, and several possible suggestions have been contemplated involving software quality assurance engineering at NASA Johnson. These research tasks are briefly described. Also, a brief discussion is given on the role of software quality assurance in software engineering along with some observations and suggestions. A brief discussion on a training program for software quality assurance engineers is provided. A list of assurance factors as well as quality factors are also included. Finally, a process model which can be used for searching and collecting software quality assurance tools is presented.

  3. High Energy Physics Forum for Computational Excellence: Working Group Reports (I. Applications Software II. Software Libraries and Tools III. Systems)

    CERN Document Server

    Habib, Salman; LeCompte, Tom; Marshall, Zach; Borgland, Anders; Viren, Brett; Nugent, Peter; Asai, Makoto; Bauerdick, Lothar; Finkel, Hal; Gottlieb, Steve; Hoeche, Stefan; Sheldon, Paul; Vay, Jean-Luc; Elmer, Peter; Kirby, Michael; Patton, Simon; Potekhin, Maxim; Yanny, Brian; Calafiura, Paolo; Dart, Eli; Gutsche, Oliver; Izubuchi, Taku; Lyon, Adam; Petravick, Don

    2015-01-01

    Computing plays an essential role in all aspects of high energy physics. As computational technology evolves rapidly in new directions, and data throughput and volume continue to follow a steep trend-line, it is important for the HEP community to develop an effective response to a series of expected challenges. In order to help shape the desired response, the HEP Forum for Computational Excellence (HEP-FCE) initiated a roadmap planning activity with two key overlapping drivers -- 1) software effectiveness, and 2) infrastructure and expertise advancement. The HEP-FCE formed three working groups, 1) Applications Software, 2) Software Libraries and Tools, and 3) Systems (including systems software), to provide an overview of the current status of HEP computing and to present findings and opportunities for the desired HEP computational roadmap. The final versions of the reports are combined in this document, and are presented along with introductory material.

  4. AWG-Parameters: new software tool to design arrayed waveguide gratings

    Science.gov (United States)

    Seyringer, D.; Bielik, M.

    2013-03-01

    A new software tool and its application in the design of optical multiplexers/demultiplexers based on arrayed waveguide gratings is presented. The motivation for this work is the fact that when designing arrayed waveguide gratings a set of geometrical parameters must be first calculated. These parameters are the input for AWG layout that will be created and simulated using commercial photonic design tools. It is important to point out that these parameters influence strongly correct AWG demultiplexing properties and therefore have to be calculated very carefully. However, most of the commercial photonic design tools do not support this fundamental calculation. To be able to design any AWG, with any software tool and particularly to save the time needed for AWG design a new software tool was developed. The tool was already applied in various AWG designs and also technologically well-proven.

  5. A Web-based modeling tool for the SEMAT Essence theory of software engineering

    Directory of Open Access Journals (Sweden)

    Daniel Graziotin

    2013-09-01

    Full Text Available As opposed to more mature subjects, software engineering lacks general theories that establish its foundations as a discipline. The Essence Theory of software engineering (Essence has been proposed by the Software Engineering Methods and Theory (SEMAT initiative. The goal of Essence is to develop a theoretically sound basis for software engineering practice and its wide adoption. However, Essence is far from reaching academic- and industry-wide adoption. The reasons for this include a struggle to foresee its utilization potential and a lack of tools for implementation. SEMAT Accelerator (SematAcc is a Web-positioning tool for a software engineering endeavor, which implements the SEMAT’s Essence kernel. SematAcc permits the use of Essence, thus helping to understand it. The tool enables the teaching, adoption, and research of Essence in controlled experiments and case studies.

  6. Simulation software support (S3) system a software testing and debugging tool

    International Nuclear Information System (INIS)

    The largest percentage of technical effort in the software development process is accounted for debugging and testing. It is not unusual for a software development organization to spend over 50% of the total project effort on testing. In the extreme, testing of human-rated software (e.g., nuclear reactor monitoring, training simulator) can cost three to five times as much as all other software engineering steps combined. The Simulation Software Support (S3) System, developed by the Link-Miles Simulation Corporation is ideally suited for real-time simulation applications which involve a large database with models programmed in FORTRAN. This paper will focus on testing elements of the S3 system. In this paper system support software utilities are provided which enable the loading and execution of modules in the development environment. These elements include the Linking/Loader (LLD) for dynamically linking program modules and loading them into memory and the interactive executive (IEXEC) for controlling the execution of the modules. Features of the Interactive Symbolic Debugger (SD) and the Real Time Executive (RTEXEC) to support the unit and integrated testing will be explored

  7. Software tool designed to detect characteristics of the malware objects

    OpenAIRE

    Bazylevych, Roman; Andriyenko, Volodymyr; Karioti, Mikael

    2013-01-01

    Computer virus is the kind of phenomenon that emerged in the operation of the evolution of computers and information technology. Its essence lies in the fact that programs - viruses, endowed with a such of properties that are present in living organisms - they born, reproduce and die. In order to ensure information security, both, individuals and entire organizations, it is necessary to raise awareness about potential threats from malicious software such as computer viruses. Simulation...

  8. Software development tools for the CDF MX scanner

    International Nuclear Information System (INIS)

    This paper discuses the design of the high level assembler and diagnostic control program developed for the MX, a high speed, custom designed computer used in the CDF data acquisition system at Fermilab. These programs provide a friendly productive environment for the development of software on the MX. Details of their implementation and special features, and some of the lessons learned during their development are included

  9. SIMPLE: a prototype software fault-injection tool

    OpenAIRE

    Acantilado, Christopher P.; Acantilado, Neil John P.

    2002-01-01

    Approved for public release; distribution is unlimited. Fault-injection techniques can be used to methodically assess the degree of fault tolerance afforded by a system. In this thesis, we introduce a Java-based, semi-automatic fault-injection test harness, called Software Fault Injection Mechanized Prototype Lightweight Engine (SIMPLE). SIMPLE employs a state-based fault injection approach designed to validate test suites. It also can assist developers to assess the properti...

  10. A software tool for 3D dose verification and analysis

    Science.gov (United States)

    Sa'd, M. Al; Graham, J.; Liney, G. P.

    2013-06-01

    The main recent developments in radiotherapy have focused on improved treatment techniques in order to generate further significant improvements in patient prognosis. There is now an internationally recognised need to improve 3D verification of highly conformal radiotherapy treatments. This is because of the very high dose gradients used in modern treatment techniques, which can result in a small error in the spatial dose distribution leading to a serious complication. In order to gain the full benefits of using 3D dosimetric technologies (such as gel dosimetry), it is vital to use 3D evaluation methods and algorithms. We present in this paper a software solution that provides a comprehensive 3D dose evaluation and analysis. The software is applied to gel dosimetry, which is based on magnetic resonance imaging (MRI) as a read-out method. The software can also be used to compare any two dose distributions, such as two distributions planned using different methods of treatment planning systems, or different dose calculation algorithms.

  11. A software tool for 3D dose verification and analysis

    International Nuclear Information System (INIS)

    The main recent developments in radiotherapy have focused on improved treatment techniques in order to generate further significant improvements in patient prognosis. There is now an internationally recognised need to improve 3D verification of highly conformal radiotherapy treatments. This is because of the very high dose gradients used in modern treatment techniques, which can result in a small error in the spatial dose distribution leading to a serious complication. In order to gain the full benefits of using 3D dosimetric technologies (such as gel dosimetry), it is vital to use 3D evaluation methods and algorithms. We present in this paper a software solution that provides a comprehensive 3D dose evaluation and analysis. The software is applied to gel dosimetry, which is based on magnetic resonance imaging (MRI) as a read-out method. The software can also be used to compare any two dose distributions, such as two distributions planned using different methods of treatment planning systems, or different dose calculation algorithms.

  12. Contingency Contractor Optimization Phase 3 Sustainment Third-Party Software List - Contingency Contractor Optimization Tool - Prototype

    Energy Technology Data Exchange (ETDEWEB)

    Durfee, Justin David; Frazier, Christopher Rawls; Bandlow, Alisa

    2016-05-01

    The Contingency Contractor Optimization Tool - Prototype (CCOT-P) requires several third-party software packages. These are documented below for each of the CCOT-P elements: client, web server, database server, solver, web application and polling application.

  13. Impact of libre software tools and methods in the robotics field

    OpenAIRE

    Barrera Gonz??lez, Pablo; Robles, Gregorio; Ca??as, Jos?? Mar??a; Mart??n Rico, Francisco; Matell??n Olivera, Vicente

    2005-01-01

    Software is one of the major components of robots; in fact, it is the main bottleneck for the proliferation of robotics in our everyday lives. In the last years the ???eld of robotics has been an emerging application area of the libre (free/open source) software phenomenon. Libre software tools have been traditionally popular among the robotics research and teaching community. Even companies whose main business model is to sell robots have found convenient to share the softw...

  14. STAMPS: software tool for automated MRI post-processing on a supercomputer

    OpenAIRE

    Bigler, Don C.; Aksu, Yaman; Yang, Qing X.

    2009-01-01

    This paper describes a Software Tool for Automated MRI Post-processing (STAMP) of multiple types of brain MRIs on a workstation and for parallel processing on a supercomputer (STAMPS). This software tool enables the automation of nonlinear registration for a large image set and for multiple MR image types. The tool uses standard brain MRI post-processing tools (such as SPM, FSL, and HAMMER) for multiple MR image types in a pipeline fashion. It also contains novel MRI post-processing features....

  15. Research flight software engineering and MUST, an integrated system of support tools

    Science.gov (United States)

    Straeter, T. A.; Foudriat, E. C.; Will, R. W.

    1977-01-01

    Consideration is given to software development to support NASA flight research. The Multipurpose User-Oriented Software Technology (MUST) program, designed to integrate digital systems into flight research, is discussed. Particular attention is given to the program's special interactive user interface, subroutine library, assemblers, compiler, automatic documentation tools, and test and simulation subsystems.

  16. Designing and Using Software Tools for Educational Purposes: FLAT, a Case Study

    Science.gov (United States)

    Castro-Schez, J. J.; del Castillo, E.; Hortolano, J.; Rodriguez, A.

    2009-01-01

    Educational software tools are considered to enrich teaching strategies, providing a more compelling means of exploration and feedback than traditional blackboard methods. Moreover, software simulators provide a more motivating link between theory and practice than pencil-paper methods, encouraging active and discovery learning in the students.…

  17. 76 FR 5832 - International Business Machines (IBM), Software Group Business Unit, Optim Data Studio Tools QA...

    Science.gov (United States)

    2011-02-02

    ... Employment and Training Administration International Business Machines (IBM), Software Group Business Unit... November 17, 2010 (75 FR 70296). The negative determination of the TAA petition filed on behalf of workers at International Business Machines (IBM), Software Group Business Unit, Optim Data Studio Tools...

  18. A Practical Comparison of De Novo Genome Assembly Software Tools for Next-Generation Sequencing Technologies

    OpenAIRE

    Zhang, Wenyu; Chen, Jiajia; Yang, Yang; Tang, Yifei; Shang, Jing; Shen, Bairong

    2011-01-01

    The advent of next-generation sequencing technologies is accompanied with the development of many whole-genome sequence assembly methods and software, especially for de novo fragment assembly. Due to the poor knowledge about the applicability and performance of these software tools, choosing a befitting assembler becomes a tough task. Here, we provide the information of adaptivity for each program, then above all, compare the performance of eight distinct tools against eight groups of simulat...

  19. The Use of Software Project Management Tools in Saudi Arabia: An Exploratory Survey

    OpenAIRE

    Nouf AlMobarak; Rawan AlAbdulrahman; Shahad AlHarbi; Wea’am AlRashed

    2013-01-01

    This paper reports the results of an online survey study, which was conducted to investigate the use of software project management tools in Saudi Arabia. The survey provides insights of project management in the local context of Saudi Arabia from ten different companies which participated in this study. The aim is to explore and specify the project management tools used by software project management teams and their managers, to understand the supported features that might influence their se...

  20. Possibilities of Simulation of Fluid Flows Using the Modern CFD Software Tools

    CERN Document Server

    Kochevsky, A N

    2004-01-01

    The article reviews fluid flow models implemented in the leading CFD software tools and designed for simulation of multi-component and multi-phase flows, compressible flows, flows with heat transfer, cavitation and other phenomena. The article shows that these software tools (CFX, Fluent, STAR-CD, etc.) allow for adequate simulation of complex physical effects of different nature, even for problems where performing of physical experiment is extremely difficult.

  1. Benchmarking of Optimization Modules for Two Wind Farm Design Software Tools

    OpenAIRE

    Yilmaz, Eftun

    2012-01-01

    Optimization of wind farm layout is an expensive and complex task involving several engineering challenges. The layout of any wind farm directly impacts profitability and return of investment. Several software optimization modules in line with wind farm design tools in industry is currently attempting to place the turbines in locations with good wind resources while adhering to the constraints of a defined objective function. Assessment of these software tools needs to be performed clearly fo...

  2. Comprehensive aging management using the software tool COMSY; Gewerkeuebergreifendes Alterungsmanagement mit dem Softwaretool COMSY

    Energy Technology Data Exchange (ETDEWEB)

    Baier, Roman; Nopper, Helmut [AREVA NP GmbH, Erlangen (Germany). Technical Center

    2011-07-01

    KTA 1403 requires an integrated concept for the aging management in nuclear power plants. AREVA has developed the software tool COMSY PLIM with the attached modules COMSY-ELT (electrical and control systems) and COMSY-BAU (civil engineering) that allow the automatic integration of the respective data bases. The tool is aimed for software supported aging monitoring of reactor components including integrated damage mechanism models, the evaluation of countermeasures and transferability tests.

  3. Software tools for the analysis of video meteors emission spectra

    Science.gov (United States)

    Madiedo, J. M.; Toscano, F. M.; Trigo-Rodriguez, J. M.

    2011-10-01

    One of the goals of the SPanish Meteor Network (SPMN) is related to the study of the chemical composition of meteoroids by analyzing the emission spectra resulting from the ablation of these particles of interplanetary matter in the atmosphere. With this aim, some of the CCD video devices we employ to observe the nigh sky are endowed with holographic diffraction gratings, and a continuous monitoring of meteor activity is performed. We have recently developed a new software to analyze these spectra. A description of this computer program is given, and some of the results obtained so far are presented here.

  4. Development of a Software Tool to Automate ADCO Flight Controller Console Planning Tasks

    Science.gov (United States)

    Anderson, Mark G.

    2011-01-01

    This independent study project covers the development of the International Space Station (ISS) Attitude Determination and Control Officer (ADCO) Planning Exchange APEX Tool. The primary goal of the tool is to streamline existing manual and time-intensive planning tools into a more automated, user-friendly application that interfaces with existing products and allows the ADCO to produce accurate products and timelines more effectively. This paper will survey the current ISS attitude planning process and its associated requirements, goals, documentation and software tools and how a software tool could simplify and automate many of the planning actions which occur at the ADCO console. The project will be covered from inception through the initial prototype delivery in November 2011 and will include development of design requirements and software as well as design verification and testing.

  5. The MineTool Software Suite: A Novel Data Mining Palette of Tools for Automated Modeling of Space Physics Data

    Science.gov (United States)

    Sipes, T.; Karimabadi, H.; Roberts, A.

    2009-12-01

    We present a new data mining software tool called MineTool for analysis and modeling of space physics data. MineTool is a graphical user interface implementation that merges two data mining algorithms into an easy-to-use software tool: an algorithm for analysis and modeling of static data [Karimabadi et al, 2007] and MineTool-TS, an algorithm for data mining of time series data [Karimabadi et al, 2009]. By virtue of automating the modeling process and model evaluations, MineTool makes data mining and predictive modeling more accessible to non-experts. The software is entirely in Java and freeware. By ranking all inputs as predictors of the outcome before constructing a model, MineTool enables inclusion of only relevant variables as well. The technique aggregates the various stages of model building into a four-step process consisting of (i) data segmentation and sampling, (ii) variable pre-selection and transform generation, (iii) predictive model estimation and validation, and (iv) final model selection. Optimal strategies are chosen for each modeling step. A notable feature of the technique is that the final model is always in closed analytical form rather than “black box” form characteristic of some other techniques. Having the analytical model enables deciphering the importance of various variables to affecting the outcome. MineTool suite also provides capabilities for data preparation for data mining as well as visualization of the datasets. MineTool has successfully been used to develop models for automated detection of flux transfer events (FTEs) at Earth’s magnetopause in the Cluster spacecraft time series data and 3D magnetopause modeling. In this presentation, we demonstrate the ease of use of the software through examples including how it was used in the FTE problem.

  6. Computer Tools “Trace” and “Locus” in Dynamic Mathematics Software

    Directory of Open Access Journals (Sweden)

    Marina Drushlyak

    2014-12-01

    Full Text Available The article describes the results of use of tools "Trace" and "Locus" in dynamic mathematics software. Examples of solutions of stereometric locus problems to construct static traces in software GeoGebra5.0, Cabri3D and plane geometry problems to construct both static traces and traces that are perceived as independent dynamic objects in software MathKit and GeoGebra are given. Attention is focuses on the differences of the actions of these tools. Author notes the possibility of forming a logical and constructive thinking with their use.

  7. Computer Tools “Trace” and “Locus” in Dynamic Mathematics Software

    OpenAIRE

    Marina Drushlyak

    2014-01-01

    The article describes the results of use of tools "Trace" and "Locus" in dynamic mathematics software. Examples of solutions of stereometric locus problems to construct static traces in software GeoGebra5.0, Cabri3D and plane geometry problems to construct both static traces and traces that are perceived as independent dynamic objects in software MathKit and GeoGebra are given. Attention is focuses on the differences of the actions of these tools. Author notes the possibility of forming a log...

  8. Collaborative Software Development in Support of Fast Adaptive AeroSpace Tools (FAAST)

    Science.gov (United States)

    Kleb, William L.; Nielsen, Eric J.; Gnoffo, Peter A.; Park, Michael A.; Wood, William A.

    2003-01-01

    A collaborative software development approach is described. The software product is an adaptation of proven computational capabilities combined with new capabilities to form the Agency's next generation aerothermodynamic and aerodynamic analysis and design tools. To efficiently produce a cohesive, robust, and extensible software suite, the approach uses agile software development techniques; specifically, project retrospectives, the Scrum status meeting format, and a subset of Extreme Programming's coding practices are employed. Examples are provided which demonstrate the substantial benefits derived from employing these practices. Also included is a discussion of issues encountered when porting legacy Fortran 77 code to Fortran 95 and a Fortran 95 coding standard.

  9. Fighting software piracy: Which governance tools matter in Africa?

    OpenAIRE

    Asongu, Simplice A; Antonio R. Andrés

    2012-01-01

    This article integrates previously missing components of government quality into the governance-piracy nexus in exploring governance mechanisms by which global obligations for the treatment of IPRs are effectively transmitted from international to the national level in the battle against piracy. It assesses the best governance tools in the fight against piracy and upholding of Intellectual Property Rights (IPRs). The instrumentality of IPR laws (treaties) in tackling piracy through good gover...

  10. Nucleonica. Web-based software tools for simulation and analysis

    International Nuclear Information System (INIS)

    The authors present a description of the Nucleonica web-based portal for simulation and analysis for a wide range of commonly encountered nuclear science applications. Advantages of a web-based approach include availability wherever there is internet access, intuitive user-friendly interface, remote access to high-power computing resources, and continual maintenance, improvement, and addition of tools and techniques common to the nuclear science industry. A description of the nuclear data resources, and some applications is given.

  11. Computer-aided design in power engineering. Application of software tools

    International Nuclear Information System (INIS)

    Demonstrates the use software tools in the practice of design in the field of power systems. Presents many applications in the design in the field of power systems. Useful for educative purposes and practical work. This textbooks demonstrates the application of software tools in solving a series of problems from the field of designing power system structures and systems. It contains four chapters: The first chapter leads the reader through all the phases necessary in the procedures of computer aided modeling and simulation. It guides through the complex problems presenting on the basis of eleven original examples. The second chapter presents application of software tools in power system calculations of power systems equipment design. Several design example calculations are carried out using engineering standards like MATLAB, EMTP/ATP, Excel and Access, AutoCAD and Simulink. The third chapters focuses on the graphical documentation using a collection of software tools (AutoCAD, EPLAN, SIMARIS SIVACON, SIMARIS DESIGN) which enable the complete automation of the development of graphical documentation of a power systems. In the fourth chapter, the application of software tools in the project management in power systems is discussed. Here, the emphasis is put on the standard software MS Excel and MS Project.

  12. Computer-aided design in power engineering. Application of software tools

    Energy Technology Data Exchange (ETDEWEB)

    Stojkovic, Zlatan

    2012-07-01

    Demonstrates the use software tools in the practice of design in the field of power systems. Presents many applications in the design in the field of power systems. Useful for educative purposes and practical work. This textbooks demonstrates the application of software tools in solving a series of problems from the field of designing power system structures and systems. It contains four chapters: The first chapter leads the reader through all the phases necessary in the procedures of computer aided modeling and simulation. It guides through the complex problems presenting on the basis of eleven original examples. The second chapter presents application of software tools in power system calculations of power systems equipment design. Several design example calculations are carried out using engineering standards like MATLAB, EMTP/ATP, Excel and Access, AutoCAD and Simulink. The third chapters focuses on the graphical documentation using a collection of software tools (AutoCAD, EPLAN, SIMARIS SIVACON, SIMARIS DESIGN) which enable the complete automation of the development of graphical documentation of a power systems. In the fourth chapter, the application of software tools in the project management in power systems is discussed. Here, the emphasis is put on the standard software MS Excel and MS Project.

  13. An evaluation of software tools for the design and development of cockpit displays

    Science.gov (United States)

    Ellis, Thomas D., Jr.

    1993-01-01

    The use of all-glass cockpits at the NASA Langley Research Center (LaRC) simulation facility has changed the means of design, development, and maintenance of instrument displays. The human-machine interface has evolved from a physical hardware device to a software-generated electronic display system. This has subsequently caused an increased workload at the facility. As computer processing power increases and the glass cockpit becomes predominant in facilities, software tools used in the design and development of cockpit displays are becoming both feasible and necessary for a more productive simulation environment. This paper defines LaRC requirements of a display software development tool and compares two available applications against these requirements. As a part of the software engineering process, these tools reduce development time, provide a common platform for display development, and produce exceptional real-time results.

  14. A Reference Architecture for Providing Tools as a Service to Support Global Software Development

    DEFF Research Database (Denmark)

    Chauhan, Aufeef

    2014-01-01

    -computing paradigm for addressing above-mentioned issues by providing a framework to select appropriate tools as well as associated services and reference architecture of the cloud-enabled middleware platform that allows on demand provisioning of software engineering Tools as a Service (TaaS) with focus on......Global Software Development (GSD) teams encounter challenges that are associated with distribution of software development activities across multiple geographic regions. The limited support for performing collaborative development and engineering activities and lack of sufficient support for......-based solutions. The restricted ability of the organizations to have desired alignment of tools with software engineering and development processes results in administrative and managerial overhead that incur increased development cost and poor product quality. Moreover, stakeholders involved in the projects have...

  15. Development of tools for safety analysis of control software in advanced reactors

    International Nuclear Information System (INIS)

    Software based control systems have gained a pervasive presence in a wide variety of applications, including nuclear power plant control and protection systems which are within the oversight and licensing responsibility of the US Nuclear Regulatory Commission. While the cost effectiveness and flexibility of software based plant process control is widely recognized, it is very difficult to achieve and prove high levels of demonstrated dependability and safety assurance for the functions performed by process control software, due to the very flexibility and potential complexity of the software itself. The development of tools to model, analyze and test software design and implementations in the context of the system that the software is designed to control can greatly assist the task of providing higher levels of assurance than those obtainable by software testing alone. This report presents and discusses the development of the Dynamic Flowgraph Methodology (DFM) and its application in the dependability and assurance analysis of software-based control systems. The features of the methodology and full-scale examples of application to both generic process and nuclear power plant control systems are presented and discussed in detail. The features of a workstation software tool developed to assist users in the application of DFM are also described

  16. Development of tools for safety analysis of control software in advanced reactors

    Energy Technology Data Exchange (ETDEWEB)

    Guarro, S.; Yau, M.; Motamed, M. [Advanced Systems Concepts Associates, El Segundo, CA (United States)

    1996-04-01

    Software based control systems have gained a pervasive presence in a wide variety of applications, including nuclear power plant control and protection systems which are within the oversight and licensing responsibility of the US Nuclear Regulatory Commission. While the cost effectiveness and flexibility of software based plant process control is widely recognized, it is very difficult to achieve and prove high levels of demonstrated dependability and safety assurance for the functions performed by process control software, due to the very flexibility and potential complexity of the software itself. The development of tools to model, analyze and test software design and implementations in the context of the system that the software is designed to control can greatly assist the task of providing higher levels of assurance than those obtainable by software testing alone. This report presents and discusses the development of the Dynamic Flowgraph Methodology (DFM) and its application in the dependability and assurance analysis of software-based control systems. The features of the methodology and full-scale examples of application to both generic process and nuclear power plant control systems are presented and discussed in detail. The features of a workstation software tool developed to assist users in the application of DFM are also described.

  17. Development of a software tool for an internal dosimetry using MIRD method

    Science.gov (United States)

    Chaichana, A.; Tocharoenchai, C.

    2016-03-01

    Currently, many software packages for the internal radiation dosimetry have been developed. Many of them do not provide sufficient tools to perform all of the necessary steps from nuclear medicine image analysis for dose calculation. For this reason, we developed a CALRADDOSE software that can be performed internal dosimetry using MIRD method within a single environment. MATLAB software version 2015a was used as development tool. The calculation process of this software proceeds from collecting time-activity data from image data followed by residence time calculation and absorbed dose calculation using MIRD method. To evaluate the accuracy of this software, we calculate residence times and absorbed doses of 5 Ga- 67 studies and 5 I-131 MIBG studies and then compared the results with those obtained from OLINDA/EXM software. The results showed that the residence times and absorbed doses obtained from both software packages were not statistically significant differences. The CALRADDOSE software is a user-friendly, graphic user interface-based software for internal dosimetry. It provides fast and accurate results, which may be useful for a routine work.

  18. CHLOE: A Software Tool for Automatic Novelty Detection in Microscopy Image Datasets

    Directory of Open Access Journals (Sweden)

    Saundra Manning

    2014-09-01

    Full Text Available The recent advancements in automated microscopy and information systems allow the acquisition and storage of massive datasets of microscopy images. Here we describe CHLOE, a software tool for automatic extraction of novelty in microscopy image datasets. The tool is based on a comprehensive set of numerical image content descriptors reflecting image morphology, and can be used in combination with ROI detection and segmentation tools such as ITK. The rich feature set allows automatic detection of repetitive outlier images that are visually different from the common images in the dataset. The code and software are publicly available for free download at http://vfacstaff.ltu.edu/lshamir/downloads/chloe.

  19. About new software and hardware tools in the education of 'Semiconductor Devices'

    International Nuclear Information System (INIS)

    This paper describes the new tools, used in the education of ”Semiconductor Devices”, developed at the Technological School “Electronic Systems”, Department of the Technical University, Sofia. The software and hardware tools give the opportunity to achieve the right balance between theory and practice, and the students are given the chance to accumulate valuable “hands-on” skills. The main purpose of the developed lab exercises is to demonstrate the use of some electronic components and practice with them. Keywords: semiconductors, media software tool, hardware, education

  20. Lessons learned applying CASE methods/tools to Ada software development projects

    Science.gov (United States)

    Blumberg, Maurice H.; Randall, Richard L.

    1993-01-01

    This paper describes the lessons learned from introducing CASE methods/tools into organizations and applying them to actual Ada software development projects. This paper will be useful to any organization planning to introduce a software engineering environment (SEE) or evolving an existing one. It contains management level lessons learned, as well as lessons learned in using specific SEE tools/methods. The experiences presented are from Alpha Test projects established under the STARS (Software Technology for Adaptable and Reliable Systems) project. They reflect the front end efforts by those projects to understand the tools/methods, initial experiences in their introduction and use, and later experiences in the use of specific tools/methods and the introduction of new ones.

  1. Nmag micromagnetic simulation tool - software engineering lessons learned

    CERN Document Server

    Fangohr, Hans; Franchin, Matteo

    2016-01-01

    We review design decisions and their impact for the open source code Nmag from a software engineering in computational science point of view. Key lessons to learn include that the approach of encapsulating the simulation functionality in a library of a general purpose language, here Python, eliminates the need for configuration files, provides greatest flexibility in using the simulation, allows mixing of multiple simulations, pre- and post-processing in the same (Python) file, and allows to benefit from the rich Python ecosystem of scientific packages. The choice of programming language (OCaml) for the computational core did not resonate with the users of the package (who are not computer scientists) and was suboptimal. The choice of Python for the top-level user interface was very well received by users from the science and engineering community. The from-source installation in which key requirements were compiled from a tarball was remarkably robust. In places, the code is a lot more ambitious than necessa...

  2. Towards a Reference Architecture to Provision Tools as a Service for Global Software Development

    DEFF Research Database (Denmark)

    Chauhan, Aufeef; Babar, Muhammad Ali

    2014-01-01

    Organizations involve in Global Software Development (GSD) face challenges in terms of having access to appropriate set of tools for performing distributed engineering and development activities, integration between heterogeneous desktop and web-based tools, management of artifacts developed and...... distributed environment. In this paper, we argue the need to have a cloud-enabled platform for supporting GSD and propose reference architecture of a cloud based Platform for providing support to provision ecosystem of the Tools as a Service (PTaaS)....

  3. CHLOE: A Software Tool for Automatic Novelty Detection in Microscopy Image Datasets

    OpenAIRE

    Saundra Manning; Lior Shamir

    2014-01-01

    The recent advancements in automated microscopy and information systems allow the acquisition and storage of massive datasets of microscopy images. Here we describe CHLOE, a software tool for automatic extraction of novelty in microscopy image datasets. The tool is based on a comprehensive set of numerical image content descriptors reflecting image morphology, and can be used in combination with ROI detection and segmentation tools such as ITK. The rich feature set allows automatic detection ...

  4. The Comprehensive Evaluation of Electronic Learning Tools and Educational Software (CEELTES)

    OpenAIRE

    Štefan KAROLČÍK; Elena ČIPKOVÁ; Roman HRUŠECKÝ; Milan VESELSKÝ

    2015-01-01

    Despite the fact that digital technologies are more and more used in the learning and education process, there is still lack of professional evaluation tools capable of assessing the quality of used digital teaching aids in a comprehensive and objective manner. Construction of the Comprehensive Evaluation of Electronic Learning Tools and Educational Software (CEELTES) tool was preceded by several surveys and knowledge obtained in the course of creation of digital learning and teaching aids an...

  5. Software Tools for Robust Analysis of High-Dimensional Data

    Directory of Open Access Journals (Sweden)

    Valentin Todorov

    2014-06-01

    Full Text Available The present work discusses robust multivariate methods specifically designed for highdimensions. Their implementation in R is presented and their application is illustratedon examples. The first group are algorithms for outlier detection, already introducedelsewhere and implemented in other packages. The value added of the new package isthat all methods follow the same design pattern and thus can use the same graphicaland diagnostic tools. The next topic covered is sparse principal components including anobject oriented interface to the standard method proposed by Zou, Hastie, and Tibshirani(2006 and the robust one proposed by Croux, Filzmoser, and Fritz (2013. Robust partialleast squares (see Hubert and Vanden Branden 2003 as well as partial least squares fordiscriminant analysis conclude the scope of the new package.

  6. Software tools and frameworks in High Energy Physics

    CERN Document Server

    Brun, R

    2011-01-01

    In many fields of science and industry the computing environment has grown at an exponential speed in the past 30 years. From ad hoc solutions for each problem, the field has evolved gradually to use or reuse systems developed across the years for the same environment or coming from other fields with the same requirements. Several frameworks have emerged to solve common problems. In High Energy Physics (HEP) and Nuclear Physics, we have witnessed the emergence of common tools, packages and libraries that have become gradually the corner stone of the computing in these fields. The emergence of these systems has been complex because the computing field is evolving rapidly, the problems to be solved more and more complex and the size of the experiments now involving several thousand physicists from all over the world. This paper describes the emergence of these frameworks and their evolution from libraries including independent subroutines to task-oriented packages and to general experiments frameworks.

  7. Tools selection criteria in software-developing Small and Medium Enterprises

    OpenAIRE

    Rivas, Lornel; Pérez, María; Luis E. Mendoza; Grimán P., Anna C.

    2010-01-01

    Nowadays,it's well-known that Small and Medium Enterprises (SMEs) generate important contributions to the software industry. Their particular characteristics constitute a challenge to decision makers when selecting technologies, like Software Engineering Tools (SETs). Deciding in which SET to invest requires managing limited resources as well as productivity pressures. Additionally, changes in SETs also affect the selection process. This article proposes a set of criteria, which were formulat...

  8. A Software Tool for Optimal Sizing of PV Systems in Malaysia

    OpenAIRE

    K. Sopian; Azah Mohamed; Tamer Khatib

    2012-01-01

    This paper presents a MATLAB based user friendly software tool called as PV.MY for optimal sizing of photovoltaic (PV) systems. The software has the capabilities of predicting the metrological variables such as solar energy, ambient temperature and wind speed using artificial neural network (ANN), optimizes the PV module/ array tilt angle, optimizes the inverter size and calculate optimal capacities of PV array, battery, wind turbine and diesel generator in hybrid PV systems. The ANN based mo...

  9. 10111 Executive Summary -- Practical Software Testing: Tool Automation and Human Factors

    OpenAIRE

    Harman, Mark; Muccini, Henry; Schulte, Wolfram; Xie, Tao

    2010-01-01

    The main goal of the seminar ``Practical Software Testing: Tool Automation and Human Factors'' was to bring together academics working on algorithms, methods, and techniques for practical software testing, with practitioners, interested in developing more soundly-based and well-understood testing processes and practices. The seminar's purpose was to make researchers aware of industry's problems, and practitioners aware of research approaches. The seminar focused in particular on testing autom...

  10. PublicationHarvester: An Open-Source Software Tool for Science Policy Research

    OpenAIRE

    Pierre Azoulay; Andrew Stellman; Joshua Graff Zivin

    2006-01-01

    We present PublicationHarvester, an open-source software tool for gathering publication information on individual life scientists. The software interfaces with MEDLINE, and allows the end-user to specify up to four MEDLINE-formatted names for each researcher. Using these names along with a user-specified search query, PublicationHarvester generates yearly publication counts, optionally weighted by Journal Impact Factors. These counts are further broken-down by order on the authorship list (fi...

  11. A software tool for design of process monitoring and analysis systems

    DEFF Research Database (Denmark)

    Singh, Ravendra; Gernaey, Krist; Gani, Rafiqul

    2009-01-01

    A well designed process monitoring and analysis system is necessary to consistently achieve any predefined end product quality. Systematic computer aided methods and tools provide the means to design the necessary process monitoring and analysis systems and/or to validate any existing monitoring...... and analysis system. A software to achieve this has been developed. Two developed supporting tools for the design, a knowledge base (consisting of the process knowledge as well as the knowledge on measurement methods & tools) and a model library (consisting of the process operational models) have been extended...... rigorously and integrated with the user interface, which made the software more generic and applicable to a wide range of problems. The software for the design of a process monitoring and analysis system is presented and illustrated with a tablet manufacturing process example....

  12. Data Mining for Secure Software Engineering – Source Code Management Tool Case Study

    Directory of Open Access Journals (Sweden)

    A.V.Krishna Prasad,

    2010-07-01

    Full Text Available As Data Mining for Secure Software Engineering improves software productivity and quality, software engineers are increasingly applying data mining algorithms to various software engineering tasks. However mining software engineering data poses several challenges, requiring various algorithms to effectively mine sequences, graphs and text from such data. Software engineering data includes code bases, execution traces, historical code changes,mailing lists and bug data bases. They contains a wealth of information about a projects-status, progress and evolution. Using well established data mining techniques, practitioners and researchers can explore the potential of this valuable data in order to better manage their projects and do produce higher-quality software systems that are delivered on time and with in budget. Data mining can be used in gathering and extracting latent security requirements, extracting algorithms and business rules from code, mining legacy applications for requirements and business rules for new projects etc. Mining algorithms for software engineering falls into four main categories: Frequent pattern mining – finding commonly occurring patterns; Pattern matching – finding data instances for given patterns; Clustering – grouping data into clusters and Classification – predicting labels of data based on already labeled data. In this paper, we will discuss the overview of strategies for data mining for secure software engineering, with the implementation of a case study of text mining for source code management tool.

  13. Towards early software reliability prediction for computer forensic tools (case study).

    Science.gov (United States)

    Abu Talib, Manar

    2016-01-01

    Versatility, flexibility and robustness are essential requirements for software forensic tools. Researchers and practitioners need to put more effort into assessing this type of tool. A Markov model is a robust means for analyzing and anticipating the functioning of an advanced component based system. It is used, for instance, to analyze the reliability of the state machines of real time reactive systems. This research extends the architecture-based software reliability prediction model for computer forensic tools, which is based on Markov chains and COSMIC-FFP. Basically, every part of the computer forensic tool is linked to a discrete time Markov chain. If this can be done, then a probabilistic analysis by Markov chains can be performed to analyze the reliability of the components and of the whole tool. The purposes of the proposed reliability assessment method are to evaluate the tool's reliability in the early phases of its development, to improve the reliability assessment process for large computer forensic tools over time, and to compare alternative tool designs. The reliability analysis can assist designers in choosing the most reliable topology for the components, which can maximize the reliability of the tool and meet the expected reliability level specified by the end-user. The approach of assessing component-based tool reliability in the COSMIC-FFP context is illustrated with the Forensic Toolkit Imager case study. PMID:27386276

  14. Managing the Testing Process Practical Tools and Techniques for Managing Hardware and Software Testing

    CERN Document Server

    Black, Rex

    2011-01-01

    New edition of one of the most influential books on managing software and hardware testing In this new edition of his top-selling book, Rex Black walks you through the steps necessary to manage rigorous testing programs of hardware and software. The preeminent expert in his field, Mr. Black draws upon years of experience as president of both the International and American Software Testing Qualifications boards to offer this extensive resource of all the standards, methods, and tools you'll need. The book covers core testing concepts and thoroughly examines the best test management practices

  15. REVEAL - A tool for rule driven analysis of safety critical software

    International Nuclear Information System (INIS)

    As the determination of ultrahigh reliability figures for safety critical software is hardly possible, national and international guidelines and standards give mainly requirements for the qualitative evaluation of software. An analysis whether all these requirements are fulfilled is time and effort consuming and prone to errors, if performed manually by analysts, and should instead be dedicated to tools as far as possible. There are many ''general-purpose'' software analysis tools, both static and dynamic, which help analyzing the source code. However, they are not designed to assess the adherence to specific requirements of guidelines and standards in the nuclear field. Against the background of the development of I and C systems in the nuclear field which are based on digital techniques and implemented in high level language, it is essential that the assessor or licenser has a tool with which he can automatically and uniformly qualify as many aspects as possible of the high level language software. For this purpose the software analysis tool REVEAL has been developed at ISTec and the Halden Reactor Project. (author)

  16. Design and implementation of a software tool intended for simulation and test of real time codes

    International Nuclear Information System (INIS)

    The objective of real time software testing is to show off processing errors and unobserved functional requirements or timing constraints in a code. In the perspective of safety analysis of nuclear equipments of power plants testing should be carried independently from the physical process (which is not generally available), and because casual hardware failures must be considered. We propose here a simulation and test tool, integrally software, with large interactive possibilities for testing assembly code running on microprocessor. The OST (outil d'aide a la simulation et au Test de logiciels temps reel) simulates code execution and hardware or software environment behaviour. Test execution is closely monitored and many useful informations are automatically saved. The present thesis work details, after exposing methods and tools dedicated to real time software, the OST system. We show the internal mechanisms and objects of the system: particularly ''events'' (which describe evolutions of the system under test) and mnemonics (which describe the variables). Then, we detail the interactive means available to the user for constructing the test data and the environment of the tested software. Finally, a prototype implementation is presented along with the results of the tests carried out. This demonstrates the many advantages of the use of an automatic tool over a manual investigation. As a conclusion, further developments, nececessary to complete the final tool are rewieved

  17. Identify new Software Quality Assurance needs for the UK e-Science community and reintroduction for the right tools to improve evolved software engineering processes

    OpenAIRE

    Chang, Victor

    2008-01-01

    Software Quality Assurance (QA) is defined as the methodology and good practices for ensuring the quality of software in development. It involves in handling bug reports, bug tracking, error investigation, verification of fixed bugs, test management, test case plan and design, as well as test case execution and records. Standards such as ISO 9001 are commonly followed for software QA, which recommends using a wide range of tools to improve the existing software engineering processes (SEP) for...

  18. A Vision on the Status and Evolution of HEP Physics Software Tools

    CERN Document Server

    Canal, P; Hatcher, R; Jun, S Y; Mrenna, S

    2013-01-01

    This paper represents the vision of the members of the Fermilab Scientific Computing Division's Computational Physics Department (SCD-CPD) on the status and the evolution of various HEP software tools such as the Geant4 detector simulation toolkit, the Pythia and GENIE physics generators, and the ROOT data analysis framework. The goal of this paper is to contribute ideas to the Snowmass 2013 process toward the composition of a unified document on the current status and potential evolution of the physics software tools which are essential to HEP.

  19. Activity Theory applied to Global Software Engineering: Theoretical Foundations and Implications for Tool Builders

    DEFF Research Database (Denmark)

    Tell, Paolo; Ali Babar, Muhammad

    2012-01-01

    Although a plethora of tools are available for Global Software Engineering (GSE) teams, it is being realized increasingly that the most prevalent desktop metaphor underpinning the majority of tools have several inherent limitations. We have proposed that Activity-Based Computing (ABC) can be a...... applying activity theory to GSE. We analyze and explain the fundamental concepts of activity theory, and how they can be applied by using examples of software architecture design and evaluation processes. We describe the kind of data model and architectural support required for applying activity theory in...

  20. Data analysis software tools used during VIRGO engineering runs, review and future needs

    Energy Technology Data Exchange (ETDEWEB)

    Buskulic, D. E-mail: buskulic@lapp.in2p3.fr

    2003-04-21

    During last years, data flow and data storage needs for large gravitational waves interferometric detectors have reached an order of magnitude similar to those of high energy physics experiments. Software tools have been developed to handle and analyze these large amounts of data, with the specificities associated to gravitational waves search. We will make a review of the experience acquired during engineering runs on the VIRGO detector with the currently used data analysis software tools, pointing out the peculiarities inherent to our type of experiments. We will also show what are the possible future needs for the offline analysis VIRGO data.

  1. Integrated structural analysis tool using the linear matching method part 1 – Software development

    International Nuclear Information System (INIS)

    A number of direct methods based upon the Linear Matching Method (LMM) framework have been developed to address structural integrity issues for components subjected to cyclic thermal and mechanical load conditions. This paper presents a new integrated structural analysis tool using the LMM framework for the assessment of load carrying capacity, shakedown limit, ratchet limit and steady state cyclic response of structures. First, the development of the LMM for the evaluation of design limits in plasticity is introduced. Second, preliminary considerations for the development of the LMM into a tool which can be used on a regular basis by engineers are discussed. After the re-structuring of the LMM subroutines for multiple central processing unit (CPU) solution, the LMM software tool for the assessment of design limits in plasticity is implemented by developing an Abaqus CAE plug-in with graphical user interfaces. Further demonstration of this new LMM analysis tool including practical application and verification is presented in an accompanying paper. - Highlights: • A new structural analysis tool using the Linear Matching Method (LMM) is developed. • The software tool is able to evaluate the design limits in plasticity. • Able to assess limit load, shakedown, ratchet limit and steady state cyclic response. • Re-structuring of the LMM subroutines for multiple CPU solution is conducted. • The software tool is implemented by developing an Abaqus CAE plug-in with GUI

  2. Proceedings of the Workshop on software tools for distributed intelligent control systems

    Energy Technology Data Exchange (ETDEWEB)

    Herget, C.J. (ed.)

    1990-09-01

    The Workshop on Software Tools for Distributed Intelligent Control Systems was organized by Lawrence Livermore National Laboratory for the United States Army Headquarters Training and Doctrine Command and the Defense Advanced Research Projects Agency. The goals of the workshop were to the identify the current state of the art in tools which support control systems engineering design and implementation, identify research issues associated with writing software tools which would provide a design environment to assist engineers in multidisciplinary control design and implementation, formulate a potential investment strategy to resolve the research issues and develop public domain code which can form the core of more powerful engineering design tools, and recommend test cases to focus the software development process and test associated performance metrics. Recognizing that the development of software tools for distributed intelligent control systems will require a multidisciplinary effort, experts in systems engineering, control systems engineering, and compute science were invited to participate in the workshop. In particular, experts who could address the following topics were selected: operating systems, engineering data representation and manipulation, emerging standards for manufacturing data, mathematical foundations, coupling of symbolic and numerical computation, user interface, system identification, system representation at different levels of abstraction, system specification, system design, verification and validation, automatic code generation, and integration of modular, reusable code.

  3. A multi-professional software tool for radiation therapy treatment verification

    International Nuclear Information System (INIS)

    Purpose: Verification of patient setup is important in conformal therapy because it provides a means of quality assurance for treatment delivery. Electronic portal imaging systems have led to software tools for performing digital comparison and verification of patient setup. However, these software tools are typically designed from a radiation oncologist's perspective even though treatment verification is a team effort involving oncologists, physicists, and therapists. A new software tool, Treatment Verification Tool (TVT), has been developed as an interactive, multi-professional application for reviewing and verifying treatment plan setup using conventional personal computers. This study will describe our approach to electronic treatment verification and demonstrate the features of TVT. Methods and Materials: TVT is an object-oriented software tool written in C++ using the PC-based Windows NT environment. The software utilizes the selection of a patient's images from a database. The software is also developed as a single window interface to reduce the amount of windows presented to the user. However, the user can select from four different possible views of the patient data. One of the views is side-by-side comparison of portal images (on-line portal images or digitized port film) with a prescription image (digitized simulator film or digitally reconstructed radiograph), and another view is a textual summary of the grades of each portal image. The grades of a portal image are assigned by a radiation oncologist using an evaluation method, and the physicists and therapists may only review these results. All users of TVT can perform image enhancement processes, measure distances, and perform semi-automated registration methods. An electronic dialogue can be established through a set of annotations and notes among the radiation oncologists and the technical staff. Results: Features of TVT include: 1) side-by-side comparison of portal images and a prescription image; 2

  4. Contingency Contractor Optimization Phase 3 Sustainment Software Design Document - Contingency Contractor Optimization Tool - Prototype

    Energy Technology Data Exchange (ETDEWEB)

    Durfee, Justin David; Frazier, Christopher Rawls; Bandlow, Alisa; Jones, Katherine A

    2016-05-01

    This document describes the final software design of the Contingency Contractor Optimization Tool - Prototype. Its purpose is to provide the overall architecture of the software and the logic behind this architecture. Documentation for the individual classes is provided in the application Javadoc. The Contingency Contractor Optimization project is intended to address Department of Defense mandates by delivering a centralized strategic planning tool that allows senior decision makers to quickly and accurately assess the impacts, risks, and mitigation strategies associated with utilizing contract support. The Contingency Contractor Optimization Tool - Prototype was developed in Phase 3 of the OSD ATL Contingency Contractor Optimization project to support strategic planning for contingency contractors. The planning tool uses a model to optimize the Total Force mix by minimizing the combined total costs for selected mission scenarios. The model optimizes the match of personnel types (military, DoD civilian, and contractors) and capabilities to meet mission requirements as effectively as possible, based on risk, cost, and other requirements.

  5. Development of software tools for 4-D visualization and quantitative analysis of PHITS simulation results

    International Nuclear Information System (INIS)

    A suite of software tools has been developed to facilitate the development of apparatus using a radiation transport simulation code PHITS by enabling 4D visualization (3D space and time) and quantitative analysis of so-called dieaway plots. To deliver useable tools as soon as possible, the existing software was utilized as much as possible; ParaView will be used for the 4D visualization of the results, whereas the analyses of dieaway plots will be done with ROOT toolkit with a tool named “diana”. To enable 4D visualization using ParaView, a group of tools (angel2vtk, DispDCAS1, CamPos) has been developed for the conversion of the data format to the one which can be read from ParaView and to ease the visualization. (author)

  6. Management of an affiliated Physics Residency Program using a commercial software tool.

    Science.gov (United States)

    Zacarias, Albert S; Mills, Michael D

    2010-01-01

    A review of commercially available allied health educational management software tools was performed to evaluate their capacity to manage program data associated with a CAMPEP-accredited Therapy Physics Residency Program. Features of these software tools include: a) didactic course reporting and organization, b) competency reporting by topic, category and didactic course, c) student time management and accounting, and d) student patient case reporting by topic, category and course. The software package includes features for recording school administrative information; setting up lists of courses, faculty, clinical sites, categories, competencies, and time logs; and the inclusion of standardized external documents. There are provisions for developing evaluation and survey instruments. The mentors and program may be evaluated by residents, and residents may be evaluated by faculty members using this feature. Competency documentation includes the time spent on the problem or with the patient, time spent with the mentor, date of the competency, and approval by the mentor and program director. Course documentation includes course and lecture title, lecturer, topic information, date of lecture and approval by the Program Director. These software tools have the facility to include multiple clinical sites, with local subadministrators having the ability to approve competencies and attendance at clinical conferences. In total, these software tools have the capability of managing all components of a CAMPEP-accredited residency program. The application database lends the software to the support of multiple affiliated clinical sites within a single residency program. Such tools are a critical and necessary component if the medical physics profession is to meet the projected needs for qualified medical physicists in future years. PMID:20717075

  7. An upgrade of a computerized tool for managing agile software projects

    OpenAIRE

    Bačnar, Andrej

    2015-01-01

    The thesis describes the development of an upgrade for an agile project management software tool. In the first part, thesis presents the basic characteristics of agile methodologies with the emphasis on Scrum and Kanban methodologies. The next chapter consists of a brief description of the existing tool and the upgrade requirements specification which includes: the workflow visualization by using the board, elaboration of additional functionality to monitor the development teams and creation ...

  8. Review of Ground Systems Development and Operations (GSDO) Tools for Verifying Command and Control Software

    Science.gov (United States)

    Aguilar, Michael L.; Bonanne, Kevin H.; Favretto, Jeffrey A.; Jackson, Maddalena M.; Jones, Stephanie L.; Mackey, Ryan M.; Sarrel, Marc A.; Simpson, Kimberly A.

    2014-01-01

    The Exploration Systems Development (ESD) Standing Review Board (SRB) requested the NASA Engineering and Safety Center (NESC) conduct an independent review of the plan developed by Ground Systems Development and Operations (GSDO) for identifying models and emulators to create a tool(s) to verify their command and control software. The NESC was requested to identify any issues or weaknesses in the GSDO plan. This document contains the outcome of the NESC review.

  9. TINA manual landmarking tool: software for the precise digitization of 3D landmarks

    OpenAIRE

    Schunke Anja C; Bromiley Paul A; Tautz Diethard; Thacker Neil A

    2012-01-01

    Abstract Background Interest in the placing of landmarks and subsequent morphometric analyses of shape for 3D data has increased with the increasing accessibility of computed tomography (CT) scanners. However, current computer programs for this task suffer from various practical drawbacks. We present here a free software tool that overcomes many of these problems. Results The TINA Manual Landmarking Tool was developed for the digitization of 3D data sets. It enables the generation of a modifi...

  10. SaTool - a Software Tool for Structural Analysis of Complex Automation Systems

    DEFF Research Database (Denmark)

    Blanke, Mogens; Lorentzen, Torsten

    2006-01-01

    The paper introduces SaTool, a tool for structural analysis, the use of the Matlab (R)-based implementation is presented and special features are introduced, which were motivated by industrial users. Salient features of tool are presented, including the ability to specify the behavior of a complex...... system at a high level of functional abstraction, analyze single and multiple fault scenarios and automatically generate parity relations for diagnosis for the system in normal and impaired conditions. User interface and algorithmic details are presented....

  11. IDA: A new software tool for INTEGRAL field spectroscopy Data Analysis

    CERN Document Server

    Lorenzo, B Garcia; Megias, E

    2016-01-01

    We present a software package, IDA, which can easily handle two-dimensional spectroscopy data. IDA has been written in IDL and offers a window-based interface. The available tools can visualize a recovered image from spectra at any desired wavelength interval, obtain velocity fields, velocity dispersion distributions, etc.

  12. Review of software tools for design and analysis of large scale MRM proteomic datasets.

    Science.gov (United States)

    Colangelo, Christopher M; Chung, Lisa; Bruce, Can; Cheung, Kei-Hoi

    2013-06-15

    Selective or Multiple Reaction monitoring (SRM/MRM) is a liquid-chromatography (LC)/tandem-mass spectrometry (MS/MS) method that enables the quantitation of specific proteins in a sample by analyzing precursor ions and the fragment ions of their selected tryptic peptides. Instrumentation software has advanced to the point that thousands of transitions (pairs of primary and secondary m/z values) can be measured in a triple quadrupole instrument coupled to an LC, by a well-designed scheduling and selection of m/z windows. The design of a good MRM assay relies on the availability of peptide spectra from previous discovery-phase LC-MS/MS studies. The tedious aspect of manually developing and processing MRM assays involving thousands of transitions has spurred to development of software tools to automate this process. Software packages have been developed for project management, assay development, assay validation, data export, peak integration, quality assessment, and biostatistical analysis. No single tool provides a complete end-to-end solution, thus this article reviews the current state and discusses future directions of these software tools in order to enable researchers to combine these tools for a comprehensive targeted proteomics workflow. PMID:23702368

  13. Modern voxel based data and geometry analysis software tools for industrial CT

    International Nuclear Information System (INIS)

    Computer Tomography has become a well recognized tool in industrial quality control. Modern computer tomography systems ranging from micro-CT to huge multi MeV systems allow us to generate more and more detailed views of the inner of nearly any object. With the scan resolution becoming smaller and smaller, and at the same time image matrices becoming larger and larger, we are able to localize smallest defects even in large scale objects. At the same time even with the same data set we are able to measure the outer and inner geometry of an object with a measurement point density never known before from classical tactile or optical techniques. However, scanning objects in high resolution generates huge amounts of data, easily exceeding two GByte per scan. These huge amounts of data have caused a major drawback of a wider acceptance of CT technology in industrial use. Either no software tools have been available at all or available software process chains haven't been able to process these amounts of data in reasonable time. This presentation will introduce a new generation of 3D visualization and analysis software tools for industrial CT users. Interactive visualization of huge data sets with several Gbyte in size has become possible on a standard PC. Automatic wall thickness analysis and internal defect/porosity analysis can be done within minutes. In addition this presentation will also demonstrate the latest generation of software tools for highly accurate 3D geometry analysis based on voxel data. (author)

  14. Development and implementation of software tools for NPP component safety and life cycle monitoring

    International Nuclear Information System (INIS)

    Two information systems affecting the technical safety and durability of components, viz. the Surveillance Program and the OPTIMUD application, are described as a basis for discussion of the broader context induced by any software tool implementation in the nuclear power area. (orig.)

  15. Wiki as a Corporate Learning Tool: Case Study for Software Development Company

    Science.gov (United States)

    Milovanovic, Milos; Minovic, Miroslav; Stavljanin, Velimir; Savkovic, Marko; Starcevic, Dusan

    2012-01-01

    In our study, we attempted to further investigate how Web 2.0 technologies influence workplace learning. Our particular interest was on using Wiki as a tool for corporate exchange of knowledge with the focus on informal learning. In this study, we collaborated with a multinational software development company that uses Wiki as a corporate tool…

  16. Claire, a tool used for the simulation of events in software tests

    International Nuclear Information System (INIS)

    CLAIRE provides a purely software system which makes it possible to validate the on line applications dealt out to the specifications domain or the code. This tool offers easy graphic design of the application and of its environment. It caries out quite efficiently the simulation of any logged in model and runs the control of the evolution either dynamically or with prerecorded time. (TEC)

  17. Programming heterogeneous MPSoCs tool flows to close the software productivity gap

    CERN Document Server

    Castrillón Mazo, Jerónimo

    2014-01-01

    This book provides embedded software developers with techniques for programmingheterogeneous Multi-Processor Systems-on-Chip (MPSoCs), capable of executing multiple applications simultaneously. It describes a set of algorithms and methodologies to narrow the software productivity gap, as well as an in-depth description of the underlying problems and challenges of today’s programming practices. The authors present four different tool flows: A parallelism extraction flow for applications writtenusing the C programming language, a mapping and scheduling flow for parallel applications, a special mapping flow for baseband applications in the context of Software Defined Radio (SDR) and a final flow for analyzing multiple applications at design time. The tool flows are evaluated on Virtual Platforms (VPs), which mimic different characteristics of state-of-the-art heterogeneous MPSoCs.   • Provides a novel set of algorithms and methodologies for programming heterogeneous Multi-Processor Systems-on-Chip (MPSoCs)...

  18. Astrophysics datamining in the classroom: Exploring real data with new software tools and robotic telescopes

    CERN Document Server

    Doran, Rosa; Boudier, Thomas; Pacôme,; Delva,; Ferlet, Roger; Almeida, Maria L T; Barbosa, Domingos; Gomez, Edward; Pennypacker, Carl; Roche, Paul; Roberts, Sarah

    2012-01-01

    Within the efforts to bring frontline interactive astrophysics and astronomy to the classroom, the Hands on Universe (HOU) developed a set of exercises and platform using real data obtained by some of the most advanced ground and space observatories. The backbone of this endeavour is a new free software Web tool - Such a Lovely Software for Astronomy based on Image J (Salsa J). It is student-friendly and developed specifically for the HOU project and targets middle and high schools. It allows students to display, analyze, and explore professionally obtained astronomical images, while learning concepts on gravitational dynamics, kinematics, nuclear fusion, electromagnetism. The continuous evolving set of exercises and tutorials is being completed with real (professionally obtained) data to download and detailed tutorials. The flexibility of the Salsa J platform tool enables students and teachers to extend the exercises with their own observations. The software developed for the HOU program has been designed to...

  19. SPLASh: A software tool for stereotactic planning of recording chamber placement and electrode trajectories

    Directory of Open Access Journals (Sweden)

    Jochen Ditterich

    2011-03-01

    Full Text Available While computer-aided planning of human neurosurgeries is becoming more and more common, animal researchers still largely rely on paper atlases for planning their approach before implanting recording chambers to perform invasive recordings of neural activity, which makes this planning process tedious and error-prone. Here we present SPLASh (Stereotactic PLAnning Software, an interactive software tool for the stereotactic planning of recording chamber placement and electrode trajectories. SPLASh has been developed for monkey cortical recordings and relies on a combination of structural MRIs and electronic brain atlases. Since SPLASh is based on the neuroanatomy software Caret, it should also be possible to use it for other parts of the brain or other species for which Caret atlases are available. The tool allows the user to interactively evaluate different possible placements of recording chambers and to simulate electrode trajectories.

  20. SDMdata: A Web-Based Software Tool for Collecting Species Occurrence Records.

    Directory of Open Access Journals (Sweden)

    Xiaoquan Kong

    Full Text Available It is important to easily and efficiently obtain high quality species distribution data for predicting the potential distribution of species using species distribution models (SDMs. There is a need for a powerful software tool to automatically or semi-automatically assist in identifying and correcting errors. Here, we use Python to develop a web-based software tool (SDMdata to easily collect occurrence data from the Global Biodiversity Information Facility (GBIF and check species names and the accuracy of coordinates (latitude and longitude. It is an open source software (GNU Affero General Public License/AGPL licensed allowing anyone to access and manipulate the source code. SDMdata is available online free of charge from .

  1. Splash: A Software Tool for Stereotactic Planning of Recording Chamber Placement and Electrode Trajectories

    Science.gov (United States)

    Sperka, Daniel J.; Ditterich, Jochen

    2011-01-01

    While computer-aided planning of human neurosurgeries is becoming more and more common, animal researchers still largely rely on paper atlases for planning their approach before implanting recording chambers to perform invasive recordings of neural activity, which makes this planning process tedious and error-prone. Here we present SPLASh (Stereotactic PLAnning Software), an interactive software tool for the stereotactic planning of recording chamber placement and electrode trajectories. SPLASh has been developed for monkey cortical recordings and relies on a combination of structural MRIs and electronic brain atlases. Since SPLASh is based on the neuroanatomy software Caret, it should also be possible to use it for other parts of the brain or other species for which Caret atlases are available. The tool allows the user to interactively evaluate different possible placements of recording chambers and to simulate electrode trajectories. PMID:21472085

  2. Software Construction and Composition Tools for Petascale Computing SCW0837 Progress Report

    Energy Technology Data Exchange (ETDEWEB)

    Epperly, T W; Hochstein, L

    2011-09-12

    The majority of scientific software is distributed as source code. As the number of library dependencies and supported platforms increases, so does the complexity of describing the rules for configuring and building software. In this project, we have performed an empirical study of the magnitude of the build problem by examining the development history of two DOE-funded scientific software projects. We have developed MixDown, a meta-build tool, to simplify the task of building applications that depend on multiple third-party libraries. The results of this research indicate that the effort that scientific programmers spend takes a significant fraction of the total development effort and that the use of MixDown can significantly simplify the task of building software with multiple dependencies.

  3. An infrastructure for the creation of high end scientific and engineering software tools and applications

    Energy Technology Data Exchange (ETDEWEB)

    Drummond, L.A.; Marques, O.A.; Wilson, G.V.

    2003-04-01

    This document has been prepared as a response to the High End Computing Revitalization Task Force (HECRTF) call for white papers. Our main goal is to identify mechanism necessary for the design and implementation of an infrastructure to support development of high-end scientific and engineering software tools and applications. This infrastructure will provide a plethora of software services to facilitate the efficient deployment of future HEC technology as well as collaborations among researchers and engineers across disciplines and institutions. In particular, we address here the following points; Key software technologies that must be advanced to strengthen the foundation for developing new generations of HEC systems. A Software Infrastructure for minimizing ''time to solution'' by users of HEC systems.

  4. A Probabilistic Model and Software Tool for Evaluating the Long-Term Performance of Landfill Covers

    International Nuclear Information System (INIS)

    A probabilistic model and software tool has been developed to assist in the selection, design, and monitoring of long-term covers for contaminated subsurface sites. The software platform contains multiple modules that can be used to simulate relevant features, events, and processes that include water flux through the cover, source-term release, vadose-zone transport, saturated-zone transport, gas transport, and exposure pathways. The component models are integrated into a probabilistic total-system performance-assessment model within a drag-and-drop software platform. Uncertainty and sensitivity analyses can be conducted that yield the following primary benefits: (1) quantification of uncertainty in the simulated performance metrics; (2) identification of parameters most important to performance; and (3) comparison of alternative designs to optimize cost and performance. A case study has been performed using the Monticello Mill Tailings Site in Utah to illustrate the important features and benefits of the modeling approach and software

  5. Assess/Mitigate Risk through the Use of Computer-Aided Software Engineering (CASE) Tools

    Science.gov (United States)

    Aguilar, Michael L.

    2013-01-01

    The NASA Engineering and Safety Center (NESC) was requested to perform an independent assessment of the mitigation of the Constellation Program (CxP) Risk 4421 through the use of computer-aided software engineering (CASE) tools. With the cancellation of the CxP, the assessment goals were modified to capture lessons learned and best practices in the use of CASE tools. The assessment goal was to prepare the next program for the use of these CASE tools. The outcome of the assessment is contained in this document.

  6. Discovering patterns of correlation and similarities in software project data with the Circos visualization tool

    CERN Document Server

    Kosti, Makrina Viola; Bourazani, Nikoleta; Angelis, Lefteris

    2011-01-01

    Software cost estimation based on multivariate data from completed projects requires the building of efficient models. These models essentially describe relations in the data, either on the basis of correlations between variables or of similarities between the projects. The continuous growth of the amount of data gathered and the need to perform preliminary analysis in order to discover patterns able to drive the building of reasonable models, leads the researchers towards intelligent and time-saving tools which can effectively describe data and their relationships. The goal of this paper is to suggest an innovative visualization tool, widely used in bioinformatics, which represents relations in data in an aesthetic and intelligent way. In order to illustrate the capabilities of the tool, we use a well known dataset from software engineering projects.

  7. IPAT: a freely accessible software tool for analyzing multiple patent documents with inbuilt landscape visualizer.

    Science.gov (United States)

    Ajay, Dara; Gangwal, Rahul P; Sangamwar, Abhay T

    2015-01-01

    Intelligent Patent Analysis Tool (IPAT) is an online data retrieval tool, operated based on text mining algorithm to extract specific patent information in a predetermined pattern into an Excel sheet. The software is designed and developed to retrieve and analyze technology information from multiple patent documents and generate various patent landscape graphs and charts. The software is C# coded in visual studio 2010, which extracts the publicly available patent information from the web pages like Google Patent and simultaneously study the various technology trends based on user-defined parameters. In other words, IPAT combined with the manual categorization will act as an excellent technology assessment tool in competitive intelligence and due diligence for predicting the future R&D forecast. PMID:26452016

  8. SU-E-T-27: A Tool for Routine Quality Assurance of Radiotherapy Dose Calculation Software

    International Nuclear Information System (INIS)

    Purpose: Dose calculation software is thoroughly evaluated when it is commissioned; however, evaluation of periodic software updates is typically limited in scope due to staffing constraints and the need to quickly return the treatment planning system to clinical service. We developed a tool for quickly and comprehensively testing and documenting dose calculation software against measured data. Methods: A tool was developed using MatLab (The MathWorks, Natick, MA) for evaluation of dose calculation algorithms against measured data. Inputs to the tool are measured data, reference DICOM RT PLAN files describing the measurements, and dose calculations in DICOM format. The tool consists of a collection of extensible modules that can perform analysis of point dose, depth dose curves, and profiles using dose difference, distance-to-agreement, and the gamma-index. Each module generates a report subsection that is incorporated into a master template, which is converted to final form in portable document format (PDF). Results: After each change to the treatment planning system, a report can be generated in approximately 90 minutes. The tool has been in use for more than 5 years, spanning 5 versions of the eMC and 4 versions of the AAA. We have detected changes to the algorithms that affected clinical practice once during this period. Conclusion: Our tool provides an efficient method for quality assurance of dose calculation software, providing a complete set of tests for an update. Future work includes the addition of plan level tests, allowing incorporation of, for example, the TG-119 test suite for IMRT, and integration with the treatment planning system via an application programming interface. Integration with the planning system will permit fully-automated testing and reporting at scheduled intervals

  9. SU-E-T-27: A Tool for Routine Quality Assurance of Radiotherapy Dose Calculation Software

    Energy Technology Data Exchange (ETDEWEB)

    Popple, R; Cardan, R; Duan, J; Wu, X; Shen, S; Brezovich, I [The University of Alabama at Birmingham, Birmingham, AL (United States)

    2014-06-01

    Purpose: Dose calculation software is thoroughly evaluated when it is commissioned; however, evaluation of periodic software updates is typically limited in scope due to staffing constraints and the need to quickly return the treatment planning system to clinical service. We developed a tool for quickly and comprehensively testing and documenting dose calculation software against measured data. Methods: A tool was developed using MatLab (The MathWorks, Natick, MA) for evaluation of dose calculation algorithms against measured data. Inputs to the tool are measured data, reference DICOM RT PLAN files describing the measurements, and dose calculations in DICOM format. The tool consists of a collection of extensible modules that can perform analysis of point dose, depth dose curves, and profiles using dose difference, distance-to-agreement, and the gamma-index. Each module generates a report subsection that is incorporated into a master template, which is converted to final form in portable document format (PDF). Results: After each change to the treatment planning system, a report can be generated in approximately 90 minutes. The tool has been in use for more than 5 years, spanning 5 versions of the eMC and 4 versions of the AAA. We have detected changes to the algorithms that affected clinical practice once during this period. Conclusion: Our tool provides an efficient method for quality assurance of dose calculation software, providing a complete set of tests for an update. Future work includes the addition of plan level tests, allowing incorporation of, for example, the TG-119 test suite for IMRT, and integration with the treatment planning system via an application programming interface. Integration with the planning system will permit fully-automated testing and reporting at scheduled intervals.

  10. Life Cycle Assessment Studies of Chemical and Biochemical Processes through the new LCSoft Software-tool

    DEFF Research Database (Denmark)

    Supawanich, Perapong; Malakul, Pomthong; Gani, Rafiqul

    2015-01-01

    Life Cycle Assessment or LCA is an effective tool for quantifying the potential environmental impacts of products, processes, or services in order to support the selection making of desired products and/or processes from different alternatives. For more sustainable process designs, technical requ...... LCI assessment results. The fourth task has been added to validate and improve LCSoft by testing it against several case studies and compare the assessment results with other available tools....... requirements have to be evaluated together with environmental and economic aspects. The LCSoft software-tool has been developed to perform LCA as a stand-alone tool as well as integrated with other process design tools such as process simulation, economic analysis (ECON), and sustainable process design...

  11. Reliability and Portability Assessment Tool Based on Hazard Rates for an Embedded Open Source Software

    Directory of Open Access Journals (Sweden)

    Yoshinobu Tamura

    2014-10-01

    Full Text Available An embedded OSS (Open Source Software known as one of OSS has been gaining a lot of attention in the embedded system area, i.e., Android, BusyBox, TRON, etc. However, the poor handling of quality problem and customer support prohibit the progress of embedded OSS. Therefore, many companies have been hesitant to innovate the embedded OSS because of OSS includes several software versions. Also, it is difficult for developers to assess reliability and portability of the porting-phase in case of installing the embedded OSS on a single-board computer. In this paper, we develop a method of software reliability/portability assessment tool based on a hazard rate model for the embedded OSS. Especially, we analyze actual software failure-occurrence time-interval data to show numerical examples of software reliability/portability assessment for the embedded OSS. Moreover, we show that our model and tool can assist quality improvement for embedded OSS systems development.

  12. Ignominy: a tool for software dependency and metric analysis with examples from large HEP packages

    International Nuclear Information System (INIS)

    Ignominy is a tool developed in the CMS IGUANA project to analyse the structure of software systems. Its primary component is a dependency scanner that distills information into human-usable forms. It also includes several tools to visualise the collected data in the form of graphical views and numerical metrics. Ignominy was designed to adapt to almost any reasonable structure, and it has been used to analyse several large projects. The original purpose of Ignominy was to help us better ensure the quality of our own software, and in particular warn us about possible structural problems early on. As a part of this activity it is now used as a standard part of our release procedure. The authors also use it to evaluate and study the quality of external packages they plan to make use of. The authors describe what Ignominy can find out, and how it can be used to visualise and assess a software structure. The authors also discuss the inherent problems of the analysis as well as the different approaches to modularity the tool makes quite evident. The focus is the illustration of these issues through the analysis results for several sizable HEP software projects

  13. Development of homemade software for radiation management using versatile commercial tools

    International Nuclear Information System (INIS)

    It has been popular to use the computer in almost all of radiation facilities. However the commercial software for radiation control is expensive and is not often suited for each facility. Especially, I have a particular problem that the two buildings of control area and the radiation control office are separated each other. Therefore, I developed the homemade software for radiation control by using commercial tools. I realized a favorite system characterized by the real-time acquisition of user input data from control area and automatic preparation of several legal reports for radiation control. (author)

  14. A semi-automatic software tool for batch processing of yeast colony images

    Czech Academy of Sciences Publication Activity Database

    Schier, Jan; Kovář, Bohumil

    Innsbruck: The International Association of Science and Technology for Development (IASTED), 2011 - (Zhang, J.), s. 206-212 ISBN 978-0-88986-865-6. [Eighth IASTED International Conference on Signal Processing, Pattern Recognition, and Applications (SPPRA). Innsbruck (AT), 16.02.2011-18.02.2011] R&D Projects: GA MŠk(CZ) 1M0567 Institutional research plan: CEZ:AV0Z10750506 Keywords : yeast colonies * Petri dish * Image segmentation * Fast radial transform Subject RIV: JC - Computer Hardware ; Software http://library.utia.cas.cz/separaty/2011/ZS/schier-a semi-automatic software tool for batch processing of yeast colony images.pdf

  15. Lessons learned from former radiation accidents on development of software tools for effective decision making support

    Czech Academy of Sciences Publication Activity Database

    Pecha, Petr; Hofman, Radek; Kuča, P.

    Praha : T-SOFT a.s, 2009, 15-1-15-8. ISBN 978-80-254-5913-3. [11th International Conference on Present and Future of Crisis Management 2009. Praha (CZ), 23.11.2009-24.11.2009] R&D Projects: GA ČR(CZ) GA102/07/1596 Institutional research plan: CEZ:AV0Z10750506 Keywords : Nuclear Accident * Lessons Learned * Software Support Subject RIV: AQ - Safety, Health Protection, Human - Machine http://library.utia.cas.cz/separaty/2009/AS/pecha- lessons learned from former radiation accidents on development of software tools for effective decision making support.pdf

  16. Cerec Smile Design--a software tool for the enhancement of restorations in the esthetic zone.

    Science.gov (United States)

    Kurbad, Andreas; Kurbad, Susanne

    2013-01-01

    Restorations in the esthetic zone can now be enhanced using software tools. In addition to the design of the restoration itself, a part or all of the patient's face can be displayed on the monitor to increase the predictability of treatment results. Using the Smile Design components of the Cerec and inLab software, a digital photograph of the patient can be projected onto a three-dimensional dummy head. In addition to its use for the enhancement of the CAD process, this technology can also be utilized for marketing purposes. PMID:24364196

  17. A review of diffusion tensor magnetic resonance imaging computational methods and software tools.

    Science.gov (United States)

    Hasan, Khader M; Walimuni, Indika S; Abid, Humaira; Hahn, Klaus R

    2011-12-01

    In this work we provide an up-to-date short review of computational magnetic resonance imaging (MRI) and software tools that are widely used to process and analyze diffusion-weighted MRI data. A review of different methods used to acquire, model and analyze diffusion-weighted imaging data (DWI) is first provided with focus on diffusion tensor imaging (DTI). The major preprocessing, processing and post-processing procedures applied to DTI data are discussed. A list of freely available software packages to analyze diffusion MRI data is also provided. PMID:21087766

  18. Software tools for simultaneous data visualization and T cell epitopes and disorder prediction in proteins.

    Science.gov (United States)

    Jandrlić, Davorka R; Lazić, Goran M; Mitić, Nenad S; Pavlović, Mirjana D

    2016-04-01

    We have developed EpDis and MassPred, extendable open source software tools that support bioinformatic research and enable parallel use of different methods for the prediction of T cell epitopes, disorder and disordered binding regions and hydropathy calculation. These tools offer a semi-automated installation of chosen sets of external predictors and an interface allowing for easy application of the prediction methods, which can be applied either to individual proteins or to datasets of a large number of proteins. In addition to access to prediction methods, the tools also provide visualization of the obtained results, calculation of consensus from results of different methods, as well as import of experimental data and their comparison with results obtained with different predictors. The tools also offer a graphical user interface and the possibility to store data and the results obtained using all of the integrated methods in the relational database or flat file for further analysis. The MassPred part enables a massive parallel application of all integrated predictors to the set of proteins. Both tools can be downloaded from http://bioinfo.matf.bg.ac.rs/home/downloads.wafl?cat=Software. Appendix A includes the technical description of the created tools and a list of supported predictors. PMID:26851400

  19. A Runtime Environment for Supporting Research in Resilient HPC System Software & Tools

    Energy Technology Data Exchange (ETDEWEB)

    Vallee, Geoffroy R [ORNL; Naughton, III, Thomas J [ORNL; Boehm, Swen [ORNL; Engelmann, Christian [ORNL

    2013-01-01

    The high-performance computing (HPC) community continues to increase the size and complexity of hardware platforms that support advanced scientific workloads. The runtime environment (RTE) is a crucial layer in the software stack for these large-scale systems. The RTE manages the interface between the operating system and the application running in parallel on the machine. The deployment of applications and tools on large-scale HPC computing systems requires the RTE to manage process creation in a scalable manner, support sparse connectivity, and provide fault tolerance. We have developed a new RTE that provides a basis for building distributed execution environments and developing tools for HPC to aid research in system software and resilience. This paper describes the software architecture of the Scalable runTime Component Infrastructure (STCI), which is intended to provide a complete infrastructure for scalable start-up and management of many processes in large-scale HPC systems. We highlight features of the current implementation, which is provided as a system library that allows developers to easily use and integrate STCI in their tools and/or applications. The motivation for this work has been to support ongoing research activities in fault-tolerance for large-scale systems. We discuss the advantages of the modular framework employed and describe two use cases that demonstrate its capabilities: (i) an alternate runtime for a Message Passing Interface (MPI) stack, and (ii) a distributed control and communication substrate for a fault-injection tool.

  20. Distortion Analysis Toolkit—A Software Tool for Easy Analysis of Nonlinear Audio Systems

    Directory of Open Access Journals (Sweden)

    Pakarinen Jyri

    2010-01-01

    Full Text Available Several audio effects devices deliberately add nonlinear distortion to the processed signal in order to create a desired sound. When creating virtual analog models of nonlinearly distorting devices, it would be very useful to carefully analyze the type of distortion, so that the model could be made as realistic as possible. While traditional system analysis tools such as the frequency response give detailed information on the operation of linear and time-invariant systems, they are less useful for analyzing nonlinear devices. Furthermore, although there do exist separate algorithms for nonlinear distortion analysis, there is currently no unified, easy-to-use tool for rapid analysis of distorting audio systems. This paper offers a remedy by introducing a new software tool for easy analysis of distorting effects. A comparison between a well-known guitar tube amplifier and two commercial software simulations is presented as a case study. This freely available software is written in Matlab language, but the analysis tool can also run as a standalone program, so the user does not need to have Matlab installed in order to perform the analysis.

  1. Exploiting Patterns and Tool Support for Reusable and Automated Change Support for Software Architectures

    Directory of Open Access Journals (Sweden)

    Aakash Ahmad

    2016-01-01

    Full Text Available Lehman?s law of continuing change implies that software must continually evolve to accommodate frequently changing requirements in existing systems. Also, maintainability as an attribute of system quality requires that changes are to be systematically implemented in existing software throughout its lifecycle. To support a continuous software evolution, the primary challenges include (i enhancing reuse of recurring changes; and (ii decreasing the efforts for change implementation. We propose change patterns and demonstrate their applicability as reusable solutions to recurring problems of architectural change implementation. Tool support can empower the role of a designer/architect by facilitating them to avoid labourious tasks and executing complex and large number of changes in an automated way. Recently, change patterns as well as tool support have been exploited for architecture evolution, however; there is no research to unify pattern-driven (reusable and toolsupported (automated evolution that is the contribution of this paper. By exploiting patterns with tool support we demonstrate the evolution of a peerto- peer system towards client-server architecture. Evaluation results suggest that: (i patterns promote reuse but lack fine-granular change implementation, and (ii tool supports automation but user intervention is required to customise architecture change management.

  2. TINA manual landmarking tool: software for the precise digitization of 3D landmarks

    Directory of Open Access Journals (Sweden)

    Schunke Anja C

    2012-04-01

    Full Text Available Abstract Background Interest in the placing of landmarks and subsequent morphometric analyses of shape for 3D data has increased with the increasing accessibility of computed tomography (CT scanners. However, current computer programs for this task suffer from various practical drawbacks. We present here a free software tool that overcomes many of these problems. Results The TINA Manual Landmarking Tool was developed for the digitization of 3D data sets. It enables the generation of a modifiable 3D volume rendering display plus matching orthogonal 2D cross-sections from DICOM files. The object can be rotated and axes defined and fixed. Predefined lists of landmarks can be loaded and the landmarks identified within any of the representations. Output files are stored in various established formats, depending on the preferred evaluation software. Conclusions The software tool presented here provides several options facilitating the placing of landmarks on 3D objects, including volume rendering from DICOM files, definition and fixation of meaningful axes, easy import, placement, control, and export of landmarks, and handling of large datasets. The TINA Manual Landmark Tool runs under Linux and can be obtained for free from http://www.tina-vision.net/tarballs/.

  3. Creating a strategic plan for configuration management using computer aided software engineering (CASE) tools

    International Nuclear Information System (INIS)

    This paper provides guidance in the definition, documentation, measurement, enhancement of processes, and validation of a strategic plan for configuration management (CM). The approach and methodology used in establishing a strategic plan is the same for any enterprise, including the Department of Energy (DOE), commercial nuclear plants, the Department of Defense (DOD), or large industrial complexes. The principles and techniques presented are used world wide by some of the largest corporations. The authors used industry knowledge and the areas of their current employment to illustrate and provide examples. Developing a strategic configuration and information management plan for DOE Idaho Field Office (DOE-ID) facilities is discussed in this paper. A good knowledge of CM principles is the key to successful strategic planning. This paper will describe and define CM elements, and discuss how CM integrates the facility's physical configuration, design basis, and documentation. The strategic plan does not need the support of a computer aided software engineering (CASE) tool. However, the use of the CASE tool provides a methodology for consistency in approach, graphics, and database capability combined to form an encyclopedia and a method of presentation that is easily understood and aids the process of reengineering. CASE tools have much more capability than those stated above. Some examples are supporting a joint application development group (JAD) to prepare a software functional specification document and, if necessary, provide the capability to automatically generate software application code. This paper briefly discusses characteristics and capabilities of two CASE tools that use different methodologies to generate similar deliverables

  4. Development of Safety-Critical Software for Nuclear Power Plant using a CASE Tool

    International Nuclear Information System (INIS)

    The Integrated SOftware Development Environment (ISODE) is developed to provide the major S/W life cycle processes that are composed of development process, V/V process, requirements traceability process, and automated document generation process and target importing process to Programmable Logic Controller (PLC) platform. This provides critical safety software developers with a certified, domain optimized, model-based development environment, and the associated services to reduce time and efforts to develop software such as debugging, simulation, code generation and document generation. This also provides critical safety software verifiers with integrated V/V features of each phase of the software life cycle using appropriate tools such as model test coverage, formal verification, and automated report generation. In addition to development and verification, the ISODE gives a complete traceability solution from the SW design phase to the testing phase. Using this information, the coverage and impact analysis can be done easily whenever software modification is necessary. The final source codes of ISODE are imported into the newly developed PLC environment, as a module based after automatically converted into the format required by PLC. Additional tests for module and unit level are performed on the target platform

  5. Development of Safety-Critical Software for Nuclear Power Plant using a CASE Tool

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Chang Ho; Oh, Do Young; Kim, Koh Eun; Choi, Woong Seock; Sohn, Se Do; Kim, Jae Hack; Kim, Hang Bae [KEPCO E and C, Daejeon (Korea, Republic of)

    2011-08-15

    The Integrated SOftware Development Environment (ISODE) is developed to provide the major S/W life cycle processes that are composed of development process, V/V process, requirements traceability process, and automated document generation process and target importing process to Programmable Logic Controller (PLC) platform. This provides critical safety software developers with a certified, domain optimized, model-based development environment, and the associated services to reduce time and efforts to develop software such as debugging, simulation, code generation and document generation. This also provides critical safety software verifiers with integrated V/V features of each phase of the software life cycle using appropriate tools such as model test coverage, formal verification, and automated report generation. In addition to development and verification, the ISODE gives a complete traceability solution from the SW design phase to the testing phase. Using this information, the coverage and impact analysis can be done easily whenever software modification is necessary. The final source codes of ISODE are imported into the newly developed PLC environment, as a module based after automatically converted into the format required by PLC. Additional tests for module and unit level are performed on the target platform.

  6. The D2G2 project: a new software tool for nuclear engineering design in Canada

    Energy Technology Data Exchange (ETDEWEB)

    Rheaume, P.; Lefebvre, J.F.; Roy, R.; Koclas, J. [Ecole Polytechnique de Montreal, Nuclear Engineering Inst., Montreal, Quebec (Canada)]. E-mail: pascal.rheaume@polymtl.ca; jean-francois.lefebvre@polymtl.ca

    2004-07-01

    Nowadays, high quality neutronic simulation codes are readily available. The open source software suite DRAGON/DONJON is a good example. It is free, it has proven quality and correctness over the years and is still developed and maintained at Ecole Polytechnique de Montreal. However, most simulation codes have the following weaknesses: limited usability, poor maintainability, no internal data standardization and poor portability. The D2G2 project is a software development initiative which aims to create an upper layer software tool that annihilates the weakness of classic simulation codes. This paper presents D2G2Client's and D2G2Server's principal capabilities, how they interact and the libraries they use. (author)

  7. PyElph - a software tool for gel images analysis and phylogenetics

    Directory of Open Access Journals (Sweden)

    Pavel Ana Brânduşa

    2012-01-01

    Full Text Available Abstract Background This paper presents PyElph, a software tool which automatically extracts data from gel images, computes the molecular weights of the analyzed molecules or fragments, compares DNA patterns which result from experiments with molecular genetic markers and, also, generates phylogenetic trees computed by five clustering methods, using the information extracted from the analyzed gel image. The software can be successfully used for population genetics, phylogenetics, taxonomic studies and other applications which require gel image analysis. Researchers and students working in molecular biology and genetics would benefit greatly from the proposed software because it is free, open source, easy to use, has a friendly Graphical User Interface and does not depend on specific image acquisition devices like other commercial programs with similar functionalities do. Results PyElph software tool is entirely implemented in Python which is a very popular programming language among the bioinformatics community. It provides a very friendly Graphical User Interface which was designed in six steps that gradually lead to the results. The user is guided through the following steps: image loading and preparation, lane detection, band detection, molecular weights computation based on a molecular weight marker, band matching and finally, the computation and visualization of phylogenetic trees. A strong point of the software is the visualization component for the processed data. The Graphical User Interface provides operations for image manipulation and highlights lanes, bands and band matching in the analyzed gel image. All the data and images generated in each step can be saved. The software has been tested on several DNA patterns obtained from experiments with different genetic markers. Examples of genetic markers which can be analyzed using PyElph are RFLP (Restriction Fragment Length Polymorphism, AFLP (Amplified Fragment Length Polymorphism, RAPD

  8. PC Software graphics tool for conceptual design of space/planetary electrical power systems

    Science.gov (United States)

    Truong, Long V.

    1995-01-01

    This paper describes the Decision Support System (DSS), a personal computer software graphics tool for designing conceptual space and/or planetary electrical power systems. By using the DSS, users can obtain desirable system design and operating parameters, such as system weight, electrical distribution efficiency, and bus power. With this tool, a large-scale specific power system was designed in a matter of days. It is an excellent tool to help designers make tradeoffs between system components, hardware architectures, and operation parameters in the early stages of the design cycle. The DSS is a user-friendly, menu-driven tool with online help and a custom graphical user interface. An example design and results are illustrated for a typical space power system with multiple types of power sources, frequencies, energy storage systems, and loads.

  9. The Use of Software Project Management Tools in Saudi Arabia: An Exploratory Survey

    Directory of Open Access Journals (Sweden)

    Nouf AlMobarak

    2013-08-01

    Full Text Available This paper reports the results of an online survey study, which was conducted to investigate the use of software project management tools in Saudi Arabia. The survey provides insights of project management in the local context of Saudi Arabia from ten different companies which participated in this study. The aim is to explore and specify the project management tools used by software project management teams and their managers, to understand the supported features that might influence their selection. Moreover, the existence of the Arabic interface, the Hijri calendar and the Arabic documentation has been specially considered, due to the nature of the local context in dealing with the Hijri calendar and the prolific use of Arabic as the formal language in communication with clients in the public sector.

  10. RAVEN as a tool for dynamic probabilistic risk assessment: Software overview

    International Nuclear Information System (INIS)

    RAVEN is a software tool under development at the Idaho National Laboratory (INL) that acts as the control logic driver and post-processing tool for the newly developed Thermal-Hydraulic code RELAP-7. The scope of this paper is to show the software structure of RAVEN and its utilization in connection with RELAP-7. A short overview of the mathematical framework behind the code is presented along with its main capabilities such as on-line controlling/ monitoring and Monte-Carlo sampling. A demo of a Station Black Out PRA analysis of a simplified Pressurized Water Reactor (PWR) model is shown in order to demonstrate the Monte-Carlo and clustering capabilities. (authors)

  11. RAVEN AS A TOOL FOR DYNAMIC PROBABILISTIC RISK ASSESSMENT: SOFTWARE OVERVIEW

    Energy Technology Data Exchange (ETDEWEB)

    Alfonsi Andrea; Mandelli Diego; Rabiti Cristian; Joshua Cogliati; Robert Kinoshita

    2013-05-01

    RAVEN is a software tool under development at the Idaho National Laboratory (INL) that acts as the control logic driver and post-processing tool for the newly developed Thermo-Hydraylic code RELAP- 7. The scope of this paper is to show the software structure of RAVEN and its utilization in connection with RELAP-7. A short overview of the mathematical framework behind the code is presented along with its main capabilities such as on-line controlling/monitoring and Monte-Carlo sampling. A demo of a Station Black Out PRA analysis of a simplified Pressurized Water Reactor (PWR) model is shown in order to demonstrate the Monte-Carlo and clustering capabilities.

  12. RAVEN as a tool for dynamic probabilistic risk assessment: Software overview

    Energy Technology Data Exchange (ETDEWEB)

    Alfonsi, A.; Rabiti, C.; Mandelli, D.; Cogliati, J. J.; Kinoshita, R. A. [Idaho National Laboratory, 2525 Fremont Avenue, Idaho Falls, ID 83415 (United States)

    2013-07-01

    RAVEN is a software tool under development at the Idaho National Laboratory (INL) that acts as the control logic driver and post-processing tool for the newly developed Thermal-Hydraulic code RELAP-7. The scope of this paper is to show the software structure of RAVEN and its utilization in connection with RELAP-7. A short overview of the mathematical framework behind the code is presented along with its main capabilities such as on-line controlling/ monitoring and Monte-Carlo sampling. A demo of a Station Black Out PRA analysis of a simplified Pressurized Water Reactor (PWR) model is shown in order to demonstrate the Monte-Carlo and clustering capabilities. (authors)

  13. Development of computer-aided software engineering tool for sequential control of JT-60U

    Energy Technology Data Exchange (ETDEWEB)

    Shimono, M.; Akasaka, H.; Kurihara, K.; Kimura, T. [Japan Atomic Energy Research Inst., Naka, Ibaraki (Japan). Naka Fusion Research Establishment

    1995-12-31

    Discharge sequential control (DSC) is an essential control function for the intermittent and pulse discharge operation of a tokamak device, so that many subsystems may work with each other in correct order and/or synchronously. In the development of the DSC program, block diagrams of logical operation for sequential control are illustrated in its design at first. Then, the logical operators and I/O`s which are involved in the block diagrams are compiled and converted to a certain particular form. Since the block diagrams of the sequential control amounts to about 50 sheets in the case of the JT-60 upgrade tokamak (JT-60U) high power discharge and the above steps of the development have been performed manually so far, a great effort has been required for the program development. In order to remove inefficiency in such development processes, a computer-aided software engineering (CASE) tool has been developed on a UNIX workstation. This paper reports how the authors design it for the development of the sequential control programs. The tool is composed of the following three tools: (1) Automatic drawing tool, (2) Editing tool, and (3) Trace tool. This CASE tool, an object-oriented programming tool having graphical formalism, can powerfully accelerate the cycle for the development of the sequential control function commonly associated with pulse discharge in a tokamak fusion device.

  14. Software Tools for Emittance Measurement and Matching for 12 GeV CEBAF

    Energy Technology Data Exchange (ETDEWEB)

    Turner, Dennis L. [Thomas Jefferson National Accelerator Facility (TJNAF), Newport News, VA (United States)

    2016-05-01

    This paper discusses model-driven setup of the Continuous Electron Beam Accelerator Facility (CEBAF) for the 12GeV era, focusing on qsUtility. qsUtility is a set of software tools created to perform emittance measurements, analyze those measurements, and compute optics corrections based upon the measurements.qsUtility was developed as a toolset to facilitate reducing machine configuration time and reproducibility by way of an accurate accelerator model, and to provide Operations staff with tools to measure and correct machine optics with little or no assistance from optics experts.

  15. Nuclear power plant aging and lifetime management with the COMSY software tool

    International Nuclear Information System (INIS)

    The main challenge of lifetime management in nuclear power plants is to maintain the integrity of systems, components and structures over long operational periods. In order to accomplish this problem it is important to know the specific and relevant damage mechanisms of each system, component or structure. The software tool COMSY is supporting knowledge-based life time management strategies. The program includes advanced calculations tools and models for damage prediction and an extensive material properties library and is performing the management of inspection data. Based on COMSY a condition-oriented lifetime assessment for components and piping systems is available. Results from component inspections are systematically used to refine the lifetime assessment.

  16. A Case Study of Black-Box Testing for Embedded Software using Test Automation Tool

    OpenAIRE

    Changhyun Baek; Joongsoon Jang; Gihyun Jung; Kyunghee Choi; Seungkyu Park

    2007-01-01

    This research shows a case study of the Black-Box testing for Temperature Controller (TC) which is one of the typical embedded systems. A test automation tool, TEST, was developed and some kinds of TCs were tested using the tool. We presented statistical analysis for the test results of the automated testing and defined the properties of the software bugs for the embedded system. The main result of the study were the following: (a) test case prioritization technique was needed because the rev...

  17. A SOFTWARE TOOL TO COMPARE MEASURED AND SIMULATED BUILDING ENERGY PERFORMANCE DATA

    Energy Technology Data Exchange (ETDEWEB)

    Maile, Tobias; Bazjanac, Vladimir; O' Donnell, James; Garr, Matthew

    2011-11-01

    Building energy performance is often inadequate when compared to design goals. To link design goals to actual operation one can compare measured with simulated energy performance data. Our previously developed comparison approach is the Energy Performance Comparison Methodology (EPCM), which enables the identification of performance problems based on a comparison of measured and simulated performance data. In context of this method, we developed a software tool that provides graphing and data processing capabilities of the two performance data sets. The software tool called SEE IT (Stanford Energy Efficiency Information Tool) eliminates the need for manual generation of data plots and data reformatting. SEE IT makes the generation of time series, scatter and carpet plots independent of the source of data (measured or simulated) and provides a valuable tool for comparing measurements with simulation results. SEE IT also allows assigning data points on a predefined building object hierarchy and supports different versions of simulated performance data. This paper briefly introduces the EPCM, describes the SEE IT tool and illustrates its use in the context of a building case study.

  18. Texas Renewable Energy Evaluation Software (TREES): A screening tool for economic assessment of renewable energy options

    International Nuclear Information System (INIS)

    Screening software has been developed to assess the economic feasibility of renewable energy alternatives for applications in Texas. The renewable options include solar water heating and PV and wind for electric power generation. A range of hot water applications, from swimming pool heating to process heat are permitted, and PV and wind energy may be considered for either stand-alone or grid-connected use. The user inputs data such as application type, location, size of load, system cost, competing fuel cost, and discount rate. For the solar water heating options the software outputs the optimum collector area, life cycle savings compared to a conventional system, and a cost effectiveness index to assess economic feasibility. For the PV and wind options the software outputs life cycle savings, the average unit cost of delivered electricity, and a cost effectiveness index to assess economic feasibility. The software is designed as a screening tool, rather than design tool, and is developed for use with a spreadsheet program

  19. Formal testing of object-oriented software: from the method to the tool

    OpenAIRE

    Péraire, Cécile

    1998-01-01

    This thesis presents a method and a tool for test set selection, dedicated to object-oriented applications and based on formal specifications. Testing is one method to increase the quality of today's extraordinary complex software. The aim is to find program errors with respect to given criteria of correctness. In the case of formal testing, the criterion of correctness is the formal specification of the tested application: program behaviors are compared to those required by the specification...

  20. NgsRelate: a software tool for estimating pairwise relatedness from next-generation sequencing data

    OpenAIRE

    Korneliussen, Thorfinn Sand; Moltke, Ida

    2015-01-01

    MOTIVATION: Pairwise relatedness estimation is important in many contexts such as disease mapping and population genetics. However, all existing estimation methods are based on called genotypes, which is not ideal for next-generation sequencing (NGS) data of low depth from which genotypes cannot be called with high certainty.RESULTS: We present a software tool, NgsRelate, for estimating pairwise relatedness from NGS data. It provides maximum likelihood estimates that are based on genotype lik...

  1. PlanetPack software tool for exoplanets detection: coming new features

    OpenAIRE

    Baluev, Roman V.

    2014-01-01

    We briefly overview the new features of PlanetPack2, the forthcoming update of PlanetPack, which is a software tool for exoplanets detection and characterization from Doppler radial velocity data. Among other things, this major update brings parallelized computing, new advanced models of the Doppler noise, handling of the so-called Keplerian periodogram, and routines for transits fitting and transit timing variation analysis.

  2. PlanetPack software tool for exoplanets detection: coming new features

    Science.gov (United States)

    Baluev, Roman V.

    2014-07-01

    We briefly overview the new features of PlanetPack2, the forthcoming update of PlanetPack, which is a software tool for exoplanets detection and characterization from Doppler radial velocity data. Among other things, this major update brings parallelized computing, new advanced models of the Doppler noise, handling of the so-called Keplerian periodogram, and routines for transits fitting and transit timing variation analysis.

  3. GMFilter and SXTestPlate: software tools for improving the SNPlex™ genotyping system

    OpenAIRE

    Schreiber Stefan; Wenz Michael H; Teuber Markus; Franke Andre

    2009-01-01

    Abstract Background Genotyping of single-nucleotide polymorphisms (SNPs) is a fundamental technology in modern genetics. The SNPlex™ mid-throughput genotyping system (Applied Biosystems, Foster City, CA, USA) enables the multiplexed genotyping of up to 48 SNPs simultaneously in a single DNA sample. The high level of automation and the large amount of data produced in a high-throughput laboratory require advanced software tools for quality control and workflow management. Results We have devel...

  4. 10111 Abstracts Collection -- Practical Software Testing : Tool Automation and Human Factors

    OpenAIRE

    Harman, Mark; Muccini, Henry; Schulte, Wolfram; Xie, Tao

    2010-01-01

    From March 14, 2010 to March 19, 2010, the Dagstuhl Seminar 10111 ``Practical Software Testing : Tool Automation and Human Factors'' was held in Schloss Dagstuhl~--~Leibniz Center for Informatics. During the seminar, several participants presented their current research, and ongoing work and open problems were discussed. Abstracts of the presentations given during the seminar as well as abstracts of seminar results and ideas are put together in this paper. The first section des...

  5. Formal testing of object-oriented software: from the method to the tool

    OpenAIRE

    Péraire, Cécile; Strohmeier, Alfred

    2005-01-01

    This thesis presents a method and a tool for test set selection, dedicated to object-oriented applications and based on formal specifications. Testing is one method to increase the quality of today’s extraordinary complex software. The aim is to find program errors with respect to given criteria of correctness. In the case of formal testing, the criterion of correctness is the formal specification of the tested application: program behaviors are compared to those required by the specification...

  6. Open Source Platforms, Applications and Tools for Software-Defined Networking and 5G Research

    OpenAIRE

    Suomalainen, Lauri; Nikkhouy, Emad; Ding, Aaron Yi; Tarkoma, Sasu

    2014-01-01

    Software-Defined Networking (SDN) is a novel solution to network configuration and management. Its openness and programmability features have greatly motivated the open source communities where numerous applications and tools are developed for various R&D purposes. For the strength of SDN, the upcoming 5th Generation mobile networks (5G) can also benefit from the modular and open design to innovate the network architecture and services. In this report, we present a survey of existing open ...

  7. A design software tool for conceptual design of wind turbine gearboxes

    OpenAIRE

    Firth, A.; Long, H.

    2010-01-01

    The paper reports the development of a design software tool for wind turbine gearboxes. It facilitates the conceptual design of wind turbine gearboxes supporting designs with different combinations of epicyclic and parallel gear stages. Analyses of gear bending strength and pitting resistance are in accordance with the AGMA2001 standard. The calculations of the AGMA geometry factors and are verified in accordance with the AGMA 908 information sheets. A case study of a 2MW and three phase asyn...

  8. A quantitative analysis software tool for mass spectrometry–based proteomics

    OpenAIRE

    Park, Sung Kyu; Venable, John D.; Xu, Tao; Yates, John R.

    2008-01-01

    We describe Census, a quantitative software tool compatible with many labeling strategies as well as with label-free analyses, single-stage mass spectrometry (MS1) and tandem mass spectrometry (MS/MS) scans, and high- and low-resolution mass spectrometry data. Census uses robust algorithms to address poor-quality measurements and improve quantitative efficiency, and it can support several input file formats. We tested Census with stable-isotope labeling analyses as well as label-free analyses.

  9. A web-based software tool to estimate unregulated daily streamflow at ungauged rivers

    Directory of Open Access Journals (Sweden)

    S. A. Archfield

    2012-08-01

    Full Text Available Streamflow information is critical for solving any number of hydrologic problems. Often times, streamflow information is needed at locations which are ungauged and, therefore, have no observations on which to base water management decisions. Furthermore, there has been increasing need for daily streamflow time series to manage rivers for both human and ecological functions. To facilitate negotiation between human and ecological demands for water, this paper presents the first publically-available, map-based, regional software tool to interactively estimate daily streamflow time series at any user-selected ungauged river location. The map interface allows users to locate and click on a river location, which then returns estimates of daily streamflow for the location selected. For the demonstration region in the northeast United States, daily streamflow was shown to be reliably estimated by the software tool, with efficiency values computed from observed and estimated streamflows ranging from 0.69 to 0.92. The software tool provides a general framework that can be applied to other regions for which daily streamflow estimates are needed.

  10. User Driven Development of Software Tools for Open Data Discovery and Exploration

    Science.gov (United States)

    Schlobinski, Sascha; Keppel, Frank; Dihe, Pascal; Boot, Gerben; Falkenroth, Esa

    2016-04-01

    The use of open data in research faces challenges not restricted to inherent properties such as data quality, resolution of open data sets. Often Open data is catalogued insufficiently or fragmented. Software tools that support the effective discovery including the assessment of the data's appropriateness for research have shortcomings such as the lack of essential functionalities like support for data provenance. We believe that one of the reasons is the neglect of real end users requirements in the development process of aforementioned software tools. In the context of the FP7 Switch-On project we have pro-actively engaged the relevant user user community to collaboratively develop a means to publish, find and bind open data relevant for hydrologic research. Implementing key concepts of data discovery and exploration we have used state of the art web technologies to provide an interactive software tool that is easy to use yet powerful enough to satisfy the data discovery and access requirements of the hydrological research community.

  11. Ignominy:a Tool for Software Dependency and Metric Analysis with Examples from Large HEP Packages

    Institute of Scientific and Technical Information of China (English)

    LassiA.Tuura; LucasTaylor

    2001-01-01

    Ignominy is a tool developed in the CMS IGUANA project to analyse the structure of software systems.Its primary component is a dependency scanner that distills information into human-usable forms.It also includes several tools to visualise the collected data in the form of graphical views and numerical metrics.Ignominy was designed to adapt to almost any reasonable structure,and it has been used to analyse several large projects. The original purpose of Ignominy was to help us better ensure the quality of our own software,and in particular warn us about possible structureal problems early on .As a part of this activity it is now used as a standard part of our release procedure,we also use it to evaluate and study the quality of external packages we plan to make use of .We describe what Ignominy can find out,and how if can be used to ivsualise and assess a software structure.We also discuss the inherent problems of the analysis as well as the different approaches to modularity the tool makes quite evident.The focus is the illustration of these issues through the analysis results for several sizable HEP softwre projects.

  12. COMSY- A Software Tool For Aging And Plant Life Management With An Integrated Documentation Tool

    International Nuclear Information System (INIS)

    For the aging and plant life management the integrity of the mechanical components and structures is one of the key objectives. In order to ensure this integrity it is essential to implement a comprehensive aging management. This should be applied to all safety relevant mechanical systems or components, civil structures, electrical systems as well as instrumentation and control (I and C). The following aspects should be covered: - Identification and assessment of relevant degradation mechanisms; - Verification and evaluation of the quality status of all safety relevant systems, structures and components (SSC's); - Verification and modernization of I and C and electrical systems; - Reliable and up-to-date documentation. For the support of this issue AREVA NP GmbH has developed the computer program COMSY, which utilizes more than 30 years of experience resulting from research activities and operational experience. The program provides the option to perform a plant-wide screening for identifying system areas, which are sensitive to specific degradation mechanisms. Another object is the administration and evaluation of NDE measurements from different techniques. An integrated documentation tool makes the document management and maintenance fast, reliable and independent from staff service. (authors)

  13. Methods and tools used at the IPSN for the safety assessment of critical software

    International Nuclear Information System (INIS)

    A significant feature of EDF's latest 1400MWe ''N4'' generation of pressurized water reactor (PWR) is the extensive use of computerized instrumentation and control, including a fully digital system for the reactor protection function. For the safety assessment of the software driving the operation of this digital reactor protection called SPIN, IPSN has developed and implemented a set of methods and tools. Using the lessons learned from this experience, IPSN has worked at improving those methods and tools, mainly trying to make them more automatic to use, and has participated in an international assessment exercise to test some other methods and tools, either new products on the market or self-developed products. As a result of these works, this paper presents an up to date overview of the IPSN methods and tools used for the assessment of safety critical software. This assessment, which consists of an analysis of all the documentation associated with the technical specifications and of a representative set of functions, is usually carried out in five steps: (1) critical examination of the documents, (2) evaluation of the quality of the code, (3) determination of the critical software components, (4) development of test cases and choice of testing strategy, (5) dynamic analysis (consistency and robustness). This paper also presents methods and tools developed or implemented by IPSN in order to: evaluate the completeness and consistency of specification and design documents written in natural language; build a model and simulate specification or design items; evaluate the quality of the source code; carry out FMEA analysis; run the binary code and perform tests (CLAIRE); perform random or mutational tests. (author)

  14. New software tool for dynamic radiological characterisation and monitoring in nuclear sites

    International Nuclear Information System (INIS)

    The Halden Reactor Project (HRP) is a jointly sponsored international cooperation, under the aegis of the Organisation for Economic Co-operation and Development - Nuclear Energy Agency. Extensive and valuable guidance and tools, connected to safe and reliable operation of nuclear facilities, has been elaborated throughout the years within the frame of this programme. The HRP has particularly high level results in virtual-reality based tools for real-time areal and personal monitoring. The techniques, developed earlier, are now being supplemented to enhance the planning and monitoring capabilities, and support general radiological characterisation connected to nuclear sites and facilities. Due to the complexity and abundance of the input information required, software tools, dedicated to the radiological characterization of contaminated materials, buildings, land and groundwater, are applied to review, evaluate and visualize the data. Characterisation of the radiation situation in a realistic environment can be very complex, and efficient visualisation of the data to the user is not straight forward. The monitoring and planning tools elaborated in the frame of the HRP feature very sophisticated three-dimensional (3D) high definition visualisation and user interfaces to promote easy interpretation of the input data. The visualisation tools permit dynamic visualisation of radiation fields in virtual or augmented reality by various techniques and real-time personal monitoring of humanoid models. In addition new techniques are being elaborated to visualise the 3D distribution of activities in structures and materials. The dosimetric algorithms, feeding information to the visualisation and user interface of these planning tools, include deterministic radiation transport techniques suitable for fast photon dose estimates, in case physical and radio- and spectrometric characteristics of the gamma sources are known. The basic deterministic model, implemented in earlier

  15. Development of a User Interface for a Regression Analysis Software Tool

    Science.gov (United States)

    Ulbrich, Norbert Manfred; Volden, Thomas R.

    2010-01-01

    An easy-to -use user interface was implemented in a highly automated regression analysis tool. The user interface was developed from the start to run on computers that use the Windows, Macintosh, Linux, or UNIX operating system. Many user interface features were specifically designed such that a novice or inexperienced user can apply the regression analysis tool with confidence. Therefore, the user interface s design minimizes interactive input from the user. In addition, reasonable default combinations are assigned to those analysis settings that influence the outcome of the regression analysis. These default combinations will lead to a successful regression analysis result for most experimental data sets. The user interface comes in two versions. The text user interface version is used for the ongoing development of the regression analysis tool. The official release of the regression analysis tool, on the other hand, has a graphical user interface that is more efficient to use. This graphical user interface displays all input file names, output file names, and analysis settings for a specific software application mode on a single screen which makes it easier to generate reliable analysis results and to perform input parameter studies. An object-oriented approach was used for the development of the graphical user interface. This choice keeps future software maintenance costs to a reasonable limit. Examples of both the text user interface and graphical user interface are discussed in order to illustrate the user interface s overall design approach.

  16. BYMUR software: a free and open source tool for quantifying and visualizing multi-risk analyses

    Science.gov (United States)

    Tonini, Roberto; Selva, Jacopo

    2013-04-01

    The BYMUR software aims to provide an easy-to-use open source tool for both computing multi-risk and managing/visualizing/comparing all the inputs (e.g. hazard, fragilities and exposure) as well as the corresponding results (e.g. risk curves, risk indexes). For all inputs, a complete management of inter-model epistemic uncertainty is considered. The BYMUR software will be one of the final products provided by the homonymous ByMuR project (http://bymur.bo.ingv.it/) funded by Italian Ministry of Education, Universities and Research (MIUR), focused to (i) provide a quantitative and objective general method for a comprehensive long-term multi-risk analysis in a given area, accounting for inter-model epistemic uncertainty through Bayesian methodologies, and (ii) apply the methodology to seismic, volcanic and tsunami risks in Naples (Italy). More specifically, the BYMUR software will be able to separately account for the probabilistic hazard assessment of different kind of hazardous phenomena, the relative (time-dependent/independent) vulnerabilities and exposure data, and their possible (predefined) interactions: the software will analyze these inputs and will use them to estimate both single- and multi- risk associated to a specific target area. In addition, it will be possible to connect the software to further tools (e.g., a full hazard analysis), allowing a dynamic I/O of results. The use of Python programming language guarantees that the final software will be open source and platform independent. Moreover, thanks to the integration of some most popular and rich-featured Python scientific modules (Numpy, Matplotlib, Scipy) with the wxPython graphical user toolkit, the final tool will be equipped with a comprehensive Graphical User Interface (GUI) able to control and visualize (in the form of tables, maps and/or plots) any stage of the multi-risk analysis. The additional features of importing/exporting data in MySQL databases and/or standard XML formats (for

  17. Robust optimal design of experiments for model discrimination using an interactive software tool.

    Science.gov (United States)

    Stegmaier, Johannes; Skanda, Dominik; Lebiedz, Dirk

    2013-01-01

    Mathematical modeling of biochemical processes significantly contributes to a better understanding of biological functionality and underlying dynamic mechanisms. To support time consuming and costly lab experiments, kinetic reaction equations can be formulated as a set of ordinary differential equations, which in turn allows to simulate and compare hypothetical models in silico. To identify new experimental designs that are able to discriminate between investigated models, the approach used in this work solves a semi-infinite constrained nonlinear optimization problem using derivative based numerical algorithms. The method takes into account parameter variabilities such that new experimental designs are robust against parameter changes while maintaining the optimal potential to discriminate between hypothetical models. In this contribution we present a newly developed software tool that offers a convenient graphical user interface for model discrimination. We demonstrate the beneficial operation of the discrimination approach and the usefulness of the software tool by analyzing a realistic benchmark experiment from literature. New robust optimal designs that allow to discriminate between the investigated model hypotheses of the benchmark experiment are successfully calculated and yield promising results. The involved robustification approach provides maximally discriminating experiments for the worst parameter configurations, which can be used to estimate the meaningfulness of upcoming experiments. A major benefit of the graphical user interface is the ability to interactively investigate the model behavior and the clear arrangement of numerous variables. In addition to a brief theoretical overview of the discrimination method and the functionality of the software tool, the importance of robustness of experimental designs against parameter variability is demonstrated on a biochemical benchmark problem. The software is licensed under the GNU General Public License

  18. Robust optimal design of experiments for model discrimination using an interactive software tool.

    Directory of Open Access Journals (Sweden)

    Johannes Stegmaier

    Full Text Available Mathematical modeling of biochemical processes significantly contributes to a better understanding of biological functionality and underlying dynamic mechanisms. To support time consuming and costly lab experiments, kinetic reaction equations can be formulated as a set of ordinary differential equations, which in turn allows to simulate and compare hypothetical models in silico. To identify new experimental designs that are able to discriminate between investigated models, the approach used in this work solves a semi-infinite constrained nonlinear optimization problem using derivative based numerical algorithms. The method takes into account parameter variabilities such that new experimental designs are robust against parameter changes while maintaining the optimal potential to discriminate between hypothetical models. In this contribution we present a newly developed software tool that offers a convenient graphical user interface for model discrimination. We demonstrate the beneficial operation of the discrimination approach and the usefulness of the software tool by analyzing a realistic benchmark experiment from literature. New robust optimal designs that allow to discriminate between the investigated model hypotheses of the benchmark experiment are successfully calculated and yield promising results. The involved robustification approach provides maximally discriminating experiments for the worst parameter configurations, which can be used to estimate the meaningfulness of upcoming experiments. A major benefit of the graphical user interface is the ability to interactively investigate the model behavior and the clear arrangement of numerous variables. In addition to a brief theoretical overview of the discrimination method and the functionality of the software tool, the importance of robustness of experimental designs against parameter variability is demonstrated on a biochemical benchmark problem. The software is licensed under the GNU

  19. TESPI (Tool for Environmental Sound Product Innovation): a simplified software tool to support environmentally conscious design in SMEs

    Science.gov (United States)

    Misceo, Monica; Buonamici, Roberto; Buttol, Patrizia; Naldesi, Luciano; Grimaldi, Filomena; Rinaldi, Caterina

    2004-12-01

    TESPI (Tool for Environmental Sound Product Innovation) is the prototype of a software tool developed within the framework of the "eLCA" project. The project, (www.elca.enea.it)financed by the European Commission, is realising "On line green tools and services for Small and Medium sized Enterprises (SMEs)". The implementation by SMEs of environmental product innovation (as fostered by the European Integrated Product Policy, IPP) needs specific adaptation to their economic model, their knowledge of production and management processes and their relationships with innovation and the environment. In particular, quality and costs are the main driving forces of innovation in European SMEs, and well known barriers exist to the adoption of an environmental approach in the product design. Starting from these considerations, the TESPI tool has been developed to support the first steps of product design taking into account both the quality and the environment. Two main issues have been considered: (i) classic Quality Function Deployment (QFD) can hardly be proposed to SMEs; (ii) the environmental aspects of the product life cycle need to be integrated with the quality approach. TESPI is a user friendly web-based tool, has a training approach and applies to modular products. Users are guided through the investigation of the quality aspects of their product (customer"s needs and requirements fulfilment) and the identification of the key environmental aspects in the product"s life cycle. A simplified check list allows analyzing the environmental performance of the product. Help is available for a better understanding of the analysis criteria. As a result, the significant aspects for the redesign of the product are identified.

  20. Utilization of Software Tools for Uncertainty Calculation in Measurement Science Education

    International Nuclear Information System (INIS)

    Despite its importance, uncertainty is often neglected by practitioners in the design of system even in safety critical applications. Thus, problems arising from uncertainty may only be identified late in the design process and thus lead to additional costs. Although there exists numerous tools to support uncertainty calculation, reasons for limited usage in early design phases may be low awareness of the existence of the tools and insufficient training in the practical application. We present a teaching philosophy that addresses uncertainty from the very beginning of teaching measurement science, in particular with respect to the utilization of software tools. The developed teaching material is based on the GUM method and makes use of uncertainty toolboxes in the simulation environment. Based on examples in measurement science education we discuss advantages and disadvantages of the proposed teaching philosophy and include feedback from students

  1. Different methods and software tools for short-term prediction of wind energy production

    Energy Technology Data Exchange (ETDEWEB)

    Ibrahim, Hussein [Wind Energy TechnoCentre (Canada)

    2011-07-01

    This paper discusses the different methods and software tools used for short-term prediction of wind energy production. Forecasts of the production of wind farms are important for a variety of reasons, especially to guarantee security of supply of the power system. There are two types of short-term predictions, physical and statistical. Physical methods use physical considerations before using model output statistics to reduce error, while statistical methods, using recursive techniques, tries to find relationships between measured results. The different types of physical and statistical models and their performance factors are discussed in detail. There are models that use both physical and statistical considerations and these are called combined models. The different wind power forecasting tools available in the market are also mentioned. It can be concluded that there are several operational tools available to meet the diversity of end-user requirements. Further investments on wind installations should provide security of supply, lower financial risks and higher acceptability.

  2. Software tool for analysing the family shopping basket without candidate generation

    Directory of Open Access Journals (Sweden)

    Roberto Carlos Naranjo Cuervo

    2010-05-01

    Full Text Available Tools leading to useful knowledge being obtained for supporting marketing decisions being taken are currently needed in the e-commerce environment. A process is needed for this which uses a series of techniques for data-processing; data-mining is one such technique enabling automatic information discovery. This work presents the association rules as a suitable technique for dis-covering how customers buy from a company offering business to consumer (B2C e-business, aimed at supporting decision-ma-king in supplying its customers or capturing new ones. Many algorithms such as A priori, DHP, Partition, FP-Growth and Eclat are available for implementing association rules; the following criteria were defined for selecting the appropriate algorithm: database insert, computational cost, performance and execution time. The development of a software tool is also presented which involved the CRISP-DM approach; this software tool was formed by the following four sub-modules: data pre-processing, data-mining, re-sults analysis and results application. The application design used three-layer architecture: presentation logic, business logic and service logic. Data warehouse design and algorithm design were included in developing this data-mining software tool. It was tested by using a FoodMart company database; the tests included performance, functionality and results’ validity, thereby allo-wing association rules to be found. The results led to concluding that using association rules as a data mining technique facilita-tes analysing volumes of information for B2C e-business services which represents a competitive advantage for those companies using Internet as their sales’ media.

  3. A software tool for modification of human voxel models used for application in radiation protection

    International Nuclear Information System (INIS)

    This note describes a new software tool called 'VolumeChange' that was developed to modify the masses and location of organs of virtual human voxel models. A voxel model is a three-dimensional representation of the human body in the form of an array of identification numbers that are arranged in slices, rows and columns. Each entry in this array represents a voxel; organs are represented by those voxels having the same identification number. With this tool, two human voxel models were adjusted to fit the reference organ masses of a male and a female adult, as defined by the International Commission on Radiological Protection (ICRP). The alteration of an already existing voxel model is a complicated process, leading to many problems that have to be solved. To solve those intricacies in an easy way, a new software tool was developed and is presented here. If the organs are modified, no bit of tissue, i.e. voxel, may vanish nor should an extra one appear. That means that organs cannot be modified without considering the neighbouring tissue. Thus, the principle of organ modification is based on the reassignment of voxels from one organ/tissue to another; actually deleting and adding voxels is only possible at the external surface, i.e. skin. In the software tool described here, the modifications are done by semi-automatic routines but including human control. Because of the complexity of the matter, a skilled person has to validate that the applied changes to organs are anatomically reasonable. A graphical user interface was designed to fulfil the purpose of a comfortable working process, and an adequate graphical display of the modified voxel model was developed. Single organs, organ complexes and even whole limbs can be edited with respect to volume, shape and location. (note)

  4. DVS-SOFTWARE: An Effective Tool for Applying Highly Parallelized Hardware To Computational Geophysics

    Science.gov (United States)

    Herrera, I.; Herrera, G. S.

    2015-12-01

    Most geophysical systems are macroscopic physical systems. The behavior prediction of such systems is carried out by means of computational models whose basic models are partial differential equations (PDEs) [1]. Due to the enormous size of the discretized version of such PDEs it is necessary to apply highly parallelized super-computers. For them, at present, the most efficient software is based on non-overlapping domain decomposition methods (DDM). However, a limiting feature of the present state-of-the-art techniques is due to the kind of discretizations used in them. Recently, I. Herrera and co-workers using 'non-overlapping discretizations' have produced the DVS-Software which overcomes this limitation [2]. The DVS-software can be applied to a great variety of geophysical problems and achieves very high parallel efficiencies (90%, or so [3]). It is therefore very suitable for effectively applying the most advanced parallel supercomputers available at present. In a parallel talk, in this AGU Fall Meeting, Graciela Herrera Z. will present how this software is being applied to advance MOD-FLOW. Key Words: Parallel Software for Geophysics, High Performance Computing, HPC, Parallel Computing, Domain Decomposition Methods (DDM)REFERENCES [1]. Herrera Ismael and George F. Pinder, Mathematical Modelling in Science and Engineering: An axiomatic approach", John Wiley, 243p., 2012. [2]. Herrera, I., de la Cruz L.M. and Rosas-Medina A. "Non Overlapping Discretization Methods for Partial, Differential Equations". NUMER METH PART D E, 30: 1427-1454, 2014, DOI 10.1002/num 21852. (Open source) [3]. Herrera, I., & Contreras Iván "An Innovative Tool for Effectively Applying Highly Parallelized Software To Problems of Elasticity". Geofísica Internacional, 2015 (In press)

  5. Emerging role of bioinformatics tools and software in evolution of clinical research

    Science.gov (United States)

    Gill, Supreet Kaur; Christopher, Ajay Francis; Gupta, Vikas; Bansal, Parveen

    2016-01-01

    Clinical research is making toiling efforts for promotion and wellbeing of the health status of the people. There is a rapid increase in number and severity of diseases like cancer, hepatitis, HIV etc, resulting in high morbidity and mortality. Clinical research involves drug discovery and development whereas clinical trials are performed to establish safety and efficacy of drugs. Drug discovery is a long process starting with the target identification, validation and lead optimization. This is followed by the preclinical trials, intensive clinical trials and eventually post marketing vigilance for drug safety. Softwares and the bioinformatics tools play a great role not only in the drug discovery but also in drug development. It involves the use of informatics in the development of new knowledge pertaining to health and disease, data management during clinical trials and to use clinical data for secondary research. In addition, new technology likes molecular docking, molecular dynamics simulation, proteomics and quantitative structure activity relationship in clinical research results in faster and easier drug discovery process. During the preclinical trials, the software is used for randomization to remove bias and to plan study design. In clinical trials software like electronic data capture, Remote data capture and electronic case report form (eCRF) is used to store the data. eClinical, Oracle clinical are software used for clinical data management and for statistical analysis of the data. After the drug is marketed the safety of a drug could be monitored by drug safety software like Oracle Argus or ARISg. Therefore, softwares are used from the very early stages of drug designing, to drug development, clinical trials and during pharmacovigilance. This review describes different aspects related to application of computers and bioinformatics in drug designing, discovery and development, formulation designing and clinical research.

  6. Emerging role of bioinformatics tools and software in evolution of clinical research

    Directory of Open Access Journals (Sweden)

    Supreet Kaur Gill

    2016-01-01

    Full Text Available Clinical research is making toiling efforts for promotion and wellbeing of the health status of the people. There is a rapid increase in number and severity of diseases like cancer, hepatitis, HIV etc, resulting in high morbidity and mortality. Clinical research involves drug discovery and development whereas clinical trials are performed to establish safety and efficacy of drugs. Drug discovery is a long process starting with the target identification, validation and lead optimization. This is followed by the preclinical trials, intensive clinical trials and eventually post marketing vigilance for drug safety. Softwares and the bioinformatics tools play a great role not only in the drug discovery but also in drug development. It involves the use of informatics in the development of new knowledge pertaining to health and disease, data management during clinical trials and to use clinical data for secondary research. In addition, new technology likes molecular docking, molecular dynamics simulation, proteomics and quantitative structure activity relationship in clinical research results in faster and easier drug discovery process. During the preclinical trials, the software is used for randomization to remove bias and to plan study design. In clinical trials software like electronic data capture, Remote data capture and electronic case report form (eCRF is used to store the data. eClinical, Oracle clinical are software used for clinical data management and for statistical analysis of the data. After the drug is marketed the safety of a drug could be monitored by drug safety software like Oracle Argus or ARISg. Therefore, softwares are used from the very early stages of drug designing, to drug development, clinical trials and during pharmacovigilance. This review describes different aspects related to application of computers and bioinformatics in drug designing, discovery and development, formulation designing and clinical research.

  7. Emerging role of bioinformatics tools and software in evolution of clinical research.

    Science.gov (United States)

    Gill, Supreet Kaur; Christopher, Ajay Francis; Gupta, Vikas; Bansal, Parveen

    2016-01-01

    Clinical research is making toiling efforts for promotion and wellbeing of the health status of the people. There is a rapid increase in number and severity of diseases like cancer, hepatitis, HIV etc, resulting in high morbidity and mortality. Clinical research involves drug discovery and development whereas clinical trials are performed to establish safety and efficacy of drugs. Drug discovery is a long process starting with the target identification, validation and lead optimization. This is followed by the preclinical trials, intensive clinical trials and eventually post marketing vigilance for drug safety. Softwares and the bioinformatics tools play a great role not only in the drug discovery but also in drug development. It involves the use of informatics in the development of new knowledge pertaining to health and disease, data management during clinical trials and to use clinical data for secondary research. In addition, new technology likes molecular docking, molecular dynamics simulation, proteomics and quantitative structure activity relationship in clinical research results in faster and easier drug discovery process. During the preclinical trials, the software is used for randomization to remove bias and to plan study design. In clinical trials software like electronic data capture, Remote data capture and electronic case report form (eCRF) is used to store the data. eClinical, Oracle clinical are software used for clinical data management and for statistical analysis of the data. After the drug is marketed the safety of a drug could be monitored by drug safety software like Oracle Argus or ARISg. Therefore, softwares are used from the very early stages of drug designing, to drug development, clinical trials and during pharmacovigilance. This review describes different aspects related to application of computers and bioinformatics in drug designing, discovery and development, formulation designing and clinical research. PMID:27453827

  8. A software tool of digital tomosynthesis application for patient positioning in radiotherapy.

    Science.gov (United States)

    Yan, Hui; Dai, Jian-Rong

    2016-01-01

    Digital Tomosynthesis (DTS) is an image modality in reconstructing tomographic images from two-dimensional kV projections covering a narrow scan angles. Comparing with conventional cone-beam CT (CBCT), it requires less time and radiation dose in data acquisition. It is feasible to apply this technique in patient positioning in radiotherapy. To facilitate its clinical application, a software tool was developed and the reconstruction processes were accelerated by graphic process-ing unit (GPU). Two reconstruction and two registration processes are required for DTS application which is different from conventional CBCT application which requires one image reconstruction process and one image registration process. The reconstruction stage consists of productions of two types of DTS. One type of DTS is reconstructed from cone-beam (CB) projections covering a narrow scan angle and is named onboard DTS (ODTS), which represents the real patient position in treatment room. Another type of DTS is reconstructed from digitally reconstructed radiography (DRR) and is named reference DTS (RDTS), which represents the ideal patient position in treatment room. Prior to the reconstruction of RDTS, The DRRs are reconstructed from planning CT using the same acquisition setting of CB projections. The registration stage consists of two matching processes between ODTS and RDTS. The target shift in lateral and longitudinal axes are obtained from the matching between ODTS and RDTS in coronal view, while the target shift in longitudinal and vertical axes are obtained from the matching between ODTS and RDTS in sagittal view. In this software, both DRR and DTS reconstruction algorithms were implemented on GPU environments for acceleration purpose. The comprehensive evaluation of this software tool was performed including geometric accuracy, image quality, registration accuracy, and reconstruction efficiency. The average correlation coefficient between DRR/DTS generated by GPU-based algorithm

  9. Practical requirements for software tools to assist in the validation and verification of hybrid expert systems

    International Nuclear Information System (INIS)

    Any practical software development effort must remain focused on verification and validation of user requirements. Knowledge-based system development is no different in this regard. In industry today, most expert systems being produced are, in reality, hybrid software systems which, in addition to those components that provide the knowledge base and expert reasoning over the problem domain using various rule-based and object-oriented paradigms, incorporate significant bodies of code based on more traditional software techniques such as database management, graphical user interfaces, hypermedia, spreadsheets, as well as specially developed sequential code. Validation and verification of such hybrid systems must perforce integrate suitable methodologies from all such fields. This paper attempts to provide a broad overview of the practical requirements for methodologies and the concomitant groupware tools which would assist in such an enterprise. These methodologies and groupware tools would facilitate the teamwork efforts necessary to validate and verify all components of such hybrid systems by emphasizing cooperative recording of requirements and negotiated resolutions of any conflicts grounded in a solid understanding of the semantics of such a system

  10. GMFilter and SXTestPlate: software tools for improving the SNPlex™ genotyping system

    Science.gov (United States)

    Teuber, Markus; Wenz, Michael H; Schreiber, Stefan; Franke, Andre

    2009-01-01

    Background Genotyping of single-nucleotide polymorphisms (SNPs) is a fundamental technology in modern genetics. The SNPlex™ mid-throughput genotyping system (Applied Biosystems, Foster City, CA, USA) enables the multiplexed genotyping of up to 48 SNPs simultaneously in a single DNA sample. The high level of automation and the large amount of data produced in a high-throughput laboratory require advanced software tools for quality control and workflow management. Results We have developed two programs, which address two main aspects of quality control in a SNPlex™ genotyping environment: GMFilter improves the analysis of SNPlex™ plates by removing wells with a low overall signal intensity. It enables scientists to automatically process the raw data in a standardized way before analyzing a plate with the proprietary GeneMapper software from Applied Biosystems. SXTestPlate examines the genotype concordance of a SNPlex™ test plate, which was typed with a control SNP set. This program allows for regular quality control checks of a SNPlex™ genotyping platform. It is compatible to other genotyping methods as well. Conclusion GMFilter and SXTestPlate provide a valuable tool set for laboratories engaged in genotyping based on the SNPlex™ system. The programs enhance the analysis of SNPlex™ plates with the GeneMapper software and enable scientists to evaluate the performance of their genotyping platform. PMID:19267942

  11. GMFilter and SXTestPlate: software tools for improving the SNPlex™ genotyping system

    Directory of Open Access Journals (Sweden)

    Schreiber Stefan

    2009-03-01

    Full Text Available Abstract Background Genotyping of single-nucleotide polymorphisms (SNPs is a fundamental technology in modern genetics. The SNPlex™ mid-throughput genotyping system (Applied Biosystems, Foster City, CA, USA enables the multiplexed genotyping of up to 48 SNPs simultaneously in a single DNA sample. The high level of automation and the large amount of data produced in a high-throughput laboratory require advanced software tools for quality control and workflow management. Results We have developed two programs, which address two main aspects of quality control in a SNPlex™ genotyping environment: GMFilter improves the analysis of SNPlex™ plates by removing wells with a low overall signal intensity. It enables scientists to automatically process the raw data in a standardized way before analyzing a plate with the proprietary GeneMapper software from Applied Biosystems. SXTestPlate examines the genotype concordance of a SNPlex™ test plate, which was typed with a control SNP set. This program allows for regular quality control checks of a SNPlex™ genotyping platform. It is compatible to other genotyping methods as well. Conclusion GMFilter and SXTestPlate provide a valuable tool set for laboratories engaged in genotyping based on the SNPlex™ system. The programs enhance the analysis of SNPlex™ plates with the GeneMapper software and enable scientists to evaluate the performance of their genotyping platform.

  12. Experimental Evaluation of Verification and Validation Tools on Martian Rover Software

    Science.gov (United States)

    Brat, Guillaume; Giannakopoulou, Dimitra; Goldberg, Allen; Havelund, Klaus; Lowry, Mike; Pasareanu, Corina; Venet, Arnaud; Visser, Willem

    2003-01-01

    To achieve its science objectives in deep space exploration, NASA has a need for science platform vehicles to autonomously make control decisions in a time frame that excludes intervention from Earth-based controllers. Round-trip light-time is one significant factor motivating autonomy capability, another factor is the need to reduce ground support operations cost. An unsolved problem potentially impeding the adoption of autonomy capability is the verification and validation of such software systems, which exhibit far more behaviors (and hence distinct execution paths in the software) than is typical in current deepspace platforms. Hence the need for a study to benchmark advanced Verification and Validation (V&V) tools on representative autonomy software. The objective of the study was to access the maturity of different technologies, to provide data indicative of potential synergies between them, and to identify gaps in the technologies with respect to the challenge of autonomy V&V. The study consisted of two parts: first, a set of relatively independent case studies of different tools on the same autonomy code, second a carefully controlled experiment with human participants on a subset of these technologies. This paper describes the second part of the study. Overall, nearly four hundred hours of data on human use of three different advanced V&V tools were accumulated, with a control group that used conventional testing methods. The experiment simulated four independent V&V teams debugging three successive versions of an executive controller for a Martian Rover. Defects were carefully seeded into the three versions based on a profile of defects from CVS logs that occurred in the actual development of the executive controller. The rest of the document is structured a s follows. In section 2 and 3, we respectively describe the tools used in the study and the rover software that was analyzed. In section 4 the methodology for the experiment is described; this

  13. Conception and validation software tools for the level 0 muon trigger of LHCb

    International Nuclear Information System (INIS)

    The Level-0 muon trigger processor of the LHCb experiment looks for straight particles crossing muon detector and measures their transverse momentum. It processes 40*106 proton-proton collisions per second. The tracking uses a road algorithm relying on the projectivity of the muon detector (the logical layout in the 5 muon station is projective in y to the interaction point and it is also projective in x when the bending in the horizontal direction introduced by the magnetic field is ignored). The architecture of the Level-0 muon trigger is complex with a dense network of data interconnections. The design and validation of such an intricate system has only been possible with intense use of software tools for the detector simulation, the modelling of the hardware components behaviour and the validation. A database describing the data-flow is the corner stone between the software and hardware components. (authors)

  14. Performance evaluation of spectral deconvolution analysis tool (SDAT) software used for nuclear explosion radionuclide measurements

    International Nuclear Information System (INIS)

    The Spectral Deconvolution Analysis Tool (SDAT) software was developed to improve counting statistics and detection limits for nuclear explosion radionuclide measurements. SDAT utilizes spectral deconvolution spectroscopy techniques and can analyze both β-γ coincidence spectra for radioxenon isotopes and high-resolution HPGe spectra from aerosol monitors. Spectral deconvolution spectroscopy is an analysis method that utilizes the entire signal deposited in a gamma-ray detector rather than the small portion of the signal that is present in one gamma-ray peak. This method shows promise to improve detection limits over classical gamma-ray spectroscopy analytical techniques; however, this hypothesis has not been tested. To address this issue, we performed three tests to compare the detection ability and variance of SDAT results to those of commercial off- the-shelf (COTS) software which utilizes a standard peak search algorithm. (author)

  15. Techniques and tools for measuring energy efficiency of scientific software applications

    CERN Document Server

    Abdurachmanov, David; Eulisse, Giulio; Knight, Robert; Niemi, Tapio; Nurminen, Jukka K.; Nyback, Filip; Pestana, Goncalo; Ou, Zhonghong; Khan, Kashif

    2014-01-01

    The scale of scientific High Performance Computing (HPC) and High Throughput Computing (HTC) has increased significantly in recent years, and is becoming sensitive to total energy use and cost. Energy-efficiency has thus become an important concern in scientific fields such as High Energy Physics (HEP). There has been a growing interest in utilizing alternate architectures, such as low power ARM processors, to replace traditional Intel x86 architectures. Nevertheless, even though such solutions have been successfully used in mobile applications with low I/O and memory demands, it is unclear if they are suitable and more energy-efficient in the scientific computing environment. Furthermore, there is a lack of tools and experience to derive and compare power consumption between the architectures for various workloads, and eventually to support software optimizations for energy efficiency. To that end, we have performed several physical and software-based measurements of workloads from HEP applications running o...

  16. Web-based software tool for constraint-based design specification of synthetic biological systems.

    Science.gov (United States)

    Oberortner, Ernst; Densmore, Douglas

    2015-06-19

    miniEugene provides computational support for solving combinatorial design problems, enabling users to specify and enumerate designs for novel biological systems based on sets of biological constraints. This technical note presents a brief tutorial for biologists and software engineers in the field of synthetic biology on how to use miniEugene. After reading this technical note, users should know which biological constraints are available in miniEugene, understand the syntax and semantics of these constraints, and be able to follow a step-by-step guide to specify the design of a classical synthetic biological system-the genetic toggle switch.1 We also provide links and references to more information on the miniEugene web application and the integration of the miniEugene software library into sophisticated Computer-Aided Design (CAD) tools for synthetic biology ( www.eugenecad.org ). PMID:25426642

  17. A software tool for teaching and training how to build and use a TOWS matrix

    Directory of Open Access Journals (Sweden)

    Amparo Mariño Ibáñez

    2010-05-01

    Full Text Available Strategic planning is currently being used by most companies; it analyses current and expected future situations, determines com-pany orientation and develops means or strategies for achieving their stated missions. This article is aimed at reviewing general considerations in strategic planning and presenting a computational tool designed for building a TOWS matrix for matching a company’s opportunities and threats with its weaknesses and, more especially, its strengths. The software development life cycle (SDLC involved analysis, design, implementation and use. The literature about strategic planning and SWOT analysis was re-viewed for making the analysis. The software only automates an aspect of the whole strategic planning process and can be used for improving students and staff training in SWOT analysis. This type of work seeks to motivate interdisciplinary research.

  18. Software tools for manipulating fe mesh, virtual surgery and post-processing

    Directory of Open Access Journals (Sweden)

    Milašinović Danko Z.

    2009-01-01

    Full Text Available This paper describes a set of software tools which we developed for the calculation of fluid flow through cardiovascular organs. Our tools work with medical data from a CT scanner, but could be used with any other 3D input data. For meshing we used a Tetgen tetrahedral mesh generator, as well as a mesh re-generator that we have developed for conversion of tetrahedral elements into bricks. After adequate meshing we used our PAKF solver for calculation of fluid flow. For human-friendly presentation of results we developed a set of post-processing software tools. With modification of 2D mesh (boundary of cardiovascular organ it is possible to do virtual surgery, so in a case of an aorta with aneurism, which we had received from University Clinical center in Heidelberg from a multi-slice 64-CT scanner, we removed the aneurism and ran calculations on both geometrical models afterwards. The main idea of this methodology is creating a system that could be used in clinics.

  19. New tools for digital medical image processing implemented in DIP software

    International Nuclear Information System (INIS)

    The anthropomorphic models used in computational dosimetry, also called phantoms, are mostly built from stacks of images CT (Computed Tomography) or MRI (Magnetic Resonance Imaging) obtained from scans of patients or volunteers. The construction of voxel phantoms requires computational processing for transforming image formats, dimensional image compression (2D) to form three-dimensional arrays (3D), quantization, resampling, enhancement, restoration and image segmentation, among others. Hardly the computational dosimetry researcher finds all these skills into a single software and often it results in a decreased development of their research or inadequate use of alternative tools. The need to integrate the various tasks of the original digital image processing to obtain an image that can be used in a computational model of exposure led to the development of software DIP (Digital Image Processing). This software reads, writes and edits binary files containing the 3D matrix corresponding to a stack of cross-sectional images of a given geometry that can be a human body or other volume of interest. It can also read any type of computer image and do conversions. When the task involves only one output image, it is saved in the JPEG standard Windows. When it involves a stack of images, the binary output file is called SGI (Interactive Graphic Simulations, a symbol already used in other publications of the Research Group in Numerical Dosimetry). The following paper presents the third version of the DIP software and emphasizes the new tools it implemented. Currently it has the menus Basics, Views, Spatial Domain, Frequency Domain, Segmentations and Study. Each menu contains items and subitems with features that generally require an image as input and produce an image or an attribute in the output. (author)

  20. New tools for digital medical image processing implemented in DIP software

    Energy Technology Data Exchange (ETDEWEB)

    Araujo, Erica A.C.; Santana, Ivan E. [Instituto Federal de Educacao, Ciencia e Tecnologia de Pernambuco, Recife, PE (Brazil); Lima, Fernando R.A., E-mail: falima@cnen.gov.b [Centro Regional de Ciencias Nucleares, (CRCN/NE-CNEN-PE), Recife, PE (Brazil); Viera, Jose W. [Escola Politecnica de Pernambuco, Recife, PE (Brazil)

    2011-07-01

    The anthropomorphic models used in computational dosimetry, also called phantoms, are mostly built from stacks of images CT (Computed Tomography) or MRI (Magnetic Resonance Imaging) obtained from scans of patients or volunteers. The construction of voxel phantoms requires computational processing for transforming image formats, dimensional image compression (2D) to form three-dimensional arrays (3D), quantization, resampling, enhancement, restoration and image segmentation, among others. Hardly the computational dosimetry researcher finds all these skills into a single software and often it results in a decreased development of their research or inadequate use of alternative tools. The need to integrate the various tasks of the original digital image processing to obtain an image that can be used in a computational model of exposure led to the development of software DIP (Digital Image Processing). This software reads, writes and edits binary files containing the 3D matrix corresponding to a stack of cross-sectional images of a given geometry that can be a human body or other volume of interest. It can also read any type of computer image and do conversions. When the task involves only one output image, it is saved in the JPEG standard Windows. When it involves a stack of images, the binary output file is called SGI (Interactive Graphic Simulations, a symbol already used in other publications of the Research Group in Numerical Dosimetry). The following paper presents the third version of the DIP software and emphasizes the new tools it implemented. Currently it has the menus Basics, Views, Spatial Domain, Frequency Domain, Segmentations and Study. Each menu contains items and subitems with features that generally require an image as input and produce an image or an attribute in the output. (author)

  1. DSC: software tool for simulation-based design of control strategies applied to wastewater treatment plants.

    Science.gov (United States)

    Ruano, M V; Ribes, J; Seco, A; Ferrer, J

    2011-01-01

    This paper presents a computer tool called DSC (Simulation based Controllers Design) that enables an easy design of control systems and strategies applied to wastewater treatment plants. Although the control systems are developed and evaluated by simulation, this tool aims to facilitate the direct implementation of the designed control system to the PC of the full-scale WWTP (wastewater treatment plants). The designed control system can be programmed in a dedicated control application and can be connected to either the simulation software or the SCADA of the plant. To this end, the developed DSC incorporates an OPC server (OLE for process control) which facilitates an open-standard communication protocol for different industrial process applications. The potential capabilities of the DSC tool are illustrated through the example of a full-scale application. An aeration control system applied to a nutrient removing WWTP was designed, tuned and evaluated with the DSC tool before its implementation in the full scale plant. The control parameters obtained by simulation were suitable for the full scale plant with only few modifications to improve the control performance. With the DSC tool, the control systems performance can be easily evaluated by simulation. Once developed and tuned by simulation, the control systems can be directly applied to the full-scale WWTP. PMID:21330730

  2. Computation of Internal Fluid Flows in Channels Using the CFD Software Tool FlowVision

    CERN Document Server

    Kochevsky, A N

    2004-01-01

    The article describes the CFD software tool FlowVision (OOO "Tesis", Moscow). The model equations used for this research are the set of Reynolds and continuity equations and equations of the standard k - e turbulence model. The aim of the paper was testing of FlowVision by comparing the computational results for a number of simple internal channel fluid flows with known experimental data. The test cases are non-swirling and swirling flows in pipes and diffusers, flows in stationary and rotating bends. Satisfactory correspondence of results was obtained both for flow patterns and respective quantitative values.

  3. Development of a software tool for the management of quality control in a helical tomotherapy unit

    International Nuclear Information System (INIS)

    The large amount of data and information that is managed in units of external radiotherapy quality control tests makes necessary the use of tools that facilitate, on the one hand, the management of measures and results in real time, and on other tasks of management, file, query and reporting of stored data. This paper presents an application of software of own development which is used for the integral management of the helical TomoTherapy unit in the aspects related to the roles and responsibilities of the hospital Radiophysics. (Author)

  4. Perspective Methods and Tools for the Design of Distributed Software Systems Based on Services

    Directory of Open Access Journals (Sweden)

    Marek Paralič

    2007-03-01

    Full Text Available In this paper the current research activities of the Distributed team at theDepartment of Computer and Informatics, FEI, TU in Košice are described. Our focus is topropose and verify new methods and tools, which could contribute to the design andimplementation of integrated, pervasive and collaborative services. Pervasive computing isan important research area whose challenges require a thorough rethinking and revision ofconventional software design ideas. In reality, there is little consensus and very little basicunderstanding of the underlying issues and their interactions to produce useful solutions.Our research activities aim to perform the research necessary to contribute to thisunderstanding.

  5. Software Tool for Automated Failure Modes and Effects Analysis (FMEA) of Hydraulic Systems

    DEFF Research Database (Denmark)

    Stecki, J. S.; Conrad, Finn; Oh, B.

    2002-01-01

    management techniques and a vast array of computer aided techniques are applied during design and testing stages. The paper present and discusses the research and development of a software tool for automated failure mode and effects analysis - FMEA - of hydraulic systems. The paper explains the underlying......Offshore, marine,aircraft and other complex engineering systems operate in harsh environmental and operational conditions and must meet stringent requirements of reliability, safety and maintability. To reduce the hight costs of development of new systems in these fields improved the design...

  6. NEuronMOrphological analysis tool: open-source software for quantitative morphometrics

    Directory of Open Access Journals (Sweden)

    Lucia eBilleci

    2013-02-01

    Full Text Available Morphometric analysis of neurons and brain tissue is relevant to the study of neuron circuitry development during the first phases of brain growth or for probing the link between microstructural morphology and degenerative diseases. As neural imaging techniques become ever more sophisticated, so does the amount and complexity of data generated. The NEuronMOrphological analysis tool NEMO was purposely developed to handle and process large numbers of optical microscopy image files of neurons in culture or slices in order to automatically run batch routines, store data and apply multivariate classification and feature extraction using3-way principal component analysis. Here we describe the software's main features, underlining the differences between NEMO and other commercial and non-commercial image processing tools, and show an example of how NEMO can be used to classify neurons from wild-type mice and from animal models of autism.

  7. Virtual Power Electronics: Novel Software Tools for Design, Modeling and Education

    Science.gov (United States)

    Hamar, Janos; Nagy, István; Funato, Hirohito; Ogasawara, Satoshi; Dranga, Octavian; Nishida, Yasuyuki

    The current paper is dedicated to present browser-based multimedia-rich software tools and e-learning curriculum to support the design and modeling process of power electronics circuits and to explain sometimes rather sophisticated phenomena. Two projects will be discussed. The so-called Inetele project is financed by the Leonardo da Vinci program of the European Union (EU). It is a collaborative project between numerous EU universities and institutes to develop state-of-the art curriculum in Electrical Engineering. Another cooperative project with participation of Japanese, European and Australian institutes focuses especially on developing e-learning curriculum, interactive design and modeling tools, furthermore on development of a virtual laboratory. Snapshots from these two projects will be presented.

  8. The impact of software and CAE tools on SEU in field programmable gate arrays

    International Nuclear Information System (INIS)

    Field programmable gate array (FPGA) devices, heavily used in spacecraft electronics, have grown substantially in size over the past few years, causing designers to work at a higher conceptual level, with computer aided engineering (CAE) tools synthesizing and optimizing the logic from a description. It is shown that the use of commercial-off-the-shelf (COTS) CAE tools can produce unreliable circuit designs when the device is used in a radiation environment and a flip-flop is upset. At a lower level, software can be used to improve the SEU performance of a flip-flop, exploiting the configurable nature of FPGA technology and on-chip delay, parasitic resistive, and capacitive circuit elements

  9. Software Tool Support to Specify and Verify Scientific Sensor Data Properties to Improve Anomaly Detection

    Science.gov (United States)

    Gallegos, I.; Gates, A. Q.; Tweedie, C.; Cybershare

    2010-12-01

    Advancements in scientific sensor data acquisition technologies, such as wireless sensor networks and robotic trams equipped with sensors, are increasing the amount of data being collected at field sites . This elevates the challenges of verifying the quality of streamed data and monitoring the correct operation of the instrumentation. Without the ability to evaluate the data collection process at near real-time, scientists can lose valuable time and data. In addition, scientists have to rely on their knowledge and experience in the field to evaluate data quality. Such knowledge is rarely shared or reused by other scientists mostly because of the lack of a well-defined methodology and tool support. Numerous scientific projects address anomaly detection, mostly as part of the verification system’s source code; however, anomaly detection properties, which often are embedded or hard-coded in the source code, are difficult to refine. In addition, a software developer is required to modify the source code every time a new anomaly detection property or a modification to an existing property is needed. This poster describes the tool support that has been developed, based on software engineering techniques, to address these challenges. The overall tool support allows scientists to specify and reuse anomaly detection properties generated using the specification tool and to use the specified properties to conduct automated anomaly detection at near-real time. The anomaly-detection mechanism is independent of the system used to collect the sensor data. With guidance provided by a classification and categorization of anomaly-detection properties, the user specifies properties on scientific sensor data. The properties, which can be associated with particular field sites or instrumentation, document knowledge about data anomalies that otherwise would have limited availability to the scientific community.

  10. A software tool for creating simulated outbreaks to benchmark surveillance systems

    Directory of Open Access Journals (Sweden)

    Olson Karen L

    2005-07-01

    Full Text Available Abstract Background Evaluating surveillance systems for the early detection of bioterrorism is particularly challenging when systems are designed to detect events for which there are few or no historical examples. One approach to benchmarking outbreak detection performance is to create semi-synthetic datasets containing authentic baseline patient data (noise and injected artificial patient clusters, as signal. Methods We describe a software tool, the AEGIS Cluster Creation Tool (AEGIS-CCT, that enables users to create simulated clusters with controlled feature sets, varying the desired cluster radius, density, distance, relative location from a reference point, and temporal epidemiological growth pattern. AEGIS-CCT does not require the use of an external geographical information system program for cluster creation. The cluster creation tool is an open source program, implemented in Java and is freely available under the Lesser GNU Public License at its Sourceforge website. Cluster data are written to files or can be appended to existing files so that the resulting file will include both existing baseline and artificially added cases. Multiple cluster file creation is an automated process in which multiple cluster files are created by varying a single parameter within a user-specified range. To evaluate the output of this software tool, sets of test clusters were created and graphically rendered. Results Based on user-specified parameters describing the location, properties, and temporal pattern of simulated clusters, AEGIS-CCT created clusters accurately and uniformly. Conclusion AEGIS-CCT enables the ready creation of datasets for benchmarking outbreak detection systems. It may be useful for automating the testing and validation of spatial and temporal cluster detection algorithms.

  11. Easyverifier 1.0: a software tool for revising scientific articles’ bibliographical citations

    Directory of Open Access Journals (Sweden)

    Freddy Alberto Correa Riveros

    2010-05-01

    Full Text Available The first academic revolution which occurred in developed countries during the late 19th century made research a university func- tion in addition to the traditional task of teaching. A second academic revolution has tried to transform the university into a tea- ching, research and socio-economic development enterprise. The scientific article has become an excellent practical means for the movement of new knowledge between the university and the socioeconomic environment. This work had two purposes. One was to present some general considerations regarding research and the scientific article. The second was to provide information about a computational tool which supports revising scientific articles’ citations; this step is usually done manually and requires some experience. The software allows two text files to be read, one containing the scientific article’s content and another one the bibliography. A report is then generated allowing the authors mentioned in the text but not indexed in the bibliography to be identified and to determine which authors have been mentioned in the bibliography but who have not been mentioned in the text of the article. The software allows researchers and journal coordinators to detect reference errors among citations in the text and the bibliographical references. The steps to develop the software were: analysis, design, implementation and use. For the analysis it was important the revision of the literature about elaboration of citations in scientific documents.

  12. The Comprehensive Evaluation of Electronic Learning Tools and Educational Software (CEELTES

    Directory of Open Access Journals (Sweden)

    Štefan KAROLČÍK

    2015-10-01

    Full Text Available Despite the fact that digital technologies are more and more used in the learning and education process, there is still lack of professional evaluation tools capable of assessing the quality of used digital teaching aids in a comprehensive and objective manner. Construction of the Comprehensive Evaluation of Electronic Learning Tools and Educational Software (CEELTES tool was preceded by several surveys and knowledge obtained in the course of creation of digital learning and teaching aids and implementation thereof in the teaching process. The evaluation tool as such consists of sets (catalogues of criteria divided into four separately assessed areas - the area of technical, technological and user attributes; the area of criteria evaluating the content, operation, information structuring and processing; the area of criteria evaluating the information processing in terms of learning, recognition, and education needs; and, finally, the area of criteria evaluating the psychological and pedagogical aspects of a digital product. The specified areas are assessed independently, separately, by a specialist in the given science discipline. The final evaluation of the assessed digital product objectifies (quantifies the overall rate of appropriateness of inclusion of a particular digital teaching aid in the teaching process.

  13. Acts -- A collection of high performing software tools for scientific computing

    Energy Technology Data Exchange (ETDEWEB)

    Drummond, L.A.; Marques, O.A.

    2002-11-01

    During the past decades there has been a continuous growth in the number of physical and societal problems that have been successfully studied and solved by means of computational modeling and simulation. Further, many new discoveries depend on high performance computer simulations to satisfy their demands for large computational resources and short response time. The Advanced CompuTational Software (ACTS) Collection brings together a number of general-purpose computational tool development projects funded and supported by the U.S. Department of Energy (DOE). These tools make it easier for scientific code developers to write high performance applications for parallel computers. They tackle a number of computational issues that are common to a large number of scientific applications, mainly implementation of numerical algorithms, and support for code development, execution and optimization. The ACTS collection promotes code portability, reusability, reduction of duplicate efforts, and tool maturity. This paper presents a brief introduction to the functionality available in ACTS. It also highlight the tools that are in demand by Climate and Weather modelers.

  14. FIND: A new software tool and development platform for enhanced multicolor flow analysis

    Science.gov (United States)

    2011-01-01

    Background Flow Cytometry is a process by which cells, and other microscopic particles, can be identified, counted, and sorted mechanically through the use of hydrodynamic pressure and laser-activated fluorescence labeling. As immunostained cells pass individually through the flow chamber of the instrument, laser pulses cause fluorescence emissions that are recorded digitally for later analysis as multidimensional vectors. Current, widely adopted analysis software limits users to manual separation of events based on viewing two or three simultaneous dimensions. While this may be adequate for experiments using four or fewer colors, advances have lead to laser flow cytometers capable of recording 20 different colors simultaneously. In addition, mass-spectrometry based machines capable of recording at least 100 separate channels are being developed. Analysis of such high-dimensional data by visual exploration alone can be error-prone and susceptible to unnecessary bias. Fortunately, the field of Data Mining provides many tools for automated group classification of multi-dimensional data, and many algorithms have been adapted or created for flow cytometry. However, the majority of this research has not been made available to users through analysis software packages and, as such, are not in wide use. Results We have developed a new software application for analysis of multi-color flow cytometry data. The main goals of this effort were to provide a user-friendly tool for automated gating (classification) of multi-color data as well as a platform for development and dissemination of new analysis tools. With this software, users can easily load single or multiple data sets, perform automated event classification, and graphically compare results within and between experiments. We also make available a simple plugin system that enables researchers to implement and share their data analysis and classification/population discovery algorithms. Conclusions The FIND (Flow

  15. Data Analysis Software Tools for Enhanced Collaboration at the DIII-D National Fusion Facility

    International Nuclear Information System (INIS)

    Data analysis at the DIII-D National Fusion Facility is simplified by the use of two software packages in analysis codes. The first is GAP1otObj, an IDL-based object-oriented library used in visualization tools for dynamic plotting. GAPlotObj gives users the ability to manipulate graphs directly through mouse and keyboard-driven commands. The second software package is MDSplus, which is used at DIED as a central repository for analyzed data. GAPlotObj and MDSplus reduce the effort required for a collaborator to become familiar with the DIII-D analysis environment by providing uniform interfaces for data display and retrieval. Two visualization tools at DIII-D that benefit from them are ReviewPlus and EFITviewer. ReviewPlus is capable of displaying interactive 2D and 3D graphs of raw, analyzed, and simulation code data. EFITviewer is used to display results from the EFIT analysis code together with kinetic profiles and machine geometry. Both bring new possibilities for data exploration to the user, and are able to plot data from any fusion research site with an MDSplus data server

  16. Practical experience with software tools to assess and improve the quality of existing nuclear analysis and safety codes

    International Nuclear Information System (INIS)

    Within the constraints of schedule and budget, software tools and techniques were applied to existing FORTRAN codes determining software quality metrics and improving the code quality. Specifically discussed are INEL experiences in applying pretty printers, cross-reference analyzers, and computer aided software engineering (CASE) tools and techniques. These have provided management with measures of the risk potential for individual program modules so that rational decisions can be made on resource allocation. Selected program modules have been modified to reduce the complexity, achieve higher functional independence, and improve the code vectorization. (orig.)

  17. Usefulness of the automatic quantitative estimation tool for cerebral blood flow. Clinical assessment of the application software tool AQCEL

    International Nuclear Information System (INIS)

    between AQCEL and conventional methods were 0.973 and 0.986 for the normal and affected sides at rest, respectively, and 0.977 and 0.984 for the normal and affected sides after ACZ loading, respectively. The quality of images reconstructed using the application software AQCEL were superior to that obtained using conventional method after ACZ loading, and high correlations were shown in quantity at rest and after ACZ loading. This software can be applied to clinical practice and is a useful tool for improvement of reproducibility and throughput. (author)

  18. Proofreading Using an Assistive Software Homophone Tool: Compensatory and Remedial Effects on the Literacy Skills of Students with Reading Difficulties

    Science.gov (United States)

    Lange, Alissa A.; Mulhern, Gerry; Wylie, Judith

    2009-01-01

    The present study investigated the effects of using an assistive software homophone tool on the assisted proofreading performance and unassisted basic skills of secondary-level students with reading difficulties. Students aged 13 to 15 years proofread passages for homophonic errors under three conditions: with the homophone tool, with homophones…

  19. Establishing a Web-based DICOM teaching file authoring tool using open-source public software.

    Science.gov (United States)

    Lee, Wen-Jeng; Yang, Chung-Yi; Liu, Kao-Lang; Liu, Hon-Man; Ching, Yu-Tai; Chen, Shyh-Jye

    2005-09-01

    Online teaching files are an important source of educational and referential materials in the radiology community. The commonly used Digital Imaging and Communications in Medicine (DICOM) file format of the radiology community is not natively supported by common Web browsers. The ability of the Web server to convert and parse DICOM is important when the DICOM-converting tools are not available. In this paper, we describe our approach to develop a Web-based teaching file authoring tool. Our server is built using Apache Web server running on FreeBSD operating system. The dynamic page content is produced by Hypertext Preprocessor (PHP). Digital Imaging and Communications in Medicine images are converted by ImageMagick into Joint Photographic Experts Group (JPEG) format. Digital Imaging and Communications in Medicine attributes are parsed by dicom3tools and stored in PostgreSQL database. Using free software available from the Internet, we build a Web service that allows radiologists to create their own online teaching file cases with a common Web browser. PMID:15924271

  20. Ranking of Business Process Simulation Software Tools with DEX/QQ Hierarchical Decision Model.

    Directory of Open Access Journals (Sweden)

    Nadja Damij

    Full Text Available The omnipresent need for optimisation requires constant improvements of companies' business processes (BPs. Minimising the risk of inappropriate BP being implemented is usually performed by simulating the newly developed BP under various initial conditions and "what-if" scenarios. An effectual business process simulations software (BPSS is a prerequisite for accurate analysis of an BP. Characterisation of an BPSS tool is a challenging task due to the complex selection criteria that includes quality of visual aspects, simulation capabilities, statistical facilities, quality reporting etc. Under such circumstances, making an optimal decision is challenging. Therefore, various decision support models are employed aiding the BPSS tool selection. The currently established decision support models are either proprietary or comprise only a limited subset of criteria, which affects their accuracy. Addressing this issue, this paper proposes a new hierarchical decision support model for ranking of BPSS based on their technical characteristics by employing DEX and qualitative to quantitative (QQ methodology. Consequently, the decision expert feeds the required information in a systematic and user friendly manner. There are three significant contributions of the proposed approach. Firstly, the proposed hierarchical model is easily extendible for adding new criteria in the hierarchical structure. Secondly, a fully operational decision support system (DSS tool that implements the proposed hierarchical model is presented. Finally, the effectiveness of the proposed hierarchical model is assessed by comparing the resulting rankings of BPSS with respect to currently available results.

  1. Ranking of Business Process Simulation Software Tools with DEX/QQ Hierarchical Decision Model.

    Science.gov (United States)

    Damij, Nadja; Boškoski, Pavle; Bohanec, Marko; Mileva Boshkoska, Biljana

    2016-01-01

    The omnipresent need for optimisation requires constant improvements of companies' business processes (BPs). Minimising the risk of inappropriate BP being implemented is usually performed by simulating the newly developed BP under various initial conditions and "what-if" scenarios. An effectual business process simulations software (BPSS) is a prerequisite for accurate analysis of an BP. Characterisation of an BPSS tool is a challenging task due to the complex selection criteria that includes quality of visual aspects, simulation capabilities, statistical facilities, quality reporting etc. Under such circumstances, making an optimal decision is challenging. Therefore, various decision support models are employed aiding the BPSS tool selection. The currently established decision support models are either proprietary or comprise only a limited subset of criteria, which affects their accuracy. Addressing this issue, this paper proposes a new hierarchical decision support model for ranking of BPSS based on their technical characteristics by employing DEX and qualitative to quantitative (QQ) methodology. Consequently, the decision expert feeds the required information in a systematic and user friendly manner. There are three significant contributions of the proposed approach. Firstly, the proposed hierarchical model is easily extendible for adding new criteria in the hierarchical structure. Secondly, a fully operational decision support system (DSS) tool that implements the proposed hierarchical model is presented. Finally, the effectiveness of the proposed hierarchical model is assessed by comparing the resulting rankings of BPSS with respect to currently available results. PMID:26871694

  2. Integration of life cycle assessment software with tools for economic and sustainability analyses and process simulation for sustainable process design

    DEFF Research Database (Denmark)

    Kalakul, Sawitree; Malakul, Pomthong; Siemanond, Kitipat;

    2014-01-01

    The sustainable future of the world challenges engineers to develop chemical process designs that are not only technically and economically feasible but also environmental friendly. Life cycle assessment (LCA) is a tool for identifying and quantifying environmental impacts of the chemical product....... Although there are several commercial LCA tools, there is still a need for a simple LCA software that can be integrated with process design tools. In this paper, a new LCA software, LCSoft, is developed for evaluation of chemical, petrochemical, and biochemical processes with options for integration with...... with other tools. To test the software, a bioethanol production process using cassava rhizome is employed as a case study. Results from LCSoft highlight the estimated environmental performance in terms of various aspects such as carbon footprint, resource and energy consumptions, and various...

  3. Quality-driven multi-objective optimization of software architecture design : method, tool, and application

    NARCIS (Netherlands)

    Etemadi Idgahi (Etemaadi), Ramin

    2014-01-01

    Software architecting is a non-trivial and demanding task for software engineers to perform. The architecture is a key enabler for software systems. Besides being crucial for user functionality, the software architecture has deep impact on software qualities such as performance, safety, and cost.

  4. Software tools for automatic generation of finite element mesh and application of biomechanical calculation in medicine

    Directory of Open Access Journals (Sweden)

    Milašinović Danko Z.

    2008-01-01

    Full Text Available Cardiovascular diseases are common and a special difficulty in their curing is diagnostics. Modern medical instruments can provide data that is much more adequate for computer modeling. Computer simulations of blood flow through the cardiovascular organs give powerful advantages to scientists today. The motivation for this work is raw data that our Center recently received from the University Clinical center in Heidelberg from a multislice CT scanner. In this work raw data from CT scanner was used for creating a 3D model of the aorta. In this process we used Gmsh, TetGen (Hang Si as well as our own software tools, and the result was the 8-node (brick mesh on which the calculation was run. The results obtained were very satisfactory so...

  5. CALDoseX: a software tool for absorbed dose calculations in diagnostic radiology

    International Nuclear Information System (INIS)

    Conversion coefficients (CCs) between absorbed dose to organs and tissues at risk and measurable quantities commonly used in X-ray diagnosis have been calculated for the last 30 years mostly with mathematical MIRD5-type phantoms, in which organs are represented by simple geometrical bodies, like ellipsoids, tori, truncated cylinders, etc. In contrast, voxel-based phantoms are true to nature representations of human bodies. The purpose of this study is therefore to calculate CCs for common examinations in X-ray diagnosis with the recently developed MAX06 (Male Adult voXel) and FAX06 (Female Adult voXel) phantoms for various projections and different X-ray spectra and to make these CCs available to the public through a software tool, called CALDoseX (CALculation of Dose for X-ray diagnosis). (author)

  6. A software tool for automatic classification and segmentation of 2D/3D medical images

    International Nuclear Information System (INIS)

    Modern medical diagnosis utilizes techniques of visualization of human internal organs (CT, MRI) or of its metabolism (PET). However, evaluation of acquired images made by human experts is usually subjective and qualitative only. Quantitative analysis of MR data, including tissue classification and segmentation, is necessary to perform e.g. attenuation compensation, motion detection, and correction of partial volume effect in PET images, acquired with PET/MR scanners. This article presents briefly a MaZda software package, which supports 2D and 3D medical image analysis aiming at quantification of image texture. MaZda implements procedures for evaluation, selection and extraction of highly discriminative texture attributes combined with various classification, visualization and segmentation tools. Examples of MaZda application in medical studies are also provided

  7. A Software Tool for Processing the Displacement Time Series Extracted from Raw Radar Data

    International Nuclear Information System (INIS)

    The application of high-resolution radar waveform and interferometric principles recently led to the development of a microwave interferometer, suitable to simultaneously measuring the (static or dynamic) deflection of several points on a large structure. From the technical standpoint, the sensor is a Stepped Frequency Continuous Wave (SF-CW), coherent radar, operating in the Ku frequency band.In the paper, the main procedures adopted to extract the deflection time series from raw radar data and to assess the quality of data are addressed, and the MATLAB toolbox developed is described. Subsequently, other functions implemented in the software tool (e.g. evaluation of the spectral matrix of the deflection time-histories, identification of natural frequencies and operational mode shapes evaluation) are described and the application to data recorded on full-scale bridges is exemplified.

  8. Automated tools and techniques for distributed Grid Software Development of the testbed infrastructure

    CERN Document Server

    Aguado Sanchez, C

    2007-01-01

    Grid technology is becoming more and more important as the new paradigm for sharing computational resources across different organizations in a secure way. The great powerfulness of this solution, requires the definition of a generic stack of services and protocols and this is the scope of the different Grid initiatives. As a result of international collaborations for its development, the Open Grid Forum created the Open Grid Services Architecture (OGSA) which aims to define the common set of services that will enable interoperability across the different implementations. This master thesis has been developed in this framework, as part of the two European-funded projects ETICS and OMII-Europe. The main objective is to contribute to the design and maintenance of large distributed development projects with the automated tool that enables to implement Software Engineering techniques oriented to achieve an acceptable level of quality at the release process. Specifically, this thesis develops the testbed concept a...

  9. A TAXONOMY FOR TOOLS, PROCESSES AND LANGUAGES IN AUTOMOTIVE SOFTWARE ENGINEERING

    Directory of Open Access Journals (Sweden)

    Florian Bock

    2016-01-01

    Full Text Available Within the growing domain of software engineering in the automotive sector, the number of used tools, processes, methods and languages has increased distinctly in the past years. To be able to choose proper methods for particular development use cases, factors like the intended use, key-features and possible limitations have to be evaluated. This requires a taxonomy that aids the decision making. An analysis of the main existing taxonomies revealed two major deficiencies: the lack of the automotive focus and the limitation to particular engineering method types. To face this, a graphical taxonomy is proposed based on two well-established engineering approaches and enriched with additional classification information. It provides a self-evident and -explanatory overview and comparison technique for engineering methods in the automotive domain. The taxonomy is applied to common automotive engineering methods. The resulting diagram classifies each method and enables the reader to select appropriate solutions for given project requirements.

  10. GeneMarker® Genotyping Software: Tools to Increase the Statistical Power of DNA Fragment Analysis

    Science.gov (United States)

    Hulce, D.; Li, X.; Snyder-Leiby, T.; Johathan Liu, C.S.

    2011-01-01

    The discriminatory power of post-genotyping analyses, such as kinship or clustering analysis, is dependent on the amount of genetic information obtained from the DNA fragment/genotyping analysis. The number of microsatellite loci amplified in one multiplex is limited by the number of dyes and overlapping loci boundaries; requiring researchers to amplify replicate samples with 2 or more multiplexes in order to obtain a genotype for 12–15 loci. AFLP is another method that is limited by the number of dyes, often requiring multiple amplifications of replicate samples to obtain more complete results. Traditionally, researchers export the genotyping results into a spread sheet, manually combine the results for each individual and then import into a third software package for post-genotyping analysis. GeneMarker is highly accurate, user-friendly genotyping software that allows all of these steps to be done in one software package, avoiding potential errors from data transfer to different programs and decreasing the amount of time needed to process the results. The Merge Project tool automatically combines the results from replicate samples processed with different primer sets. Replicate animal (diploid) DNA samples were amplified with three different multiplexes, each multiplex provided information on 4–6 loci. The kinship analysis using the merged results provided a 1017 increase in statistical power with a range of 108 when 5 loci were used versus 1025 when 15 loci were used to determine potential relationship levels with identity by descent calculations. These same sample sets were used in clustering analysis to diagram dendrograms. The dendrogram based on a single multiplex resulted in three branches at a given Euclidian distance. In comparison, the dendrogram that was constructed using the merged results had eight branches at the same Euclidian distance.

  11. Quality-driven multi-objective optimization of software architecture design: method, tool, and application

    OpenAIRE

    Etemadi Idgahi (Etemaadi), Ramin

    2014-01-01

    Software architecting is a non-trivial and demanding task for software engineers to perform. The architecture is a key enabler for software systems. Besides being crucial for user functionality, the software architecture has deep impact on software qualities such as performance, safety, and cost. In this dissertation, an automated approach for software architecture design is proposed that supports analysis and optimization of multiple quality attributes: First of all, we demonstrate an optimi...

  12. Data Mining for Secure Software Engineering – Source Code Management Tool Case Study

    OpenAIRE

    A.V.Krishna Prasad,; Dr.S.Rama Krishna

    2010-01-01

    As Data Mining for Secure Software Engineering improves software productivity and quality, software engineers are increasingly applying data mining algorithms to various software engineering tasks. However mining software engineering data poses several challenges, requiring various algorithms to effectively mine sequences, graphs and text from such data. Software engineering data includes code bases, execution traces, historical code changes,mailing lists and bug data bases. They contains a w...

  13. Development of software tools for supporting building clearance and site release at UKAEA

    International Nuclear Information System (INIS)

    UKAEA sites generally have complex histories and have been subject to a diverse range of nuclear operations. Most of the nuclear reactors, laboratories, workshops and other support facilities are now redundant and a programme of decommissioning works in accordance with IAEA guidance is in progress. Decommissioning is being carried out in phases with post- operative activities, care and maintenance and care and surveillance periods between stages to allow relatively short-lived radioactivity to decay. This reduces dose levels to personnel and minimises radioactive waste production. Following on from these stages is an end point phase which corresponds to the point at which the risks to human health and the environment are sufficiently low so that the buildings / land can be released for future use. Unconditional release corresponds to meeting the requirement for 'de-licensing'. Although reaching a de-licensable end point is the desired aim for UKAEA sites, it is recognised that this may take hundreds of years for parts of some UKAEA sites, or may never be attainable at a reasonable cost to the UK taxpayer. Thus on these sites, long term risk management systems are in place to minimise the impact on health, safety and the environment. In order to manage these short, medium and long term liabilities, UKAEA has developed a number of software tools based on good practice guidance. One of these tools in particular is being developed to address building clearance and site release. This tool, IMAGES (Information Management and Geographical Information System) integrates systematic data capture, with database management and spatial assessment (through a Geographical Information System). Details of IMAGES and its applications are discussed in the paper. This paper outlines the approach being adopted by UKAEA for building and site release and the integrated software system, IMAGES, being used to capture, collate, interpret and report results. The key to UKAEA's strategy for

  14. A software tool for increased efficiency in observer performance studies in radiology

    International Nuclear Information System (INIS)

    Observer performance studies are time-consuming tasks, both for the participating observers and for the scientists collecting and analysing the data. A possible way to optimise such studies is to perform them in a completely digital environment. A software tool - ViewDEX (Viewer for Digital Evaluation of X-ray images) - has been developed in Java, enabling it to function on almost any computer. ViewDEX is designed to handle several types of studies, such as visual grading analysis (VGA), image criteria scoring (ICS) and receiver operating characteristics (ROC). The results from each observer are saved in a log file, which can be exported for further analysis in, for example, a special software for analysing ROC results. By using ViewDEX for an ROC experiment, an evaluation rate of ∼200 images per hour can be achieved, compared to ∼25 images per hour using hard copy evaluation. The results are obtained within minutes of completion of the viewing. The risk of human errors in the process of data collection and analysis is also minimised. The viewer has been used in a major trial containing ∼2700 images. (authors)

  15. RadNotes: a novel software development tool for radiology education.

    Science.gov (United States)

    Baxter, A B; Klein, J S; Oesterle, E V

    1997-01-01

    RadNotes is a novel software development tool that enables physicians to develop teaching materials incorporating text and images in an intelligent, highly usable format. Projects undertaken in the RadNotes environment require neither programming expertise nor the assistance of a software engineer. The first of these projects, Thoracic Imaging, integrates image teaching files, concise disease and topic summaries, references, and flash card quizzes into a single program designed to provide an overview of chest radiology. RadNotes is intended to support the academic goals of teaching radiologists by enabling authors to create, edit, and electronically distribute image-oriented presentations. RadNotes also supports the educational goals of physicians who wish to quickly review selected imaging topics, as well as to develop a visual vocabulary of corresponding radiologic anatomy and pathologic conditions. Although Thoracic Imaging was developed with the aim of introducing chest radiology to residents, RadNotes can be used to develop tutorials and image-based tests for all levels; create corresponding World Wide Web sites; and organize notes, images, and references for individual use. PMID:9153710

  16. A Software Tool for Optimal Sizing of PV Systems in Malaysia

    Directory of Open Access Journals (Sweden)

    Tamer Khatib

    2012-01-01

    Full Text Available This paper presents a MATLAB based user friendly software tool called as PV.MY for optimal sizing of photovoltaic (PV systems. The software has the capabilities of predicting the metrological variables such as solar energy, ambient temperature and wind speed using artificial neural network (ANN, optimizes the PV module/ array tilt angle, optimizes the inverter size and calculate optimal capacities of PV array, battery, wind turbine and diesel generator in hybrid PV systems. The ANN based model for metrological prediction uses four meteorological variables, namely, sun shine ratio, day number and location coordinates. As for PV system sizing, iterative methods are used for determining the optimal sizing of three types of PV systems, which are standalone PV system, hybrid PV/wind system and hybrid PV/diesel generator system. The loss of load probability (LLP technique is used for optimization in which the energy sources capacities are the variables to be optimized considering very low LLP. As for determining the optimal PV panels tilt angle and inverter size, the Liu and Jordan model for solar energy incident on a tilt surface is used in optimizing the monthly tilt angle, while a model for inverter efficiency curve is used in the optimization of inverter size.

  17. OligoSpawn: a software tool for the design of overgo probes from large unigene datasets

    Directory of Open Access Journals (Sweden)

    Jiang Tao

    2006-01-01

    Full Text Available Abstract Background Expressed sequence tag (EST datasets represent perhaps the largest collection of genetic information. ESTs can be exploited in a variety of biological experiments and analysis. Here we are interested in the design of overlapping oligonucleotide (overgo probes from large unigene (EST-contigs datasets. Results OLIGOSPAWN is a suite of software tools that offers two complementary services, namely (1 the selection of "unique" oligos each of which appears in one unigene but does not occur (exactly or approximately in any other and (2 the selection of "popular" oligos each of which occurs (exactly or approximately in as many unigenes as possible. In this paper, we describe the functionalities of OLIGOSPAWN and the computational methods it employs, and we report on experimental results for the overgo probes designed with it. Conclusion The algorithms we designed are highly efficient and capable of processing unigene datasets of sizes on the order of several tens of Mb in a few hours on a regular PC. The software has been used to design overgo probes employed to screen a barley BAC library (Hordeum vulgare. OLIGOSPAWN is freely available at http://oligospawn.ucr.edu/.

  18. Open Source Software Openfoam as a New Aerodynamical Simulation Tool for Rocket-Borne Measurements

    Science.gov (United States)

    Staszak, T.; Brede, M.; Strelnikov, B.

    2015-09-01

    The only way to do in-situ measurements, which are very important experimental studies for atmospheric science, in the mesoshere/lower thermosphere (MLT) is to use sounding rockets. The drawback of using rockets is the shock wave appearing because of the very high speed of the rocket motion (typically about 1000 mIs). This shock wave disturbs the density, the temperature and the velocity fields in the vicinity of the rocket, compared to undisturbed values of the atmosphere. This effect, however, can be quantified and the measured data has to be corrected not just to make it more precise but simply usable. The commonly accepted and widely used tool for this calculations is the Direct Simulation Monte Carlo (DSMC) technique developed by GA. Bird which is available as stand-alone program limited to use a single processor. Apart from complications with simulations of flows around bodies related to different flow regimes in the altitude range of MLT, that rise due to exponential density change by several orders of magnitude, a particular hardware configuration introduces significant difficulty for aerodynamical calculations due to choice of the grid sizes mainly depending on the demands on adequate DSMCs and good resolution of geometries with scale differences of factor of iO~. This makes either the calculation time unreasonably long or even prevents the calculation algorithm from converging. In this paper we apply the free open source software OpenFOAM (licensed under GNU GPL) for a three-dimensional CFD-Simulation of a flow around a sounding rocket instrumentation. An advantage of this software package, among other things, is that it can run on high performance clusters, which are easily scalable. We present the first results and discuss the potential of the new tool in applications for sounding rockets.

  19. ELER software – a new tool for urban earthquake loss assessment

    Directory of Open Access Journals (Sweden)

    U. Hancilar

    2010-12-01

    .0 (Molina et al., 2008 and ATC-55 (Yang, 2005. An urban loss assessment exercise for a scenario earthquake for the city of Istanbul is conducted and physical and social losses are presented. Damage to the urban environment is compared to the results obtained from similar software, i.e. KOERILoss (KOERI, 2002 and DBELA (Crowley et al., 2004. The European rapid loss estimation tool is expected to help enable effective emergency response, on both local and global level, as well as public information.

  20. Techniques and software tools for estimating ultrasonic signal-to-noise ratios

    Science.gov (United States)

    Chiou, Chien-Ping; Margetan, Frank J.; McKillip, Matthew; Engle, Brady J.; Roberts, Ronald A.

    2016-02-01

    At Iowa State University's Center for Nondestructive Evaluation (ISU CNDE), the use of models to simulate ultrasonic inspections has played a key role in R&D efforts for over 30 years. To this end a series of wave propagation models, flaw response models, and microstructural backscatter models have been developed to address inspection problems of interest. One use of the combined models is the estimation of signal-to-noise ratios (S/N) in circumstances where backscatter from the microstructure (grain noise) acts to mask sonic echoes from internal defects. Such S/N models have been used in the past to address questions of inspection optimization and reliability. Under the sponsorship of the National Science Foundation's Industry/University Cooperative Research Center at ISU, an effort was recently initiated to improve existing research-grade software by adding graphical user interface (GUI) to become user friendly tools for the rapid estimation of S/N for ultrasonic inspections of metals. The software combines: (1) a Python-based GUI for specifying an inspection scenario and displaying results; and (2) a Fortran-based engine for computing defect signal and backscattered grain noise characteristics. The latter makes use of several models including: the Multi-Gaussian Beam Model for computing sonic fields radiated by commercial transducers; the Thompson-Gray Model for the response from an internal defect; the Independent Scatterer Model for backscattered grain noise; and the Stanke-Kino Unified Model for attenuation. The initial emphasis was on reformulating the research-grade code into a suitable modular form, adding the graphical user interface and performing computations rapidly and robustly. Thus the initial inspection problem being addressed is relatively simple. A normal-incidence pulse/echo immersion inspection is simulated for a curved metal component having a non-uniform microstructure, specifically an equiaxed, untextured microstructure in which the average

  1. SHAPA: An interactive software tool for protocol analysis applied to aircrew communications and workload

    Science.gov (United States)

    James, Jeffrey M.; Sanderson, Penelope M.; Seidler, Karen S.

    1990-01-01

    As modern transport environments become increasingly complex, issues such as crew communication, interaction with automation, and workload management have become crucial. Much research is being focused on holistic aspects of social and cognitive behavior, such as the strategies used to handle workload, the flow of information, the scheduling of tasks, the verbal and non-verbal interactions between crew members. Traditional laboratory performance measures no longer sufficiently meet the needs of researchers addressing these issues. However observational techniques are better equipped to capture the type of data needed and to build models of the requisite level of sophistication. Presented here is SHAPA, an interactive software tool for performing both verbal and non-verbal protocol analysis. It has been developed with the idea of affording the researchers the closest possible degree of engagement with protocol data. The researcher can configure SHAPA to encode protocols using any theoretical framework or encoding vocabulary that is desired. SHAPA allows protocol analysis to be performed at any level of analysis, and it supplies a wide variety of tools for data aggregation, manipulation. The output generated by SHAPA can be used alone or in combination with other performance variables to get a rich picture of the influences on sequences of verbal or nonverbal behavior.

  2. Effectiveness of Crown Preparation Assessment Software As an Educational Tool in Simulation Clinic: A Pilot Study.

    Science.gov (United States)

    Tiu, Janine; Cheng, Enxin; Hung, Tzu-Chiao; Yu, Chuan-Chia; Lin, Tony; Schwass, Don; Al-Amleh, Basil

    2016-08-01

    The aim of this pilot study was to evaluate the feasibility of a new tooth preparation assessment software, Preppr, as an educational tool for dental students in achieving optimal parameters for a crown preparation. In February 2015, 30 dental students in their fourth year in a five-year undergraduate dental curriculum in New Zealand were randomly selected from a pool of volunteers (N=40) out of the total class of 85. The participants were placed into one of three groups of ten students each: Group A, the control group, received only written and pictorial instructions; Group B received tutor evaluation and feedback; and Group C performed self-directed learning with the aid of Preppr. Each student was asked to prepare an all-ceramic crown on the lower first molar typodont within three hours and to repeat the exercise three times over the next four weeks. The exercise stipulated a 1 mm finish line dimension and total convergence angles (TOC) between 10 and 20 degrees. Fulfillment of these parameters was taken as an acceptable preparation. The results showed that Group C had the highest percentage of students who achieved minimum finish line dimensions and acceptable TOC angles. Those students also achieved the stipulated requirements earlier than the other groups. This study's findings provide promising data on the feasibility of using Preppr as a self-directed educational tool for students training to prepare dental crowns. PMID:27480712

  3. ROCKETSHIP: a flexible and modular software tool for the planning, processing and analysis of dynamic MRI studies

    OpenAIRE

    Barnes, Samuel R.; Ng, Thomas S. C.; Santa-Maria, Naomi; Montagne, Axel; Zlokovic, Berislav V.; Jacobs, Russell E.

    2015-01-01

    Background Dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) is a promising technique to characterize pathology and evaluate treatment response. However, analysis of DCE-MRI data is complex and benefits from concurrent analysis of multiple kinetic models and parameters. Few software tools are currently available that specifically focuses on DCE-MRI analysis with multiple kinetic models. Here, we developed ROCKETSHIP, an open-source, flexible and modular software for DCE-MRI analy...

  4. GraphCrunch 2: Software tool for network modeling, alignment and clustering

    Directory of Open Access Journals (Sweden)

    Hayes Wayne

    2011-01-01

    Full Text Available Abstract Background Recent advancements in experimental biotechnology have produced large amounts of protein-protein interaction (PPI data. The topology of PPI networks is believed to have a strong link to their function. Hence, the abundance of PPI data for many organisms stimulates the development of computational techniques for the modeling, comparison, alignment, and clustering of networks. In addition, finding representative models for PPI networks will improve our understanding of the cell just as a model of gravity has helped us understand planetary motion. To decide if a model is representative, we need quantitative comparisons of model networks to real ones. However, exact network comparison is computationally intractable and therefore several heuristics have been used instead. Some of these heuristics are easily computable "network properties," such as the degree distribution, or the clustering coefficient. An important special case of network comparison is the network alignment problem. Analogous to sequence alignment, this problem asks to find the "best" mapping between regions in two networks. It is expected that network alignment might have as strong an impact on our understanding of biology as sequence alignment has had. Topology-based clustering of nodes in PPI networks is another example of an important network analysis problem that can uncover relationships between interaction patterns and phenotype. Results We introduce the GraphCrunch 2 software tool, which addresses these problems. It is a significant extension of GraphCrunch which implements the most popular random network models and compares them with the data networks with respect to many network properties. Also, GraphCrunch 2 implements the GRAph ALigner algorithm ("GRAAL" for purely topological network alignment. GRAAL can align any pair of networks and exposes large, dense, contiguous regions of topological and functional similarities far larger than any other

  5. Evaluation and Usage of Browser Compatibility Tools during the Software Development Process

    OpenAIRE

    Boyaci, Burak

    2016-01-01

    The software testing process is one of the most important phases during software development to check that the developed software product meets its specified specifications/requirements. This is especially true for the software products used in the health-industry. Supported browsers can also be documented into the requirements. Thus browser compatibility testing needs to be considered, especially while performing testing on web-based software products. Browser compatibility testing is perfor...

  6. Performance analysis and optimization of an advanced pharmaceutical wastewater treatment plant through a visual basic software tool (PWWT.VB).

    Science.gov (United States)

    Pal, Parimal; Thakura, Ritwik; Chakrabortty, Sankha

    2016-05-01

    A user-friendly, menu-driven simulation software tool has been developed for the first time to optimize and analyze the system performance of an advanced continuous membrane-integrated pharmaceutical wastewater treatment plant. The software allows pre-analysis and manipulation of input data which helps in optimization and shows the software performance visually on a graphical platform. Moreover, the software helps the user to "visualize" the effects of the operating parameters through its model-predicted output profiles. The software is based on a dynamic mathematical model, developed for a systematically integrated forward osmosis-nanofiltration process for removal of toxic organic compounds from pharmaceutical wastewater. The model-predicted values have been observed to corroborate well with the extensive experimental investigations which were found to be consistent under varying operating conditions like operating pressure, operating flow rate, and draw solute concentration. Low values of the relative error (RE = 0.09) and high values of Willmott-d-index (d will = 0.981) reflected a high degree of accuracy and reliability of the software. This software is likely to be a very efficient tool for system design or simulation of an advanced membrane-integrated treatment plant for hazardous wastewater. PMID:26856870

  7. Diva software, a tool for European regional seas and Ocean climatologies production

    Science.gov (United States)

    Ouberdous, M.; Troupin, C.; Barth, A.; Alvera-Azcàrate, A.; Beckers, J.-M.

    2012-04-01

    Diva (Data-Interpolating Variational Analysis) is a software based on a method designed to perform data-gridding (or analysis) tasks, with the assets of taking into account the intrinsic nature of oceanographic data, i.e., the uncertainty on the in situ measurements and the anisotropy due to advection and irregular coastlines and topography. The Variational Inverse Method (VIM, Brasseur et al., 1996) implemented in Diva consists in minimizing a variational principle which accounts for the differences between the observations and the reconstructed field, the influence of the gradients and variability of the reconstructed field. The resolution of the numerical problem is based on finite-element method, which allows a great numerical efficiency and the consideration of complicated contours. Along with the analysis, Diva provides also error fields (Brankart and Brasseur, 1998; Rixen et al., 2000) based on the data coverage and noise. Diva is used for the production of climatologies in the pan-European network SeaDataNet. SeaDataNet is connecting the existing marine data centres of more than 30 countries and set up a data management infrastructure consisting of a standardized distributed system. The consortium has elaborated integrated products, using common procedures and methods. Among these, it uses the Diva software as reference tool for climatologies computation for various European regional seas, the Atlantic and the global ocean. During the first phase of the SeaDataNet project, a number of additional tools were developed to make easier the climatologies production for the users. Among these tools: the advection constraint during the field reconstruction through the specification of a velocity field on a regular grid, forcing the analysis to align with the velocity vectors; the Generalized Cross Validation for the determination of analysis parameters (signal-to-noise ratio); the creation of contours at selected depths; the detection of possible outliers; the

  8. An open-source software tool for the generation of relaxation time maps in magnetic resonance imaging

    International Nuclear Information System (INIS)

    In magnetic resonance (MR) imaging, T1, T2 and T2* relaxation times represent characteristic tissue properties that can be quantified with the help of specific imaging strategies. While there are basic software tools for specific pulse sequences, until now there is no universal software program available to automate pixel-wise mapping of relaxation times from various types of images or MR systems. Such a software program would allow researchers to test and compare new imaging strategies and thus would significantly facilitate research in the area of quantitative tissue characterization. After defining requirements for a universal MR mapping tool, a software program named MRmap was created using a high-level graphics language. Additional features include a manual registration tool for source images with motion artifacts and a tabular DICOM viewer to examine pulse sequence parameters. MRmap was successfully tested on three different computer platforms with image data from three different MR system manufacturers and five different sorts of pulse sequences: multi-image inversion recovery T1; Look-Locker/TOMROP T1; modified Look-Locker (MOLLI) T1; single-echo T2/T2*; and multi-echo T2/T2*. Computing times varied between 2 and 113 seconds. Estimates of relaxation times compared favorably to those obtained from non-automated curve fitting. Completed maps were exported in DICOM format and could be read in standard software packages used for analysis of clinical and research MR data. MRmap is a flexible cross-platform research tool that enables accurate mapping of relaxation times from various pulse sequences. The software allows researchers to optimize quantitative MR strategies in a manufacturer-independent fashion. The program and its source code were made available as open-source software on the internet

  9. An open-source software tool for the generation of relaxation time maps in magnetic resonance imaging

    Directory of Open Access Journals (Sweden)

    Kühne Titus

    2010-07-01

    Full Text Available Abstract Background In magnetic resonance (MR imaging, T1, T2 and T2* relaxation times represent characteristic tissue properties that can be quantified with the help of specific imaging strategies. While there are basic software tools for specific pulse sequences, until now there is no universal software program available to automate pixel-wise mapping of relaxation times from various types of images or MR systems. Such a software program would allow researchers to test and compare new imaging strategies and thus would significantly facilitate research in the area of quantitative tissue characterization. Results After defining requirements for a universal MR mapping tool, a software program named MRmap was created using a high-level graphics language. Additional features include a manual registration tool for source images with motion artifacts and a tabular DICOM viewer to examine pulse sequence parameters. MRmap was successfully tested on three different computer platforms with image data from three different MR system manufacturers and five different sorts of pulse sequences: multi-image inversion recovery T1; Look-Locker/TOMROP T1; modified Look-Locker (MOLLI T1; single-echo T2/T2*; and multi-echo T2/T2*. Computing times varied between 2 and 113 seconds. Estimates of relaxation times compared favorably to those obtained from non-automated curve fitting. Completed maps were exported in DICOM format and could be read in standard software packages used for analysis of clinical and research MR data. Conclusions MRmap is a flexible cross-platform research tool that enables accurate mapping of relaxation times from various pulse sequences. The software allows researchers to optimize quantitative MR strategies in a manufacturer-independent fashion. The program and its source code were made available as open-source software on the internet.

  10. DAST PVPS, a new PC software tool for the training, design and simulation of photo-electric pump systems. DAST-PVPS, ein neues PC Software Tool fuer die Schulung, Auslegung und Simulation von photovoltaischen Pumpensystemen

    Energy Technology Data Exchange (ETDEWEB)

    Mayer, O. (Univ. der Bundeswehr Muenchen, Abt. Regenerative Energien, Neubiberg (Germany)); Baumeister, A. (Univ. der Bundeswehr Muenchen, Abt. Regenerative Energien, Neubiberg (Germany)); Festl, T. (Univ. der Bundeswehr Muenchen, Abt. Regenerative Energien, Neubiberg (Germany))

    1993-01-01

    DAST-PVPS makes a new software tool for the design and simulation of photo-electric pump systems available. The programme offers the possibility of testing different variants of PVPS quickly and simply, and to integrate one's own irradiation and component data in the programme. The programme can be obtained in the first version 1.0. (orig.)

  11. COMSY - A software tool for PLIM + PLEX with integrated risk-informed approaches

    International Nuclear Information System (INIS)

    The majority of mechanical components and structures in a thermal power plant are designed to experience a service life which is far above the intended design life. In most cases, only a small percentage of mechanical components are subject to significant degradation which may affect the integrity or the function of the component. If plant life extension (PLEX) is considered as an option, a plant specific PLIM strategy needs to be developed. One of the most important tasks of such a PLIM strategy is to identify those components which (i) are relevant for the safety and/or availability of the plant and (ii) experience elevated degradation due to their operating and design conditions. For these components special life management strategies need to be established to reliably monitor their condition. FRAMATOME ANP GmbH has developed the software tool COMSY, which is designed to efficiently support a plant-wide lifetime management strategy for static mechanical components, providing the basis for plant life extension (PLEX) activities. The objective is the economical and safe operation of power plants over their design lifetime - and beyond. The tool provides the capability to establish a program guided technical documentation of the plant by utilizing a virtual plant data model. The software integrates engineering analysis functions and comprehensive material libraries to perform a lifetime analysis for various degradation mechanisms typically experienced in power plants (e.g. flow-accelerated corrosion, intergranular stress corrosion cracking, strain-induced cracking, material fatigue, cavitation erosion, droplet impingement erosion, pitting, etc.). A risk-based prioritization serves to focus inspection activities on safety or availability relevant locations, where a degradation potential exists. Trending functions support the comparison of the as-measured condition with the predicted progress of degradation while making allowance for measurement tolerances. The

  12. 基于软件体系结构的测试及其工具研究%Research on Testing and its Tools Based on Software Architecture

    Institute of Scientific and Technical Information of China (English)

    叶俊民; 王振宇; 陈利; 赵恒

    2003-01-01

    In this paper ,we discuss the objective and the significance of software testing on the basis of software ar-chitecture; we propose the content of testing planing of software architecture and testing criteria; on the level of ar-chitecture,through comparison with traditional testing tools,we present testing tools in integration environment andanalyze the roles of these tools. On the basis of this,we further suggest the structure of a integration testing environment.

  13. Evaluating the core damage frequency of a TRIGA research reactor using risk assessment tool software

    Energy Technology Data Exchange (ETDEWEB)

    Kamyab, Shahabeddin [School of Engineering, Shiraz University, 71348-51154 Shiraz (Iran, Islamic Republic of); Nematollahi, Mohammadreza, E-mail: mrnema@yahoo.com [School of Engineering, Shiraz University, 71348-51154 Shiraz (Iran, Islamic Republic of); Safety Research Center of Shiraz University, 71348-51154 Shiraz (Iran, Islamic Republic of)

    2011-08-15

    Highlights: {center_dot} In this study, level-I PSA is performed, to reveal and modify the weak points threatening the safe operation of a typical TRIGA reactor. {center_dot} After identification of the initiating events and developing the appropriate event trees and fault trees, by the risk assessment tool interface, the core damage frequency has been estimated to be 8.368E-6 per year of reactor operation, which meets the IAEA standards. {center_dot} The results also indicate the significant effects of the common cause failures. - Abstract: After all preventive and mitigative measures considered in the design of a nuclear reactor, the installation still represents a residual risk to the outside world. Probabilistic safety assessment (PSA) is a powerful method to survey the safety of nuclear reactors. In this study the occurrence frequency of different types of core damage states (CDS) which may potentially arise in Tehran Research Reactor (TRR) is evaluated by use of the recently developed risk assessment tool (RAT) software which has been designed and represented in the Safety Research Center of Shiraz University. RAT uses event trees and fault trees to evaluate the total final core damage frequency (CDF) through studying the frequencies of initiation events, and following their consequences has resulted in one type of the CDS. The criterion must be of the order of smaller than 1E-04 through IAEA standards for research reactors (). Results show that the total final CDF for TRR is of the order of 10{sup -6}, which meets the criterion of nuclear research reactor.

  14. Data-Driven Decision Making as a Tool to Improve Software Development Productivity

    Science.gov (United States)

    Brown, Mary Erin

    2013-01-01

    The worldwide software project failure rate, based on a survey of information technology software manager's view of user satisfaction, product quality, and staff productivity, is estimated to be between 24% and 36% and software project success has not kept pace with the advances in hardware. The problem addressed by this study was the limited…

  15. Plagiarism Detection: A Comparison of Teaching Assistants and a Software Tool in Identifying Cheating in a Psychology Course

    Science.gov (United States)

    Seifried, Eva; Lenhard, Wolfgang; Spinath, Birgit

    2015-01-01

    Essays that are assigned as homework in large classes are prone to cheating via unauthorized collaboration. In this study, we compared the ability of a software tool based on Latent Semantic Analysis (LSA) and student teaching assistants to detect plagiarism in a large group of students. To do so, we took two approaches: the first approach was…

  16. Students' Learning Experiences When Using a Dynamic Geometry Software Tool in a Geometry Lesson at Secondary School in Ethiopia

    Science.gov (United States)

    Denbel, Dejene Girma

    2015-01-01

    Students learning experiences were investigated in geometry lesson when using Dynamic Geometry Software (DGS) tool in geometry learning in 25 Ethiopian secondary students. The research data were drawn from the used worksheets, classroom observations, results of pre- and post-test, a questionnaire and interview responses. I used GeoGebra as a DGS…

  17. The Design and Development of a Computerized Tool Support for Conducting Senior Projects in Software Engineering Education

    Science.gov (United States)

    Chen, Chung-Yang; Teng, Kao-Chiuan

    2011-01-01

    This paper presents a computerized tool support, the Meetings-Flow Project Collaboration System (MFS), for designing, directing and sustaining the collaborative teamwork required in senior projects in software engineering (SE) education. Among many schools' SE curricula, senior projects serve as a capstone course that provides comprehensive…

  18. A note on the development of a new software package, the FAO-ICLARM stock assessment tools (FiSAT)

    OpenAIRE

    Pauly, D.; Sparre, P.

    1991-01-01

    A brief narrative is given of the background of a new FAO-ICLARM software for (mainly length-based) fish stock assessment, the FAO-ICLARM Stock Assessment Tools or "FiSAT" package, integrating the ICLARM's Compleat ELEFAN, FAO's LFSA and various other routines, and which is to be released in mid-1992.

  19. MAPIT: A new software tool to assist in the transition from conceptual model to numerical simulation models

    International Nuclear Information System (INIS)

    MapIt is a new software tool developed at Lawrence Livermore National Laboratory to assist ground water remediation professionals in generating numerical simulation models from a variety of physical and chemical data sources and the corresponding 1, 2, and 3 dimensional conceptual models that emerge from analysis of such data

  20. CubeSat mission design software tool for risk estimating relationships

    Science.gov (United States)

    Gamble, Katharine Brumbaugh; Lightsey, E. Glenn

    2014-09-01

    In an effort to make the CubeSat risk estimation and management process more scientific, a software tool has been created that enables mission designers to estimate mission risks. CubeSat mission designers are able to input mission characteristics, such as form factor, mass, development cycle, and launch information, in order to determine the mission risk root causes which historically present the highest risk for their mission. Historical data was collected from the CubeSat community and analyzed to provide a statistical background to characterize these Risk Estimating Relationships (RERs). This paper develops and validates the mathematical model based on the same cost estimating relationship methodology used by the Unmanned Spacecraft Cost Model (USCM) and the Small Satellite Cost Model (SSCM). The RER development uses general error regression models to determine the best fit relationship between root cause consequence and likelihood values and the input factors of interest. These root causes are combined into seven overall CubeSat mission risks which are then graphed on the industry-standard 5×5 Likelihood-Consequence (L-C) chart to help mission designers quickly identify areas of concern within their mission. This paper is the first to document not only the creation of a historical database of CubeSat mission risks, but, more importantly, the scientific representation of Risk Estimating Relationships.

  1. Software Tool for Analysis of Breathing-Related Errors in Transthoracic Electrical Bioimpedance Spectroscopy Measurements

    Science.gov (United States)

    Abtahi, F.; Gyllensten, I. C.; Lindecrantz, K.; Seoane, F.

    2012-12-01

    During the last decades, Electrical Bioimpedance Spectroscopy (EBIS) has been applied in a range of different applications and mainly using the frequency sweep-technique. Traditionally the tissue under study is considered to be timeinvariant and dynamic changes of tissue activity are ignored and instead treated as a noise source. This assumption has not been adequately tested and could have a negative impact and limit the accuracy for impedance monitoring systems. In order to successfully use frequency-sweeping EBIS for monitoring time-variant systems, it is paramount to study the effect of frequency-sweep delay on Cole Model-based analysis. In this work, we present a software tool that can be used to simulate the influence of respiration activity in frequency-sweep EBIS measurements of the human thorax and analyse the effects of the different error sources. Preliminary results indicate that the deviation on the EBIS measurement might be significant at any frequency, and especially in the impedance plane. Therefore the impact on Cole-model analysis might be different depending on method applied for Cole parameter estimation.

  2. Software Tool for Analysis of Breathing-Related Errors in Transthoracic Electrical Bioimpedance Spectroscopy Measurements

    International Nuclear Information System (INIS)

    During the last decades, Electrical Bioimpedance Spectroscopy (EBIS) has been applied in a range of different applications and mainly using the frequency sweep-technique. Traditionally the tissue under study is considered to be timeinvariant and dynamic changes of tissue activity are ignored and instead treated as a noise source. This assumption has not been adequately tested and could have a negative impact and limit the accuracy for impedance monitoring systems. In order to successfully use frequency-sweeping EBIS for monitoring time-variant systems, it is paramount to study the effect of frequency-sweep delay on Cole Model-based analysis. In this work, we present a software tool that can be used to simulate the influence of respiration activity in frequency-sweep EBIS measurements of the human thorax and analyse the effects of the different error sources. Preliminary results indicate that the deviation on the EBIS measurement might be significant at any frequency, and especially in the impedance plane. Therefore the impact on Cole-model analysis might be different depending on method applied for Cole parameter estimation.

  3. A Software Tool to Visualize Verbal Protocols to Enhance Strategic and Metacognitive Abilities in Basic Programming

    Directory of Open Access Journals (Sweden)

    Carlos A. Arévalo

    2011-07-01

    Full Text Available Learning to program is difficult for many first year undergraduate students. Instructional strategies of traditional programming courses tend to focus on syntactic issues and assigning practice exercises using the presentation-examples-practice formula and by showing the verbal and visual explanation of a teacher during the “step by step” process of writing a computer program. Cognitive literature regarding the mental processes involved in programming suggests that the explicit teaching of certain aspects such as mental models, strategic knowledge and metacognitive abilities, are critical issues of how to write and assemble the pieces of a computer program. Verbal protocols are often used in software engineering as a technique to record the short term cognitive process of a user or expert in evaluation or problem solving scenarios. We argue that verbal protocols can be used as a mechanism to explicitly show the strategic and metacognitive process of an instructor when writing a program. In this paper we present an Information System Prototype developed to store and visualize worked examples derived from transcribed verbal protocols during the process of writing introductory level programs. Empirical data comparing the grades obtained by two groups of novice programming students, using ANOVA, indicates a statistically positive difference in performance in the group using the tool, even though these results still cannot be extrapolated to general population, given the reported limitations of this study.

  4. PlanetPack3: a software tool for exoplanets characterization from radial velocity and transit data

    Science.gov (United States)

    Baluev, Roman V.

    2015-08-01

    We describe the forthcoming third major release of the PlanetPack software tool for exoplanets detection and characterization from Doppler and/or transit data. Among other things, this major update will bring routines for the joint fitting of radial velocities and transits, optionally taking into account various subtle effects: the Rossiter-McLaughlin effect, the light arrival time delay between the radial velocity and transit curves, new experimental models of the Doppler or photometry noise, including non-stationary models with variable noise magnitude (due to e.g. the stellar activity variations).This work was supported by the Russian Foundation for Basic Research (project No. 14-02-92615 KO_a), the UK Royal Society International Exchange grant IE140055, by the President of Russia grant for young scientists (No. MK-733.2014.2), by the programme of the Presidium of Russian Academy of Sciences P21, and by the Saint Petersburg State University research grant 6.37.341.2015.

  5. TRANSIT--A Software Tool for Himar1 TnSeq Analysis.

    Directory of Open Access Journals (Sweden)

    Michael A DeJesus

    2015-10-01

    Full Text Available TnSeq has become a popular technique for determining the essentiality of genomic regions in bacterial organisms. Several methods have been developed to analyze the wealth of data that has been obtained through TnSeq experiments. We developed a tool for analyzing Himar1 TnSeq data called TRANSIT. TRANSIT provides a graphical interface to three different statistical methods for analyzing TnSeq data. These methods cover a variety of approaches capable of identifying essential genes in individual datasets as well as comparative analysis between conditions. We demonstrate the utility of this software by analyzing TnSeq datasets of M. tuberculosis grown on glycerol and cholesterol. We show that TRANSIT can be used to discover genes which have been previously implicated for growth on these carbon sources. TRANSIT is written in Python, and thus can be run on Windows, OSX and Linux platforms. The source code is distributed under the GNU GPL v3 license and can be obtained from the following GitHub repository: https://github.com/mad-lab/transit.

  6. TRANSIT--A Software Tool for Himar1 TnSeq Analysis.

    Science.gov (United States)

    DeJesus, Michael A; Ambadipudi, Chaitra; Baker, Richard; Sassetti, Christopher; Ioerger, Thomas R

    2015-10-01

    TnSeq has become a popular technique for determining the essentiality of genomic regions in bacterial organisms. Several methods have been developed to analyze the wealth of data that has been obtained through TnSeq experiments. We developed a tool for analyzing Himar1 TnSeq data called TRANSIT. TRANSIT provides a graphical interface to three different statistical methods for analyzing TnSeq data. These methods cover a variety of approaches capable of identifying essential genes in individual datasets as well as comparative analysis between conditions. We demonstrate the utility of this software by analyzing TnSeq datasets of M. tuberculosis grown on glycerol and cholesterol. We show that TRANSIT can be used to discover genes which have been previously implicated for growth on these carbon sources. TRANSIT is written in Python, and thus can be run on Windows, OSX and Linux platforms. The source code is distributed under the GNU GPL v3 license and can be obtained from the following GitHub repository: https://github.com/mad-lab/transit. PMID:26447887

  7. CancellationTools: All-in-one software for administration and analysis of cancellation tasks.

    Science.gov (United States)

    Dalmaijer, Edwin S; Van der Stigchel, Stefan; Nijboer, Tanja C W; Cornelissen, Tim H W; Husain, Masud

    2015-12-01

    In a cancellation task, a participant is required to search for and cross out ("cancel") targets, which are usually embedded among distractor stimuli. The number of cancelled targets and their location can be used to diagnose the neglect syndrome after stroke. In addition, the organization of search provides a potentially useful way to measure executive control over multitarget search. Although many useful cancellation measures have been introduced, most fail to make their way into research studies and clinical practice due to the practical difficulty of acquiring such parameters from traditional pen-and-paper measures. Here we present new, open-source software that is freely available to all. It allows researchers and clinicians to flexibly administer computerized cancellation tasks using stimuli of their choice, and to directly analyze the data in a convenient manner. The automated analysis suite provides output that includes almost all of the currently existing measures, as well as several new ones introduced here. All tasks can be performed using either a computer mouse or a touchscreen as an input device, and an online version of the task runtime is available for tablet devices. A summary of the results is produced in a single A4-sized PDF document, including high quality data visualizations. For research purposes, batch analysis of large datasets is possible. In sum, CancellationTools allows users to employ a flexible, computerized cancellation task, which provides extensive benefits and ease of use. PMID:25381020

  8. A software tool for quality assurance of computed/digital radiography (CR/DR) systems

    Science.gov (United States)

    Desai, Nikunj; Valentino, Daniel J.

    2011-03-01

    The recommended methods to test the performance of computed radiography (CR) systems have been established by The American Association of Physicists in Medicine, Report No. 93, "Acceptance Testing and Quality Control of Photostimulable Storage Phosphor Imaging Systems". The quality assurance tests are categorized by how frequently they need to be performed. Quality assurance of CR systems is the responsibility of the facility that performs the exam and is governed by the state in which the facility is located. For Example, the New York State Department of Health has established a guide which lists the tests that a CR facility must perform for quality assurance. This study aims at educating the reader about the new quality assurance requirements defined by the state. It further demonstrates an easy to use software tool, henceforth referred to as the Digital Physicist, developed to aid a radiologic facility in conforming with state guidelines and monitoring quality assurance of CR/DR imaging systems. The Digital Physicist provides a vendor independent procedure for quality assurance of CR/DR systems. Further it, generates a PDF report with a brief description of these tests and the obtained results.

  9. UNBizPlanner: a software tool for preparing a business plan

    Directory of Open Access Journals (Sweden)

    Oscar Ávila Cifuentes

    2010-04-01

    Full Text Available Universities are currently expected to play a new role in society (in addition to research and teaching by engaging in a third mission concerning socio-economic development. Universities also play an important role in encouraging en-trepreneurs through training them in business planning. A business plan is a document summarising how an entre-preneur will create an organisation to exploit a business opportunity. Preparing a business plan draws on a wide range of knowledge from many business disciplines (e.g. finance, human resource management, intellectual pro-perty management, supply chain management, operations management and marketing. This article presents a computational tool for drawing up a business plan from a Colombian viewpoint by identifying the most relevant stages which are born in mind by national entities having most experience in creating and consolidating companies. Special emphasis was placed on analysing, designing and implementing a systems development life cycle for de-veloping the software. Reviewing the literature concerning business plans formed an important part of the analysis stage (bearing a Colombian viewpoint in mind.

  10. Development of a Kinect Software Tool to Classify Movements during Active Video Gaming.

    Science.gov (United States)

    Rosenberg, Michael; Thornton, Ashleigh L; Lay, Brendan S; Ward, Brodie; Nathan, David; Hunt, Daniel; Braham, Rebecca

    2016-01-01

    While it has been established that using full body motion to play active video games results in increased levels of energy expenditure, there is little information on the classification of human movement during active video game play in relationship to fundamental movement skills. The aim of this study was to validate software utilising Kinect sensor motion capture technology to recognise fundamental movement skills (FMS), during active video game play. Two human assessors rated jumping and side-stepping and these assessments were compared to the Kinect Action Recognition Tool (KART), to establish a level of agreement and determine the number of movements completed during five minutes of active video game play, for 43 children (m = 12 years 7 months ± 1 year 6 months). During five minutes of active video game play, inter-rater reliability, when examining the two human raters, was found to be higher for the jump (r = 0.94, p game play, demonstrating that both humans and KART had higher agreement for jumps than sidesteps in the game play condition. The results of the study provide confidence that the Kinect sensor can be used to count the number of jumps and sidestep during five minutes of active video game play with a similar level of accuracy as human raters. However, in contrast to humans, the KART system required a fraction of the time to analyse and tabulate the results. PMID:27442437

  11. Computer-generated holograms (CGH) realization: the integration of dedicated software tool with digital slides printer

    Science.gov (United States)

    Guarnieri, Vittorio; Francini, Franco

    1997-12-01

    Last generation of digital printer is usually characterized by a spatial resolution enough high to allow the designer to realize a binary CGH directly on a transparent film avoiding photographic reduction techniques. These devices are able to produce slides or offset prints. Furthermore, services supplied by commercial printing company provide an inexpensive method to rapidly verify the validity of the design by means of a test-and-trial process. Notably, this low-cost approach appears to be suitable for a didactical environment. On the basis of these considerations, a set of software tools able to design CGH's has been developed. The guidelines inspiring the work have been the following ones: (1) ray-tracing approach, considering the object to be reproduced as source of spherical waves; (2) Optimization and speed-up of the algorithms used, in order to produce a portable code, runnable on several hardware platforms. In this paper calculation methods to obtain some fundamental geometric functions (points, lines, curves) are described. Furthermore, by the juxtaposition of these primitives functions it is possible to produce the holograms of more complex objects. Many examples of generated CGHs are presented.

  12. GTest: a software tool for graphical assessment of empirical distributions' Gaussianity.

    Science.gov (United States)

    Barca, E; Bruno, E; Bruno, D E; Passarella, G

    2016-03-01

    their request for an effective tool for addressing such difficulties motivated us in adopting the inference-by-eye paradigm and implementing an easy-to-use, quick and reliable statistical tool. GTest visualizes its outcomes as a modified version of the Q-Q plot. The application has been developed in Visual Basic for Applications (VBA) within MS Excel 2010, which demonstrated to have all the characteristics of robustness and reliability needed. GTest provides true graphical normality tests which are as reliable as any statistical quantitative approach but much easier to understand. The Q-Q plots have been integrated with the outlining of an acceptance region around the representation of the theoretical distribution, defined in accordance with the alpha level of significance and the data sample size. The test decision rule is the following: if the empirical scatterplot falls completely within the acceptance region, then it can be concluded that the empirical distribution fits the theoretical one at the given alpha level. A comprehensive case study has been carried out with simulated and real-world data in order to check the robustness and reliability of the software. PMID:26846288

  13. Mars, accessing the third dimension: a software tool to exploit Mars ground penetrating radars data.

    Science.gov (United States)

    Cantini, Federico; Ivanov, Anton B.

    2016-04-01

    The Mars Advanced Radar for Subsurface and Ionosphere Sounding (MARSIS), on board the ESA's Mars Express and the SHAllow RADar (SHARAD), on board the NASA's Mars Reconnaissance Orbiter are two ground penetrating radars (GPRs) aimed to probe the crust of Mars to explore the subsurface structure of the planet. By now they are collecting data since about 10 years covering a large fraction of the Mars surface. On the Earth GPRs collect data by sending electromagnetic (EM) pulses toward the surface and listening to the return echoes occurring at the dielectric discontinuities on the planet's surface and subsurface. The wavelengths used allow MARSIS EM pulses to penetrate the crust for several kilometers. The data products (Radargrams) are matrices where the x-axis spans different sampling points on the planet surface and the y-axis is the power of the echoes over time in the listening window. No standard way to manage this kind of data is established in the planetary science community and data analysis and interpretation require very often some knowledge of radar signal processing. Our software tool is aimed to ease the access to this data in particular to scientists without a specific background in signal processing. MARSIS and SHARAD geometrical data such as probing point latitude and longitude and spacecraft altitude, are stored, together with relevant acquisition metadata, in a geo-enabled relational database implemented using PostgreSQL and PostGIS. Data are extracted from official ESA and NASA released data using self-developed python classes and scripts and inserted in the database using OGR utilities. This software is also aimed to be the core of a collection of classes and script to implement more complex GPR data analysis. Geometrical data and metadata are exposed as WFS layers using a QGIS server, which can be further integrated with other data, such as imaging, spectroscopy and topography. Radar geometry data will be available as a part of the iMars Web

  14. Development and application of a new software tool for the basic design of flue gas cleaning processes

    Energy Technology Data Exchange (ETDEWEB)

    Schausberger, P.; Friedl, A. [Vienna Univ. of Technology, Inst. of Chemical Engineering, Group of Thermal Process Engineering and Simulation, Vienna (Austria); Wieland, A.; Reissner, H. [AE and E Austrian Energy and Environment AG, Flue Gas Cleaning Div., Raaba/Graz (Austria)

    2004-07-01

    The development of a new software tool designed for improvement of the basic engineering of flue-gas cleaning processes and its specific application is presented. The tool is based on the commercially available simulation tool IPSEpro originating from the field of power engineering. Here, a modelling environment enables the enhancement of the existing content: substances, streams and unit operations to be included are structured in an object-oriented manner, the according steady mass and heat balances are setup to yield a system of equations to be solved simultaneously. (orig.)

  15. SAGES: A Suite of Freely-Available Software Tools for Electronic Disease Surveillance in Resource-Limited Settings

    Science.gov (United States)

    Lewis, Sheri L.; Feighner, Brian H.; Loschen, Wayne A.; Wojcik, Richard A.; Skora, Joseph F.; Coberly, Jacqueline S.; Blazes, David L.

    2011-01-01

    Public health surveillance is undergoing a revolution driven by advances in the field of information technology. Many countries have experienced vast improvements in the collection, ingestion, analysis, visualization, and dissemination of public health data. Resource-limited countries have lagged behind due to challenges in information technology infrastructure, public health resources, and the costs of proprietary software. The Suite for Automated Global Electronic bioSurveillance (SAGES) is a collection of modular, flexible, freely-available software tools for electronic disease surveillance in resource-limited settings. One or more SAGES tools may be used in concert with existing surveillance applications or the SAGES tools may be used en masse for an end-to-end biosurveillance capability. This flexibility allows for the development of an inexpensive, customized, and sustainable disease surveillance system. The ability to rapidly assess anomalous disease activity may lead to more efficient use of limited resources and better compliance with World Health Organization International Health Regulations. PMID:21572957

  16. Open Software Tools Applied to Jordan's National Multi-Agent Water Management Model

    Science.gov (United States)

    Knox, Stephen; Meier, Philipp; Harou, Julien; Yoon, Jim; Selby, Philip; Lachaut, Thibaut; Klassert, Christian; Avisse, Nicolas; Khadem, Majed; Tilmant, Amaury; Gorelick, Steven

    2016-04-01

    Jordan is the fourth most water scarce country in the world, where demand exceeds supply in a politically and demographically unstable context. The Jordan Water Project (JWP) aims to perform policy evaluation by modelling the hydrology, economics, and governance of Jordan's water resource system. The multidisciplinary nature of the project requires a modelling software system capable of integrating submodels from multiple disciplines into a single decision making process and communicating results to stakeholders. This requires a tool for building an integrated model and a system where diverse data sets can be managed and visualised. The integrated Jordan model is built using Pynsim, an open-source multi-agent simulation framework implemented in Python. Pynsim operates on network structures of nodes and links and supports institutional hierarchies, where an institution represents a grouping of nodes, links or other institutions. At each time step, code within each node, link and institution can executed independently, allowing for their fully autonomous behaviour. Additionally, engines (sub-models) perform actions over the entire network or on a subset of the network, such as taking a decision on a set of nodes. Pynsim is modular in design, allowing distinct modules to be modified easily without affecting others. Data management and visualisation is performed using Hydra (www.hydraplatform.org), an open software platform allowing users to manage network structure and data. The Hydra data manager connects to Pynsim, providing necessary input parameters for the integrated model. By providing a high-level portal to the model, Hydra removes a barrier between the users of the model (researchers, stakeholders, planners etc) and the model itself, allowing them to manage data, run the model and visualise results all through a single user interface. Pynsim's ability to represent institutional hierarchies, inter-network communication and the separation of node, link and

  17. Improvement of a free software tool for the assessment of sediment connectivity

    Science.gov (United States)

    Crema, Stefano; Lanni, Cristiano; Goldin, Beatrice; Marchi, Lorenzo; Cavalli, Marco

    2015-04-01

    Sediment connectivity expresses the degree of linkage that controls sediment fluxes throughout landscape, in particular between sediment sources and downstream areas. The assessment of sediment connectivity becomes a key issue when dealing with risk mitigation and priorities of intervention in the territory. In this work, the authors report the improvements made to an open source and stand-alone application (SedInConnect, http://www.sedalp.eu/download/tools.shtml), along with extensive applications to alpine catchments. SedInConnect calculates a sediment connectivity index as expressed in Cavalli et al. (2013); the software improvements consisted primarily in the introduction of the sink feature, i.e. areas that act as traps for sediment produced upstream (e.g., lakes, sediment traps). Based on user-defined sinks, the software decouples those parts of the catchment that do not deliver sediment to a selected target of interest (e.g., fan apex, main drainage network). In this way the assessment of sediment connectivity is achieved by taking in consideration effective sediment contributing areas. Sediment connectivity analysis has been carried out on several catchments in the South Tyrol alpine area (Northern Italy) with the goal of achieving a fast and objective characterization of the topographic control on sediment transfer. In addition to depicting the variability of sediment connectivity inside each basin, the index of connectivity has proved to be a valuable indicator of the dominant process characterizing the basin sediment dynamics (debris flow, bedload, mixed behavior). The characterization of the dominant process is of great importance for the hazard and risk assessment in mountain areas, and for choice and design of structural and non-structural intervention measures. The recognition of the dominant sediment transport process by the index of connectivity is in agreement with evidences arising from post-event field surveys and with the application of

  18. Exploiting Patterns and Tool Support for Reusable and Automated Change Support for Software Architectures

    OpenAIRE

    Aakash Ahmad; Claus Pahl; Fawad Khaliq; Onaiza Maqbool; Pooyan Jamshidi

    2016-01-01

    Lehman?s law of continuing change implies that software must continually evolve to accommodate frequently changing requirements in existing systems. Also, maintainability as an attribute of system quality requires that changes are to be systematically implemented in existing software throughout its lifecycle. To support a continuous software evolution, the primary challenges include (i) enhancing reuse of recurring changes; and (ii) decreasing the efforts for change implementation. We propose...

  19. Genoviz Software Development Kit: Java tool kit for building genomics visualization applications

    OpenAIRE

    Chervitz Stephen A; Blanchard Steven G; Erwin Ed; Blossom Eric; Nicol John W; Helt Gregg A; Harmon Cyrus; Loraine Ann E

    2009-01-01

    Abstract Background Visualization software can expose previously undiscovered patterns in genomic data and advance biological science. Results The Genoviz Software Development Kit (SDK) is an open source, Java-based framework designed for rapid assembly of visualization software applications for genomics. The Genoviz SDK framework provides a mechanism for incorporating adaptive, dynamic zooming into applications, a desirable feature of genome viewers. Visualization capabilities of the Genoviz...

  20. Programming heterogeneous MPSoCs : tool flows to close the software productivity gap

    OpenAIRE

    Castrillón, Jerónimo

    2013-01-01

    Embedded electronic systems have seen an enormous boost in the last few years, evolving from single-processor platforms running single applications to complex heterogeneous Multi-Processor Systems-on-Chip (MPSoCs) capable of executing multiple applications simultaneously. This hardware complexity together with today’s stringent application requirements makes programming a daunting task for embedded software developers. As a consequence, software schedules are rarely met so that software solut...

  1. Development of an open source software tool for propeller design in the MAAT project

    OpenAIRE

    Morgado, João Paulo Salgueiro

    2016-01-01

    This thesis presents the development of a new propeller design and analysis software capable of adequately predicting the low Reynolds number performance. JBLADE software was developed from QBLADE and XFLR5 and it uses an improved version of Blade Element Momentum (BEM) theory that embeds a new model for the three-dimensional flow equilibrium. The software allows the introduction of the blade geometry as an arbitrary number of sections characterized by their radial position, chord, twist, len...

  2. Biomedical Mutation Analysis (BMA): A software tool for analyzing mutations associated with antiviral resistance

    Science.gov (United States)

    Salvatierra, Karina; Florez, Hector

    2016-01-01

    Introduction: Hepatitis C virus (HCV) is considered a major public health problem, with 200 million people infected worldwide. The treatment for HCV chronic infection with pegylated interferon alpha plus ribavirin inhibitors is unspecific; consequently, the treatment is effective in only 50% of patients infected. This has prompted the development of direct-acting antivirals (DAA) that target virus proteins. These DAA have demonstrated a potent effect in vitro and in vivo; however, virus mutations associated with the development of resistance have been described. Objective: To design and develop an online information system for detecting mutations in amino acids known to be implicated in resistance to DAA. Materials and methods:    We have used computer applications, technological tools, standard languages, infrastructure systems and algorithms, to analyze positions associated with resistance to DAA for the NS3, NS5A, and NS5B genes of HCV. Results: We have designed and developed an online information system named Biomedical Mutation Analysis (BMA), which allows users to calculate changes in nucleotide and amino acid sequences for each selected sequence from conventional Sanger and cloning sequencing using a graphical interface. Conclusion: BMA quickly, easily and effectively analyzes mutations, including complete documentation and examples. Furthermore, the development of different visualization techniques allows proper interpretation and understanding of the results. The data obtained using BMA will be useful for the assessment and surveillance of HCV resistance to new antivirals, and for the treatment regimens by selecting those DAA to which the virus is not resistant, avoiding unnecessary treatment failures. The software is available at: http://bma.itiud.org.

  3. Development of a Kinect Software Tool to Classify Movements during Active Video Gaming

    Science.gov (United States)

    Rosenberg, Michael; Lay, Brendan S.; Ward, Brodie; Nathan, David; Hunt, Daniel; Braham, Rebecca

    2016-01-01

    While it has been established that using full body motion to play active video games results in increased levels of energy expenditure, there is little information on the classification of human movement during active video game play in relationship to fundamental movement skills. The aim of this study was to validate software utilising Kinect sensor motion capture technology to recognise fundamental movement skills (FMS), during active video game play. Two human assessors rated jumping and side-stepping and these assessments were compared to the Kinect Action Recognition Tool (KART), to establish a level of agreement and determine the number of movements completed during five minutes of active video game play, for 43 children (m = 12 years 7 months ± 1 year 6 months). During five minutes of active video game play, inter-rater reliability, when examining the two human raters, was found to be higher for the jump (r = 0.94, p < .01) than the sidestep (r = 0.87, p < .01), although both were excellent. Excellent reliability was also found between human raters and the KART system for the jump (r = 0.84, p, .01) and moderate reliability for sidestep (r = 0.6983, p < .01) during game play, demonstrating that both humans and KART had higher agreement for jumps than sidesteps in the game play condition. The results of the study provide confidence that the Kinect sensor can be used to count the number of jumps and sidestep during five minutes of active video game play with a similar level of accuracy as human raters. However, in contrast to humans, the KART system required a fraction of the time to analyse and tabulate the results. PMID:27442437

  4. Development of a Kinect Software Tool to Classify Movements during Active Video Gaming.

    Directory of Open Access Journals (Sweden)

    Michael Rosenberg

    Full Text Available While it has been established that using full body motion to play active video games results in increased levels of energy expenditure, there is little information on the classification of human movement during active video game play in relationship to fundamental movement skills. The aim of this study was to validate software utilising Kinect sensor motion capture technology to recognise fundamental movement skills (FMS, during active video game play. Two human assessors rated jumping and side-stepping and these assessments were compared to the Kinect Action Recognition Tool (KART, to establish a level of agreement and determine the number of movements completed during five minutes of active video game play, for 43 children (m = 12 years 7 months ± 1 year 6 months. During five minutes of active video game play, inter-rater reliability, when examining the two human raters, was found to be higher for the jump (r = 0.94, p < .01 than the sidestep (r = 0.87, p < .01, although both were excellent. Excellent reliability was also found between human raters and the KART system for the jump (r = 0.84, p, .01 and moderate reliability for sidestep (r = 0.6983, p < .01 during game play, demonstrating that both humans and KART had higher agreement for jumps than sidesteps in the game play condition. The results of the study provide confidence that the Kinect sensor can be used to count the number of jumps and sidestep during five minutes of active video game play with a similar level of accuracy as human raters. However, in contrast to humans, the KART system required a fraction of the time to analyse and tabulate the results.

  5. A flexible, interactive software tool for fitting the parameters of neuronal models

    Directory of Open Access Journals (Sweden)

    Péter eFriedrich

    2014-07-01

    Full Text Available The construction of biologically relevant neuronal models as well as model-based analysis of experimental data often requires the simultaneous fitting of multiple model parameters, so that the behavior of the model in a certain paradigm matches (as closely as possible the corresponding output of a real neuron according to some predefined criterion. Although the task of model optimization is often computationally hard, and the quality of the results depends heavily on technical issues such as the appropriate choice (and implementation of cost functions and optimization algorithms, no existing program provides access to the best available methods while also guiding the user through the process effectively. Our software, called Optimizer, implements a modular and extensible framework for the optimization of neuronal models, and also features a graphical interface which makes it easy for even non-expert users to handle many commonly occurring scenarios. Meanwhile, educated users can extend the capabilities of the program and customize it according to their needs with relatively little effort. Optimizer has been developed in Python, takes advantage of open-source Python modules for nonlinear optimization, and interfaces directly with the NEURON simulator to run the models. Other simulators are supported through an external interface. We have tested the program on several different types of problem of varying complexity, using different model classes. As targets, we used simulated traces from the same or a more complex model class, as well as experimental data. We successfully used Optimizer to determine passive parameters and conductance densities in compartmental models, and to fit simple (adaptive exponential integrate-and-fire neuronal models to complex biological data. Our detailed comparisons show that Optimizer can handle a wider range of problems, and delivers equally good or better performance than any other existing neuronal model fitting

  6. The NEPLAN software package a universal tool for electric power systems analysis

    CERN Document Server

    Kahle, K

    2002-01-01

    The NEPLAN software package has been used by CERN's Electric Power Systems Group since 1997. The software is designed for the calculation of short-circuit currents, load flow, motor start, dynamic stability, harmonic analysis and harmonic filter design. This paper describes the main features of the software package and their application to CERN's electric power systems. The implemented models of CERN's power systems are described in detail. Particular focus is given to fault calculations, harmonic analysis and filter design. Based on this software package and the CERN power network model, several recommendations are given.

  7. Development of a 3d tool for visualization of different software artifacts and their relationships

    OpenAIRE

    Montaño Ramírez, David

    2011-01-01

    Este trabajo se enfoca en el desarrollo de una herramienta de visualización de software que permite analizar diferentes artefactos de software como código fuente y bases de datos relacionales. Por su naturaleza, la herramienta creada propone una metáfora basada en desarrollos anteriores del campo de visualización de software. La primera parte del documento de tesis presenta un estado del arte en el área de visualización de software, esto incluye la forma en cómo esta área aporta al proceso de...

  8. Tools to Support the Reuse of Software Assets for the NASA Earth Science Decadal Survey Missions

    Science.gov (United States)

    Mattmann, Chris A.; Downs, Robert R.; Marshall, James J.; Most, Neal F.; Samadi, Shahin

    2011-01-01

    The NASA Earth Science Data Systems (ESDS) Software Reuse Working Group (SRWG) is chartered with the investigation, production, and dissemination of information related to the reuse of NASA Earth science software assets. One major current objective is to engage the NASA decadal missions in areas relevant to software reuse. In this paper we report on the current status of these activities. First, we provide some background on the SRWG in general and then discuss the group s flagship recommendation, the NASA Reuse Readiness Levels (RRLs). We continue by describing areas in which mission software may be reused in the context of NASA decadal missions. We conclude the paper with pointers to future directions.

  9. Development of the software tool for generation and visualization of the finite element head model with bone conduction sounds

    Science.gov (United States)

    Nikolić, Dalibor; Milošević, Žarko; Saveljić, Igor; Filipović, Nenad

    2015-12-01

    Vibration of the skull causes a hearing sensation. We call it Bone Conduction (BC) sound. There are several investigations about transmission properties of bone conducted sound. The aim of this study was to develop a software tool for easy generation of the finite element (FE) model of the human head with different materials based on human head anatomy and to calculate sound conduction through the head. Developed software tool generates a model in a few steps. The first step is to do segmentation of CT medical images (DICOM) and to generate a surface mesh files (STL). Each STL file presents a different layer of human head with different material properties (brain, CSF, different layers of the skull bone, skin, etc.). The next steps are to make tetrahedral mesh from obtained STL files, to define FE model boundary conditions and to solve FE equations. This tool uses PAK solver, which is the open source software implemented in SIFEM FP7 project, for calculations of the head vibration. Purpose of this tool is to show impact of the bone conduction sound of the head on the hearing system and to estimate matching of obtained results with experimental measurements.

  10. Software sensors as a tool for optimization of animal-cell cultures.

    NARCIS (Netherlands)

    Dorresteijn, P.C.

    1997-01-01

    In this thesis software sensors are introduced that predict the biomass activity and the concentrations of glucose, glutamine, lactic acid, and ammonium on line, The software sensors for biomass activity, glucose and lactic acid can be applied for any type of animal cell that is grown in a bioreacto

  11. Using McIDAS-V data analysis and visualization software as an educational tool for understanding the atmosphere

    Science.gov (United States)

    Achtor, T. H.; Rink, T.

    2010-12-01

    The University of Wisconsin’s Space Science and Engineering Center (SSEC) has been at the forefront in developing data analysis and visualization tools for environmental satellites and other geophysical data. The fifth generation of the Man-computer Interactive Data Access System (McIDAS-V) is Java-based, open-source, freely available software that operates on Linux, Macintosh and Windows systems. The software tools provide powerful new data manipulation and visualization capabilities that work with geophysical data in research, operational and educational environments. McIDAS-V provides unique capabilities to support innovative techniques for evaluating research results, teaching and training. McIDAS-V is based on three powerful software elements. VisAD is a Java library for building interactive, collaborative, 4 dimensional visualization and analysis tools. The Integrated Data Viewer (IDV) is a reference application based on the VisAD system and developed by the Unidata program that demonstrates the flexibility that is needed in this evolving environment, using a modern, object-oriented software design approach. The third tool, HYDRA, allows users to build, display and interrogate multi and hyperspectral environmental satellite data in powerful ways. The McIDAS-V software is being used for training and education in several settings. The McIDAS User Group provides training workshops at its annual meeting. Numerous online tutorials with training data sets have been developed to aid users in learning simple and more complex operations in McIDAS-V, all are available online. In a University of Wisconsin-Madison undergraduate course in Radar and Satellite Meteorology, McIDAS-V is used to create and deliver laboratory exercises using case study and real time data. At the high school level, McIDAS-V is used in several exercises in our annual Summer Workshop in Earth and Atmospheric Sciences to provide young scientists the opportunity to examine data with friendly and

  12. The UP3-UP2 800 reprocessing plants control systems. Use of tools for the diagnosis, the track of control softwares and the management of technical data

    International Nuclear Information System (INIS)

    After a rapid presentation of control systems architectures of the La Hague COGEMA reprocessing plants, details are given about the tools used to master the control and instrumentation softwares and technical data. The paper focusses more particularly on the CML (Software Maintenance Center) tool which manages the software versions installed on the driving system, on the SYDDEX tool devoted to the management of the control and instrumentation associated data and documents, and on the SAD tool used for diagnosis assistance. (J.S.). 5 figs

  13. ROCKETSHIP: a flexible and modular software tool for the planning, processing and analysis of dynamic MRI studies

    International Nuclear Information System (INIS)

    Dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) is a promising technique to characterize pathology and evaluate treatment response. However, analysis of DCE-MRI data is complex and benefits from concurrent analysis of multiple kinetic models and parameters. Few software tools are currently available that specifically focuses on DCE-MRI analysis with multiple kinetic models. Here, we developed ROCKETSHIP, an open-source, flexible and modular software for DCE-MRI analysis. ROCKETSHIP incorporates analyses with multiple kinetic models, including data-driven nested model analysis. ROCKETSHIP was implemented using the MATLAB programming language. Robustness of the software to provide reliable fits using multiple kinetic models is demonstrated using simulated data. Simulations also demonstrate the utility of the data-driven nested model analysis. Applicability of ROCKETSHIP for both preclinical and clinical studies is shown using DCE-MRI studies of the human brain and a murine tumor model. A DCE-MRI software suite was implemented and tested using simulations. Its applicability to both preclinical and clinical datasets is shown. ROCKETSHIP was designed to be easily accessible for the beginner, but flexible enough for changes or additions to be made by the advanced user as well. The availability of a flexible analysis tool will aid future studies using DCE-MRI. A public release of ROCKETSHIP is available at (https://github.com/petmri/ROCKETSHIP)

  14. Numerical arc segmentation algorithm for a radio conference - A software tool for communication satellite systems planning

    Science.gov (United States)

    Whyte, W. A.; Heyward, A. O.; Ponchak, D. S.; Spence, R. L.; Zuzek, J. E.

    1988-01-01

    A detailed description of a Numerical Arc Segmentation Algorithm for a Radio Conference (NASARC) software package for communication satellite systems planning is presented. This software provides a method of generating predetermined arc segments for use in the development of an allotment planning procedure to be carried out at the 1988 World Administrative Radio Conference (WARC - 88) on the use of the GEO and the planning of space services utilizing GEO. The features of the NASARC software package are described, and detailed information is given about the function of each of the four NASARC program modules. The results of a sample world scenario are presented and discussed.

  15. AssociationViewer: a scalable and integrated software tool for visualization of large-scale variation data in genomic context.

    OpenAIRE

    Martin O.; Valsesia A.; Telenti A.; Xenarios I.; Stevenson B.J.

    2009-01-01

    SUMMARY: We present a tool designed for visualization of large-scale genetic and genomic data exemplified by results from genome-wide association studies. This software provides an integrated framework to facilitate the interpretation of SNP association studies in genomic context. Gene annotations can be retrieved from Ensembl, linkage disequilibrium data downloaded from HapMap and custom data imported in BED or WIG format. AssociationViewer integrates functionalities that enable the aggregat...

  16. Towards a Software Tool Supporting Urban Decision Makers in Locating and Sizing the Household Garbage Accumulation Points Within Cities

    OpenAIRE

    Di Felice, Paolino

    2015-01-01

    Locating and sizing garbage bins for the separate accumulation of household solid waste within urban areas is of primary interest for the local administrations that so far lack adequate IT support. The paper highlights the versatility of a method for solving such a problem, which involves both standard and geographic data. Implementation of the proposal, centered around a spatial database, goes in the direction of developing a supporting software tool to the officials responsible for the mana...

  17. IsobarPTM: A software tool for the quantitative analysis of post-translationally modified proteins ☆

    OpenAIRE

    Breitwieser, Florian P; Colinge, Jacques

    2013-01-01

    The establishment of extremely powerful proteomics platforms able to map thousands of modification sites, e.g. phosphorylations or acetylations, over entire proteomes calls for equally powerful software tools to effectively extract useful and reliable information from such complex datasets. We present a new quantitative PTM analysis platform aimed at processing iTRAQ or Tandem Mass Tags (TMT) labeled peptides. It covers a broad range of needs associated with proper PTM ratio analysis such as ...

  18. The Seismic Tool-Kit (STK): an open source software for seismology and signal processing.

    Science.gov (United States)

    Reymond, Dominique

    2016-04-01

    We present an open source software project (GNU public license), named STK: Seismic ToolKit, that is dedicated mainly for seismology and signal processing. The STK project that started in 2007, is hosted by SourceForge.net, and count more than 19 500 downloads at the date of writing. The STK project is composed of two main branches: First, a graphical interface dedicated to signal processing (in the SAC format (SAC_ASCII and SAC_BIN): where the signal can be plotted, zoomed, filtered, integrated, derivated, ... etc. (a large variety of IFR and FIR filter is proposed). The estimation of spectral density of the signal are performed via the Fourier transform, with visualization of the Power Spectral Density (PSD) in linear or log scale, and also the evolutive time-frequency representation (or sonagram). The 3-components signals can be also processed for estimating their polarization properties, either for a given window, or either for evolutive windows along the time. This polarization analysis is useful for extracting the polarized noises, differentiating P waves, Rayleigh waves, Love waves, ... etc. Secondly, a panel of Utilities-Program are proposed for working in a terminal mode, with basic programs for computing azimuth and distance in spherical geometry, inter/auto-correlation, spectral density, time-frequency for an entire directory of signals, focal planes, and main components axis, radiation pattern of P waves, Polarization analysis of different waves (including noize), under/over-sampling the signals, cubic-spline smoothing, and linear/non linear regression analysis of data set. A MINimum library of Linear AlGebra (MIN-LINAG) is also provided for computing the main matrix process like: QR/QL decomposition, Cholesky solve of linear system, finding eigen value/eigen vectors, QR-solve/Eigen-solve of linear equations systems ... etc. STK is developed in C/C++, mainly under Linux OS, and it has been also partially implemented under MS-Windows. Usefull links: http

  19. GUM2DFT—a software tool for uncertainty evaluation of transient signals in the frequency domain

    Science.gov (United States)

    Eichstädt, S.; Wilkens, V.

    2016-05-01

    The Fourier transform and its counterpart for discrete time signals, the discrete Fourier transform (DFT), are common tools in measurement science and application. Although almost every scientific software package offers ready-to-use implementations of the DFT, the propagation of uncertainties in line with the guide to the expression of uncertainty in measurement (GUM) is typically neglected. This is of particular importance in dynamic metrology, when input estimation is carried out by deconvolution in the frequency domain. To this end, we present the new open-source software tool GUM2DFT, which utilizes closed formulas for the efficient propagation of uncertainties for the application of the DFT, inverse DFT and input estimation in the frequency domain. It handles different frequency domain representations, accounts for autocorrelation and takes advantage of the symmetry inherent in the DFT result for real-valued time domain signals. All tools are presented in terms of examples which form part of the software package. GUM2DFT will foster GUM-compliant evaluation of uncertainty in a DFT-based analysis and enable metrologists to include uncertainty evaluations in their routine work.

  20. Towards a publicly available, map-based regional software tool to estimate unregulated daily streamflow at ungauged rivers

    Directory of Open Access Journals (Sweden)

    S. A. Archfield

    2013-01-01

    Full Text Available Streamflow information is critical for addressing any number of hydrologic problems. Often, streamflow information is needed at locations that are ungauged and, therefore, have no observations on which to base water management decisions. Furthermore, there has been increasing need for daily streamflow time series to manage rivers for both human and ecological functions. To facilitate negotiation between human and ecological demands for water, this paper presents the first publicly available, map-based, regional software tool to estimate historical, unregulated, daily streamflow time series (streamflow not affected by human alteration such as dams or water withdrawals at any user-selected ungauged river location. The map interface allows users to locate and click on a river location, which then links to a spreadsheet-based program that computes estimates of daily streamflow for the river location selected. For a demonstration region in the northeast United States, daily streamflow was, in general, shown to be reliably estimated by the software tool. Estimating the highest and lowest streamflows that occurred in the demonstration region over the period from 1960 through 2004 also was accomplished but with more difficulty and limitations. The software tool provides a general framework that can be applied to other regions for which daily streamflow estimates are needed.

  1. GUM2DFT—a software tool for uncertainty evaluation of transient signals in the frequency domain

    International Nuclear Information System (INIS)

    The Fourier transform and its counterpart for discrete time signals, the discrete Fourier transform (DFT), are common tools in measurement science and application. Although almost every scientific software package offers ready-to-use implementations of the DFT, the propagation of uncertainties in line with the guide to the expression of uncertainty in measurement (GUM) is typically neglected. This is of particular importance in dynamic metrology, when input estimation is carried out by deconvolution in the frequency domain. To this end, we present the new open-source software tool GUM2DFT, which utilizes closed formulas for the efficient propagation of uncertainties for the application of the DFT, inverse DFT and input estimation in the frequency domain. It handles different frequency domain representations, accounts for autocorrelation and takes advantage of the symmetry inherent in the DFT result for real-valued time domain signals. All tools are presented in terms of examples which form part of the software package. GUM2DFT will foster GUM-compliant evaluation of uncertainty in a DFT-based analysis and enable metrologists to include uncertainty evaluations in their routine work. (paper)

  2. Methods and tools for dynamic requirements catalog management in agile software development

    OpenAIRE

    Tkachuk, M. V.; Gamzaev, R. A.; Martinkus, I. O.; Ianushkevych, S. D.

    2015-01-01

    A method for managing dynamic requirements catalog in agile software development, especially on example of Scrum-methodology is proposed. Popular approaches to solving this problem are reviewed. The proposed approach is based on the combined usage of the latent semantic analysis and analytical hierarchy process, it allows to evaluate the given textual software specification with respect to their possible redundancy and possible logical conflicts. Besides that this approach supports the decisi...

  3. TU-C-17A-03: An Integrated Contour Evaluation Software Tool Using Supervised Pattern Recognition for Radiotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Chen, H; Tan, J; Kavanaugh, J; Dolly, S; Gay, H; Thorstad, W; Anastasio, M; Altman, M; Mutic, S; Li, H [Washington University School of Medicine, Saint Louis, MO (United States)

    2014-06-15

    Purpose: Radiotherapy (RT) contours delineated either manually or semiautomatically require verification before clinical usage. Manual evaluation is very time consuming. A new integrated software tool using supervised pattern contour recognition was thus developed to facilitate this process. Methods: The contouring tool was developed using an object-oriented programming language C# and application programming interfaces, e.g. visualization toolkit (VTK). The C# language served as the tool design basis. The Accord.Net scientific computing libraries were utilized for the required statistical data processing and pattern recognition, while the VTK was used to build and render 3-D mesh models from critical RT structures in real-time and 360° visualization. Principal component analysis (PCA) was used for system self-updating geometry variations of normal structures based on physician-approved RT contours as a training dataset. The inhouse design of supervised PCA-based contour recognition method was used for automatically evaluating contour normality/abnormality. The function for reporting the contour evaluation results was implemented by using C# and Windows Form Designer. Results: The software input was RT simulation images and RT structures from commercial clinical treatment planning systems. Several abilities were demonstrated: automatic assessment of RT contours, file loading/saving of various modality medical images and RT contours, and generation/visualization of 3-D images and anatomical models. Moreover, it supported the 360° rendering of the RT structures in a multi-slice view, which allows physicians to visually check and edit abnormally contoured structures. Conclusion: This new software integrates the supervised learning framework with image processing and graphical visualization modules for RT contour verification. This tool has great potential for facilitating treatment planning with the assistance of an automatic contour evaluation module in avoiding

  4. TU-C-17A-03: An Integrated Contour Evaluation Software Tool Using Supervised Pattern Recognition for Radiotherapy

    International Nuclear Information System (INIS)

    Purpose: Radiotherapy (RT) contours delineated either manually or semiautomatically require verification before clinical usage. Manual evaluation is very time consuming. A new integrated software tool using supervised pattern contour recognition was thus developed to facilitate this process. Methods: The contouring tool was developed using an object-oriented programming language C# and application programming interfaces, e.g. visualization toolkit (VTK). The C# language served as the tool design basis. The Accord.Net scientific computing libraries were utilized for the required statistical data processing and pattern recognition, while the VTK was used to build and render 3-D mesh models from critical RT structures in real-time and 360° visualization. Principal component analysis (PCA) was used for system self-updating geometry variations of normal structures based on physician-approved RT contours as a training dataset. The inhouse design of supervised PCA-based contour recognition method was used for automatically evaluating contour normality/abnormality. The function for reporting the contour evaluation results was implemented by using C# and Windows Form Designer. Results: The software input was RT simulation images and RT structures from commercial clinical treatment planning systems. Several abilities were demonstrated: automatic assessment of RT contours, file loading/saving of various modality medical images and RT contours, and generation/visualization of 3-D images and anatomical models. Moreover, it supported the 360° rendering of the RT structures in a multi-slice view, which allows physicians to visually check and edit abnormally contoured structures. Conclusion: This new software integrates the supervised learning framework with image processing and graphical visualization modules for RT contour verification. This tool has great potential for facilitating treatment planning with the assistance of an automatic contour evaluation module in avoiding

  5. 软件静态分析工具评析%Evaluation of Software Static Analysis Tools

    Institute of Scientific and Technical Information of China (English)

    王凯; 孔祥营

    2011-01-01

    For finding more software defects during coding phase in software lifecycle to decrease costs and development time, it is necessary for us to actualize static analysis of source codes tested, the most effective means of carrying out static analysis is to use static analysis tools.Aiming at software defects of C procedure to us, via.comparison of functionality of several popular static analysis tools, we discuss the advantages and shortages of static analysis tools as well as many factors which influence us to select static analysis tools.These factors provide us references to selection among static analysis tools.%为了在软件生命周期的编码阶段尽可能多地发现软件缺陷以降低软件成本和开发时间,需要对被测程序源代码实施软件静态分析.软件静态分析最有效的手段是使用软件静态分析工具.针对C程序常见的软件缺陷,通过对几种主流静态分析工具的功能性对比分析,探讨了软件静态分析工具的优缺点及影响软件静态分析工具选择的诸多因素,可为软件测试人员选择合适的软件静态分析工具提供参考.

  6. Tools for quantitative form description; an evaluation of different software packages for semi-landmark analysis.

    Science.gov (United States)

    Botton-Divet, Léo; Houssaye, Alexandra; Herrel, Anthony; Fabre, Anne-Claire; Cornette, Raphael

    2015-01-01

    The challenging complexity of biological structures has led to the development of several methods for quantitative analyses of form. Bones are shaped by the interaction of historical (phylogenetic), structural, and functional constrains. Consequently, bone shape has been investigated intensively in an evolutionary context. Geometric morphometric approaches allow the description of the shape of an object in all of its biological complexity. However, when biological objects present only few anatomical landmarks, sliding semi-landmarks may provide good descriptors of shape. The sliding procedure, mandatory for sliding semi-landmarks, requires several steps that may be time-consuming. We here compare the time required by two different software packages ('Edgewarp' and 'Morpho') for the same sliding task, and investigate potential differences in the results and biological interpretation. 'Morpho' is much faster than 'Edgewarp,' notably as a result of the greater computational power of the 'Morpho' software routines and the complexity of the 'Edgewarp' workflow. Morphospaces obtained using both software packages are similar and provide a consistent description of the biological variability. The principal differences between the two software packages are observed in areas characterized by abrupt changes in the bone topography. In summary, both software packages perform equally well in terms of the description of biological structures, yet differ in the simplicity of the workflow and time needed to perform the analyses. PMID:26618086

  7. Development of a software tool and criteria evaluation for efficient design of small interfering RNA.

    Science.gov (United States)

    Chaudhary, Aparna; Srivastava, Sonam; Garg, Sanjeev

    2011-01-01

    RNA interference can be used as a tool for gene silencing mediated by small interfering RNAs (siRNA). The critical step in effective and specific RNAi processing is the selection of suitable constructs. Major design criteria, i.e., Reynolds's design rules, thermodynamic stability, internal repeats, immunostimulatory motifs were emphasized and implemented in the siRNA design tool. The tool provides thermodynamic stability score, GC content and a total score based on other design criteria in the output. The viability of the tool was established with different datasets. In general, the siRNA constructs produced by the tool had better thermodynamic score and positional properties. Comparable thermodynamic scores and better total scores were observed with the existing tools. Moreover, the results generated had comparable off-target silencing effect. Criteria evaluations with additional criteria were achieved in WEKA. PMID:21145307

  8. Development of a case tool to support decision based software development

    Science.gov (United States)

    Wild, Christian J.

    1993-01-01

    A summary of the accomplishments of the research over the past year are presented. Achievements include: made demonstrations with DHC, a prototype supporting decision based software development (DBSD) methodology, for Paramax personnel at ODU; met with Paramax personnel to discuss DBSD issues, the process of integrating DBSD and Refinery and the porting process model; completed and submitted a paper describing DBSD paradigm to IFIP '92; completed and presented a paper describing the approach for software reuse at the Software Reuse Workshop in April 1993; continued to extend DHC with a project agenda, facility necessary for a better project management; completed a primary draft of the re-engineering process model for porting; created a logging form to trace all the activities involved in the process of solving the reengineering problem, and developed a primary chart with the problems involved by the reengineering process.

  9. A Systemic Approach to the Preservation of Audio Documents: Methodology and Software Tools

    Directory of Open Access Journals (Sweden)

    Federica Bressan

    2013-01-01

    protocol reflects the methodological principles adopted by the authors, and its effectiveness is based on the results obtained in recent research projects involving some of the finest audio archives in Europe. Some recommendations are given for the rerecording process, aimed at minimizing the information loss and at quantifying the unintentional alterations introduced by the technical equipment. Finally, the paper introduces an original software system that guides and supports the preservation staff along the process, reducing the processing timing, automatizing tasks, minimizing errors, and using information hiding strategies to ease the cognitive load. Currently the software system is in use in several international archives.

  10. Software module for geometric product modeling and NC tool path generation

    International Nuclear Information System (INIS)

    The intelligent CAD/CAM system named VIRTUAL MANUFACTURE is created. It is consisted of four intelligent software modules: the module for virtual NC machine creation, the module for geometric product modeling and automatic NC path generation, the module for virtual NC machining and the module for virtual product evaluation. In this paper the second intelligent software module is presented. This module enables feature-based product modeling carried out via automatic saving of the designed product geometric features as knowledge data. The knowledge data are afterwards applied for automatic NC program generation for the designed product NC machining. (Author)

  11. APASVO: A free software tool for automatic P-phase picking and event detection in seismic traces

    Science.gov (United States)

    Romero, José Emilio; Titos, Manuel; Bueno, Ángel; Álvarez, Isaac; García, Luz; Torre, Ángel de la; Benítez, M.a. Carmen

    2016-05-01

    The accurate estimation of the arrival time of seismic waves or picking is a problem of major interest in seismic research given its relevance in many seismological applications, such as earthquake source location and active seismic tomography. In the last decades, several automatic picking methods have been proposed with the ultimate goal of implementing picking algorithms whose results are comparable to those obtained by manual picking. In order to facilitate the use of these automated methods in the analysis of seismic traces, this paper presents a new free, open source, software graphical tool, named APASVO, which allows picking tasks in an easy and user-friendly way. The tool also provides event detection functionality, where a relatively imprecise estimation of the onset time is sufficient. The application implements the STA-LTA detection algorithm and the AMPA picking algorithm. An autoregressive AIC-based picking method can also be applied. Besides, this graphical tool is complemented with two additional command line tools, an event picking tool and a synthetic earthquake generator. APASVO is a multiplatform tool that works on Windows, Linux and OS X. The application can process data in a large variety of file formats. It is implemented in Python and relies on well-known scientific computing packages such as ObsPy, NumPy, SciPy and Matplotlib.

  12. Apache Open Climate Workbench: Building Open Source Climate Science Tools and Community at the Apache Software Foundation

    Science.gov (United States)

    Joyce, M.; Ramirez, P.; Boustani, M.; Mattmann, C. A.; Khudikyan, S.; McGibbney, L. J.; Whitehall, K. D.

    2014-12-01

    Apache Open Climate Workbench (OCW; https://climate.apache.org/) is a Top-Level Project at the Apache Software Foundation that aims to provide a suite of tools for performing climate science evaluations using model outputs from a multitude of different sources (ESGF, CORDEX, U.S. NCA, NARCCAP) with remote sensing data from NASA, NOAA, and other agencies. Apache OCW is the second NASA project to become a Top-Level Project at the Apache Software Foundation. It grew out of the Jet Propulsion Laboratory's (JPL) Regional Climate Model Evaluation System (RCMES) project, a collaboration between JPL and the University of California, Los Angeles' Joint Institute for Regional Earth System Science and Engineering (JIFRESSE). Apache OCW provides scientists and developers with tools for data manipulation, metrics for dataset comparisons, and a visualization suite. In addition to a powerful low-level API, Apache OCW also supports a web application for quick, browser-controlled evaluations, a command line application for local evaluations, and a virtual machine for isolated experimentation with minimal setup. This talk will look at the difficulties and successes of moving a closed community research project out into the wild world of open source. We'll explore the growing pains Apache OCW went through to become a Top-Level Project at the Apache Software Foundation as well as the benefits gained by opening up development to the broader climate and computer science communities.

  13. Development of a software tool and criteria evaluation for efficient design of small interfering RNA

    International Nuclear Information System (INIS)

    Research highlights: → The developed tool predicted siRNA constructs with better thermodynamic stability and total score based on positional and other criteria. → Off-target silencing below score 30 were observed for the best siRNA constructs for different genes. → Immunostimulation and cytotoxicity motifs considered and penalized in the developed tool. → Both positional and compositional criteria were observed to be important. -- Abstract: RNA interference can be used as a tool for gene silencing mediated by small interfering RNAs (siRNA). The critical step in effective and specific RNAi processing is the selection of suitable constructs. Major design criteria, i.e., Reynolds's design rules, thermodynamic stability, internal repeats, immunostimulatory motifs were emphasized and implemented in the siRNA design tool. The tool provides thermodynamic stability score, GC content and a total score based on other design criteria in the output. The viability of the tool was established with different datasets. In general, the siRNA constructs produced by the tool had better thermodynamic score and positional properties. Comparable thermodynamic scores and better total scores were observed with the existing tools. Moreover, the results generated had comparable off-target silencing effect. Criteria evaluations with additional criteria were achieved in WEKA.

  14. The Comprehensive Evaluation of Electronic Learning Tools and Educational Software (CEELTES)

    Science.gov (United States)

    Karolcík, Štefan; Cipková, Elena; Hrušecký, Roman; Veselský, Milan

    2015-01-01

    Despite the fact that digital technologies are more and more used in the learning and education process, there is still lack of professional evaluation tools capable of assessing the quality of used digital teaching aids in a comprehensive and objective manner. Construction of the Comprehensive Evaluation of Electronic Learning Tools and…

  15. Featureous: A Tool for Feature-Centric Analysis of Java Software

    DEFF Research Database (Denmark)

    Olszak, Andrzej; Jørgensen, Bo Nørregaard

    2010-01-01

    mappings between features and source code. To support programmers in overcoming these difficulties, we present a feature-centric analysis tool, Featureous. Our tool extends the NetBeans IDE with mechanisms for efficient location of feature implementations in legacy source code, and an extensive analysis of...

  16. Development of a software tool and criteria evaluation for efficient design of small interfering RNA

    Energy Technology Data Exchange (ETDEWEB)

    Chaudhary, Aparna; Srivastava, Sonam [Department of Chemical Engineering, IIT Kanpur, UP 208 016 (India); Garg, Sanjeev, E-mail: sgarg@iitk.ac.in [Department of Chemical Engineering, IIT Kanpur, UP 208 016 (India)

    2011-01-07

    Research highlights: {yields} The developed tool predicted siRNA constructs with better thermodynamic stability and total score based on positional and other criteria. {yields} Off-target silencing below score 30 were observed for the best siRNA constructs for different genes. {yields} Immunostimulation and cytotoxicity motifs considered and penalized in the developed tool. {yields} Both positional and compositional criteria were observed to be important. -- Abstract: RNA interference can be used as a tool for gene silencing mediated by small interfering RNAs (siRNA). The critical step in effective and specific RNAi processing is the selection of suitable constructs. Major design criteria, i.e., Reynolds's design rules, thermodynamic stability, internal repeats, immunostimulatory motifs were emphasized and implemented in the siRNA design tool. The tool provides thermodynamic stability score, GC content and a total score based on other design criteria in the output. The viability of the tool was established with different datasets. In general, the siRNA constructs produced by the tool had better thermodynamic score and positional properties. Comparable thermodynamic scores and better total scores were observed with the existing tools. Moreover, the results generated had comparable off-target silencing effect. Criteria evaluations with additional criteria were achieved in WEKA.

  17. LevRad software as a tool to learn how to proceed with a shielding adequacy analysis

    International Nuclear Information System (INIS)

    Since the discovery of X-rays by Roentgen in 1895, several recommendations about the hazards from this radiation source have been published. About 14% of the total annual worldwide collective effective dose originates from the diagnostic X-rays examinations. In the UK, the collective effective dose from diagnostic X-rays examinations represents about 90% of the dose from all artificial sources. Diverse strategies have been performed, in an attempt to reduce the worldwide collective effective dose. We developed the LevRad software with the aim to teach how to proceed in an analysis of barriers shielding against diagnostic X-rays, to minimize the contact of the professional or the student with X-rays, and, finally, to prevent the consuming of the X-rays equipment. Some tests of the software were made, and preliminary results indicate that LevRad is efficient as a complementary tool for teaching professionals related to diagnostic radiology. In the case of the students, the advantage is perceived when using the software before the first contact with the X-rays equipment. The software introduces a solid knowledge about shielding adequacy analysis, prevents the consummation of the X-rays tube recurrent of the shielding adequacy analyses teaching and reduces the collective effective dose by avoiding the possible unnecessary exposures. (author)

  18. Evaluating Difficulty Levels of Dynamic Geometry Software Tools to Enhance Teachers' Professional Development

    Science.gov (United States)

    Hohenwarter, Judith; Hohenwarter, Markus; Lavicza, Zsolt

    2010-01-01

    This paper describes a study aimed to identify commonly emerging impediments related to the introduction of dynamic mathematics software. We report on the analysis of data collected during a three-week professional development programme organised for middle and high school teachers in Florida. The study identified challenges that participants face…

  19. Programming Languages or Generic Software Tools, for Beginners' Courses in Computer Literacy?

    Science.gov (United States)

    Neuwirth, Erich

    1987-01-01

    Discussion of methods that can be used to teach beginner courses in computer literacy focuses on students aged 10-12. The value of using a programing language versus using a generic software package is highlighted; Logo and Prolog are reviewed; and the use of databases is discussed. (LRW)

  20. Basis of Estimate Software Tool (BEST) - a practical solution to part of the cost and schedule integration puzzle

    International Nuclear Information System (INIS)

    The Basis of Estimate Software Tool (BEST) was developed at the Rocky Flats Environmental Technology Site (Rocky Flats) to bridge the gap that exists in conventional project control systems between scheduled activities, their allocated or assigned resources, and the set of assumptions (basis of estimate) that correlate resources and activities. Having a documented and auditable basis of estimate (BOE) is necessary for budget validation, work scope analysis, change control, and a number of related management control functions. The uniqueness of BEST is demonstrated by the manner in which it responds to the diverse needs of the heavily regulated environmental workplace - containing many features not found in conventional off-the-shelf software products. However, even companies dealing in relatively unregulated work places will find many attractive features in BEST. This product will be of particular interest to current Government contractors and contractors preparing proposals that may require subsequent validation. 2 figs

  1. The Application of Intentional Subjective Properties and Mediated Communication Tools to Software Agents in Online Disputes Resolution Environments

    Directory of Open Access Journals (Sweden)

    Renzo Gobbin

    2004-11-01

    Full Text Available This paper examines the use of subjective properties in modeling an architecture for cooperative agents using Agent Communication Language (ACL that is used as a mediating tool for cooperative communication activities between and within software agents. The role that subjective and objective properties have in explaining and modeling agent internalization and externalization of ACL messages is investigated and related to Vygotsky’s developmental learning theories such as Mediated Activity Theory. A novel agent architecture ALMA (Agent Language Mediated Activity based on the integration of agents’ subjective and objective properties within an agent communication activity framework will be presented. The relevance of software agents subjective properties in modeling applications such as e-Law Online Dispute Resolution for e-business contractual arrangements using natural language subject/object relation in their communication patterns will be discussed.

  2. AcquiTools: A new Software Toolkit for the Efficient Preparation of DMC-Ready Waveform Data

    Science.gov (United States)

    Golden, S.

    2009-12-01

    Many Seismic projects make use of the excellent infrastructure provided by the IRIS Data Management Center (DMC) for archival and distribution of waveform data. This usually requires the data to be submitted to the DMC in SEED (Standard for the Exchange of Earthquake Data) format. Therefore some current data loggers are already recording data in a waveform-only subset of the SEED format called miniSEED. Nevertheless, recordings from other data loggers, such as the Reftek RT130, first need to be converted. One standard procedure to do this for RT130 data involves a software package distributed by the IRIS PASSCAL Instrument Center. Its use requires the sequential application of a minimum of two conversion programs. This number of conversion programs will increase, if more than a minimum of manipulations is needed. A new tool, named “ckreftekt”, was developed to combine several of these processing steps into one. Thereby, it simplifies the low-level data processing to a point, where a relatively simple shell script is sufficient to process the data set of an entire experiment. This makes the low-level data processing automatically reproducible. As side effects, computation time and disk usage are significantly reduced. So far the tool has only been used in-house. Thereby, more than 2 TB of waveform data have been processed and submitted to the DMC, mostly from the High Lava Plains (HLP), and Structural Change projects. The program “ckreftek” is part of a larger new software toolkit named AcquiTools, which attempts to simplify a series of similar low-level data handling processes. This contribution is an attempt to introduce AcquiTools as an open source tool to the community, and to gather feedback on where its development should be headed in the future.

  3. CUSTOMER RESPONSE TO BESTPRACTICES TRAINING AND SOFTWARE TOOLS PROVIDED BY DOE'S INDUSTRIAL TECHNOLOGIES PROGRAM

    Energy Technology Data Exchange (ETDEWEB)

    Schweitzer, Martin [ORNL; Martin, Michaela A [ORNL; Schmoyer, Richard L [ORNL

    2008-03-01

    The BestPractices program area, which has evolved into the Save Energy Now (SEN) Initiative, is a component of the U.S. Department of Energy's (DOE's) Industrial Technologies Program (ITP) that provides technical assistance and disseminates information on energy-efficient technologies and practices to U.S. industrial firms. The BestPractices approach to information dissemination includes conducting training sessions which address energy-intensive systems (compressed air, steam, process heat, pumps, motors, and fans) and distributing DOE software tools on those same topics. The current report documents a recent Oak Ridge National Laboratory (ORNL) study undertaken to determine the implementation rate, attribution rate, and reduction factor for industrial end-users who received BestPractices training and registered software in FY 2006. The implementation rate is the proportion of service recipients taking energy-saving actions as a result of the service received. The attribution rate applies to those individuals taking energy-saving actions as a result of the services received and represents the portion of the savings achieved through those actions that is due to the service. The reduction factor is the saving that is realized from program-induced measures as a proportion of the potential savings that could be achieved if all service recipients took action. In addition to examining those factors, the ORNL study collected information on selected characteristics of service recipients, the perceived value of the services provided, and the potential energy savings that can be achieved through implementation of measures identified from the training or software. Because the provision of training is distinctly different from the provision of software tools, the two efforts were examined independently and the findings for each are reported separately.

  4. A System Level Tool for Translating Software to Reconfigurable Hardware Project

    Data.gov (United States)

    National Aeronautics and Space Administration — In this research we will develop a system level tool to translate binary code of a general-purpose processor into Register Transfer Level VHDL code to be mapped...

  5. 软件测试过程管理工具的设计与实现%Design and Implementation of Software Testing Process Management Tools

    Institute of Scientific and Technical Information of China (English)

    王象刚

    2014-01-01

    With the development of technology, the popularity of computers, software applications are increasingly being used. As software quality guarantee, software testing is particularly important. This paper first describes software testing management, process management and software testing tools analyzed, final design and implementation of software test management tool for a detailed explanation.%随着科技的发展,计算机的普及,软件的应用也越来越广泛。而作为软件质量的保障,软件测试显得尤为重要。本文先是对软件测试管理进行阐述,然后对软件测试过程管理工具进行了分析,最后对软件测试管理工具的设计与实现进行了详细的说明。

  6. Establishing a Web-Based DICOM Teaching File Authoring Tool Using Open-Source Public Software

    OpenAIRE

    Lee, Wen-Jeng; Yang, Chung-Yi; Liu, Kao-Lang; Liu, Hon-Man; Ching, Yu-Tai; Chen, Shyh-Jye

    2005-01-01

    Online teaching files are an important source of educational and referential materials in the radiology community. The commonly used Digital Imaging and Communications in Medicine (DICOM) file format of the radiology community is not natively supported by common Web browsers. The ability of the Web server to convert and parse DICOM is important when the DICOM-converting tools are not available. In this paper, we describe our approach to develop a Web-based teaching file authoring tool. Our se...

  7. PTaaS: Platform for Providing Software Developing Applications and Tools as a Service

    DEFF Research Database (Denmark)

    Chauhan, Muhammad Aufeef; Babar, Muhammad Ali

    2014-01-01

    Cloud computing has become an established paradigm for enabling organizations to build scalable software systems and to meet challenges of rapid demand of computing and storage resources. There has been a significant success in building cloud-enabled applications for many disciplines ranging from...... web based and mobile application to intensive video and data processing systems. This initial success of cloud has opened new horizons for more complex domains. Global Software Development (GSD) is one of such domains. GSD is different than traditional applications domains because of involvement...... of aligning their processes and establishing technology support needed to facilitate working according to new processes. Although the benefits of using cloud computing to solve GSD issues have been discussed in the literature but there has not been a significant attempt to provide fully functional...

  8. Providing a Connection between a Bayesian Inverse Modeling Tool and a Coupled Hydrogeological Processes Modeling Software

    Science.gov (United States)

    Frystacky, H.; Osorio-Murillo, C. A.; Over, M. W.; Kalbacher, T.; Gunnell, D.; Kolditz, O.; Ames, D.; Rubin, Y.

    2013-12-01

    The Method of Anchored Distributions (MAD) is a Bayesian technique for characterizing the uncertainty in geostatistical model parameters. Open-source software has been developed in a modular framework such that this technique can be applied to any forward model software via a driver. This presentation is about the driver that has been developed for OpenGeoSys (OGS), open-source software that can simulate many hydrogeological processes, including couple processes. MAD allows the use of multiple data types for conditioning the spatially random fields and assessing model parameter likelihood. For example, if simulating flow and mass transport, the inversion target variable could be hydraulic conductivity and the inversion data types could be head, concentration, or both. The driver detects from the OGS files which processes and variables are being used in a given project and allows MAD to prompt the user to choose those that are to be modeled or to be treated deterministically. In this way, any combination of processes allowed by OGS can have MAD applied. As for the software, there are two versions, each with its own OGS driver. A Windows desktop version is available as a graphical user interface and is ideal for the learning and teaching environment. High-throughput computing can even be achieved with this version via HTCondor if large projects want to be pursued in a computer lab. In addition to this desktop application, a Linux version is available equipped with MPI such that it can be run in parallel on a computer cluster. All releases can be downloaded from the MAD Codeplex site given below.

  9. A software tool for analysis and quantification of regional pulmonary ventilation using dynamic hyperpolarised-3He-MRI

    International Nuclear Information System (INIS)

    Purpose: 3He-MRI is able to visualize the regional distribution of lung ventilation with a temporal and spatial resolution so far unmatched by any other technique. The main of the study was the development of a new software tool for quantification of dynamic ventilation parameters in absolute physical units. Materials and Methods: During continuous breathing, a bolus of hyperpolarized 3He (300 ml) was applied at inspiration and a series of 168 coronal projection images simultaneously acquired using a 2D FLASH-sequence. Postprocessing software was developed to analyze the 3He distribution in the lung. After correction for lung motion, several ventilation parameters (rise time, delay time, 3He amount and 3He peak flow) were calculated. Due to normalization of signal intensities, these parameters are presented in absolute physical units. The data sets were analyzed on a ROI basis as well as on a pixel-by-pixel basis. Results: Using the developed software, the measurements were analyzed in 6 lung-healthy volunteers, in one patient after lung transplantation, and in one patient with lung emphysema. The volunteers' parameter maps of the pixel-based analysis showed an almost homogeneous distribution of the ventilation parameters within the lung. In the parameter maps of both patients, regions with poor ventilation were observed. Conclusion: The developed software permits an objective and quantitative analysis of regional lung ventilation in absolute physical units. The clinical significance of the parameters, however, has to be determined in larger clinical studies. The software may become valuable in grading and following pulmonary function as well as in monitoring any therapy. (orig.)

  10. Genoviz Software Development Kit: Java tool kit for building genomics visualization applications

    Directory of Open Access Journals (Sweden)

    Chervitz Stephen A

    2009-08-01

    Full Text Available Abstract Background Visualization software can expose previously undiscovered patterns in genomic data and advance biological science. Results The Genoviz Software Development Kit (SDK is an open source, Java-based framework designed for rapid assembly of visualization software applications for genomics. The Genoviz SDK framework provides a mechanism for incorporating adaptive, dynamic zooming into applications, a desirable feature of genome viewers. Visualization capabilities of the Genoviz SDK include automated layout of features along genetic or genomic axes; support for user interactions with graphical elements (Glyphs in a map; a variety of Glyph sub-classes that promote experimentation with new ways of representing data in graphical formats; and support for adaptive, semantic zooming, whereby objects change their appearance depending on zoom level and zooming rate adapts to the current scale. Freely available demonstration and production quality applications, including the Integrated Genome Browser, illustrate Genoviz SDK capabilities. Conclusion Separation between graphics components and genomic data models makes it easy for developers to add visualization capability to pre-existing applications or build new applications using third-party data models. Source code, documentation, sample applications, and tutorials are available at http://genoviz.sourceforge.net/.

  11. STEMCELL: A software tool for electron microscopy. Part 2 analysis of crystalline materials

    International Nuclear Information System (INIS)

    A new graphical software (STEMCELL) for analysis of HRTEM and STEM-HAADF images is here introduced in detail. The advantage of the software, beyond its graphic interface, is to put together different analysis algorithms and simulation (described in an associated article) to produce novel analysis methodologies. Different implementations and improvements to state of the art approach are reported in the image analysis, filtering, normalization, background subtraction. In particular two important methodological results are here highlighted: (i) the definition of a procedure for atomic scale quantitative analysis of HAADF images, (ii) the extension of geometric phase analysis to large regions up to potentially 1 μm through the use of under sampled images with aliasing effects. - Highlights: ► STEMCELL is a new software for microscopy analysis. ► Improved algorithms for image analysis are demonstrated. ► A new GPA method permits to study strain on large scale. ► New protocols for quantitative analysis of HAADF images. ► Abel transform 3D analysis of composition in Quantum Dots

  12. Immunogenetic Management Software: a new tool for visualization and analysis of complex immunogenetic datasets.

    Science.gov (United States)

    Johnson, Z P; Eady, R D; Ahmad, S F; Agravat, S; Morris, T; Else, J; Lank, S M; Wiseman, R W; O'Connor, D H; Penedo, M C T; Larsen, C P; Kean, L S

    2012-04-01

    Here we describe the Immunogenetic Management Software (IMS) system, a novel web-based application that permits multiplexed analysis of complex immunogenetic traits that are necessary for the accurate planning and execution of experiments involving large animal models, including nonhuman primates. IMS is capable of housing complex pedigree relationships, microsatellite-based MHC typing data, as well as MHC pyrosequencing expression analysis of class I alleles. It includes a novel, automated MHC haplotype naming algorithm and has accomplished an innovative visualization protocol that allows users to view multiple familial and MHC haplotype relationships through a single, interactive graphical interface. Detailed DNA and RNA-based data can also be queried and analyzed in a highly accessible fashion, and flexible search capabilities allow experimental choices to be made based on multiple, individualized and expandable immunogenetic factors. This web application is implemented in Java, MySQL, Tomcat, and Apache, with supported browsers including Internet Explorer and Firefox on Windows and Safari on Mac OS. The software is freely available for distribution to noncommercial users by contacting Leslie.kean@emory.edu. A demonstration site for the software is available at http://typing.emory.edu/typing_demo , user name: imsdemo7@gmail.com and password: imsdemo. PMID:22080300

  13. Corganiser: a web-based software tool for planning time-sensitive sampling of whole rounds during scientific drilling

    DEFF Research Database (Denmark)

    Marshall, Ian

    2014-01-01

    Corganiser is a software tool developed to simplify the process of preparing whole-round sampling plans for time-sensitive microbiology and geochemistry sampling during scientific drilling. It was developed during the Integrated Ocean Drilling Program (IODP) Expedition 347, but is designed to work...... with a wide range of core and section configurations and can thus be used in future drilling projects. Corganiser is written in the Python programming language and is implemented both as a graphical web interface and command-line interface. It can be accessed online at http://130.226.247.137/....

  14. What parameters to consider and which software tools to use for target selection and molecular design of small interfering RNAs.

    Science.gov (United States)

    Matveeva, Olga

    2013-01-01

    The design of small gene silencing RNAs with a high probability of being efficient still has some elements of an art, especially when the lowest concentration of small molecules needs to be utilized. The design of highly target-specific small interfering RNAs or short hairpin RNAs is even a greater challenging task. Some logical schemes and software tools that can be used for simplifying both tasks are presented here. In addition, sequence motifs and sequence composition biases of small interfering RNAs that have to be avoided because of specificity concerns are also detailed. PMID:23027043

  15. The evaluation of Computed Tomography hard- and software tools for micropaleontologic studies on foraminifera

    Science.gov (United States)

    van Loo, D.; Speijer, R.; Masschaele, B.; Dierick, M.; Cnudde, V.; Boone, M.; de Witte, Y.; Dewanckele, J.; van Hoorebeke, L.; Jacobs, P.

    2009-04-01

    Foraminifera (Forams) are single-celled amoeba-like organisms in the sea, which build a tiny calcareous multi-chambered shell for protection. Their enormous abundance, great variation of shape through time and their presence in all marine deposits made these tiny microfossils the oil companies' best friend by facilitating the detection of new oil wells. Besides the success of forams in the oil and gas industry, they are also a most powerful tool for reconstructing climate change in the past. The shell of a foraminifer is a tiny gold mine of information both geometrical as chemical. However, until recently the best information on this architecture was only obtained through imaging the outside of a shell with Scanning Electron Microscopy (SEM), giving no clues towards internal structures other than single snapshots through breaking a specimen apart. With X-ray computed tomography (CT) it is possible to overcome this problem and uncover a huge amount of geometrical information without destructing the samples. Using the last generation of micro-CT's, called nano-CT, because of the sub-micron resolution, it is now possible to perform adequate imaging even on these tiny samples without needing huge facilities. In this research, a comparison is made between different X-ray sources and X-ray detectors and the resulting image resolution. Both sharpness, noise and contrast are very important parameters that will have important effects on the accuracy of the results and on the speed of data-processing. Combining this tomography technique with specific image processing software, called segmentation, it is possible to obtain a 3D virtual representation of the entire forams shell. This 3D virtual object can then be used for many purposes, from which automatic measurement of the chambers size is one of the most important ones. The segmentation process is a combination of several algorithms that are often used in CT evaluation, in this work an evaluation of those algorithms is

  16. Reliability of wind farm design tools in complex terrain : A comparative study of commercial software

    OpenAIRE

    Timander, Tobias; WESTERLUND, JIMMY

    2012-01-01

    A comparative study of two different approaches in wind energy simulations has been made where the aim was to investigate the performance of two commercially available tools. The study includes the linear model by WAsP and the computational fluid dynamic model of WindSim (also featuring an additional forest module). The case studied is a small wind farm located in the inland of Sweden featuring a fairly complex and forested terrain. The results showed similar estimations from both tools and i...

  17. Plots, Calculations and Graphics Tools (PCG2). Software Transfer Request Presentation

    Science.gov (United States)

    Richardson, Marilou R.

    2010-01-01

    This slide presentation reviews the development of the Plots, Calculations and Graphics Tools (PCG2) system. PCG2 is an easy to use tool that provides a single user interface to view data in a pictorial, tabular or graphical format. It allows the user to view the same display and data in the Control Room, engineering office area, or remote sites. PCG2 supports extensive and regular engineering needs that are both planned and unplanned and it supports the ability to compare, contrast and perform ad hoc data mining over the entire domain of a program's test data.

  18. A software tool for simulation of surfaces generated by ball nose end milling

    DEFF Research Database (Denmark)

    Bissacco, Giuliano

    2004-01-01

    The number of models available for prediction of surface topography is very limited. The main reason is that these models cannot be based on engineering principles like those for elastic deformations. Most knowledge about surface roughness and integrity is empirical and up to now very few mathema...... readable by a surface processor software (SPIP [2]), for calculation of a number of surface roughness parameters. In the next paragraph a description of the basic features of ball nose end milled surfaces is given, while in paragraph 3 the model is described....

  19. Software Tools for the Analysis of the Photocathode Response of Photomultiplier Vacuum Tubes

    OpenAIRE

    Fabbri, R.

    2013-01-01

    The central institute of electronics (ZEA-2) in the Forschungszentrum Juelich (FZJ) has developed a system to scan the response of the photocathode of photomultiplier tubes (PMT). The PMT sits tight on a supporting structure, while a blue light emitting diode is moved along its surface by two stepper motors, spanning both the x and y coordinates. All the system is located in a light-tight box made by wood. A graphical software was developed in-situ to perform the scan operations under differe...

  20. A software tool to estimate the dynamic behaviour of the IP2C samples as sensors for didactic purposes

    International Nuclear Information System (INIS)

    Ionic Polymer Polymer Composites (IP2Cs) are emerging materials used to realize motion actuators and sensors. In the former case a voltage input is able to cause the membrane to bend while in the latter case by bending an IP2C membrane, a voltage output is obtained. In this paper authors introduce a software tool able to estimate the dynamic behaviour for sensors based on IP2Cs working in air. In the proposed tool, geometrical quantities that rule the sensing properties of IP2C-based transducers are taken into account together with their dynamic characteristics. A graphical interface (GUI) has been developed in order to give a useful tool that allows the user to understand the behaviour and the role of the parameters involved in the transduction phenomena. The tool is based on the idea that a graphical user interface will allow persons not skilled in IP2C materials to observe their behaviour and to analyze their characteristics. This could greatly increase the interest of researchers towards this new class of transducers; moreover, it can support the educational activity of students involved in advanced academical courses.

  1. Experiences and perspectives with SRI's tools for software design and validation

    Science.gov (United States)

    Goguen, J.; Levitt, K. N.

    1982-01-01

    Development of tools that include the STP theorem poer and its associated Design Verification Systems; PHIL, a meta-programmable context sensitive structured editor; Pegasus, a system for support of graphical programming; and OBJ, an ultra high level programming language based on rewrite rules and abstract data type is reported.

  2. Generating statements at whole-body imaging with a workflow-optimized software tool - first experiences with multireader analysis

    International Nuclear Information System (INIS)

    Introduction: Due to technical innovations in sectional diagram methods, whole-body imaging has increased in importance for clinical radiology, particularly for the diagnosis of systemic tumor disease. Large numbers of images have to be evaluated in increasingly shorter time periods. The aim was to create and evaluate a new software tool to assist and automate the process of diagnosing whole-body datasets. Material and Methods: Thirteen whole-body datasets were evaluated by 3 readers using the conventional system and the new software tool. The times for loading the datasets, examining 5 different regions (head, neck, thorax, abdomen and pelvis/skeletal system) and retrieving a relevant finding for demonstration were acquired. Additionally a Student T-Test was performed. For qualitative analysis the 3 readers used a scale from 0 - 4 (0 = bad, 4 = very good) to assess dataset loading convenience, lesion location assistance, and ease of use. Additionally a kappa value was calculated. Results: The average loading time was 39.7 s (± 5.5) with the conventional system and 6.5 s (± 1.4) (p 0.9). The qualitative analysis showed a significant advantage with respect to convenience (p 0.9). (orig.)

  3. IDENT 1D - a novel software tool for an easy identification of material constitutive parameters

    International Nuclear Information System (INIS)

    Non-linear finite element computations make use of very sophisticated constitutive equations for the description of materials behaviour. The first difficulty encountered by potential users is the gap existing between raw material characterisation on uniaxial specimens and the knowledge of the required equation's parameters. There are very few softwares for this particular task. IDENT 1D is a special software developed under Matlab language in our laboratory, which is able to provide a complete optimised parameters set for implemented models. The originality of IDENT 1D is that no initial estimation of the material parameters is requested of the user. Two main examples are described in this article: the LEMAITRE AND CHABOCHE (1990) creep law coupled with damage and a non unified cyclic law proposed by CONTESTI AND CAILLETAUD (1989) with a separation of plastic and viscous strain terms which is called DDI model. For both laws, the identification method is completely described. Each method is then applied to a set of experimental data. In both cases, the results of the parameters identification show a very good agreement with experimental data. (orig.)

  4. Ident 1D - a novel software tool for an easy identification of material constitutive parameters

    International Nuclear Information System (INIS)

    Non-linear finite element computations make use of very sophisticated constitutive equations for description of materials behaviour. The first difficulty encountered by potential users is the gap existing between raw material characterisation on uniaxial specimens and the knowledge of the required equation's parameters. There are very few software for this particular task. IDENT 1D is a special software developed under Matlab language in our laboratory, which is able to provide a complete optimised parameters set for implemented models. The originality of IDENT 1D is that no initial estimation of the material parameters is requested of the user. Two main examples are described in this article: the Lemaitre and Chaboche creep law coupled with damage and a non unified cyclic law proposed by Contesti and Cailletaud with a separation of plastic and viscous strain terms which is called DDI model. For both laws, the identification method is completely described. Each method is then applied to a set of experimental data. In both cases, the results of the parameters identification show a very good agreement with experimental data. (authors)

  5. Software Tools for the Analysis of the Photocathode Response of Photomultiplier Vacuum Tubes

    CERN Document Server

    Fabbri, R

    2013-01-01

    The central institute of electronics (ZEA-2) in the Forschungszentrum Juelich (FZJ) has developed a system to scan the response of the photocathode of photomultiplier tubes (PMT). The PMT sits tight on a supporting structure, while a blue light emitting diode is moved along its surface by two stepper motors, spanning both the x and y coordinates. All the system is located in a light-tight box made by wood. A graphical software was developed in-situ to perform the scan operations under different configurations (e.g., the step size of the scan and the number of measurements per point). During each point measurement the current output generated in the vacuum photomultiplier is processed in sequence by a pre-amplifier (mainly to convert the current signal into a voltage signal), an amplifier, and by an ADC module (typically a CAEN N957). The information of the measurement is saved in files at the end of the scan. Recently, software based on the CERN ROOT and on the Qt libraries was developed to help the user anal...

  6. MoRFchibi SYSTEM: software tools for the identification of MoRFs in protein sequences.

    Science.gov (United States)

    Malhis, Nawar; Jacobson, Matthew; Gsponer, Jörg

    2016-07-01

    Molecular recognition features, MoRFs, are short segments within longer disordered protein regions that bind to globular protein domains in a process known as disorder-to-order transition. MoRFs have been found to play a significant role in signaling and regulatory processes in cells. High-confidence computational identification of MoRFs remains an important challenge. In this work, we introduce MoRFchibi SYSTEM that contains three MoRF predictors: MoRFCHiBi, a basic predictor best suited as a component in other applications, MoRFCHiBi_ Light, ideal for high-throughput predictions and MoRFCHiBi_ Web, slower than the other two but best for high accuracy predictions. Results show that MoRFchibi SYSTEM provides more than double the precision of other predictors. MoRFchibi SYSTEM is available in three different forms: as HTML web server, RESTful web server and downloadable software at: http://www.chibi.ubc.ca/faculty/joerg-gsponer/gsponer-lab/software/morf_chibi/. PMID:27174932

  7. Numerical arc segmentation algorithm for a radio conference: A software tool for communication satellite systems planning

    Science.gov (United States)

    Whyte, W. A.; Heyward, A. O.; Ponchak, D. S.; Spence, R. L.; Zuzek, J. E.

    1988-01-01

    The Numerical Arc Segmentation Algorithm for a Radio Conference (NASARC) provides a method of generating predetermined arc segments for use in the development of an allotment planning procedure to be carried out at the 1988 World Administrative Radio Conference (WARC) on the Use of the Geostationary Satellite Orbit and the Planning of Space Services Utilizing It. Through careful selection of the predetermined arc (PDA) for each administration, flexibility can be increased in terms of choice of system technical characteristics and specific orbit location while reducing the need for coordination among administrations. The NASARC software determines pairwise compatibility between all possible service areas at discrete arc locations. NASARC then exhaustively enumerates groups of administrations whose satellites can be closely located in orbit, and finds the arc segment over which each such compatible group exists. From the set of all possible compatible groupings, groups and their associated arc segments are selected using a heuristic procedure such that a PDA is identified for each administration. Various aspects of the NASARC concept and how the software accomplishes specific features of allotment planning are discussed.

  8. Internet-Based Software Tools for Analysis and Processing of LIDAR Point Cloud Data via the OpenTopography Portal

    Science.gov (United States)

    Nandigam, V.; Crosby, C. J.; Baru, C.; Arrowsmith, R.

    2009-12-01

    LIDAR is an excellent example of the new generation of powerful remote sensing data now available to Earth science researchers. Capable of producing digital elevation models (DEMs) more than an order of magnitude higher resolution than those currently available, LIDAR data allows earth scientists to study the processes that contribute to landscape evolution at resolutions not previously possible, yet essential for their appropriate representation. Along with these high-resolution datasets comes an increase in the volume and complexity of data that the user must efficiently manage and process in order for it to be scientifically useful. Although there are expensive commercial LIDAR software applications available, processing and analysis of these datasets are typically computationally inefficient on the conventional hardware and software that is currently available to most of the Earth science community. We have designed and implemented an Internet-based system, the OpenTopography Portal, that provides integrated access to high-resolution LIDAR data as well as web-based tools for processing of these datasets. By using remote data storage and high performance compute resources, the OpenTopography Portal attempts to simplify data access and standard LIDAR processing tasks for the Earth Science community. The OpenTopography Portal allows users to access massive amounts of raw point cloud LIDAR data as well as a suite of DEM generation tools to enable users to generate custom digital elevation models to best fit their science applications. The Cyberinfrastructure software tools for processing the data are freely available via the portal and conveniently integrated with the data selection in a single user-friendly interface. The ability to run these tools on powerful Cyberinfrastructure resources instead of their own labs provides a huge advantage in terms of performance and compute power. The system also encourages users to explore data processing methods and the

  9. The pyPHaz software, an interactive tool to analyze and visualize results from probabilistic hazard assessments

    Science.gov (United States)

    Tonini, Roberto; Selva, Jacopo; Costa, Antonio; Sandri, Laura

    2014-05-01

    Probabilistic Hazard Assessment (PHA) is becoming an essential tool for risk mitigation policies, since it allows to quantify the hazard due to hazardous phenomena and, differently from the deterministic approach, it accounts for both aleatory and epistemic uncertainties. On the other hand, one of the main disadvantages of PHA methods is that their results are not easy to understand and interpret by people who are not specialist in probabilistic tools. For scientists, this leads to the issue of providing tools that can be easily used and understood by decision makers (i.e., risk managers or local authorities). The work here presented fits into the problem of simplifying the transfer between scientific knowledge and land protection policies, by providing an interface between scientists, who produce PHA's results, and decision makers, who use PHA's results for risk analyses. In this framework we present pyPHaz, an open tool developed and designed to visualize and analyze PHA results due to one or more phenomena affecting a specific area of interest. The software implementation has been fully developed with the free and open-source Python programming language and some featured Python-based libraries and modules. The pyPHaz tool allows to visualize the Hazard Curves (HC) calculated in a selected target area together with different levels of uncertainty (mean and percentiles) on maps that can be interactively created and modified by the user, thanks to a dedicated Graphical User Interface (GUI). Moreover, the tool can be used to compare the results of different PHA models and to merge them, by creating ensemble models. The pyPHaz software has been designed with the features of storing and accessing all the data through a MySQL database and of being able to read as input the XML-based standard file formats defined in the frame of GEM (Global Earthquake Model). This format model is easy to extend also to any other kind of hazard, as it will be shown in the applications

  10. SU-E-J-199: A Software Tool for Quality Assurance of Online Replanning with MR-Linac

    International Nuclear Information System (INIS)

    Purpose: To develop a quality assurance software tool, ArtQA, capable of automatically checking radiation treatment plan parameters, verifying plan data transfer from treatment planning system (TPS) to record and verify (R&V) system, performing a secondary MU calculation considering the effect of magnetic field from MR-Linac, and verifying the delivery and plan consistency, for online replanning. Methods: ArtQA was developed by creating interfaces to TPS (e.g., Monaco, Elekta), R&V system (Mosaiq, Elekta), and secondary MU calculation system. The tool obtains plan parameters from the TPS via direct file reading, and retrieves plan data both transferred from TPS and recorded during the actual delivery in the R&V system database via open database connectivity and structured query language. By comparing beam/plan datasets in different systems, ArtQA detects and outputs discrepancies between TPS, R&V system and secondary MU calculation system, and delivery. To consider the effect of 1.5T transverse magnetic field from MR-Linac in the secondary MU calculation, a method based on modified Clarkson integration algorithm was developed and tested for a series of clinical situations. Results: ArtQA is capable of automatically checking plan integrity and logic consistency, detecting plan data transfer errors, performing secondary MU calculations with or without a transverse magnetic field, and verifying treatment delivery. The tool is efficient and effective for pre- and post-treatment QA checks of all available treatment parameters that may be impractical with the commonly-used visual inspection. Conclusion: The software tool ArtQA can be used for quick and automatic pre- and post-treatment QA check, eliminating human error associated with visual inspection. While this tool is developed for online replanning to be used on MR-Linac, where the QA needs to be performed rapidly as the patient is lying on the table waiting for the treatment, ArtQA can be used as a general QA tool

  11. The GlycanBuilder: a fast, intuitive and flexible software tool for building and displaying glycan structures

    Directory of Open Access Journals (Sweden)

    Dell Anne

    2007-08-01

    Full Text Available Abstract Background Carbohydrates play a critical role in human diseases and their potential utility as biomarkers for pathological conditions is a major driver for characterization of the glycome. However, the additional complexity of glycans compared to proteins and nucleic acids has slowed the advancement of glycomics in comparison to genomics and proteomics. The branched nature of carbohydrates, the great diversity of their constituents and the numerous alternative symbolic notations, make the input and display of glycans not as straightforward as for example the amino-acid sequence of a protein. Every glycoinformatic tool providing a user interface would benefit from a fast, intuitive, appealing mechanism for input and output of glycan structures in a computer readable format. Results A software tool for building and displaying glycan structures using a chosen symbolic notation is described here. The "GlycanBuilder" uses an automatic rendering algorithm to draw the saccharide symbols and to place them on the drawing board. The information about the symbolic notation is derived from a configurable graphical model as a set of rules governing the aspect and placement of residues and linkages. The algorithm is able to represent a structure using only few traversals of the tree and is inherently fast. The tool uses an XML format for import and export of encoded structures. Conclusion The rendering algorithm described here is able to produce high-quality representations of glycan structures in a chosen symbolic notation. The automated rendering process enables the "GlycanBuilder" to be used both as a user-independent component for displaying glycans and as an easy-to-use drawing tool. The "GlycanBuilder" can be integrated in web pages as a Java applet for the visual editing of glycans. The same component is available as a web service to render an encoded structure into a graphical format. Finally, the "GlycanBuilder" can be integrated into other

  12. Data Assimilation Tools for CO2 Reservoir Model Development – A Review of Key Data Types, Analyses, and Selected Software

    Energy Technology Data Exchange (ETDEWEB)

    Rockhold, Mark L.; Sullivan, E. C.; Murray, Christopher J.; Last, George V.; Black, Gary D.

    2009-09-30

    Pacific Northwest National Laboratory (PNNL) has embarked on an initiative to develop world-class capabilities for performing experimental and computational analyses associated with geologic sequestration of carbon dioxide. The ultimate goal of this initiative is to provide science-based solutions for helping to mitigate the adverse effects of greenhouse gas emissions. This Laboratory-Directed Research and Development (LDRD) initiative currently has two primary focus areas—advanced experimental methods and computational analysis. The experimental methods focus area involves the development of new experimental capabilities, supported in part by the U.S. Department of Energy’s (DOE) Environmental Molecular Science Laboratory (EMSL) housed at PNNL, for quantifying mineral reaction kinetics with CO2 under high temperature and pressure (supercritical) conditions. The computational analysis focus area involves numerical simulation of coupled, multi-scale processes associated with CO2 sequestration in geologic media, and the development of software to facilitate building and parameterizing conceptual and numerical models of subsurface reservoirs that represent geologic repositories for injected CO2. This report describes work in support of the computational analysis focus area. The computational analysis focus area currently consists of several collaborative research projects. These are all geared towards the development and application of conceptual and numerical models for geologic sequestration of CO2. The software being developed for this focus area is referred to as the Geologic Sequestration Software Suite or GS3. A wiki-based software framework is being developed to support GS3. This report summarizes work performed in FY09 on one of the LDRD projects in the computational analysis focus area. The title of this project is Data Assimilation Tools for CO2 Reservoir Model Development. Some key objectives of this project in FY09 were to assess the current state

  13. ATAQS: A computational software tool for high throughput transition optimization and validation for selected reaction monitoring mass spectrometry

    Directory of Open Access Journals (Sweden)

    Ramos Hector

    2011-03-01

    proteomics via SRM is a powerful new technique that enables the reproducible and accurate identification and quantification of sets of proteins of interest. ATAQS is the first open-source software that supports all steps of the targeted proteomics workflow. ATAQS also provides software API (Application Program Interface documentation that enables the addition of new algorithms to each of the workflow steps. The software, installation guide and sample dataset can be found in http://tools.proteomecenter.org/ATAQS/ATAQS.html

  14. ATAQS: A computational software tool for high throughput transition optimization and validation for selected reaction monitoring mass spectrometry

    Science.gov (United States)

    2011-01-01

    new technique that enables the reproducible and accurate identification and quantification of sets of proteins of interest. ATAQS is the first open-source software that supports all steps of the targeted proteomics workflow. ATAQS also provides software API (Application Program Interface) documentation that enables the addition of new algorithms to each of the workflow steps. The software, installation guide and sample dataset can be found in http://tools.proteomecenter.org/ATAQS/ATAQS.html PMID:21414234

  15. Software tool for analysing the family shopping basket without candidate generation

    OpenAIRE

    Roberto Carlos Naranjo Cuervo; Luz Marina Sierra Martínez

    2010-01-01

    Tools leading to useful knowledge being obtained for supporting marketing decisions being taken are currently needed in the e-commerce environment. A process is needed for this which uses a series of techniques for data-processing; data-mining is one such technique enabling automatic information discovery. This work presents the association rules as a suitable technique for dis-covering how customers buy from a company offering business to consumer (B2C) e-business, aimed at supporting decisi...

  16. Gel2DE - A software tool for correlation analysis of 2D gel electrophoresis data

    OpenAIRE

    Øye, Ola Kristoffer; Jørgensen, Katarina Mariann; Hjelle, Sigrun Margrethe; Sulen, André; Ulvang, Dag Magne; Gjertsen, Bjørn Tore

    2013-01-01

    Background: Two-dimensional gel electrophoresis (2DE) is a powerful technique for studying protein isoforms and their modifications. Existing commercial 2D image analysis tools rely on spot detection that limits analysis of complex protein profiles, e.g. spot appearance/disappearance or overlapping spots. Pixel-by-pixel correlation analysis, an analysis technique for identifying relations between protein patterns in gel images and external variables, can overcome such limitations ...

  17. A Bluetooth low-energy capture and analysis tool using software-defined radio

    OpenAIRE

    Kilgour, Christopher David

    2013-01-01

    Wireless protocol analysis is a useful tool for researchers, engineers, and network security professionals. Exhaustive BTLE sniffing – the full capture and analysis of Bluetooth Low-Energy radio transmissions – has been out of reach for individuals to apply to research, engineering, and security analysis tasks. Discovering and following an arbitrary Bluetooth frequency-hopping pattern with a cheap narrow-band receiver is a complex undertaking with little chance of success. Further, the high-e...

  18. TIP-EXE: A software tool for studying the use and understanding of procedural documents

    OpenAIRE

    Ganier, Franck; Querrec, Ronan

    2012-01-01

    International audience Research problem: When dealing with procedural documents, individuals sometimes encounter comprehension problems due to poor information design. Researchers studying the use and understanding of procedural documents, as well as technical writers charged with the design of these documents, or usability specialists evaluating their quality, would all benefit from tools allowing them to collect real-time data concerning user behavior in user-centered studies. With this ...

  19. A Tool for Optimizing the Build Performance of Large Software Code Bases

    OpenAIRE

    Telea, Alexandru; Voinea, Lucian; Kontogiannis, K; Tjortjis, C.; Winter, A.

    2008-01-01

    We present Build Analyzer, a tool that helps developers optimize the build performance of huge systems written in C Due to complex C header dependencies, even small code changes can cause extremely long rebuilds, which are problematic when code is shared and modified by teams of hundreds of individuals. Build Analyzer supports several use cases. For developers, it provides an estimate of the build impact and distribution caused by a given change. For architects, it shows why a build is costly...

  20. The evaluation of Computed Tomography hard- and software tools for micropaleontologic studies on foraminifera

    OpenAIRE

    Van Loo, Denis; Speijer, Robert; Masschaele, Bert; Dierick, Manuel; Cnudde, Veerle; Boone, Matthieu; De Witte, Yoni; Dewanckele, Jan; Van Hoorebeke, Luc; Jacobs, Patric

    2009-01-01

    Foraminifera (Forams) are single-celled amoeba-like organisms in the sea, which build a tiny calcareous multi-chambered shell for protection. Their enormous abundance, great variation of shape through time and their presence in all marine deposits made these tiny microfossils the oil companies’ best friend by facilitating the detection of new oil wells. Besides the success of forams in the oil and gas industry, they are also a most powerful tool for reconstructing climate change in the past. ...

  1. Automatic Tools for Software Quality Analysis in a Project-Based-Learning Course

    OpenAIRE

    Montero Martínez, Juan Manuel; San Segundo Hernández, Rubén; Córdoba Herralde, Ricardo de; Marin de la Barcena, Amparo; Zlotnik, Alexander

    2009-01-01

    Over the last decade, the “Anytime, anywhere” paradigm has gained pace in Higher Education teaching, leading many universities to innovate in pedagogical strategies based on Internet and Web access technologies. Development of remote access technologies has enabled teachers to achieve higher levels of efficiency while students can access tools and resources no longer constrained by time or location. Additionally, students can submit their assignments, be evaluated and be provided feedbac...

  2. Using Teamcenter engineering software for a successive punching tool lifecycle management

    Science.gov (United States)

    Blaga, F.; Pele, A.-V.; Stǎnǎşel, I.; Buidoş, T.; Hule, V.

    2015-11-01

    The paper presents studies and researches results of the implementation of Teamcenter (TC) integrated management of a product lifecycle, in a virtual enterprise. The results are able to be implemented also in a real enterprise. The product was considered a successive punching and cutting tool, designed to materialize a metal sheet part. The paper defines the technical documentation flow (flow of information) in the process of constructive computer aided design of the tool. After the design phase is completed a list of parts is generated containing standard or manufactured components (BOM, Bill of Materials). The BOM may be exported to MS Excel (.xls) format and can be transferred to other departments of the company in order to supply the necessary materials and resources to achieve the final product. This paper describes the procedure to modify or change certain dimensions of sheet metal part obtained by punching. After 3D and 2D design, the digital prototype of punching tool moves to following lifecycle phase of the manufacturing process. For each operation of the technological process the corresponding phases are described in detail. Teamcenter enables to describe manufacturing company structure, underlying workstations that carry out various operations of manufacturing process. The paper revealed that the implementation of Teamcenter PDM in a company, improves efficiency of managing product information, eliminating time working with search, verification and correction of documentation, while ensuring the uniqueness and completeness of the product data.

  3. Use of slide presentation software as a tool to measure hip arthroplasty wear.

    Science.gov (United States)

    Yun, Ho Hyun; Jajodia, Nirmal K; Myung, Jae Sung; Oh, Jong Keon; Park, Sang Won; Shon, Won Yong

    2009-12-01

    The authors propose a manual measurement method for wear in total hip arthroplasty (PowerPoint method) based on the well-known Microsoft PowerPoint software (Microsoft Corporation, Redmond, Wash). In addition, the accuracy and reproducibility of the devised method were quantified and compared with two methods previously described by Livermore and Dorr, and accuracies were determined at different degrees of wear. The 57 hips recruited were allocated to: class 1 (retrieval series), class 2 (clinical series), and class 3 (a repeat film analysis series). The PowerPoint method was found to have good reproducibility and to better detect wear differences between classes. The devised method can be easily used for recording wear at follow-up visits and could be used as a supplementary method when computerized methods cannot be employed. PMID:19896061

  4. On a Formal Tool for Reasoning About Flight Software Cost Analysis

    Science.gov (United States)

    Spagnuolo, John N., Jr.; Stukes, Sherry A.

    2013-01-01

    A report focuses on the development of flight software (FSW) cost estimates for 16 Discovery-class missions at JPL. The techniques and procedures developed enabled streamlining of the FSW analysis process, and provided instantaneous confirmation that the data and processes used for these estimates were consistent across all missions. The research provides direction as to how to build a prototype rule-based system for FSW cost estimation that would provide (1) FSW cost estimates, (2) explanation of how the estimates were arrived at, (3) mapping of costs, (4) mathematical trend charts with explanations of why the trends are what they are, (5) tables with ancillary FSW data of interest to analysts, (6) a facility for expert modification/enhancement of the rules, and (7) a basis for conceptually convenient expansion into more complex, useful, and general rule-based systems.

  5. YANA – a software tool for analyzing flux modes, gene-expression and enzyme activities

    Directory of Open Access Journals (Sweden)

    Engels Bernd

    2005-06-01

    Full Text Available Abstract Background A number of algorithms for steady state analysis of metabolic networks have been developed over the years. Of these, Elementary Mode Analysis (EMA has proven especially useful. Despite its low user-friendliness, METATOOL as a reliable high-performance implementation of the algorithm has been the instrument of choice up to now. As reported here, the analysis of metabolic networks has been improved by an editor and analyzer of metabolic flux modes. Analysis routines for expression levels and the most central, well connected metabolites and their metabolic connections are of particular interest. Results YANA features a platform-independent, dedicated toolbox for metabolic networks with a graphical user interface to calculate (integrating METATOOL, edit (including support for the SBML format, visualize, centralize, and compare elementary flux modes. Further, YANA calculates expected flux distributions for a given Elementary Mode (EM activity pattern and vice versa. Moreover, a dissection algorithm, a centralization algorithm, and an average diameter routine can be used to simplify and analyze complex networks. Proteomics or gene expression data give a rough indication of some individual enzyme activities, whereas the complete flux distribution in the network is often not known. As such data are noisy, YANA features a fast evolutionary algorithm (EA for the prediction of EM activities with minimum error, including alerts for inconsistent experimental data. We offer the possibility to include further known constraints (e.g. growth constraints in the EA calculation process. The redox metabolism around glutathione reductase serves as an illustration example. All software and documentation are available for download at http://yana.bioapps.biozentrum.uni-wuerzburg.de. Conclusion A graphical toolbox and an editor for METATOOL as well as a series of additional routines for metabolic network analyses constitute a new user

  6. Software Tools For Building Decision-support Models For Flood Emergency Situations

    Science.gov (United States)

    Garrote, L.; Molina, M.; Ruiz, J. M.; Mosquera, J. C.

    The SAIDA decision-support system was developed by the Spanish Ministry of the Environment to provide assistance to decision-makers during flood situations. SAIDA has been tentatively implemented in two test basins: Jucar and Guadalhorce, and the Ministry is currently planning to have it implemented in all major Spanish basins in a few years' time. During the development cycle of SAIDA, the need for providing as- sistance to end-users in model definition and calibration was clearly identified. System developers usually emphasise abstraction and generality with the goal of providing a versatile software environment. End users, on the other hand, require concretion and specificity to adapt the general model to their local basins. As decision-support models become more complex, the gap between model developers and users gets wider: Who takes care of model definition, calibration and validation?. Initially, model developers perform these tasks, but the scope is usually limited to a few small test basins. Before the model enters operational stage, end users must get involved in model construction and calibration, in order to gain confidence in the model recommendations. However, getting the users involved in these activities is a difficult task. The goal of this re- search is to develop representation techniques for simulation and management models in order to define, develop and validate a mechanism, supported by a software envi- ronment, oriented to provide assistance to the end-user in building decision models for the prediction and management of river floods in real time. The system is based on three main building blocks: A library of simulators of the physical system, an editor to assist the user in building simulation models, and a machine learning method to calibrate decision models based on the simulation models provided by the user.

  7. Models, methods and software tools for building complex adaptive traffic systems

    International Nuclear Information System (INIS)

    The paper studies the modern methods and tools to simulate the behavior of complex adaptive systems (CAS), the existing systems of traffic modeling in simulators and their characteristics; proposes requirements for assessing the suitability of the system to simulate the CAS behavior in simulators. The author has developed a model of adaptive agent representation and its functioning environment to meet certain requirements set above, and has presented methods of agents' interactions and methods of conflict resolution in simulated traffic situations. A simulation system realizing computer modeling for simulating the behavior of CAS in traffic situations has been created

  8. Models, methods and software tools to evaluate the quality of informational and educational resources

    International Nuclear Information System (INIS)

    The paper studies the modern methods and tools to evaluate the quality of data systems, which allows determining the specificity of informational and educational resources (IER). The author has developed a model of IER quality management at all stages of the life cycle and an integrated multi-level hierarchical system of IER quality assessment, taking into account both information properties and targeted resource assignment. The author presents a mathematical and algorithmic justification of solving the problem of IER quality management, and offers data system to assess the IER quality

  9. Software tools for 3d modeling as a part of design and technology in primary school

    OpenAIRE

    Mihovec, Nastja

    2013-01-01

    There are numerous programs that enable 3D modeling. We can choose from various free programs or the ones that we must pay for. Many designers and engineers use payable programs such as AutoCad, Maya, ProEngineer, Cinema 3D, SolidWorks, etc. In their opinion these programs give their users more than the free ones mainly because of their better modeling quality, tools, functions, easy usage, support, maintenance, etc. Free program developers try very hard to convince these users to reconsider,...

  10. Using Numerical Models in the Development of Software Tools for Risk Management of Accidents with Oil and Inert Spills

    Science.gov (United States)

    Fernandes, R.; Leitão, P. C.; Braunschweig, F.; Lourenço, F.; Galvão, P.; Neves, R.

    2012-04-01

    substances, helping in the management of the crisis, in the distribution of response resources, or prioritizing specific areas. They can also be used for detection of pollution sources. However, the resources involved, and the scientific and technological levels needed in the manipulation of numerical models, had both limited the interoperability between operational models, monitoring tools and decision-support software tools. The increasing predictive capacity of metocean conditions and fate and behaviour of pollutants spilt at sea or costal zones, and the presence of monitoring tools like vessel traffic control systems, can both provide a safer support for decision-making in emergency or planning issues associated to pollution risk management, especially if used in an integrated way. Following this approach, and taking advantage of an integrated framework developed in ARCOPOL (www.arcopol.eu) and EASYCO (www.project-easy.info) projects, three innovative model-supported software tools were developed and applied in the Atlantic Area, and / or the Portuguese Coast. Two of these tools are used for spill model simulations - a web-based interface (EASYCO web bidirectional tool) and an advanced desktop application (MOHID Desktop Spill Simulator) - both of them allowing end user to have control over the model simulations. Parameters such as date and time of the event, location and oil spill volume are provided the users; these interactive tools also integrate best available metocean forecasts (waves, meteorological, hydrodynamics) from different institutions in the Atlantic Area. Metocean data are continuously gathered from remote THREDDS data servers (using OPENDAP) or ftp sites, and then automatically interpolated and pre-processed to be available for the simulators. These simulation tools developed can also import initial data and export results from/to remote servers, using OGC WFS services. Simulations are provided to end user in a matter of seconds, and thus, can be very

  11. Data and software tools for gamma radiation spectral threat detection and nuclide identification algorithm development and evaluation

    International Nuclear Information System (INIS)

    The detection of radiological and nuclear threats is extremely important to national security. The federal government is spending significant resources developing new detection systems and attempting to increase the performance of existing ones. The detection of illicit radionuclides that may pose a radiological or nuclear threat is a challenging problem complicated by benign radiation sources (e.g., cat litter and medical treatments), shielding, and large variations in background radiation. Although there is a growing acceptance within the community that concentrating efforts on algorithm development (independent of the specifics of fully assembled systems) has the potential for significant overall system performance gains, there are two major hindrances to advancements in gamma spectral analysis algorithms under the current paradigm: access to data and common performance metrics along with baseline performance measures. Because many of the signatures collected during performance measurement campaigns are classified, dissemination to algorithm developers is extremely limited. This leaves developers no choice but to collect their own data if they are lucky enough to have access to material and sensors. This is often combined with their own definition of metrics for measuring performance. These two conditions make it all but impossible for developers and external reviewers to make meaningful comparisons between algorithms. Without meaningful comparisons, performance advancements become very hard to achieve and (more importantly) recognize. The objective of this work is to overcome these obstacles by developing and freely distributing real and synthetically generated gamma-spectra data sets as well as software tools for performance evaluation with associated performance baselines to national labs, academic institutions, government agencies, and industry. At present, datasets for two tracks, or application domains, have been developed: one that includes temporal

  12. Data and software tools for gamma radiation spectral threat detection and nuclide identification algorithm development and evaluation

    Science.gov (United States)

    Portnoy, David; Fisher, Brian; Phifer, Daniel

    2015-06-01

    The detection of radiological and nuclear threats is extremely important to national security. The federal government is spending significant resources developing new detection systems and attempting to increase the performance of existing ones. The detection of illicit radionuclides that may pose a radiological or nuclear threat is a challenging problem complicated by benign radiation sources (e.g., cat litter and medical treatments), shielding, and large variations in background radiation. Although there is a growing acceptance within the community that concentrating efforts on algorithm development (independent of the specifics of fully assembled systems) has the potential for significant overall system performance gains, there are two major hindrances to advancements in gamma spectral analysis algorithms under the current paradigm: access to data and common performance metrics along with baseline performance measures. Because many of the signatures collected during performance measurement campaigns are classified, dissemination to algorithm developers is extremely limited. This leaves developers no choice but to collect their own data if they are lucky enough to have access to material and sensors. This is often combined with their own definition of metrics for measuring performance. These two conditions make it all but impossible for developers and external reviewers to make meaningful comparisons between algorithms. Without meaningful comparisons, performance advancements become very hard to achieve and (more importantly) recognize. The objective of this work is to overcome these obstacles by developing and freely distributing real and synthetically generated gamma-spectra data sets as well as software tools for performance evaluation with associated performance baselines to national labs, academic institutions, government agencies, and industry. At present, datasets for two tracks, or application domains, have been developed: one that includes temporal

  13. Regional Economic Accounting (REAcct). A software tool for rapidly approximating economic impacts

    Energy Technology Data Exchange (ETDEWEB)

    Ehlen, Mark Andrew [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Vargas, Vanessa N. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Loose, Verne William [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Starks, Shirley J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Ellebracht, Lory A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2011-07-01

    This paper describes the Regional Economic Accounting (REAcct) analysis tool that has been in use for the last 5 years to rapidly estimate approximate economic impacts for disruptions due to natural or manmade events. It is based on and derived from the well-known and extensively documented input-output modeling technique initially presented by Leontief and more recently further developed by numerous contributors. REAcct provides county-level economic impact estimates in terms of gross domestic product (GDP) and employment for any area in the United States. The process for using REAcct incorporates geospatial computational tools and site-specific economic data, permitting the identification of geographic impact zones that allow differential magnitude and duration estimates to be specified for regions affected by a simulated or actual event. Using these data as input to REAcct, the number of employees for 39 directly affected economic sectors (including 37 industry production sectors and 2 government sectors) are calculated and aggregated to provide direct impact estimates. Indirect estimates are then calculated using Regional Input-Output Modeling System (RIMS II) multipliers. The interdependent relationships between critical infrastructures, industries, and markets are captured by the relationships embedded in the inputoutput modeling structure.

  14. Theoretical Tools and Software for Modeling, Simulation and Control Design of Rocket Test Facilities

    Science.gov (United States)

    Richter, Hanz

    2004-01-01

    A rocket test stand and associated subsystems are complex devices whose operation requires that certain preparatory calculations be carried out before a test. In addition, real-time control calculations must be performed during the test, and further calculations are carried out after a test is completed. The latter may be required in order to evaluate if a particular test conformed to specifications. These calculations are used to set valve positions, pressure setpoints, control gains and other operating parameters so that a desired system behavior is obtained and the test can be successfully carried out. Currently, calculations are made in an ad-hoc fashion and involve trial-and-error procedures that may involve activating the system with the sole purpose of finding the correct parameter settings. The goals of this project are to develop mathematical models, control methodologies and associated simulation environments to provide a systematic and comprehensive prediction and real-time control capability. The models and controller designs are expected to be useful in two respects: 1) As a design tool, a model is the only way to determine the effects of design choices without building a prototype, which is, in the context of rocket test stands, impracticable; 2) As a prediction and tuning tool, a good model allows to set system parameters off-line, so that the expected system response conforms to specifications. This includes the setting of physical parameters, such as valve positions, and the configuration and tuning of any feedback controllers in the loop.

  15. Safety assessment driving radioactive waste management solutions (SADRWMS Methodology) implemented in a software tool (SAFRAN)

    Energy Technology Data Exchange (ETDEWEB)

    Kinker, M., E-mail: M.Kinker@iaea.org [International Atomic Energy Agency (IAEA), Vienna (Austria); Avila, R.; Hofman, D., E-mail: rodolfo@facilia.se [FACILIA AB, Stockholm (Sweden); Jova Sed, L., E-mail: jovaluis@gmail.com [Centro Nacional de Seguridad Nuclear (CNSN), La Habana (Cuba); Ledroit, F., E-mail: frederic.ledroit@irsn.fr [IRSN PSN-EXP/SSRD/BTE, (France)

    2013-07-01

    In 2004, the International Atomic Energy Agency (IAEA) organized the International Project on Safety Assessment Driving Radioactive Waste Management Solutions (SADRWMS) to examine international approaches to safety assessment for predisposal management of radioactive waste. The initial outcome of the SADRWMS Project was achieved through the development of flowcharts which could be used to improve the mechanisms for applying safety assessment methodologies to predisposal management of radioactive waste. These flowcharts have since been incorporated into DS284 (General Safety Guide on the Safety Case and Safety Assessment for Predisposal Management of Radioactive Waste), and were also considered during the early development stages of the Safety Assessment Framework (SAFRAN) Tool. In 2009 the IAEA presented DS284 to the IAEA Waste Safety Standards Committee, during which it was proposed that the graded approach to safety case and safety assessment be illustrated through the development of Safety Reports for representative predisposal radioactive waste management facilities and activities. To oversee the development of these reports, it was agreed to establish the International Project on Complementary Safety Reports: Development and Application to Waste Management Facilities (CRAFT). The goal of the CRAFT project is to develop complementary reports by 2014, which the IAEA could then publish as IAEA Safety Reports. The present work describes how the DS284 methodology and SAFRAN Tool can be applied in the development and review of the safety case and safety assessment to a range of predisposal waste management facilities or activities within the Region. (author)

  16. Development of a new software tool, based on ANN technology, in neutron spectrometry and dosimetry research

    International Nuclear Information System (INIS)

    Artificial Intelligence is a branch of study which enhances the capability of computers by giving them human-like intelligence. The brain architecture has been extensively studied and attempts have been made to emulate it as in the Artificial Neural Network technology. A large variety of neural network architectures have been developed and they have gained wide-spread popularity over the last few decades. Their application is considered as a substitute for many classical techniques that have been used for many years, as in the case of neutron spectrometry and dosimetry research areas. In previous works, a new approach called Robust Design of Artificial Neural network was applied to build an ANN topology capable to solve the neutron spectrometry and dosimetry problems within the Mat lab programming environment. In this work, the knowledge stored at Mat lab ANN's synaptic weights was extracted in order to develop for first time a customized software application based on ANN technology, which is proposed to be used in the neutron spectrometry and simultaneous dosimetry fields. (Author)

  17. CAGO: a software tool for dynamic visual comparison and correlation measurement of genome organization.

    Directory of Open Access Journals (Sweden)

    Yi-Feng Chang

    Full Text Available CAGO (Comparative Analysis of Genome Organization is developed to address two critical shortcomings of conventional genome atlas plotters: lack of dynamic exploratory functions and absence of signal analysis for genomic properties. With dynamic exploratory functions, users can directly manipulate chromosome tracks of a genome atlas and intuitively identify distinct genomic signals by visual comparison. Signal analysis of genomic properties can further detect inconspicuous patterns from noisy genomic properties and calculate correlations between genomic properties across various genomes. To implement dynamic exploratory functions, CAGO presents each genome atlas in Scalable Vector Graphics (SVG format and allows users to interact with it using a SVG viewer through JavaScript. Signal analysis functions are implemented using R statistical software and a discrete wavelet transformation package waveslim. CAGO is not only a plotter for generating complex genome atlases, but also a platform for exploring genome atlases with dynamic exploratory functions for visual comparison and with signal analysis for comparing genomic properties across multiple organisms. The web-based application of CAGO, its source code, user guides, video demos, and live examples are publicly available and can be accessed at http://cbs.ym.edu.tw/cago.

  18. Perturbation experiments to investigate the impact of ocean acidification: approaches and software tools

    Directory of Open Access Journals (Sweden)

    J.-P. Gattuso

    2009-04-01

    Full Text Available Although future changes in the seawater carbonate chemistry are well constrained, their impact on marine organisms and ecosystems remains poorly known. The biological response to ocean acidification is a recent field of research as most purposeful experiments have only been carried out in the late 1990s. The potentially dire consequences of ocean acidification attract scientists and students with a limited knowledge of the carbonate chemistry and its experimental manipulation. Hence, some guidelines on carbonate chemistry manipulations may be helpful for the growing ocean acidification community to maintain comparability. Perturbation experiments are one of the key approaches used to investigate the biological response to elevated pCO2. They are based on measurements of physiological or metabolic processes in organisms and communities exposed to seawater with normal or altered carbonate chemistry. Seawater chemistry can be manipulated in different ways depending on the facilities available and on the question being addressed. The goal of this paper is (1 to examine the benefits and drawbacks of various manipulation techniques and (2 to describe a new version of the R software package seacarb which includes new functions aimed at assisting the design of ocean acidification perturbation experiments. Three approaches closely mimic the on-going and future changes in the seawater carbonate chemistry: gas bubbling, addition of high-CO2 seawater as well as combined additions of acid and bicarbonate and/or carbonate.

  19. Determination of Flux linkage Characteristics and Inductance of a Submersible Switched Reluctance Motor using Software Tools

    Directory of Open Access Journals (Sweden)

    Sundaram Maruthachalam

    2011-01-01

    Full Text Available Problem statement: The Switched Reluctance Motor (SRM is an old member of the Electric Machines Family. It’s simple structure, ruggedness and inexpensive manufacturing capability make it more attractive for Industrial applications. Now, the applications of switched reluctance motors in various industrial fields are tried by many engineers. However, switched reluctance motors are not used so far in submersible underwater motor for agriculture purposee. The torque developed by an SRM is dependent on the change of flux-linkage and rotor position. The flux linkage characteristic of the motor is required to make the control circuit. Since the SRM is non-linear in nature, estimation and calculation of the flux linkage characteristics is very difficult. Approach: With the flux tube method concept a simple algorithm is being developed in a MATLAB. ANSYS Software is used to determine the flux distribution at various rotor positions. Results: The aligned and unaligned flux linkage values for theoretical calculation at a current of 7 A is 72.7 mwb and 13.79 mwb respectively. With FEA simulation the obtained value is 92.73 mwb and 19.175. Conclusion: In this and, a simplified method for the determination of flux linkage characteristics of submersible SRM using MATLAB has been presented. The obtained value has been validated with the ANSYS FEM method. the calculated unaligned and aligned inductance values of a 4- phase, 3 hp, 220 V Submersible SRM using simplified MATLAB method very much matches with the ANSYS FEM Method.

  20. CAGO: A Software Tool for Dynamic Visual Comparison and Correlation Measurement of Genome Organization

    Science.gov (United States)

    Chang, Yi-Feng; Chang, Chuan-Hsiung

    2011-01-01

    CAGO (Comparative Analysis of Genome Organization) is developed to address two critical shortcomings of conventional genome atlas plotters: lack of dynamic exploratory functions and absence of signal analysis for genomic properties. With dynamic exploratory functions, users can directly manipulate chromosome tracks of a genome atlas and intuitively identify distinct genomic signals by visual comparison. Signal analysis of genomic properties can further detect inconspicuous patterns from noisy genomic properties and calculate correlations between genomic properties across various genomes. To implement dynamic exploratory functions, CAGO presents each genome atlas in Scalable Vector Graphics (SVG) format and allows users to interact with it using a SVG viewer through JavaScript. Signal analysis functions are implemented using R statistical software and a discrete wavelet transformation package waveslim. CAGO is not only a plotter for generating complex genome atlases, but also a platform for exploring genome atlases with dynamic exploratory functions for visual comparison and with signal analysis for comparing genomic properties across multiple organisms. The web-based application of CAGO, its source code, user guides, video demos, and live examples are publicly available and can be accessed at http://cbs.ym.edu.tw/cago. PMID:22114666