WorldWideScience

Sample records for bartab software tools

  1. Software engineering tools.

    Science.gov (United States)

    Wear, L L; Pinkert, J R

    1994-01-01

    We have looked at general descriptions and illustrations of several software development tools, such as tools for prototyping, developing DFDs, testing, and maintenance. Many others are available, and new ones are being developed. However, you have at least seen some examples of powerful CASE tools for systems development.

  2. Machine Tool Software

    Science.gov (United States)

    1988-01-01

    A NASA-developed software package has played a part in technical education of students who major in Mechanical Engineering Technology at William Rainey Harper College. Professor Hack has been using (APT) Automatically Programmed Tool Software since 1969 in his CAD/CAM Computer Aided Design and Manufacturing curriculum. Professor Hack teaches the use of APT programming languages for control of metal cutting machines. Machine tool instructions are geometry definitions written in APT Language to constitute a "part program." The part program is processed by the machine tool. CAD/CAM students go from writing a program to cutting steel in the course of a semester.

  3. Software engineering methodologies and tools

    Science.gov (United States)

    Wilcox, Lawrence M.

    1993-01-01

    Over the years many engineering disciplines have developed, including chemical, electronic, etc. Common to all engineering disciplines is the use of rigor, models, metrics, and predefined methodologies. Recently, a new engineering discipline has appeared on the scene, called software engineering. For over thirty years computer software has been developed and the track record has not been good. Software development projects often miss schedules, are over budget, do not give the user what is wanted, and produce defects. One estimate is there are one to three defects per 1000 lines of deployed code. More and more systems are requiring larger and more complex software for support. As this requirement grows, the software development problems grow exponentially. It is believed that software quality can be improved by applying engineering principles. Another compelling reason to bring the engineering disciplines to software development is productivity. It has been estimated that productivity of producing software has only increased one to two percent a year in the last thirty years. Ironically, the computer and its software have contributed significantly to the industry-wide productivity, but computer professionals have done a poor job of using the computer to do their job. Engineering disciplines and methodologies are now emerging supported by software tools that address the problems of software development. This paper addresses some of the current software engineering methodologies as a backdrop for the general evaluation of computer assisted software engineering (CASE) tools from actual installation of and experimentation with some specific tools.

  4. Modern Tools for Modern Software

    Energy Technology Data Exchange (ETDEWEB)

    Kumfert, G; Epperly, T

    2001-10-31

    This is a proposal for a new software configure/build tool for building, maintaining, deploying, and installing software. At its completion, this new tool will replace current standard tool suites such as ''autoconf'', ''automake'', ''libtool'', and the de facto standard build tool, ''make''. This ambitious project is born out of the realization that as scientific software has grown in size and complexity over the years, the difficulty of configuring and building software has increased as well. For high performance scientific software, additional complexities often arises from the need for portability to multiple platforms (including many one-of-a-kind platforms), multilanguage implementations, use of third party libraries, and a need to adapt algorithms to the specific features of the hardware. Development of scientific software is being hampered by the quality of configuration and build tools commonly available. Inordinate amounts of time and expertise are required to develop and maintain the configure and build system for a moderately complex project. Better build and configure tools will increase developer productivity. This proposal is a first step in a process of shoring up the foundation upon which DOE software is created and used.

  5. Toxicity Estimation Software Tool (TEST)

    Science.gov (United States)

    The Toxicity Estimation Software Tool (TEST) was developed to allow users to easily estimate the toxicity of chemicals using Quantitative Structure Activity Relationships (QSARs) methodologies. QSARs are mathematical models used to predict measures of toxicity from the physical c...

  6. Software Release Procedure and Tools

    OpenAIRE

    Giammatteo, Gabriele; Frosini, Luca; Laskaris, Nikolas

    2015-01-01

    Deliverable D4.1 - "Software Release Procedures and Tools" aims to provide a detailed description of the procedures applied and tools used to manage releases of the gCube System within Work Package 4. gCube System is the software at the basis of all VREs applications, data management services and portals. Given the large size of the gCube system, its high degree of modularity and the number of developers involved in the implementation, a set of procedures that formalize and simplify the integ...

  7. Design of parametric software tools

    DEFF Research Database (Denmark)

    Sabra, Jakob Borrits; Mullins, Michael

    2011-01-01

    fulfilment of evidence-based design criterion regarding light distribution and location in relation to patient safety in architectural health care design proposals. The study uses 2D/3D CAD modelling software Rhinoceros 3D with plug-in Grasshopper to create parametric tool prototypes to exemplify......The studies investigate the field of evidence-based design used in architectural design practice and propose a method using 2D/3D CAD applications to: 1) enhance integration of evidence-based design knowledge in architectural design phases with a focus on lighting and interior design and 2) assess...... Modelling projects....

  8. Tools for Embedded Computing Systems Software

    Science.gov (United States)

    1978-01-01

    A workshop was held to assess the state of tools for embedded systems software and to determine directions for tool development. A synopsis of the talk and the key figures of each workshop presentation, together with chairmen summaries, are presented. The presentations covered four major areas: (1) tools and the software environment (development and testing); (2) tools and software requirements, design, and specification; (3) tools and language processors; and (4) tools and verification and validation (analysis and testing). The utility and contribution of existing tools and research results for the development and testing of embedded computing systems software are described and assessed.

  9. Software Tools Streamline Project Management

    Science.gov (United States)

    2009-01-01

    Three innovative software inventions from Ames Research Center (NETMARK, Program Management Tool, and Query-Based Document Management) are finding their way into NASA missions as well as industry applications. The first, NETMARK, is a program that enables integrated searching of data stored in a variety of databases and documents, meaning that users no longer have to look in several places for related information. NETMARK allows users to search and query information across all of these sources in one step. This cross-cutting capability in information analysis has exponentially reduced the amount of time needed to mine data from days or weeks to mere seconds. NETMARK has been used widely throughout NASA, enabling this automatic integration of information across many documents and databases. NASA projects that use NETMARK include the internal reporting system and project performance dashboard, Erasmus, NASA s enterprise management tool, which enhances organizational collaboration and information sharing through document routing and review; the Integrated Financial Management Program; International Space Station Knowledge Management; Mishap and Anomaly Information Reporting System; and management of the Mars Exploration Rovers. Approximately $1 billion worth of NASA s projects are currently managed using Program Management Tool (PMT), which is based on NETMARK. PMT is a comprehensive, Web-enabled application tool used to assist program and project managers within NASA enterprises in monitoring, disseminating, and tracking the progress of program and project milestones and other relevant resources. The PMT consists of an integrated knowledge repository built upon advanced enterprise-wide database integration techniques and the latest Web-enabled technologies. The current system is in a pilot operational mode allowing users to automatically manage, track, define, update, and view customizable milestone objectives and goals. The third software invention, Query

  10. Software tools for optical interferometry

    Science.gov (United States)

    Thureau, Nathalie D.; Ireland, Michael; Monnier, John D.; Pedretti, Ettore

    2006-06-01

    We describe a set of general purpose utilities for visualizing and manipulating optical interferometry data stored in the FITS-based OIFITS data format. This class of routines contains code like the OiPlot navigation/visualization tool which allows the user to extract visibility, closure phase and UV-coverage information from the OIFITS files and to display the information in various ways. OiPlot also has basic data model fitting capabilities which can be used for a rapid first analysis of the scientific data. More advanced image reconstruction techniques are part of a dedicated utility. In addition, these routines allow data from multiple interferometers to be combined and used together. Part of our work also aims at developing software specific to the Michigan InfraRed Combiner (MIRC). Our experience designing a flexible and robust graphical user interfaced based on sockets using python libraries has wide applicability and this paper will discuss practicalities.

  11. A Software Tool for Legal Drafting

    Directory of Open Access Journals (Sweden)

    Daniel Gorín

    2011-09-01

    Full Text Available Although many attempts at automated aids for legal drafting have been made, they were based on the construction of a new tool, completely from scratch. This is at least curious, considering that a strong parallelism can be established between a normative document and a software specification: both describe what an entity should or should not do, can or cannot do. In this article we compare normative documents and software specifications to find out their similarities and differences. The comparison shows that there are distinctive particularities, but they are restricted to a very specific subclass of normative propositions. The rest, we postulate, can be dealt with software tools. For such an enterprise the FormaLex tool set was devised: an LTL-based language and companion tools that utilize model checking to find out normative incoherences in regulations, contracts and other legal documents. A feature-rich case study is analyzed with the presented tools.

  12. A Software Tool for Legal Drafting

    CERN Document Server

    Gorín, Daniel; Schapachnik, Fernando; 10.4204/EPTCS.68.7

    2011-01-01

    Although many attempts at automated aids for legal drafting have been made, they were based on the construction of a new tool, completely from scratch. This is at least curious, considering that a strong parallelism can be established between a normative document and a software specification: both describe what an entity should or should not do, can or cannot do. In this article we compare normative documents and software specifications to find out their similarities and differences. The comparison shows that there are distinctive particularities, but they are restricted to a very specific subclass of normative propositions. The rest, we postulate, can be dealt with software tools. For such an enterprise the \\FormaLex tool set was devised: an LTL-based language and companion tools that utilize model checking to find out normative incoherences in regulations, contracts and other legal documents. A feature-rich case study is analyzed with the presented tools.

  13. Some software tools for scientific programming

    International Nuclear Information System (INIS)

    A number of advanced software tools are described which have been used by Logica or its clients for scientific or technical software development. These are: RAPPORT, a Relational Database Management System; A Fortran Program Analyser, which is designed to answer those questions about large Fortran Programs which are not easily answered by examining listings; A Test Coverage Monitor, which measures how well the code and branches in a Fortran program have been exercised by a set of test runs; The UNIX operating system and the tools available with it. These tools will be described with examples of their use in practice. (orig.)

  14. Tool Use Within NASA Software Quality Assurance

    Science.gov (United States)

    Shigeta, Denise; Port, Dan; Nikora, Allen P.; Wilf, Joel

    2013-01-01

    As space mission software systems become larger and more complex, it is increasingly important for the software assurance effort to have the ability to effectively assess both the artifacts produced during software system development and the development process itself. Conceptually, assurance is a straightforward idea - it is the result of activities carried out by an organization independent of the software developers to better inform project management of potential technical and programmatic risks, and thus increase management's confidence in the decisions they ultimately make. In practice, effective assurance for large, complex systems often entails assessing large, complex software artifacts (e.g., requirements specifications, architectural descriptions) as well as substantial amounts of unstructured information (e.g., anomaly reports resulting from testing activities during development). In such an environment, assurance engineers can benefit greatly from appropriate tool support. In order to do so, an assurance organization will need accurate and timely information on the tool support available for various types of assurance activities. In this paper, we investigate the current use of tool support for assurance organizations within NASA, and describe on-going work at JPL for providing assurance organizations with the information about tools they need to use them effectively.

  15. Software Tools Used for Continuous Assessment

    Directory of Open Access Journals (Sweden)

    Corina SBUGHEA

    2016-04-01

    Full Text Available he present paper addresses the subject of continuous evaluation and of the IT tools that support it. The approach starts from the main concepts and methods used in the teaching process, according to the assessment methodology and, then, it focuses on their implementation in the Wondershare QuizCreator software.

  16. ATLAS software configuration and build tool optimisation

    CERN Document Server

    Rybkin, G; The ATLAS collaboration

    2014-01-01

    ATLAS software code base is over 6 million lines organised in about 2000 packages. It makes use of some 100 external software packages, is developed by more than 400 developers and used by more than 2500 physicists from over 200 universities and laboratories in 6 continents. To meet the challenge of configuration and building of this software, the Configuration Management Tool (CMT) is used. CMT expects each package to describe its build targets, build and environment setup parameters, dependencies on other packages in a text file called requirements, and each project (group of packages) to describe its policies and dependencies on other projects in a text project file. Based on the effective set of configuration parameters read from the requirements files of dependent packages and project files, CMT commands build the packages, generate the environment for their use, or query the packages. The main focus was on build time performance that was optimised within several approaches: reduction of the number of re...

  17. ATLAS software configuration and build tool optimisation

    CERN Document Server

    Rybkin, G; The ATLAS collaboration

    2013-01-01

    ATLAS software code base is over 6 million lines organised in about 2000 packages. It makes use of some 100 external software packages, is developed by more than 400 developers and used by more than 2500 physicists from over 200 universities and laboratories in 6 continents. To meet the challenge of configuration and building of this software, the Configuration Management Tool (CMT) is used. CMT expects each package to describe its build targets, build and environment setup parameters, dependencies on other packages in a text file called requirements, and each project (group of packages) to describe its policies and dependencies on other projects in a text project file. Based on the effective set of configuration parameters read from the requirements files of dependent packages and project files, CMT commands build the packages, generate the environment for their use, or query the packages. The main focus was on build time performance that was optimised within several approaches: reduction of the number of re...

  18. Software Engineering Tools for Scientific Models

    Science.gov (United States)

    Abrams, Marc; Saboo, Pallabi; Sonsini, Mike

    2013-01-01

    Software tools were constructed to address issues the NASA Fortran development community faces, and they were tested on real models currently in use at NASA. These proof-of-concept tools address the High-End Computing Program and the Modeling, Analysis, and Prediction Program. Two examples are the NASA Goddard Earth Observing System Model, Version 5 (GEOS-5) atmospheric model in Cell Fortran on the Cell Broadband Engine, and the Goddard Institute for Space Studies (GISS) coupled atmosphere- ocean model called ModelE, written in fixed format Fortran.

  19. ATLAS software configuration and build tool optimisation

    Science.gov (United States)

    Rybkin, Grigory; Atlas Collaboration

    2014-06-01

    ATLAS software code base is over 6 million lines organised in about 2000 packages. It makes use of some 100 external software packages, is developed by more than 400 developers and used by more than 2500 physicists from over 200 universities and laboratories in 6 continents. To meet the challenge of configuration and building of this software, the Configuration Management Tool (CMT) is used. CMT expects each package to describe its build targets, build and environment setup parameters, dependencies on other packages in a text file called requirements, and each project (group of packages) to describe its policies and dependencies on other projects in a text project file. Based on the effective set of configuration parameters read from the requirements files of dependent packages and project files, CMT commands build the packages, generate the environment for their use, or query the packages. The main focus was on build time performance that was optimised within several approaches: reduction of the number of reads of requirements files that are now read once per package by a CMT build command that generates cached requirements files for subsequent CMT build commands; introduction of more fine-grained build parallelism at package task level, i.e., dependent applications and libraries are compiled in parallel; code optimisation of CMT commands used for build; introduction of package level build parallelism, i. e., parallelise the build of independent packages. By default, CMT launches NUMBER-OF-PROCESSORS build commands in parallel. The other focus was on CMT commands optimisation in general that made them approximately 2 times faster. CMT can generate a cached requirements file for the environment setup command, which is especially useful for deployment on distributed file systems like AFS or CERN VMFS. The use of parallelism, caching and code optimisation significantly-by several times-reduced software build time, environment setup time, increased the efficiency of

  20. Towards an interoperability ontology for software development tools

    OpenAIRE

    Hasni, Neji.

    2003-01-01

    Approved for public release; distribution is unlimited The automation of software development has long been a goal of software engineering to increase efficiency of the development effort and improve the software product. This efficiency (high productivity with less software faults) results from best practices in building, managing and tes ting software projects via the use of these automated tools and processes. However, each software development tool has its own characteristics, semantic...

  1. Software Tools for Stochastic Simulations of Turbulence

    Science.gov (United States)

    Kaufman, Ryan

    We present two software tools useful for the analysis of mesh based physics application data, and specifically for turbulent mixing simulations. Each has a broader, but separate scope, as we describe. Both features play a key role as we push computational science to its limits and thus the present work contributes to the frontier of research. The first tool is Wstar, a weak* comparison tool, which addresses the stochastic nature of turbulent flow. The goal is to compare underresolved turbulent data in convergence, parameter dependence, or validation studies. This is achieved by separating space-time data from state data (e.g. density, pressure, momentum, etc.) through coarsening and sampling. The collection of fine grained data in a single coarse cell is treated as a random sample in state space, whose cumulative distribution function defines a measure within that cell. This set of measures with the spacial dependence defined by the coarse grid defines a Young measure solution to the PDE. The second tool is a front tracking application programming interface (API) called FTI. It has the capability to generate geometric surfaces (e.g. the location of interspecies boundaries) of high complexity, and track them dynamically. FTI also includes the ghost fluid method, which enables mesh based fluid codes to maintain sharpness at interspecies boundaries by modifying solution stencils that cross such a boundary. FTI outlines and standardizes the methods involved in this model. FronTier, as developed here, is a software package which implements this standard. The client must implement the physics and grid interpolation routines outlined in the client interface to FTI. Specific client programs using this interface include the weather forecasting code WRF; the high energy physics code, FLASH; and two locally constructed fluid codes, cFluid and iFluid for compressible and incompressible flow respectively.

  2. Selecting and effectively using a computer aided software engineering tool

    Energy Technology Data Exchange (ETDEWEB)

    Kuhn, D.L.

    1989-01-01

    Software engineering is a science by which user requirements are translated into a quality software product. Computer Aided Software Engineering (CASE) is the scientific application of a set of tools and methods to a software which results in high-quality, defect-free, and maintainable software products. The Computer Systems Engineering (CSE) group of Separations Technology at the Savannah River Site has successfully used CASE tools to produce high-quality, reliable, and maintainable software products. This paper details the selection process CSE used to acquire a commonly available CASE product and how the CSE group effectively used this CASE tool to consistently produce quality software. 9 refs.

  3. Tool Support for Software Lookup Table Optimization

    Directory of Open Access Journals (Sweden)

    Chris Wilcox

    2011-01-01

    Full Text Available A number of scientific applications are performance-limited by expressions that repeatedly call costly elementary functions. Lookup table (LUT optimization accelerates the evaluation of such functions by reusing previously computed results. LUT methods can speed up applications that tolerate an approximation of function results, thereby achieving a high level of fuzzy reuse. One problem with LUT optimization is the difficulty of controlling the tradeoff between performance and accuracy. The current practice of manual LUT optimization adds programming effort by requiring extensive experimentation to make this tradeoff, and such hand tuning can obfuscate algorithms. In this paper we describe a methodology and tool implementation to improve the application of software LUT optimization. Our Mesa tool implements source-to-source transformations for C or C++ code to automate the tedious and error-prone aspects of LUT generation such as domain profiling, error analysis, and code generation. We evaluate Mesa with five scientific applications. Our results show a performance improvement of 3.0× and 6.9× for two molecular biology algorithms, 1.4× for a molecular dynamics program, 2.1× to 2.8× for a neural network application, and 4.6× for a hydrology calculation. We find that Mesa enables LUT optimization with more control over accuracy and less effort than manual approaches.

  4. Sandia software guidelines: Volume 5, Tools, techniques, and methodologies

    Energy Technology Data Exchange (ETDEWEB)

    1989-07-01

    This volume is one in a series of Sandia Software Guidelines intended for use in producing quality software within Sandia National Laboratories. This volume describes software tools and methodologies available to Sandia personnel for the development of software, and outlines techniques that have proven useful within the Laboratories and elsewhere. References and evaluations by Sandia personnel are included. 6 figs.

  5. Tools and Behavioral Abstraction: A Direction for Software Engineering

    Science.gov (United States)

    Leino, K. Rustan M.

    As in other engineering professions, software engineers rely on tools. Such tools can analyze program texts and design specifications more automatically and in more detail than ever before. While many tools today are applied to find new defects in old code, I predict that more software-engineering tools of the future will be available to software authors at the time of authoring. If such analysis tools can be made to be fast enough and easy enough to use, they can help software engineers better produce and evolve programs.

  6. Herramientas libres para modelar software Free tools to model software

    Directory of Open Access Journals (Sweden)

    Mauro Callejas Cuervo Óscar Yovany Baquero Moreno

    2010-11-01

    Full Text Available Observación acerca del  software libre y de suimplicación en procesos de desarrollo de  softwarecon herramientas 4G por parte de entidades opersonas sin capitales astronómicos y sin lamentalidad acaparadora de dominar el mercado conproductos costosos que las hagan multimillonarias yque no ofrecen una garantía real, ni la posibilidadsiquiera de conocer el  software por el que se hapagado, y mucho menos de modificarlo si no cumplenuestras expectativas.

  7. Herramientas libres para modelar software Free tools to model software

    OpenAIRE

    Mauro Callejas Cuervo Óscar Yovany Baquero Moreno

    2010-01-01

    Observación acerca del  software libre y de suimplicación en procesos de desarrollo de  softwarecon herramientas 4G por parte de entidades opersonas sin capitales astronómicos y sin lamentalidad acaparadora de dominar el mercado conproductos costosos que las hagan multimillonarias yque no ofrecen una garantía real, ni la posibilidadsiquiera de conocer el  software por el que se hapagado, y mucho menos de modificarlo si no cumplenuestras expectativas.

  8. Towards E-CASE Tools for Software Engineering

    Directory of Open Access Journals (Sweden)

    Nabil Arman

    2013-02-01

    Full Text Available CASE tools are having an important role in all phases of software systems development and engineering. This is evident in the huge benefits obtained from using these tools including their cost-effectiveness, rapid software application development, and improving the possibility of software reuse to name just a few. In this paper, the idea of moving towards E-CASE tools, rather than traditional CASE tools, is advocated since these E-CASE tools have all the benefits and advantages of traditional CASE tools and add to that all the benefits of web technology. This is presented by focusing on the role of E-CASE tools in facilitating the trend of telecommuting and virtual workplaces among software engineering and information technology professionals. In addition, E-CASE tools integrates smoothly with the trend of E-learning in conducting software engineering courses. Finally, two surveys were conducted for a group of software engineering professional and students of software engineering courses. The surveys show that E-CASE tools are of great value to both communities of students and professionals of software engineering.

  9. TAUS:A File—Based Software Understanding Tool

    Institute of Scientific and Technical Information of China (English)

    费翔林; 汪承藻; 等

    1990-01-01

    A program called TAUS,a Tool for Analyzing and Understanding Software,was developed.It is designed to help the programmer analyze and understand the software interactively.Its aim is to reduce the dependence on human intelligence in software understanding and improve the programmer's understanding productivity.The design and implementation of TAUS and its applications are described.

  10. Estimation of toxicity using a Java based software tool

    Science.gov (United States)

    A software tool has been developed that will allow a user to estimate the toxicity for a variety of endpoints (such as acute aquatic toxicity). The software tool is coded in Java and can be accessed using a web browser (or alternatively downloaded and ran as a stand alone applic...

  11. Caesy: A software tool for computer-aided engineering

    Science.gov (United States)

    Wette, Matt

    1993-01-01

    A new software tool, Caesy, is described. This tool provides a strongly typed programming environment for research in the development of algorithms and software for computer-aided control system design. A description of the user language and its implementation as they currently stand are presented along with a description of work in progress and areas of future work.

  12. EISA 432 Energy Audits Best Practices: Software Tools

    Energy Technology Data Exchange (ETDEWEB)

    Maryl Fisher

    2014-11-01

    Five whole building analysis software tools that can aid an energy manager with fulfilling energy audit and commissioning/retro-commissioning requirements were selected for review in this best practices study. A description of each software tool is provided as well as a discussion of the user interface and level of expertise required for each tool, a review of how to use the tool for analyzing energy conservation opportunities, the format and content of reports generated by the tool, and a discussion on the applicability of the tool for commissioning.

  13. Software Tools for Fault Management Technologies Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Fault Management (FM) is a key requirement for safety, efficient onboard and ground operations, maintenance, and repair. QSI's TEAMS Software suite is a leading...

  14. Software Users Manual (SUM): Extended Testability Analysis (ETA) Tool

    Science.gov (United States)

    Maul, William A.; Fulton, Christopher E.

    2011-01-01

    This software user manual describes the implementation and use the Extended Testability Analysis (ETA) Tool. The ETA Tool is a software program that augments the analysis and reporting capabilities of a commercial-off-the-shelf (COTS) testability analysis software package called the Testability Engineering And Maintenance System (TEAMS) Designer. An initial diagnostic assessment is performed by the TEAMS Designer software using a qualitative, directed-graph model of the system being analyzed. The ETA Tool utilizes system design information captured within the diagnostic model and testability analysis output from the TEAMS Designer software to create a series of six reports for various system engineering needs. The ETA Tool allows the user to perform additional studies on the testability analysis results by determining the detection sensitivity to the loss of certain sensors or tests. The ETA Tool was developed to support design and development of the NASA Ares I Crew Launch Vehicle. The diagnostic analysis provided by the ETA Tool was proven to be valuable system engineering output that provided consistency in the verification of system engineering requirements. This software user manual provides a description of each output report generated by the ETA Tool. The manual also describes the example diagnostic model and supporting documentation - also provided with the ETA Tool software release package - that were used to generate the reports presented in the manual

  15. Technology Transfer Challenges for High-Assurance Software Engineering Tools

    Science.gov (United States)

    Koga, Dennis (Technical Monitor); Penix, John; Markosian, Lawrence Z.

    2003-01-01

    In this paper, we describe our experience with the challenges thar we are currently facing in our effort to develop advanced software verification and validation tools. We categorize these challenges into several areas: cost benefits modeling, tool usability, customer application domain, and organizational issues. We provide examples of challenges in each area and identrfj, open research issues in areas which limit our ability to transfer high-assurance software engineering tools into practice.

  16. Innovative Software Tools Measure Behavioral Alertness

    Science.gov (United States)

    2014-01-01

    To monitor astronaut behavioral alertness in space, Johnson Space Center awarded Philadelphia-based Pulsar Informatics Inc. SBIR funding to develop software to be used onboard the International Space Station. Now used by the government and private companies, the technology has increased revenues for the firm by an average of 75 percent every year.

  17. Criteria and tools for scientific software quality measurements

    International Nuclear Information System (INIS)

    Not all software used in the nuclear industry needs the rigorous formal verification, reliability testing and quality assessment that are being applied to safety critical software. Recently, however, there is increasing recognition that systematic and objective quality assessment of the scientific software used in design and safety analyses of nuclear facilities is necessary to support safety and licensing decisions. Because of the complexity and large size of these programs and the resource constraints faced by the AECB reviewer, it is desirable that appropriate automated tools are used wherever practical. To objectively assess the quality of software, a set of attributes of a software product by which its quality is described and evaluated must be established. These attributes must be relevant to the application domain of software under evaluation. To effectively assess the quality of software, metrics defining quantitative scale and method appropriate to determine the value of attributes need to be applied. To cost-effectively perform the evaluation, use of suitable automated tools is desirable. In this project, criteria for evaluating the quality of scientific software are presented; metrics for which those criteria can be evaluated are identified; a survey of automated tools to measure those metrics was conducted and the most appropriate tool (QA Fortran) was acquired; and the tool usage was demonstrated on three sample programs. (author) 5 refs

  18. Computer- Aided Design in Power Engineering Application of Software Tools

    CERN Document Server

    Stojkovic, Zlatan

    2012-01-01

    This textbooks demonstrates the application of software tools in solving a series of problems from the field of designing power system structures and systems. It contains four chapters: The first chapter leads the reader through all the phases necessary in the procedures of computer aided modeling and simulation. It guides through the complex problems presenting on the basis of eleven original examples. The second chapter presents  application of software tools in power system calculations of power systems equipment design. Several design example calculations are carried out using engineering standards like MATLAB, EMTP/ATP, Excel & Access, AutoCAD and Simulink. The third chapters focuses on the graphical documentation using a collection of software tools (AutoCAD, EPLAN, SIMARIS SIVACON, SIMARIS DESIGN) which enable the complete automation of the development of graphical documentation of a power systems. In the fourth chapter, the application of software tools in the project management in power systems ...

  19. ISWHM: Tools and Techniques for Software and System Health Management

    Science.gov (United States)

    Schumann, Johann; Mengshoel, Ole J.; Darwiche, Adnan

    2010-01-01

    This presentation presents status and results of research on Software Health Management done within the NRA "ISWHM: Tools and Techniques for Software and System Health Management." Topics include: Ingredients of a Guidance, Navigation, and Control System (GN and C); Selected GN and C Testbed example; Health Management of major ingredients; ISWHM testbed architecture; and Conclusions and next Steps.

  20. Developing a Decision Support System: The Software and Hardware Tools.

    Science.gov (United States)

    Clark, Phillip M.

    1989-01-01

    Describes some of the available software and hardware tools that can be used to develop a decision support system implemented on microcomputers. Activities that should be supported by software are discussed, including data entry, data coding, finding and combining data, and data compatibility. Hardware considerations include speed, storage…

  1. Software tool for xenon gamma-ray spectrometer control

    Science.gov (United States)

    Chernysheva, I. V.; Novikov, A. S.; Shustov, A. E.; Dmitrenko, V. V.; Pyae Nyein, Sone; Petrenko, D.; Ulin, S. E.; Uteshev, Z. M.; Vlasik, K. F.

    2016-02-01

    Software tool "Acquisition and processing of gamma-ray spectra" for xenon gamma-ray spectrometers control was developed. It supports the multi-windows interface. Software tool has the possibilities for acquisition of gamma-ray spectra from xenon gamma-ray detector via USB or RS-485 interfaces, directly or via TCP-IP protocol, energy calibration of gamma-ray spectra, saving gamma-ray spectra on a disk.

  2. Flow sheeting software as a tool when teaching Chemical Engineering

    OpenAIRE

    Abbas, Asad

    2011-01-01

    The aim of this thesis is to design different chemical processes by using flow sheeting software and to show the usefulness of flow sheeting software as an educational tool. The industries studied are hydrogen, sulfur, nitric acid and ethylene glycol production and a model of drying technique is also included. Firstly, there is an introduction of chemcad as a tool when teaching chemical processes and explanation of each industry which is selected to design. Various production methods for each...

  3. iPhone examination with modern forensic software tools

    Science.gov (United States)

    Höne, Thomas; Kröger, Knut; Luttenberger, Silas; Creutzburg, Reiner

    2012-06-01

    The aim of the paper is to show the usefulness of modern forensic software tools for iPhone examination. In particular, we focus on the new version of Elcomsoft iOS Forensic Toolkit and compare it with Oxygen Forensics Suite 2012 regarding functionality, usability and capabilities. It is shown how these software tools works and how capable they are in examining non-jailbreaked and jailbreaked iPhones.

  4. Some Interactive Aspects of a Software Design Schema Acquisition Tool

    Science.gov (United States)

    Lee, Hing-Yan; Harandi, Mehdi T.

    1991-01-01

    This paper describes a design schema acquisition tool which forms an important component of a hybrid software design system for reuse. The hybrid system incorporates both schema-based approaches in supporting software design reuse activities and is realized by extensions to the IDeA system. The paper also examines some of the interactive aspects that the tool requires with the domain analyst to accomplish its acquisition task.

  5. Generating DEM from LIDAR data - comparison of available software tools

    Science.gov (United States)

    Korzeniowska, K.; Lacka, M.

    2011-12-01

    In recent years many software tools and applications have appeared that offer procedures, scripts and algorithms to process and visualize ALS data. This variety of software tools and of "point cloud" processing methods contributed to the aim of this study: to assess algorithms available in various software tools that are used to classify LIDAR "point cloud" data, through a careful examination of Digital Elevation Models (DEMs) generated from LIDAR data on a base of these algorithms. The works focused on the most important available software tools: both commercial and open source ones. Two sites in a mountain area were selected for the study. The area of each site is 0.645 sq km. DEMs generated with analysed software tools ware compared with a reference dataset, generated using manual methods to eliminate non ground points. Surfaces were analysed using raster analysis. Minimum, maximum and mean differences between reference DEM and DEMs generated with analysed software tools were calculated, together with Root Mean Square Error. Differences between DEMs were also examined visually using transects along the grid axes in the test sites.

  6. The evolution of CACSD tools-a software engineering perspective

    DEFF Research Database (Denmark)

    Ravn, Ole; Szymkat, Maciej

    1992-01-01

    The earlier evolution of computer-aided control system design (CACSD) tools is discussed from a software engineering perspective. A model of the design process is presented as the basis for principles and requirements of future CACSD tools. Combinability, interfacing in memory, and an open...

  7. Tool for Measuring Coupling in Object- Oriented Java Software

    Directory of Open Access Journals (Sweden)

    Mr. V. S. Bidve

    2016-04-01

    Full Text Available The importance of object-oriented software metrics is increasing day by day to evaluate and predict the quality of software. Coupling is one of the object-oriented metrics. It is a dependency degree to which one program module depends on one of the other modules. Coupling measures play a significant role in the quality aspect of object-oriented software, from design up to maintenance. To correctly predict the quality factors of object oriented software, the coupling should be accurately measured. In the related literature, we find many techniques to measure coupling. But, No any author explained the implementation of his technique(s in details and made the tool available to know how exactly coupling has been measured. In this paper, we propose a tool for measurement of coupling among classes of Java software. Java source code is taken as an input for the tool to measure coupling. The input Java code is parsed, and tokens are extracted. These tokens along with the code are used to measure different types of Coupling in Java software. Coupling of different sample Java codes is measured with the tool to observe values of each coupling type.

  8. Selecting Appropriate Requirements Management Tool for Developing Secure Enterprises Software

    Directory of Open Access Journals (Sweden)

    Daniyal M Alghazzawi

    2014-03-01

    Full Text Available This paper discusses about the significance of selecting right requirements management tools. It’s no secret that poorly understood user requirements and uncontrolled scope creep to many software project failures. Many of the application development professionals buy wrong tools for the wrong reasons. To avoid purchasing the more complex and expensive tool, the organization needs to be realistic about the particular problem for which they opt. Software development organizations are improving the methods, they use to gather, analyze, trace, document, prioritize and manage their requirements. This paper considers four leading Requirements Management tools; Analyst Pro, CORE, Cradle and Caliber RM, the focus is to select the appropriate tool according to their capabilities and customers need.

  9. Software tool for data mining and its applications

    Science.gov (United States)

    Yang, Jie; Ye, Chenzhou; Chen, Nianyi

    2002-03-01

    A software tool for data mining is introduced, which integrates pattern recognition (PCA, Fisher, clustering, hyperenvelop, regression), artificial intelligence (knowledge representation, decision trees), statistical learning (rough set, support vector machine), computational intelligence (neural network, genetic algorithm, fuzzy systems). It consists of nine function models: pattern recognition, decision trees, association rule, fuzzy rule, neural network, genetic algorithm, Hyper Envelop, support vector machine, visualization. The principle and knowledge representation of some function models of data mining are described. The software tool of data mining is realized by Visual C++ under Windows 2000. Nonmonotony in data mining is dealt with by concept hierarchy and layered mining. The software tool of data mining has satisfactorily applied in the prediction of regularities of the formation of ternary intermetallic compounds in alloy systems, and diagnosis of brain glioma.

  10. Software engineering and data management for automated payload experiment tool

    Science.gov (United States)

    Maddux, Gary A.; Provancha, Anna; Chattam, David

    1994-01-01

    The Microgravity Projects Office identified a need to develop a software package that will lead experiment developers through the development planning process, obtain necessary information, establish an electronic data exchange avenue, and allow easier manipulation/reformatting of the collected information. An MS-DOS compatible software package called the Automated Payload Experiment Tool (APET) has been developed and delivered. The objective of this task is to expand on the results of the APET work previously performed by UAH and provide versions of the software in a Macintosh and Windows compatible format.

  11. Tools and Techniques for testing of Flight Critical Software

    Directory of Open Access Journals (Sweden)

    G. Venkat Reddy

    1999-10-01

    Full Text Available Flight control system software is a critical component of the digital flight control computer of light combat aircraft. The problems associated with the testing of flight critical software and the test tools, and techniques used to achieve maintainability, and structural and functional coverage of test cases are presented. Also, the experience gained throughout the cycle of testing-design and implementation, reviews and revisions, test execution and software error detection and modificaton of test cases based on requirements and design changes, and regression testing are enumerated. It presents an object-oriented approach towards testing to make it less tedious, more creative, reviewable and easily miantainable.

  12. PAnalyzer: A software tool for protein inference in shotgun proteomics

    Directory of Open Access Journals (Sweden)

    Prieto Gorka

    2012-11-01

    Full Text Available Abstract Background Protein inference from peptide identifications in shotgun proteomics must deal with ambiguities that arise due to the presence of peptides shared between different proteins, which is common in higher eukaryotes. Recently data independent acquisition (DIA approaches have emerged as an alternative to the traditional data dependent acquisition (DDA in shotgun proteomics experiments. MSE is the term used to name one of the DIA approaches used in QTOF instruments. MSE data require specialized software to process acquired spectra and to perform peptide and protein identifications. However the software available at the moment does not group the identified proteins in a transparent way by taking into account peptide evidence categories. Furthermore the inspection, comparison and report of the obtained results require tedious manual intervention. Here we report a software tool to address these limitations for MSE data. Results In this paper we present PAnalyzer, a software tool focused on the protein inference process of shotgun proteomics. Our approach considers all the identified proteins and groups them when necessary indicating their confidence using different evidence categories. PAnalyzer can read protein identification files in the XML output format of the ProteinLynx Global Server (PLGS software provided by Waters Corporation for their MSE data, and also in the mzIdentML format recently standardized by HUPO-PSI. Multiple files can also be read simultaneously and are considered as technical replicates. Results are saved to CSV, HTML and mzIdentML (in the case of a single mzIdentML input file files. An MSE analysis of a real sample is presented to compare the results of PAnalyzer and ProteinLynx Global Server. Conclusions We present a software tool to deal with the ambiguities that arise in the protein inference process. Key contributions are support for MSE data analysis by ProteinLynx Global Server and technical replicates

  13. Software reference for SaTool - a Tool for Structural Analysis of Automated Systems

    DEFF Research Database (Denmark)

    Lorentzen, Torsten; Blanke, Mogens

    2004-01-01

    This software reference details the functions of SaTool – a tool for structural analysis of technical systems. SaTool is intended used as part of an industrial systems design cycle. Structural analysis is a graph-based technique where principal relations between variables express the system...

  14. Automotive Software Engineering. Fundamentals, processes, methods, tools; Automotive Software Engineering. Grundlagen, Prozesse, Methoden und Werkzeuge

    Energy Technology Data Exchange (ETDEWEB)

    Schaeuffele, J.; Zurawka, T.

    2003-07-01

    The book presents fundamentals and practical examples of processes, methods and tools to ensure safe operation of electronic systems and software in motor vehicles. The focus is on the electronic systems of the powertrain, suspension and ar body. Contents: The overall system of car, driver and environment; Fundamentals; Processes for development of electronic systems and software; Methods and tools for the development, production and servicing of electronic systems. The book addresses staff members of motor car producers and suppliers of electronic systems and software, as well as students of computer science, electrical and mechanical engineering specifying in car engineering, control engineering, mechatronics and software engineering. [German] Dieses Buch enthaelt Grundlagen und praktische Beispiele zu Prozessen, Methoden und Werkzeugen, die zur sicheren Beherrschbarkeit von elektronischen Systemen und Software im Fahrzeug beitragen. Dabei stehen die elektronischen Systeme des Antriebsstrangs, des Fahrwerks und der Karosserie im Vordergrund. Zum Inhalt gehoeren die folgenden Rubriken: Gesamtsystem Fahrzeug-Fahrer-Umwelt - Grundlagen - Prozesse zur Entwicklung von elektronischen Systemen und Software - Methoden und Werkzeuge fuer die Entwicklung, die Produktion un den Service elektronischer Systeme. Das Buch richtet sich an alle Mitarbeiter von Fahrzeugherstellern und Zulieferern, die sich mit elektornischen Systemen und Software im Fahrzeug beschaeftigen. Studierende der Informatik, der Elektrotechnik oder des Maschinenbaus mit den Schwerpunkten Fahrzeugtechnik, Steuerungs- und Regelungstechnik, Mechatronik oder Software-Technik. (orig.)

  15. Validation of a software dependability tool via fault injection experiments

    OpenAIRE

    Tagliaferri, Luca; Benso, Alfredo; Di Carlo, Stefano; Di Natale, Giorgio; Prinetto, Paolo Ernesto

    2001-01-01

    Presents the validation of the strategies employed in the RECCO tool to analyze a C/C++ software; the RECCO compiler scans C/C++ source code to extract information about the significance of the variables that populate the program and the code structure itself. Experimental results gathered on an Open Source Router are used to compare and correlate two sets of critical variables, one obtained by fault injection experiments, and the other applying the RECCO tool, respectively. Then the two sets...

  16. The Systems Biology Research Tool: evolvable open-source software

    Directory of Open Access Journals (Sweden)

    Wright Jeremiah

    2008-06-01

    Full Text Available Abstract Background Research in the field of systems biology requires software for a variety of purposes. Software must be used to store, retrieve, analyze, and sometimes even to collect the data obtained from system-level (often high-throughput experiments. Software must also be used to implement mathematical models and algorithms required for simulation and theoretical predictions on the system-level. Results We introduce a free, easy-to-use, open-source, integrated software platform called the Systems Biology Research Tool (SBRT to facilitate the computational aspects of systems biology. The SBRT currently performs 35 methods for analyzing stoichiometric networks and 16 methods from fields such as graph theory, geometry, algebra, and combinatorics. New computational techniques can be added to the SBRT via process plug-ins, providing a high degree of evolvability and a unifying framework for software development in systems biology. Conclusion The Systems Biology Research Tool represents a technological advance for systems biology. This software can be used to make sophisticated computational techniques accessible to everyone (including those with no programming ability, to facilitate cooperation among researchers, and to expedite progress in the field of systems biology.

  17. Knickpoint finder: A software tool that improves neotectonic analysis

    Science.gov (United States)

    Queiroz, G. L.; Salamuni, E.; Nascimento, E. R.

    2015-03-01

    This work presents a new software tool for morphometric analysis of drainage networks based on the methods of Hack (1973) and Etchebehere et al. (2004). This tool is applicable to studies of morphotectonics and neotectonics. The software used a digital elevation model (DEM) to identify the relief breakpoints along drainage profiles (knickpoints). The program was coded in Python for use on the ArcGIS platform and is called Knickpoint Finder. A study area was selected to test and evaluate the software's ability to analyze and identify neotectonic morphostructures based on the morphology of the terrain. For an assessment of its validity, we chose an area of the James River basin, which covers most of the Piedmont area of Virginia (USA), which is an area of constant intraplate seismicity and non-orogenic active tectonics and exhibits a relatively homogeneous geodesic surface currently being altered by the seismogenic features of the region. After using the tool in the chosen area, we found that the knickpoint locations are associated with the geologic structures, epicenters of recent earthquakes, and drainages with rectilinear anomalies. The regional analysis demanded the use of a spatial representation of the data after processing using Knickpoint Finder. The results were satisfactory in terms of the correlation of dense areas of knickpoints with active lineaments and the rapidity of the identification of deformed areas. Therefore, this software tool may be considered useful in neotectonic analyses of large areas and may be applied to any area where there is DEM coverage.

  18. A Brief Review of Software Tools for Pangenomics

    Institute of Scientific and Technical Information of China (English)

    Jingfa Xiao; Zhewen Zhang; Jiayan Wu; Jun Yu

    2015-01-01

    Since the proposal for pangenomic study, there have been a dozen software tools actively in use for pangenomic analysis. By the end of 2014, Panseq and the pan-genomes analysis pipeline (PGAP) ranked as the top two most popular packages according to cumulative citations of peer-reviewed scientific publications. The functions of the software packages and tools, albeit variable among them, include categorizing orthologous genes, calculating pangenomic profiles, integrating gene annotations, and constructing phylogenies. As epigenomic elements are being gradually revealed in prokaryotes, it is expected that pangenomic databases and toolkits have to be extended to handle information of detailed functional annotations for genes and non-protein-coding sequences including non-coding RNAs, insertion elements, and conserved structural elements. To develop better bioinformatic tools, user feedback and integration of novel features are both of essence.

  19. A software communication tool for the tele-ICU.

    Science.gov (United States)

    Pimintel, Denise M; Wei, Shang Heng; Odor, Alberto

    2013-01-01

    The Tele Intensive Care Unit (tele-ICU) supports a high volume, high acuity population of patients. There is a high-volume of incoming and outgoing calls, especially during the evening and night hours, through the tele-ICU hubs. The tele-ICU clinicians must be able to communicate effectively to team members in order to support the care of complex and critically ill patients while supporting and maintaining a standard to improve time to intervention. This study describes a software communication tool that will improve the time to intervention, over the paper-driven communication format presently used in the tele-ICU. The software provides a multi-relational database of message instances to mine information for evaluation and quality improvement for all entities that touch the tele-ICU. The software design incorporates years of critical care and software design experience combined with new skills acquired in an applied Health Informatics program. This software tool will function in the tele-ICU environment and perform as a front-end application that gathers, routes, and displays internal communication messages for intervention by priority and provider. PMID:24551398

  20. Software tools at the Rome CMS/ECAL Regional Center

    CERN Document Server

    Organtini, G

    2001-01-01

    The construction of the CMS electromagnetic calorimeter is under way in Rome and at CERN. To this purpose, two Regional Centers were set up in both sites. In Rome, the project was entirely carried out using new software technologies such as object oriented programming, object databases, CORBA programming and Web tools. It can be regarded as a use case for the evaluation of the benefits of new software technologies in high energy physics. Our experience is positive and encouraging for the future. (10 refs).

  1. Development of the software generation method using model driven software engineering tool

    Energy Technology Data Exchange (ETDEWEB)

    Jang, H. S.; Jeong, J. C.; Kim, J. H.; Han, H. W.; Kim, D. Y.; Jang, Y. W. [KOPEC, Taejon (Korea, Republic of); Moon, W. S. [NEXTech Inc., Seoul (Korea, Republic of)

    2003-10-01

    The methodologies to generate the automated software design specification and source code for the nuclear I and C systems software using model driven language is developed in this work. For qualitative analysis of the algorithm, the activity diagram is modeled and generated using Unified Modeling Language (UML), and then the sequence diagram is designed for automated source code generation. For validation of the generated code, the code audits and module test is performed using Test and QA tool. The code coverage and complexities of example code are examined in this stage. The low pressure pressurizer reactor trip module of the Plant Protection System was programmed as subject for this task. The test result using the test tool shows that errors were easily detected from the generated source codes that have been generated using test tool. The accuracy of input/output processing by the execution modules was clearly identified.

  2. The RUMBA software: tools for neuroimaging data analysis.

    Science.gov (United States)

    Bly, Benjamin Martin; Rebbechi, Donovan; Hanson, Stephen Jose; Grasso, Giorgio

    2004-01-01

    The enormous scale and complexity of data sets in functional neuroimaging makes it crucial to have well-designed and flexible software for image processing, modeling, and statistical analysis. At present, researchers must choose between general purpose scientific computing environments (e.g., Splus and Matlab), and specialized human brain mapping packages that implement particular analysis strategies (e.g., AFNI, SPM, VoxBo, FSL or FIASCO). For the vast majority of users in Human Brain Mapping and Cognitive Neuroscience, general purpose computing environments provide an insufficient framework for a complex data-analysis regime. On the other hand, the operational particulars of more specialized neuroimaging analysis packages are difficult or impossible to modify and provide little transparency or flexibility to the user for approaches other than massively multiple comparisons based on inferential statistics derived from linear models. In order to address these problems, we have developed open-source software that allows a wide array of data analysis procedures. The RUMBA software includes programming tools that simplify the development of novel methods, and accommodates data in several standard image formats. A scripting interface, along with programming libraries, defines a number of useful analytic procedures, and provides an interface to data analysis procedures. The software also supports a graphical functional programming environment for implementing data analysis streams based on modular functional components. With these features, the RUMBA software provides researchers programmability, reusability, modular analysis tools, novel data analysis streams, and an analysis environment in which multiple approaches can be contrasted and compared. The RUMBA software retains the flexibility of general scientific computing environments while adding a framework in which both experts and novices can develop and adapt neuroimaging-specific analyses.

  3. The Open2-Innova8ion Tool - A Software Tool for Rating Organisational Innovation Performance

    OpenAIRE

    Caird, Sally; Hallett, Stephen; Potter, Stephen

    2013-01-01

    The Open2-Innova8ion Tool is an interactive, multi-media, web-based software tool for rating organisational innovation performance. This tool was designed for organisations to use as an adaptation of the European Commission’s work on developing empirical measures of national innovation performance with the Summary Innovation Index (SII). It is designed for users with experience of employment in an organisation, from senior managers to all types of employees, with an interest in rating the inn...

  4. Nucleonica: Web-based Software Tools for Simulations and Analysis

    OpenAIRE

    Magill, Joseph; DREHER Raymond; SOTI Zsolt; LASCHE George

    2012-01-01

    The authors present a description of a new web-based software portal for simulation and analysis for a wide range of commonly encountered nuclear science applications. Advantages of a web-based approach include availability wherever there is internet access, intuitive user-friendly interface, remote access to high-power computing resources, and continual maintenance, improvement, and addition of tools and techniques common to the nuclear science industry. A description of the nuclear data res...

  5. Software Information Base(SIB)and Its Integration with Data Flow Diagram(DFD)Tool

    Institute of Scientific and Technical Information of China (English)

    董士海

    1989-01-01

    Software in formation base is the main technique of the integration of software engineering environment.Data flow diagram tool is an important software tool to support software requirement analysis phase.This article introduces the functions,structures of a Software Information Base(SIB),and a Data Flow Diagram tool first.The E-R data model of SIB and its integration with Data Flow Diagram tool are emphatically described.

  6. Building a High-Level Process Model for Soliciting Requirements on Software Tools to Support Software Development : Experience Report

    OpenAIRE

    Bider, Ilia; Karapantelakis, Athanasios; Khadka, Nirjal

    2013-01-01

    Use of software tools to support business processes is both a possibility and necessity for both large and small enterprises of today. Given the variety of tools on the market, the question of how to choose the right tools for the process in question or analyze the suitability of the tools already employed arises. The paper presents an experience report of using a high-level business process model for analyzing software tools suitability at a large ICT organization that recently transitioned ...

  7. Tool Support for Parametric Analysis of Large Software Simulation Systems

    Science.gov (United States)

    Schumann, Johann; Gundy-Burlet, Karen; Pasareanu, Corina; Menzies, Tim; Barrett, Tony

    2008-01-01

    The analysis of large and complex parameterized software systems, e.g., systems simulation in aerospace, is very complicated and time-consuming due to the large parameter space, and the complex, highly coupled nonlinear nature of the different system components. Thus, such systems are generally validated only in regions local to anticipated operating points rather than through characterization of the entire feasible operational envelope of the system. We have addressed the factors deterring such an analysis with a tool to support envelope assessment: we utilize a combination of advanced Monte Carlo generation with n-factor combinatorial parameter variations to limit the number of cases, but still explore important interactions in the parameter space in a systematic fashion. Additional test-cases, automatically generated from models (e.g., UML, Simulink, Stateflow) improve the coverage. The distributed test runs of the software system produce vast amounts of data, making manual analysis impossible. Our tool automatically analyzes the generated data through a combination of unsupervised Bayesian clustering techniques (AutoBayes) and supervised learning of critical parameter ranges using the treatment learner TAR3. The tool has been developed around the Trick simulation environment, which is widely used within NASA. We will present this tool with a GN&C (Guidance, Navigation and Control) simulation of a small satellite system.

  8. STARS software tool for analysis of reliability and safety

    International Nuclear Information System (INIS)

    This paper reports on the STARS (Software Tool for the Analysis of Reliability and Safety) project aims at developing an integrated set of Computer Aided Reliability Analysis tools for the various tasks involved in systems safety and reliability analysis including hazard identification, qualitative analysis, logic model construction and evaluation. The expert system technology offers the most promising perspective for developing a Computer Aided Reliability Analysis tool. Combined with graphics and analysis capabilities, it can provide a natural engineering oriented environment for computer assisted reliability and safety modelling and analysis. For hazard identification and fault tree construction, a frame/rule based expert system is used, in which the deductive (goal driven) reasoning and the heuristic, applied during manual fault tree construction, is modelled. Expert system can explain their reasoning so that the analyst can become aware of the why and the how results are being obtained. Hence, the learning aspect involved in manual reliability and safety analysis can be maintained and improved

  9. Classroom Live: a software-assisted gamification tool

    Science.gov (United States)

    de Freitas, Adrian A.; de Freitas, Michelle M.

    2013-06-01

    Teachers have come to rely on a variety of approaches in order to elicit and sustain student interest in the classroom. One particular approach, known as gamification, seeks to improve student engagement by transforming the traditional classroom experience into a competitive multiplayer game. Initial attempts at classroom gamification relied on the teacher manually tracking student progress. At the US Air Force Academy, we wanted to experiment with a software gamification tool. Our client/server suite, dubbed Classroom Live, streamlines the gamification process for the teacher by simplifying common tasks. Simultaneously, the tool provides students with an esthetically pleasing user interface that offers in game rewards in exchange for their participation. Classroom Live is still in development, but our initial experience using the tool has been extremely positive and confirms our belief that students respond positively to gamification, even at the undergraduate level.

  10. A NEO population generation and observation simulation software tool

    Science.gov (United States)

    Müller, Sven; Gelhaus, Johannes; Hahn, Gerhard; Franco, Raffaella

    One of the main targets of ESA's Space Situational Awareness (SSA) program is to build a wide knowledge base about objects that can potentially harm Earth (Near-Earth Objects, NEOs). An important part of this effort is to create the Small Bodies Data Centre (SBDC) which is going to aggregate measurement data from a fully-integrated NEO observation sensor network. Until this network is developed, artificial NEO measurement data is needed in order to validate SBDC algorithms. Moreover, to establish a functioning NEO observation sensor network, it has to be determined where to place sensors, what technical requirements have to be met in order to be able to detect NEOs and which observation strategies work the best. Because of this, a sensor simulation software was needed. This paper presents a software tool which allows users to create and analyse NEO populations and to simulate and analyse population observations. It is a console program written in Fortran and comes with a Graphical User Interface (GUI) written in Java and C. The tool can be distinguished into the components ``Population Generator'' and ``Observation Simulator''. The Population Generator component is responsible for generating and analysing a NEO population. Users can choose between creating fictitious (random) and synthetic populations. The latter are based on one of two models describing the orbital and size distribution of observed NEOs: The existing socalled ``Bottke Model'' (Bottke et al. 2000, 2002) and the new ``Granvik Model'' (Granvik et al. 2014, in preparation) which has been developed in parallel to the tool. Generated populations can be analysed by defining 2D, 3D and scatter plots using various NEO attributes. As a result, the tool creates the appropiate files for the plotting tool ``gnuplot''. The tool's Observation Simulator component yields the Observation Simulation and Observation Analysis functions. Users can define sensor systems using ground- or space-based locations as well as

  11. Tools and Behavior Abstraction: A Future for Software Engineering

    Directory of Open Access Journals (Sweden)

    Wilson Solís

    2012-06-01

    Full Text Available Software engineers rely on and use tools to analyze automatically and detailed the code and design specifications. Although many are still used to find new defects in old code, is expected in the future have more application in software engineering and are available to developers at the time of editing their products. If were possible build them fast enough and easy to use, software engineers would apply it to improve design and product development. To solve any problem, traditional engineering use programming languages, however, the level of abstraction of the most popular is not much larger than C programs several decades ago. Moreover, this level is the same in all the code and do not leaves room for abstraction of behavior, in which the design is divided into phases and which gradually introduces more details. This article presents a study of the need for a larger set of analysis tools to create languages and development environments, which provide good support to archive this abstraction.

  12. Software tool for horizontal-axis wind turbine simulation

    Energy Technology Data Exchange (ETDEWEB)

    Vitale, A.J. [Instituto Argentino de Oceanografia, Camino La Carrindanga Km. 7, 5 CC 804, B8000FWB Bahia Blanca (Argentina); Universidad Tecnologica Nacional Facultad Regional Bahia Blanca, GESE, 11 de Abril 461, B8000LMI Bahia Blanca (Argentina); Dpto. de Ing. Electrica y de Computadoras, Universidad Nacional del Sur, Av. Alem 1253, 8000 Bahia Blanca (Argentina); Rossi, A.P. [Universidad Tecnologica Nacional Facultad Regional Bahia Blanca, GESE, 11 de Abril 461, B8000LMI Bahia Blanca (Argentina); Dpto. de Ing. Electrica y de Computadoras, Universidad Nacional del Sur, Av. Alem 1253, 8000 Bahia Blanca (Argentina)

    2008-07-15

    The main problem of a wind turbine generator design project is the design of the right blades capable of satisfying the specific energy requirement of an electric system with optimum performance. Once the blade has been designed for optimum operation at a particular rotor angular speed, it is necessary to determine the overall performance of the rotor under the range of wind speed that it will encounter. A software tool that simulates low-power, horizontal-axis wind turbines was developed for this purpose. With this program, the user can calculate the rotor power output for any combination of wind and rotor speeds, with definite blade shape and airfoil characteristics. The software also provides information about distribution of forces along the blade span, for different operational conditions. (author)

  13. Northwestern University Schizophrenia Data and Software Tool (NUSDAST).

    Science.gov (United States)

    Wang, Lei; Kogan, Alex; Cobia, Derin; Alpert, Kathryn; Kolasny, Anthony; Miller, Michael I; Marcus, Daniel

    2013-01-01

    The schizophrenia research community has invested substantial resources on collecting, managing and sharing large neuroimaging datasets. As part of this effort, our group has collected high resolution magnetic resonance (MR) datasets from individuals with schizophrenia, their non-psychotic siblings, healthy controls and their siblings. This effort has resulted in a growing resource, the Northwestern University Schizophrenia Data and Software Tool (NUSDAST), an NIH-funded data sharing project to stimulate new research. This resource resides on XNAT Central, and it contains neuroimaging (MR scans, landmarks and surface maps for deep subcortical structures, and FreeSurfer cortical parcellation and measurement data), cognitive (cognitive domain scores for crystallized intelligence, working memory, episodic memory, and executive function), clinical (demographic, sibling relationship, SAPS and SANS psychopathology), and genetic (20 polymorphisms) data, collected from more than 450 subjects, most with 2-year longitudinal follow-up. A neuroimaging mapping, analysis and visualization software tool, CAWorks, is also part of this resource. Moreover, in making our existing neuroimaging data along with the associated meta-data and computational tools publically accessible, we have established a web-based information retrieval portal that allows the user to efficiently search the collection. This research-ready dataset meaningfully combines neuroimaging data with other relevant information, and it can be used to help facilitate advancing neuroimaging research. It is our hope that this effort will help to overcome some of the commonly recognized technical barriers in advancing neuroimaging research such as lack of local organization and standard descriptions. PMID:24223551

  14. Northwestern University Schizophrenia Data and Software Tool (NUSDAST

    Directory of Open Access Journals (Sweden)

    Lei eWang

    2013-11-01

    Full Text Available The schizophrenia research community has invested substantial resources on collecting, managing and sharing large neuroimaging datasets. As part of this effort, our group has collected high resolution magnetic resonance (MR datasets from individuals with schizophrenia, their non-psychotic siblings, healthy controls and their siblings. This effort has resulted in a growing resource, the Northwestern University Schizophrenia Data and Software Tool (NUSDAST, an NIH-funded data sharing project to stimulate new research. This resource resides on XNAT Central, and it contains neuroimaging (MR scans, landmarks and surface maps for deep subcortical structures, and FreeSurfer cortical parcellation and measurement data, cognitive (cognitive domain scores for crystallized intelligence, working memory, episodic memory, and executive function, clinical (demographic, sibling relationship, SAPS and SANS psychopathology, and genetic (20 polymorphisms data, collected from more than 450 subjects, most with 2-year longitudinal follow-up. A neuroimaging mapping, analysis and visualization software tool, CAWorks, is also part of this resource. Moreover, in making our existing neuroimaging data along with the associated meta-data and computational tools publically accessible, we have established a web-based information retrieval portal that allows the user to efficiently search the collection. This research-ready dataset meaningfully combines neuroimaging data with other relevant information, and it can be used to help facilitate advancing neuroimaging research. It is our hope that this effort will help to overcome some of the commonly recognized technical barriers in advancing neuroimaging research such as lack of local organization and standard descriptions.

  15. Advanced software tools for digital loose part monitoring systems

    International Nuclear Information System (INIS)

    The paper describes two software modules as analysis tools for digital loose part monitoring systems. The first module is called acoustic module which utilizes the multi-media features of modern personal computers to replay the digital stored short-time bursts with sufficient length and in good quality. This is possible due to the so-called puzzle technique developed at ISTec. The second module is called classification module which calculates advanced burst parameters and classifies the acoustic events in pre-defined classes with the help of an artificial multi-layer perception neural network trained with the back propagation algorithm. (author). 7 refs, 7 figs

  16. Object-Oriented Software Tools for the Construction of Preconditioners

    Directory of Open Access Journals (Sweden)

    Eva Mossberg

    1997-01-01

    Full Text Available In recent years, there has been considerable progress concerning preconditioned iterative methods for large and sparse systems of equations arising from the discretization of differential equations. Such methods are particularly attractive in the context of high-performance (parallel computers. However, the implementation of a preconditioner is a nontrivial task. The focus of the present contribution is on a set of object-oriented software tools that support the construction of a family of preconditioners based on fast transforms. By combining objects of different classes, it is possible to conveniently construct any preconditioner within this family.

  17. QUALITY SERVICES EVALUATION MODEL BASED ON DEDICATED SOFTWARE TOOL

    Directory of Open Access Journals (Sweden)

    ANDREEA CRISTINA IONICĂ

    2012-10-01

    Full Text Available In this paper we introduced a new model, called Service Quality (SQ, which combines QFD and SERVQUAL methods. This model takes from the SERVQUAL method the five dimensions of requirements and three of characteristics and from the QFD method the application methodology. The originality of the SQ model consists in computing a global index that reflects the customers’ requirements accomplishment level by the quality characteristics. In order to prove the viability of the SQ model, there was developed a software tool that was applied for the evaluation of a health care services provider.

  18. Software Tools for In-Situ Documentation of Built Heritage

    Science.gov (United States)

    Smars, P.

    2013-07-01

    The paper presents open source software tools developed by the author to facilitate in-situ documentation of architectural and archæological heritage. The design choices are exposed and related to a general issue in conservation and documentation: taking decisions about a valuable object under threat . The questions of level of objectivity is central to the three steps of this process. It is our belief that in-situ documentation has to be favoured in this demanding context, full of potential discoveries. The very powerful surveying techniques in rapid development nowadays enhance our vision but often tend to bring back a critical part of the documentation process to the office. The software presented facilitate a direct treatment of the data on the site. Emphasis is given to flexibility, interoperability and simplicity. Key features of the software are listed and illustrated with examples (3D model of Gothic vaults, analysis of the shape of a column, deformation of a wall, direct interaction with AutoCAD).

  19. A software tool for graphically assembling damage identification algorithms

    Science.gov (United States)

    Allen, David W.; Clough, Joshua A.; Sohn, Hoon; Farrar, Charles R.

    2003-08-01

    At Los Alamos National Laboratory (LANL), various algorithms for structural health monitoring problems have been explored in the last 5 to 6 years. The original DIAMOND (Damage Identification And MOdal aNalysis of Data) software was developed as a package of modal analysis tools with some frequency domain damage identification algorithms included. Since the conception of DIAMOND, the Structural Health Monitoring (SHM) paradigm at LANL has been cast in the framework of statistical pattern recognition, promoting data driven damage detection approaches. To reflect this shift and to allow user-friendly analyses of data, a new piece of software, DIAMOND II is under development. The Graphical User Interface (GUI) of the DIAMOND II software is based on the idea of GLASS (Graphical Linking and Assembly of Syntax Structure) technology, which is currently being implemented at LANL. GLASS is a Java based GUI that allows drag and drop construction of algorithms from various categories of existing functions. In the platform of the underlying GLASS technology, DIAMOND II is simply a module specifically targeting damage identification applications. Users can assemble various routines, building their own algorithms or benchmark testing different damage identification approaches without writing a single line of code.

  20. The Software Improvement Process - Tools And Rules To Encourage Quality

    CERN Document Server

    Sigerud, K

    2011-01-01

    The Applications section of the CERN accelerator Controls group has decided to apply a systematic approach to quality assurance (QA), the “Software Improvement Process”, SIP. This process focuses on three areas: the development process itself, suitable QA tools, and how to practically encourage developers to do QA. For each stage of the development process we have agreed on the recommended activities and deliverables, and identified tools to automate and support the task. For example we do more code reviews. As peer reviews are resource-intensive, we only do them for complex parts of a product. As a complement, we are using static code checking tools, like FindBugs and Checkstyle. We also encourage unit testing and have agreed on a minimum level of test coverage recommended for all products, measured using Clover. Each of these tools is well integrated with our IDE (Eclipse) and give instant feedback to the developer about the quality of their code. The major challenges of SIP have been to 1) agree on com...

  1. Software tools to aid Pascal and Ada program design

    Energy Technology Data Exchange (ETDEWEB)

    Jankowitz, H.T.

    1987-01-01

    This thesis describes a software tool which analyses the style and structure of Pascal and Ada programs by ensuring that some minimum design requirements are fulfilled. The tool is used in much the same way as a compiler is used to teach students the syntax of a language, only in this case issues related to the design and structure of the program are of paramount importance. The tool operates by analyzing the design and structure of a syntactically correct program, automatically generating a report detailing changes that need to be made in order to ensure that the program is structurally sound. The author discusses how the model gradually evolved from a plagiarism detection system which extracted several measurable characteristics in a program to a model that analyzed the style of Pascal programs. In order to incorporate more-sophistical concepts like data abstraction, information hiding and data protection, this model was then extended to analyze the composition of Ada programs. The Ada model takes full advantage of facilities offered in the language and by using this tool the standard and quality of written programs is raised whilst the fundamental principles of program design are grasped through a process of self-tuition.

  2. Evaluating, selecting and relevance software tools in technology monitoring

    Directory of Open Access Journals (Sweden)

    Óscar Fernando Castellanos Domínguez

    2010-07-01

    Full Text Available The current setting for industrial and entrepreneurial development has posed the need for incorporating differentiating elements into the production apparatus leading to anticipating technological change. Technology monitoring (TM emerges as a methodology focused on analysing these changes for identifying challenges and opportunities (being mainly supported by information technology (IT through the search for, capture and analysis of data and information. This article proposes criteria for choosing and efficiently using software tools having different characteristics, requirements, capacity and cost which could be used in monitoring. An approach is made to different TM models, emphasising the identification and analysis of different information sources for coving and supporting information and access monitoring. Some evaluation, selection and analysis criteria are given for using these types of tools according to each production system’s individual profile and needs. Some of the existing software packages are described which are available on the market for carrying out monitoring prolects, relating them to their complexity, process characteristics and cost.

  3. Learning Photogrammetry with Interactive Software Tool PhoX

    Directory of Open Access Journals (Sweden)

    T. Luhmann

    2016-06-01

    Full Text Available Photogrammetry is a complex topic in high-level university teaching, especially in the fields of geodesy, geoinformatics and metrology where high quality results are demanded. In addition, more and more black-box solutions for 3D image processing and point cloud generation are available that generate nice results easily, e.g. by structure-from-motion approaches. Within this context, the classical approach of teaching photogrammetry (e.g. focusing on aerial stereophotogrammetry has to be reformed in order to educate students and professionals with new topics and provide them with more information behind the scene. Since around 20 years photogrammetry courses at the Jade University of Applied Sciences in Oldenburg, Germany, include the use of digital photogrammetry software that provide individual exercises, deep analysis of calculation results and a wide range of visualization tools for almost all standard tasks in photogrammetry. During the last years the software package PhoX has been developed that is part of a new didactic concept in photogrammetry and related subjects. It also serves as analysis tool in recent research projects. PhoX consists of a project-oriented data structure for images, image data, measured points and features and 3D objects. It allows for almost all basic photogrammetric measurement tools, image processing, calculation methods, graphical analysis functions, simulations and much more. Students use the program in order to conduct predefined exercises where they have the opportunity to analyse results in a high level of detail. This includes the analysis of statistical quality parameters but also the meaning of transformation parameters, rotation matrices, calibration and orientation data. As one specific advantage, PhoX allows for the interactive modification of single parameters and the direct view of the resulting effect in image or object space.

  4. Learning Photogrammetry with Interactive Software Tool PhoX

    Science.gov (United States)

    Luhmann, T.

    2016-06-01

    Photogrammetry is a complex topic in high-level university teaching, especially in the fields of geodesy, geoinformatics and metrology where high quality results are demanded. In addition, more and more black-box solutions for 3D image processing and point cloud generation are available that generate nice results easily, e.g. by structure-from-motion approaches. Within this context, the classical approach of teaching photogrammetry (e.g. focusing on aerial stereophotogrammetry) has to be reformed in order to educate students and professionals with new topics and provide them with more information behind the scene. Since around 20 years photogrammetry courses at the Jade University of Applied Sciences in Oldenburg, Germany, include the use of digital photogrammetry software that provide individual exercises, deep analysis of calculation results and a wide range of visualization tools for almost all standard tasks in photogrammetry. During the last years the software package PhoX has been developed that is part of a new didactic concept in photogrammetry and related subjects. It also serves as analysis tool in recent research projects. PhoX consists of a project-oriented data structure for images, image data, measured points and features and 3D objects. It allows for almost all basic photogrammetric measurement tools, image processing, calculation methods, graphical analysis functions, simulations and much more. Students use the program in order to conduct predefined exercises where they have the opportunity to analyse results in a high level of detail. This includes the analysis of statistical quality parameters but also the meaning of transformation parameters, rotation matrices, calibration and orientation data. As one specific advantage, PhoX allows for the interactive modification of single parameters and the direct view of the resulting effect in image or object space.

  5. PTaaS: Platform for Providing Software Developing Applications and Tools as a Service

    DEFF Research Database (Denmark)

    Chauhan, Muhammad Aufeef; Babar, Muhammad Ali

    2014-01-01

    technological support for it that is not limited to one specific tools and a particular phase of software development life cycle. In this thesis, we have explored the possibility of offering software development applications and tools as services that can be acquired on demand according to the software...... with process. Information gained from the review of literature on GSD tools and processes is used to extract functional requirements for the middleware platform for provisioning of software development applications and tools as services. Finding from the review of literature on architecture solutions for cloud...... client application that act as a bridge between software development tools and middleware platform....

  6. Desire characteristics of a generic 'no frills' software engineering tools package

    Energy Technology Data Exchange (ETDEWEB)

    Rhodes, J.J.

    1986-07-29

    Increasing numbers of vendors are developing software engineering tools to meet the demands of increasingly complex software systems, higher reliability goals for software products, higher programming labor costs, and management's desire to more closely associate software lifecycle costs with the estimated development schedule. Some vendors have chosen a dedicated workstation approach to achieve high user interactivity through windowing and mousing. Other vendors are using multi-user mainframes with low cost terminals to economize on the costs of the hardware and the tools software. For all of the potential customers of software tools, the question remains: What are the minimum functional requirements that a software engineering tools package must have in order to be considered useful throughout the entire software lifecycle. This paper describes the desired characteristics of a non-existent but realistic 'no frills' software engineering tools package. 3 refs., 5 figs.

  7. The ultimate CASE (Computer-Aided Software Engineering) tool

    Energy Technology Data Exchange (ETDEWEB)

    Sharp, J.K.

    1990-01-01

    The theory and practice of information engineering is being actively developed at Sandia National Laboratories. The main output of Sandia is information. Information is created, analyzed and distributed. It is the life blood of our design laboratory. The proper management of information will have a large, positive impact on staff productivity. In order to achieve the potential benefits of shared information a commonly understood approach is needed, and the approach must be implemented in a CASE (Computer-Aided Software Engineering) tool that spans the entire life cycle of information. The commonly understood approach used at Sandia is natural language. More specifically, it is a structured subset of English. Users and system developers communicate requirements and commitments that they both understand. The approach is based upon NIAM (Nijssen's Information Analysis Methodology). In the last three years four NIAM training classes have been given at Sandia. The classes were all at the introductory level, with the latest class last October having an additional seminar highlighting successful projects. The continued growth in applications using NIAM requires an advanced class. The class will develop an information model for the Ultimate CASE Tool.'' This paper presents the requirements that have been established for the Ultimate CASE Tool'' and presents initial models. 4 refs., 1 tab.

  8. Hardware replacements and software tools for digital control computers

    International Nuclear Information System (INIS)

    computers which use 'Varian' technology. A new software program, Desk Top Tools, permits the designer greater flexibility in digital control computer software design and testing. This software development allows the user to emulate control of the CANDU reactor system by system. All discussions will highlight the ability of the replacements and the new developments to enhance the operation of the existing and 'repeat' plant digital control computers and will explore future applications of these developments. Examples of current use of all replacement components and software are provided. (author)

  9. User Guide for the STAYSL PNNL Suite of Software Tools

    Energy Technology Data Exchange (ETDEWEB)

    Greenwood, Lawrence R.; Johnson, Christian D.

    2013-02-27

    The STAYSL PNNL software suite provides a set of tools for working with neutron activation rates measured in a nuclear fission reactor, an accelerator-based neutron source, or any neutron field to determine the neutron flux spectrum through a generalized least-squares approach. This process is referred to as neutron spectral adjustment since the preferred approach is to use measured data to adjust neutron spectra provided by neutron physics calculations. The input data consist of the reaction rates based on measured activities, an initial estimate of the neutron flux spectrum, neutron activation cross sections and their associated uncertainties (covariances), and relevant correction factors. The output consists of the adjusted neutron flux spectrum and associated covariance matrix, which is useful for neutron dosimetry and radiation damage calculations.

  10. Evaluation of The Virtual Cells Software: a Teaching Tool

    Directory of Open Access Journals (Sweden)

    C.C.P. da Silva

    2005-07-01

    handling,  having an accessible language,  supporting the  software  as an education  tool that is capable  to facilitate  the learning  of the fundamental concepts  about the theme.  Other  workshops are programmed to happen with participants from different educational institutions of Sao Carlos  city,  with the goal to broaden our sample.

  11. ELER software - a new tool for urban earthquake loss assessment

    Science.gov (United States)

    Hancilar, U.; Tuzun, C.; Yenidogan, C.; Erdik, M.

    2010-12-01

    ATC-55 (Yang, 2005). An urban loss assessment exercise for a scenario earthquake for the city of Istanbul is conducted and physical and social losses are presented. Damage to the urban environment is compared to the results obtained from similar software, i.e. KOERILoss (KOERI, 2002) and DBELA (Crowley et al., 2004). The European rapid loss estimation tool is expected to help enable effective emergency response, on both local and global level, as well as public information.

  12. A software tool for soil clean-up technology selection

    International Nuclear Information System (INIS)

    Soil remediation is a difficult, time-consuming and expensive operation. A variety of mature and emerging soil remediation technologies is available and future trends in remediation will include continued competition among environmental service companies and technology developers, which will definitely result in further increase in the clean-up options. Consequently, the demand has enhanced developing decision support tools that could help the decision makers to select the most appropriate technology for the specific contaminated site, before the costly remedial actions are taken. Therefore, a software tool for soil clean-up technology selection is currently being developed with the aim of closely working with human decision makers (site owners, local community representatives, environmentalists, regulators, etc.) to assess the available technologies and preliminarily select the preferred remedial options. The analysis for the identification of the best remedial options is based on technical, financial, environmental, and social criteria. These criteria are ranked by all involved parties to determine their relative importance for a particular project. (author)

  13. Effective Implementation of Agile Practices - Object Oriented Metrics Tool to Improve Software Quality

    Directory of Open Access Journals (Sweden)

    K. Nageswara Rao

    2012-08-01

    Full Text Available Maintaining the quality of the software is the major challenge in the process of software development.Software inspections which use the methods like structured walkthroughs and formal code reviews involvecareful examination of each and every aspect/stage of software development. In Agile softwaredevelopment, refactoring helps to improve software quality. This refactoring is a technique to improvesoftware internal structure without changing its behaviour. After much study regarding the ways toimprove software quality, our research proposes an object oriented software metric tool called“MetricAnalyzer”. This tool is tested on different codebases and is proven to be much useful.

  14. Technical Data Exchange Software Tools Adapted to Distributed Microsatellite Design

    Science.gov (United States)

    Pache, Charly

    2002-01-01

    One critical issue concerning distributed design of satellites, is the collaborative work it requires. In particular, the exchange of data between each group responsible for each subsystem can be complex and very time-consuming. The goal of this paper is to present a design collaborative tool, the SSETI Design Model (SDM), specifically developed for enabling satellite distributed design. SDM is actually used in the ongoing Student Space Exploration &Technology (SSETI) initiative (www.sseti.net). SSETI is lead by European Space Agency (ESA) outreach office (http://www.estec.esa.nl/outreach), involving student groups from all over Europe for design, construction and launch of a microsatellite. The first part of this paper presents the current version of the SDM tool, a collection of Microsoft Excel linked worksheets, one for each subsystem. An overview of the project framework/structure is given, explaining the different actors, the flows between them, as well as the different types of data and the links - formulas - between data sets. Unified Modeling Language (UML) diagrams give an overview of the different parts . Then the SDM's functionalities, developed in VBA scripts (Visual Basic for Application), are introduced, as well as the interactive features, user interfaces and administration tools. The second part discusses the capabilities and limitations of SDM current version. Taking into account these capabilities and limitations, the third part outlines the next version of SDM, a web-oriented, database-driven evolution of the current version. This new approach will enable real-time data exchange and processing between the different actors of the mission. Comprehensive UML diagrams will guide the audience through the entire modeling process of such a system. Tradeoffs simulation capabilities, security, reliability, hardware and software issues will also be thoroughly discussed.

  15. A software tool for rapid flood inundation mapping

    Science.gov (United States)

    Verdin, James; Verdin, Kristine; Mathis, Melissa; Magadzire, Tamuka; Kabuchanga, Eric; Woodbury, Mark; Gadain, Hussein

    2016-01-01

    The GIS Flood Tool (GFT) was developed by the U.S. Geological Survey with support from the U.S. Agency for International Development’s Office of U.S. Foreign Disaster Assistance to provide a means for production of reconnaissance-level flood inundation mapping for data-sparse and resource-limited areas of the world. The GFT has also attracted interest as a tool for rapid assessment flood inundation mapping for the Flood Inundation Mapping Program of the U.S. Geological Survey. The GFT can fill an important gap for communities that lack flood inundation mapping by providing a first-estimate of inundation zones, pending availability of resources to complete an engineering study. The tool can also help identify priority areas for application of scarce flood inundation mapping resources. The technical basis of the GFT is an application of the Manning equation for steady flow in an open channel, operating on specially processed digital elevation data. The GFT is implemented as a software extension in ArcGIS. Output maps from the GFT were validated at 11 sites with inundation maps produced previously by the Flood Inundation Mapping Program using standard one-dimensional hydraulic modeling techniques. In 80 percent of the cases, the GFT inundation patterns matched 75 percent or more of the one-dimensional hydraulic model inundation patterns. Lower rates of pattern agreement were seen at sites with low relief and subtle surface water divides. Although the GFT is simple to use, it should be applied with the oversight or review of a qualified hydraulic engineer who understands the simplifying assumptions of the approach.

  16. A software tool for rapid flood inundation mapping

    Science.gov (United States)

    Verdin, James; Verdin, Kristine; Mathis, Melissa; Magadzire, Tamuka; Kabuchanga, Eric; Woodbury, Mark; Gadain, Hussein

    2016-06-02

    The GIS Flood Tool (GFT) was developed by the U.S. Geological Survey with support from the U.S. Agency for International Development’s Office of U.S. Foreign Disaster Assistance to provide a means for production of reconnaissance-level flood inundation mapping for data-sparse and resource-limited areas of the world. The GFT has also attracted interest as a tool for rapid assessment flood inundation mapping for the Flood Inundation Mapping Program of the U.S. Geological Survey. The GFT can fill an important gap for communities that lack flood inundation mapping by providing a first-estimate of inundation zones, pending availability of resources to complete an engineering study. The tool can also help identify priority areas for application of scarce flood inundation mapping resources. The technical basis of the GFT is an application of the Manning equation for steady flow in an open channel, operating on specially processed digital elevation data. The GFT is implemented as a software extension in ArcGIS. Output maps from the GFT were validated at 11 sites with inundation maps produced previously by the Flood Inundation Mapping Program using standard one-dimensional hydraulic modeling techniques. In 80 percent of the cases, the GFT inundation patterns matched 75 percent or more of the one-dimensional hydraulic model inundation patterns. Lower rates of pattern agreement were seen at sites with low relief and subtle surface water divides. Although the GFT is simple to use, it should be applied with the oversight or review of a qualified hydraulic engineer who understands the simplifying assumptions of the approach.

  17. Can agile software tools bring the benefits of a task board to globally distributed teams?

    NARCIS (Netherlands)

    Katsma, Christiaan; Amrit, Chintan; Hillegersberg, van Jos; Sikkel, Klaas; Oshri, Ilan; Kotlarsky, Julia; Willcocks, Leslie P.

    2013-01-01

    Software-based tooling has become an essential part of globally disitrbuted software development. In this study we focus on the usage of such tools and task boards in particular. We investigate the deployment of these tools through a field research in 4 different companies that feature agile and glo

  18. The Systems Biology Research Tool: evolvable open-source software

    OpenAIRE

    Wright Jeremiah; Wagner Andreas

    2008-01-01

    Abstract Background Research in the field of systems biology requires software for a variety of purposes. Software must be used to store, retrieve, analyze, and sometimes even to collect the data obtained from system-level (often high-throughput) experiments. Software must also be used to implement mathematical models and algorithms required for simulation and theoretical predictions on the system-level. Results We introduce a free, easy-to-use, open-source, integrated software platform calle...

  19. Software Reuse in Agile Development Organizations - A Conceptual Management Tool

    NARCIS (Netherlands)

    Spoelstra, Wouter; Iacob, Maria; Sinderen, van Marten

    2011-01-01

    The reuse of knowledge is considered a major factor for increasing productivity and quality. In the software industry knowledge is embodied in software assets such as code components, functional designs and test cases. This kind of knowledge reuse is also referred to as software reuse. Although the

  20. A Decision Support Tool for Assessing the Maturity of Software Product Line Process

    OpenAIRE

    Ahmed, Faheem; Capretz, Luiz Fernando

    2015-01-01

    The software product line aims at the effective utilization of software assets, reducing the time required to deliver a product, improving the quality, and decreasing the cost of software products. Organizations trying to incorporate this concept require an approach to assess the current maturity level of the software product line process in order to make management decisions. A decision support tool for assessing the maturity of the software product line process is developed to implement the...

  1. Work Breakdown Structure: A Tool for Software Project Scope Verification

    OpenAIRE

    Robert T. Hans

    2013-01-01

    Software project scope verification is a very important process in project scope management and it needs to be performed properly and thoroughly so as to avoid project rework and scope creep. Moreover, software scope verification is crucial in the process of delivering exactly what the customer requested and minimizing project scope changes. Well defined software scope eases the process of scope verification and contributes to project success. Furthermore, a deliverable-oriented WBS provides ...

  2. Systems and software variability management concepts, tools and experiences

    CERN Document Server

    Capilla, Rafael; Kang, Kyo-Chul

    2013-01-01

    The success of product line engineering techniques in the last 15 years has popularized the use of software variability as a key modeling approach for describing the commonality and variability of systems at all stages of the software lifecycle. Software product lines enable a family of products to share a common core platform, while allowing for product specific functionality being built on top of the platform. Many companies have exploited the concept of software product lines to increase the resources that focus on highly differentiating functionality and thus improve their competitiveness

  3. Measuring the development process: A tool for software design evaluation

    Science.gov (United States)

    Moy, S. S.

    1980-01-01

    The design metrics evaluator (DME), a component of an automated software design analysis system, is described. The DME quantitatively evaluates software design attributes. Its use directs attention to areas of a procedure, module, or complete program having a high potential for error.

  4. Work Breakdown Structure: A Tool for Software Project Scope Verification

    Directory of Open Access Journals (Sweden)

    Robert T. Hans

    2013-07-01

    Full Text Available Software project scope verification is a very important process in project scope management and it needsto be performed properly and thoroughly so as to avoid project rework and scope creep. Moreover,software scope verification is crucial in the process of delivering exactly what the customer requested andminimizing project scope changes. Well defined software scope eases the process of scope verification andcontributes to project success. Furthermore, a deliverable-oriented WBS provides a road map to a welldefined software scope of work. It is on the basis of this that this paper extends the use of deliverableorientedWBS to that of scope verification process. This paper argues that a deliverable-oriented WBS is atool for software scope verification

  5. Nik Software Captured The Complete Guide to Using Nik Software's Photographic Tools

    CERN Document Server

    Corbell, Tony L

    2011-01-01

    Learn all the features and functionality of the complete Nik family of products Styled in such a way as to resemble the way photographers think, Nik Software Captured aims to help you learn to apply all the features and functionality of the Nik software products. With Nik Software Captured, authors and Nik Software, Inc. insiders Tony Corbell and Josh Haftel help you use after-capture software products easier and more creatively. Their sole aim is to ensure that you can apply the techniques discussed in the book while gaining a thorough understanding of the capabilities of programs such as Dfi

  6. Development of Software for Analyzing Breakage Cutting ToolsBased on Image Processing

    Institute of Scientific and Technical Information of China (English)

    赵彦玲; 刘献礼; 王鹏; 王波; 王红运

    2004-01-01

    As the present day digital microsystems do not provide specialized microscopes that can detect cutting-tool, analysis software has been developed using VC++. A module for verge test and image segmentation is designed specifically for cutting-tools. Known calibration relations and given postulates are used in scale measurements. Practical operations show that the software can perform accurate detection.

  7. A Software Tool for Removing Patient Identifying Information from Clinical Documents

    OpenAIRE

    Friedlin, F. Jeff; McDonald, Clement J.

    2008-01-01

    We created a software tool that accurately removes all patient identifying information from various kinds of clinical data documents, including laboratory and narrative reports. We created the Medical De-identification System (MeDS), a software tool that de-identifies clinical documents, and performed 2 evaluations. Our first evaluation used 2,400 Health Level Seven (HL7) messages from 10 different HL7 message producers. After modifying the software based on the results of this first evaluati...

  8. Possibilities for using software tools in the process of secuirty design

    Directory of Open Access Journals (Sweden)

    Ladislav Mariš

    2013-07-01

    Full Text Available The authors deal with the use of software support the process of security design. The article proposes the theoretical basis of the implementation of software tools to design activities. Based on the selected design standards of electrical safety systems application design solutions, especially in drawing documentation. The article should serve the needs of the project team members in order to use selected software tools and a subsequent increase in the degree of automation of design activities.

  9. Software Engineering Practices and Tool Support: an exploratory study in New Zealand

    Directory of Open Access Journals (Sweden)

    Chris Phillips

    2003-11-01

    Full Text Available This study was designed as a preliminary investigation of the practices of software engineers within New Zealand, including their use of development tools. The project involved a review of relevant literature on software engineering and CASE tools, the development and testing of an interview protocol, and structured interviews with five software engineers. This paper describes the project, presents the findings, examines the results in the context of the literature and outlines on-going funded work involving a larger survey.

  10. Possibilities for using software tools in the process of secuirty design

    OpenAIRE

    Ladislav Mariš; Andrej Veľas

    2013-01-01

    The authors deal with the use of software support the process of security design. The article proposes the theoretical basis of the implementation of software tools to design activities. Based on the selected design standards of electrical safety systems application design solutions, especially in drawing documentation. The article should serve the needs of the project team members in order to use selected software tools and a subsequent increase in the degree of automation of design activities.

  11. An Approach to Building a Traceability Tool for Software Development

    Science.gov (United States)

    Delgado, Nelly; Watson, Tom

    1997-01-01

    It is difficult in a large, complex computer program to ensure that it meets the specified requirements. As the program evolves over time, a11 program constraints originally elicited during the requirements phase must be maintained. In addition, during the life cycle of the program, requirements typically change and the program must consistently reflect those changes. Imagine the following scenario. Company X wants to develop a system to automate its assembly line. With such a large system, there are many different stakeholders, e.g., managers, experts such as industrial and mechanical engineers, and end-users. Requirements would be elicited from all of the stake holders involved in the system with each stakeholder contributing their point of view to the requirements. For example, some of the requirements provided by an industrial engineer may concern the movement of parts through the assembly line. A point of view provided by the electrical engineer may be reflected in constraints concerning maximum power usage. End-users may be concerned with comfort and safety issues, whereas managers are concerned with the efficiency of the operation. With so many points of view affecting the requirements, it is difficult to manage them, communicate information to relevant stakeholders. and it is likely that conflicts in the requirements will arise. In the coding process, the implementors will make additional assumptions and interpretations on the design and the requirements of the system. During any stage of development, stakeholders may request that a requirement be added or changed. In such a dynamic environment, it is difficult to guarantee that the system will preserve the current set of requirements. Tracing, the mapping between objects in the artifacts of the system being developed, addresses this issue. Artifacts encompass documents such as the system definition, interview transcripts, memoranda, the software requirements specification, user's manuals, the functional

  12. Concurrent Software Testing : A Systematic Review and an Evaluation of Static Analysis Tools

    OpenAIRE

    Mamun, Md. Abdullah al; Khanam, Aklima

    2009-01-01

    Verification and validation is one of the most important concerns in the area of software engineering towards more reliable software development. Hence it is important to overcome the challenges of testing concurrent programs. The extensive use of concurrent systems warrants more attention to the concurrent software testing. For testing concurrent software, automatic tools development is getting increased focus. The first part of this study presents a systematic review that aims to explore th...

  13. Module Testing Techniques for Nuclear Safety Critical Software Using LDRA Testing Tool

    International Nuclear Information System (INIS)

    The safety critical software in the I and C systems of nuclear power plants requires high functional integrity and reliability. To achieve those requirement goals, the safety critical software should be verified and tested according to related codes and standards through verification and validation (V and V) activities. The safety critical software testing is performed at various stages during the development of the software, and is generally classified as three major activities: module testing, system integration testing, and system validation testing. Module testing involves the evaluation of module level functions of hardware and software. System integration testing investigates the characteristics of a collection of modules and aims at establishing their correct interactions. System validation testing demonstrates that the complete system satisfies its functional requirements. In order to generate reliable software and reduce high maintenance cost, it is important that software testing is carried out at module level. Module testing for the nuclear safety critical software has rarely been performed by formal and proven testing tools because of its various constraints. LDRA testing tool is a widely used and proven tool set that provides powerful source code testing and analysis facilities for the V and V of general purpose software and safety critical software. Use of the tool set is indispensable where software is required to be reliable and as error-free as possible, and its use brings in substantial time and cost savings, and efficiency

  14. Tuning COCOMO-II for Software Process Improvement: A Tool Based Approach

    Directory of Open Access Journals (Sweden)

    SYEDA UMEMA HANI

    2016-10-01

    Full Text Available In order to compete in the international software development market the software organizations have to adopt internationally accepted software practices i.e. standard like ISO (International Standard Organization or CMMI (Capability Maturity Model Integration in spite of having scarce resources and tools. The aim of this study is to develop a tool which could be used to present an actual picture of Software Process Improvement benefits in front of the software development companies. However, there are few tools available to assist in making predictions, they are too expensive and could not cover dataset that reflect the cultural behavior of organizations for software development in developing countries. In extension to our previously done research reported elsewhere for Pakistani software development organizations which has quantified benefits of SDPI (Software Development Process Improvement, this research has used sixty-two datasets from three different software development organizations against the set of metrics used in COCOMO-II (Constructive Cost Model 2000. It derived a verifiable equation for calculating ISF (Ideal Scale Factor and tuned the COCOMO-II model to bring prediction capability for SDPI (benefit measurement classes such as ESCP (Effort, Schedule, Cost, and Productivity. This research has contributed towards software industry by giving a reliable and low-cost mechanism for generating prediction models with high prediction accuracy. Hopefully, this study will help software organizations to use this tool not only to predict ESCP but also to predict an exact impact of SDPI.

  15. A Web-based Tool for Automatizing the Software Process Improvement Initiatives in Small Software Enterprises

    NARCIS (Netherlands)

    Garcia, I.; Pacheco, C.

    2010-01-01

    Top-down process improvement approaches provide a high-level model of what the process of a software development organization should be. Such models are based on the consensus of a designated working group on how software should be developed or maintained. They are very useful in that they provide g

  16. MSP-Tool: a VBA-based software tool for the analysis of multispecimen paleointensity data

    Directory of Open Access Journals (Sweden)

    Marilyn Wilhelmina Leonora Monster

    2015-12-01

    Full Text Available The multispecimen protocol (MSP is a method to estimate the Earth’s magnetic field’s past strength from volcanic rocks or archeological materials. By reducing the amount of heating steps and aligning the specimens parallel to the applied field, thermochemical alteration and multi-domain effects are minimized. We present a new software tool, written for Microsoft Excel 2010 in Visual Basic for Applications (VBA, that evaluates paleointensity data acquired using this protocol. In addition to the three ratios (standard, fraction-corrected and domain-state-corrected calculated following Dekkers and Böhnel (2006 and Fabian and Leonhardt (2010 and a number of other parameters proposed by Fabian and Leonhardt (2010, it also provides several reliability criteria. These include an alteration criterion, whether or not the linear regression intersects the y axis within the theoretically prescribed range, and two directional checks. Overprints and misalignment are detected by isolating the remaining natural remanent magnetization (NRM and the partial thermoremanent magnetization (pTRM gained and comparing their declinations and inclinations. The NRM remaining and pTRM gained are then used to calculate alignment-corrected multispecimen plots. Data are analyzed using bootstrap statistics. The program was tested on lava samples that were given a full TRM and that acquired their pTRMs at angles of 0, 15, 30 and 90° with respect to their NRMs. MSP-Tool adequately detected and largely corrected these artificial alignment errors.

  17. MSP-Tool: a VBA-based software tool for the analysis of multispecimen paleointensity data

    Science.gov (United States)

    Monster, Marilyn; de Groot, Lennart; Dekkers, Mark

    2015-12-01

    The multispecimen protocol (MSP) is a method to estimate the Earth's magnetic field's past strength from volcanic rocks or archeological materials. By reducing the amount of heating steps and aligning the specimens parallel to the applied field, thermochemical alteration and multi-domain effects are minimized. We present a new software tool, written for Microsoft Excel 2010 in Visual Basic for Applications (VBA), that evaluates paleointensity data acquired using this protocol. In addition to the three ratios (standard, fraction-corrected and domain-state-corrected) calculated following Dekkers and Böhnel (2006) and Fabian and Leonhardt (2010) and a number of other parameters proposed by Fabian and Leonhardt (2010), it also provides several reliability criteria. These include an alteration criterion, whether or not the linear regression intersects the y axis within the theoretically prescribed range, and two directional checks. Overprints and misalignment are detected by isolating the remaining natural remanent magnetization (NRM) and the partial thermoremanent magnetization (pTRM) gained and comparing their declinations and inclinations. The NRM remaining and pTRM gained are then used to calculate alignment-corrected multispecimen plots. Data are analyzed using bootstrap statistics. The program was tested on lava samples that were given a full TRM and that acquired their pTRMs at angles of 0, 15, 30 and 90° with respect to their NRMs. MSP-Tool adequately detected and largely corrected these artificial alignment errors.

  18. A Case Study of Black-Box Testing for Embedded Software using Test Automation Tool

    Directory of Open Access Journals (Sweden)

    Changhyun Baek

    2007-01-01

    Full Text Available This research shows a case study of the Black-Box testing for Temperature Controller (TC which is one of the typical embedded systems. A test automation tool, TEST, was developed and some kinds of TCs were tested using the tool. We presented statistical analysis for the test results of the automated testing and defined the properties of the software bugs for the embedded system. The main result of the study were the following: (a test case prioritization technique was needed because the review phase, in the test process, takes long time; (b there are three types of software defects for the embedded software; (c the complexity of the system configuration have an effect on the software de-fect; (d the software defect was distributed in vulnerable points of the software; and (e testing activi-ties reduce the number of the software defects. The result can be useful in the asymptotic study of test case prioritization.

  19. C++ Software Quality in the ATLAS Experiment: Tools and Experience

    CERN Document Server

    Kluth, Stefan; The ATLAS collaboration; Obreshkov, Emil; Roe, Shaun; Seuster, Rolf; Snyder, Scott; Stewart, Graeme

    2016-01-01

    The ATLAS experiment at CERN uses about six million lines of code and currently has about 420 developers whose background is largely from physics. In this paper we explain how the C++ code quality is managed using a range of tools from compile-time through to run time testing and reflect on the great progress made in the last year largely through the use of static analysis tools such as Coverity®, an industry-standard tool which enables quality comparison with general open source C++ code. Other tools including cppcheck, Include-What-You-Use and run-time 'sanitizers' are also discussed.

  20. Software engineering capability for Ada (GRASP/Ada Tool)

    Science.gov (United States)

    Cross, James H., II

    1995-01-01

    The GRASP/Ada project (Graphical Representations of Algorithms, Structures, and Processes for Ada) has successfully created and prototyped a new algorithmic level graphical representation for Ada software, the Control Structure Diagram (CSD). The primary impetus for creation of the CSD was to improve the comprehension efficiency of Ada software and, as a result, improve reliability and reduce costs. The emphasis has been on the automatic generation of the CSD from Ada PDL or source code to support reverse engineering and maintenance. The CSD has the potential to replace traditional prettyprinted Ada Source code. A new Motif compliant graphical user interface has been developed for the GRASP/Ada prototype.

  1. Vulnerability management tools for COTS software - A comparison

    NARCIS (Netherlands)

    Welberg, S.M.

    2008-01-01

    In this paper, we compare vulnerability management tools in two stages. In the first stage, we perform a global comparison involving thirty tools available in the market. A framework composed of several criteria based on scope and analysis is used for this comparison. From this global view of the to

  2. A SOFTWARE TOOL FOR EXPERIMENTAL STUDY LEAP MOTION

    Directory of Open Access Journals (Sweden)

    Georgi Krastev

    2015-12-01

    Full Text Available The paper aims to present computer application that illustrates Leap Motion controller’s abilities. It is a peripheral and software for PC, which enables control by natural user interface based on gestures. The publication also describes how the controller works and its main advantages/disadvantages. Some apps using leap motion controller are discussed.

  3. Calico: An Early-Phase Software Design Tool

    Science.gov (United States)

    Mangano, Nicolas Francisco

    2013-01-01

    When developers are faced with a design challenge, they often turn to the whiteboard. This is typical during the conceptual stages of software design, when no code is in existence yet. It may also happen when a significant code base has already been developed, for instance, to plan new functionality or discuss optimizing a key component. While…

  4. Use of software tools for calculating flow accelerated corrosion of nuclear power plant equipment and pipelines

    Science.gov (United States)

    Naftal', M. M.; Baranenko, V. I.; Gulina, O. M.

    2014-06-01

    The results obtained from calculations of flow accelerated corrosion of equipment and pipelines operating at nuclear power plants constructed on the basis of PWR, VVER, and RBMK reactors carried out using the EKI-02 and EKI-03 software tools are presented. It is shown that the calculation error does not exceed its value indicated in the qualification certificates for these software tools. It is pointed out that calculations aimed at predicting the service life of pipelines and efficient surveillance of flow accelerated corrosion wear are hardly possible without using the above-mentioned software tools.

  5. High Energy Physics Forum for Computational Excellence: Working Group Reports (I. Applications Software II. Software Libraries and Tools III. Systems)

    CERN Document Server

    Habib, Salman; LeCompte, Tom; Marshall, Zach; Borgland, Anders; Viren, Brett; Nugent, Peter; Asai, Makoto; Bauerdick, Lothar; Finkel, Hal; Gottlieb, Steve; Hoeche, Stefan; Sheldon, Paul; Vay, Jean-Luc; Elmer, Peter; Kirby, Michael; Patton, Simon; Potekhin, Maxim; Yanny, Brian; Calafiura, Paolo; Dart, Eli; Gutsche, Oliver; Izubuchi, Taku; Lyon, Adam; Petravick, Don

    2015-01-01

    Computing plays an essential role in all aspects of high energy physics. As computational technology evolves rapidly in new directions, and data throughput and volume continue to follow a steep trend-line, it is important for the HEP community to develop an effective response to a series of expected challenges. In order to help shape the desired response, the HEP Forum for Computational Excellence (HEP-FCE) initiated a roadmap planning activity with two key overlapping drivers -- 1) software effectiveness, and 2) infrastructure and expertise advancement. The HEP-FCE formed three working groups, 1) Applications Software, 2) Software Libraries and Tools, and 3) Systems (including systems software), to provide an overview of the current status of HEP computing and to present findings and opportunities for the desired HEP computational roadmap. The final versions of the reports are combined in this document, and are presented along with introductory material.

  6. Training, Quality Assurance Factors, and Tools Investigation: a Work Report and Suggestions on Software Quality Assurance

    Science.gov (United States)

    Lee, Pen-Nan

    1991-01-01

    Previously, several research tasks have been conducted, some observations were obtained, and several possible suggestions have been contemplated involving software quality assurance engineering at NASA Johnson. These research tasks are briefly described. Also, a brief discussion is given on the role of software quality assurance in software engineering along with some observations and suggestions. A brief discussion on a training program for software quality assurance engineers is provided. A list of assurance factors as well as quality factors are also included. Finally, a process model which can be used for searching and collecting software quality assurance tools is presented.

  7. Effectiveness of AutoCAD 3D Software as a Learning Support Tool

    Directory of Open Access Journals (Sweden)

    Fatariah Zakaria

    2012-06-01

    Full Text Available The aim of this study is to test the effectiveness of AutoCAD 3D software in learning of Engineering Drawing to enhance students understanding. Data were collected from a sample of students from a secondary school in Sungai Petani, Kedah. The quasi experimental design was used to find the effectiveness of the software in improving student’s achievement. The result from this study shows excellent increases in student achievement after using this software. These indicate the software can help school student visualization capability. This study suggests that teachers, school administrators and government to consider this software as learning tool in Malaysia school.

  8. SIMPLE: a prototype software fault-injection tool

    OpenAIRE

    Acantilado, Christopher P.; Acantilado, Neil John P.

    2002-01-01

    Approved for public release; distribution is unlimited. Fault-injection techniques can be used to methodically assess the degree of fault tolerance afforded by a system. In this thesis, we introduce a Java-based, semi-automatic fault-injection test harness, called Software Fault Injection Mechanized Prototype Lightweight Engine (SIMPLE). SIMPLE employs a state-based fault injection approach designed to validate test suites. It also can assist developers to assess the properti...

  9. Contingency Contractor Optimization Phase 3 Sustainment Third-Party Software List - Contingency Contractor Optimization Tool - Prototype

    Energy Technology Data Exchange (ETDEWEB)

    Durfee, Justin David; Frazier, Christopher Rawls; Bandlow, Alisa

    2016-05-01

    The Contingency Contractor Optimization Tool - Prototype (CCOT-P) requires several third-party software packages. These are documented below for each of the CCOT-P elements: client, web server, database server, solver, web application and polling application.

  10. Toolkit of Available EPA Green Infrastructure Modeling Software: Watershed Management Optimization Support Tool (WMOST)

    Science.gov (United States)

    Watershed Management Optimization Support Tool (WMOST) is a software application designed tofacilitate integrated water resources management across wet and dry climate regions. It allows waterresources managers and planners to screen a wide range of practices across their watersh...

  11. Accuracy Test of Software Architecture Compliance Checking Tools – Test Instruction

    NARCIS (Netherlands)

    Pruijt, Leo; van der Werf, J.M.E.M.; Brinkkemper., Sjaak

    2015-01-01

    Software Architecture Compliance Checking (SACC) is an approach to verify conformance of implemented program code to high-level models of architectural design. Static SACC focuses on the modular software architecture and on the existence of rule violating dependencies between modules. Accurate tool

  12. Designing and Using Software Tools for Educational Purposes: FLAT, a Case Study

    Science.gov (United States)

    Castro-Schez, J. J.; del Castillo, E.; Hortolano, J.; Rodriguez, A.

    2009-01-01

    Educational software tools are considered to enrich teaching strategies, providing a more compelling means of exploration and feedback than traditional blackboard methods. Moreover, software simulators provide a more motivating link between theory and practice than pencil-paper methods, encouraging active and discovery learning in the students.…

  13. 76 FR 5832 - International Business Machines (IBM), Software Group Business Unit, Optim Data Studio Tools QA...

    Science.gov (United States)

    2011-02-02

    ... Employment and Training Administration International Business Machines (IBM), Software Group Business Unit... November 17, 2010 (75 FR 70296). The negative determination of the TAA petition filed on behalf of workers at International Business Machines (IBM), Software Group Business Unit, Optim Data Studio Tools...

  14. Benchmarking of Optimization Modules for Two Wind Farm Design Software Tools

    OpenAIRE

    Yilmaz, Eftun

    2012-01-01

    Optimization of wind farm layout is an expensive and complex task involving several engineering challenges. The layout of any wind farm directly impacts profitability and return of investment. Several software optimization modules in line with wind farm design tools in industry is currently attempting to place the turbines in locations with good wind resources while adhering to the constraints of a defined objective function. Assessment of these software tools needs to be performed clearly fo...

  15. Possibilities of Simulation of Fluid Flows Using the Modern CFD Software Tools

    CERN Document Server

    Kochevsky, A N

    2004-01-01

    The article reviews fluid flow models implemented in the leading CFD software tools and designed for simulation of multi-component and multi-phase flows, compressible flows, flows with heat transfer, cavitation and other phenomena. The article shows that these software tools (CFX, Fluent, STAR-CD, etc.) allow for adequate simulation of complex physical effects of different nature, even for problems where performing of physical experiment is extremely difficult.

  16. Software tools for the analysis of video meteors emission spectra

    Science.gov (United States)

    Madiedo, J. M.; Toscano, F. M.; Trigo-Rodriguez, J. M.

    2011-10-01

    One of the goals of the SPanish Meteor Network (SPMN) is related to the study of the chemical composition of meteoroids by analyzing the emission spectra resulting from the ablation of these particles of interplanetary matter in the atmosphere. With this aim, some of the CCD video devices we employ to observe the nigh sky are endowed with holographic diffraction gratings, and a continuous monitoring of meteor activity is performed. We have recently developed a new software to analyze these spectra. A description of this computer program is given, and some of the results obtained so far are presented here.

  17. Nucleonica. Web-based software tools for simulation and analysis

    International Nuclear Information System (INIS)

    The authors present a description of the Nucleonica web-based portal for simulation and analysis for a wide range of commonly encountered nuclear science applications. Advantages of a web-based approach include availability wherever there is internet access, intuitive user-friendly interface, remote access to high-power computing resources, and continual maintenance, improvement, and addition of tools and techniques common to the nuclear science industry. A description of the nuclear data resources, and some applications is given.

  18. Software tool for the prosthetic foot modeling and stiffness optimization.

    Science.gov (United States)

    Strbac, Matija; Popović, Dejan B

    2012-01-01

    We present the procedure for the optimization of the stiffness of the prosthetic foot. The procedure allows the selection of the elements of the foot and the materials used for the design. The procedure is based on the optimization where the cost function is the minimization of the difference between the knee joint torques of healthy walking and the walking with the transfemural prosthesis. We present a simulation environment that allows the user to interactively vary the foot geometry and track the changes in the knee torque that arise from these adjustments. The software allows the estimation of the optimal prosthetic foot elasticity and geometry. We show that altering model attributes such as the length of the elastic foot segment or its elasticity leads to significant changes in the estimated knee torque required for a given trajectory.

  19. Improving Fund Risk Management by Using New Software Tools Technology

    Directory of Open Access Journals (Sweden)

    Stephanos Papadamou

    2004-01-01

    Full Text Available This paper introduces a new MATLAB-based toolbox for Computer Aided mutual fund risk evaluation. In the age of computerized trading, financial services companies and independent investors must quickly investigate fund investment style and market risk. The Fund Risk toolbox is a financial software that includes a set of functions based on value at risk (VaR and expected tail loss (ETL methodology for graphical presentation of risk forecasts, evaluation of different risk models and identification of fund investment style. The sample of historical data can be divided to an estimation rolling window and the back-testing period. MATLAB?s vast built-in mathematical and financial functionality along with the fact that is both an interpreted and compiled programming language make this toolbox easily extendable by adding new complicated risk models with minimum programming effort.

  20. Software Tool for the Prosthetic Foot Modeling and Stiffness Optimization

    Directory of Open Access Journals (Sweden)

    Matija Štrbac

    2012-01-01

    Full Text Available We present the procedure for the optimization of the stiffness of the prosthetic foot. The procedure allows the selection of the elements of the foot and the materials used for the design. The procedure is based on the optimization where the cost function is the minimization of the difference between the knee joint torques of healthy walking and the walking with the transfemural prosthesis. We present a simulation environment that allows the user to interactively vary the foot geometry and track the changes in the knee torque that arise from these adjustments. The software allows the estimation of the optimal prosthetic foot elasticity and geometry. We show that altering model attributes such as the length of the elastic foot segment or its elasticity leads to significant changes in the estimated knee torque required for a given trajectory.

  1. Collaborative Software Development in Support of Fast Adaptive AeroSpace Tools (FAAST)

    Science.gov (United States)

    Kleb, William L.; Nielsen, Eric J.; Gnoffo, Peter A.; Park, Michael A.; Wood, William A.

    2003-01-01

    A collaborative software development approach is described. The software product is an adaptation of proven computational capabilities combined with new capabilities to form the Agency's next generation aerothermodynamic and aerodynamic analysis and design tools. To efficiently produce a cohesive, robust, and extensible software suite, the approach uses agile software development techniques; specifically, project retrospectives, the Scrum status meeting format, and a subset of Extreme Programming's coding practices are employed. Examples are provided which demonstrate the substantial benefits derived from employing these practices. Also included is a discussion of issues encountered when porting legacy Fortran 77 code to Fortran 95 and a Fortran 95 coding standard.

  2. Computer-aided design in power engineering. Application of software tools

    International Nuclear Information System (INIS)

    Demonstrates the use software tools in the practice of design in the field of power systems. Presents many applications in the design in the field of power systems. Useful for educative purposes and practical work. This textbooks demonstrates the application of software tools in solving a series of problems from the field of designing power system structures and systems. It contains four chapters: The first chapter leads the reader through all the phases necessary in the procedures of computer aided modeling and simulation. It guides through the complex problems presenting on the basis of eleven original examples. The second chapter presents application of software tools in power system calculations of power systems equipment design. Several design example calculations are carried out using engineering standards like MATLAB, EMTP/ATP, Excel and Access, AutoCAD and Simulink. The third chapters focuses on the graphical documentation using a collection of software tools (AutoCAD, EPLAN, SIMARIS SIVACON, SIMARIS DESIGN) which enable the complete automation of the development of graphical documentation of a power systems. In the fourth chapter, the application of software tools in the project management in power systems is discussed. Here, the emphasis is put on the standard software MS Excel and MS Project.

  3. Computer-aided design in power engineering. Application of software tools

    Energy Technology Data Exchange (ETDEWEB)

    Stojkovic, Zlatan

    2012-07-01

    Demonstrates the use software tools in the practice of design in the field of power systems. Presents many applications in the design in the field of power systems. Useful for educative purposes and practical work. This textbooks demonstrates the application of software tools in solving a series of problems from the field of designing power system structures and systems. It contains four chapters: The first chapter leads the reader through all the phases necessary in the procedures of computer aided modeling and simulation. It guides through the complex problems presenting on the basis of eleven original examples. The second chapter presents application of software tools in power system calculations of power systems equipment design. Several design example calculations are carried out using engineering standards like MATLAB, EMTP/ATP, Excel and Access, AutoCAD and Simulink. The third chapters focuses on the graphical documentation using a collection of software tools (AutoCAD, EPLAN, SIMARIS SIVACON, SIMARIS DESIGN) which enable the complete automation of the development of graphical documentation of a power systems. In the fourth chapter, the application of software tools in the project management in power systems is discussed. Here, the emphasis is put on the standard software MS Excel and MS Project.

  4. A Reference Architecture for Providing Tools as a Service to Support Global Software Development

    DEFF Research Database (Denmark)

    Chauhan, Aufeef

    2014-01-01

    -computing paradigm for addressing above-mentioned issues by providing a framework to select appropriate tools as well as associated services and reference architecture of the cloud-enabled middleware platform that allows on demand provisioning of software engineering Tools as a Service (TaaS) with focus...

  5. Development of a software tool for an internal dosimetry using MIRD method

    Science.gov (United States)

    Chaichana, A.; Tocharoenchai, C.

    2016-03-01

    Currently, many software packages for the internal radiation dosimetry have been developed. Many of them do not provide sufficient tools to perform all of the necessary steps from nuclear medicine image analysis for dose calculation. For this reason, we developed a CALRADDOSE software that can be performed internal dosimetry using MIRD method within a single environment. MATLAB software version 2015a was used as development tool. The calculation process of this software proceeds from collecting time-activity data from image data followed by residence time calculation and absorbed dose calculation using MIRD method. To evaluate the accuracy of this software, we calculate residence times and absorbed doses of 5 Ga- 67 studies and 5 I-131 MIBG studies and then compared the results with those obtained from OLINDA/EXM software. The results showed that the residence times and absorbed doses obtained from both software packages were not statistically significant differences. The CALRADDOSE software is a user-friendly, graphic user interface-based software for internal dosimetry. It provides fast and accurate results, which may be useful for a routine work.

  6. Development of tools for safety analysis of control software in advanced reactors

    Energy Technology Data Exchange (ETDEWEB)

    Guarro, S.; Yau, M.; Motamed, M. [Advanced Systems Concepts Associates, El Segundo, CA (United States)

    1996-04-01

    Software based control systems have gained a pervasive presence in a wide variety of applications, including nuclear power plant control and protection systems which are within the oversight and licensing responsibility of the US Nuclear Regulatory Commission. While the cost effectiveness and flexibility of software based plant process control is widely recognized, it is very difficult to achieve and prove high levels of demonstrated dependability and safety assurance for the functions performed by process control software, due to the very flexibility and potential complexity of the software itself. The development of tools to model, analyze and test software design and implementations in the context of the system that the software is designed to control can greatly assist the task of providing higher levels of assurance than those obtainable by software testing alone. This report presents and discusses the development of the Dynamic Flowgraph Methodology (DFM) and its application in the dependability and assurance analysis of software-based control systems. The features of the methodology and full-scale examples of application to both generic process and nuclear power plant control systems are presented and discussed in detail. The features of a workstation software tool developed to assist users in the application of DFM are also described.

  7. Nmag micromagnetic simulation tool - software engineering lessons learned

    CERN Document Server

    Fangohr, Hans; Franchin, Matteo

    2016-01-01

    We review design decisions and their impact for the open source code Nmag from a software engineering in computational science point of view. Key lessons to learn include that the approach of encapsulating the simulation functionality in a library of a general purpose language, here Python, eliminates the need for configuration files, provides greatest flexibility in using the simulation, allows mixing of multiple simulations, pre- and post-processing in the same (Python) file, and allows to benefit from the rich Python ecosystem of scientific packages. The choice of programming language (OCaml) for the computational core did not resonate with the users of the package (who are not computer scientists) and was suboptimal. The choice of Python for the top-level user interface was very well received by users from the science and engineering community. The from-source installation in which key requirements were compiled from a tarball was remarkably robust. In places, the code is a lot more ambitious than necessa...

  8. Lessons learned applying CASE methods/tools to Ada software development projects

    Science.gov (United States)

    Blumberg, Maurice H.; Randall, Richard L.

    1993-01-01

    This paper describes the lessons learned from introducing CASE methods/tools into organizations and applying them to actual Ada software development projects. This paper will be useful to any organization planning to introduce a software engineering environment (SEE) or evolving an existing one. It contains management level lessons learned, as well as lessons learned in using specific SEE tools/methods. The experiences presented are from Alpha Test projects established under the STARS (Software Technology for Adaptable and Reliable Systems) project. They reflect the front end efforts by those projects to understand the tools/methods, initial experiences in their introduction and use, and later experiences in the use of specific tools/methods and the introduction of new ones.

  9. Software tools and frameworks in High Energy Physics

    Science.gov (United States)

    Brun, R.

    2011-01-01

    In many fields of science and industry the computing environment has grown at an exponential speed in the past 30 years. From ad hoc solutions for each problem, the field has evolved gradually to use or reuse systems developed across the years for the same environment or coming from other fields with the same requirements. Several frameworks have emerged to solve common problems. In High Energy Physics (HEP) and Nuclear Physics, we have witnessed the emergence of common tools, packages and libraries that have become gradually the corner stone of the computing in these fields. The emergence of these systems has been complex because the computing field is evolving rapidly, the problems to be solved more and more complex and the size of the experiments now involving several thousand physicists from all over the world. This paper describes the emergence of these frameworks and their evolution from libraries including independent subroutines to task-oriented packages and to general experiments frameworks.

  10. Software Tools for Robust Analysis of High-Dimensional Data

    Directory of Open Access Journals (Sweden)

    Valentin Todorov

    2014-06-01

    Full Text Available The present work discusses robust multivariate methods specifically designed for highdimensions. Their implementation in R is presented and their application is illustratedon examples. The first group are algorithms for outlier detection, already introducedelsewhere and implemented in other packages. The value added of the new package isthat all methods follow the same design pattern and thus can use the same graphicaland diagnostic tools. The next topic covered is sparse principal components including anobject oriented interface to the standard method proposed by Zou, Hastie, and Tibshirani(2006 and the robust one proposed by Croux, Filzmoser, and Fritz (2013. Robust partialleast squares (see Hubert and Vanden Branden 2003 as well as partial least squares fordiscriminant analysis conclude the scope of the new package.

  11. Towards a Reference Architecture to Provision Tools as a Service for Global Software Development

    DEFF Research Database (Denmark)

    Chauhan, Aufeef; Babar, Muhammad Ali

    2014-01-01

    Organizations involve in Global Software Development (GSD) face challenges in terms of having access to appropriate set of tools for performing distributed engineering and development activities, integration between heterogeneous desktop and web-based tools, management of artifacts developed and ...... distributed environment. In this paper, we argue the need to have a cloud-enabled platform for supporting GSD and propose reference architecture of a cloud based Platform for providing support to provision ecosystem of the Tools as a Service (PTaaS)....

  12. Teaching structure: student use of software tools for understanding macromolecular structure in an undergraduate biochemistry course.

    Science.gov (United States)

    Jaswal, Sheila S; O'Hara, Patricia B; Williamson, Patrick L; Springer, Amy L

    2013-01-01

    Because understanding the structure of biological macromolecules is critical to understanding their function, students of biochemistry should become familiar not only with viewing, but also with generating and manipulating structural representations. We report a strategy from a one-semester undergraduate biochemistry course to integrate use of structural representation tools into both laboratory and homework activities. First, early in the course we introduce the use of readily available open-source software for visualizing protein structure, coincident with modules on amino acid and peptide bond properties. Second, we use these same software tools in lectures and incorporate images and other structure representations in homework tasks. Third, we require a capstone project in which teams of students examine a protein-nucleic acid complex and then use the software tools to illustrate for their classmates the salient features of the structure, relating how the structure helps explain biological function. To ensure engagement with a range of software and database features, we generated a detailed template file that can be used to explore any structure, and that guides students through specific applications of many of the software tools. In presentations, students demonstrate that they are successfully interpreting structural information, and using representations to illustrate particular points relevant to function. Thus, over the semester students integrate information about structural features of biological macromolecules into the larger discussion of the chemical basis of function. Together these assignments provide an accessible introduction to structural representation tools, allowing students to add these methods to their biochemical toolboxes early in their scientific development.

  13. SaTool - a Software Tool for Structural Analysis of Complex Automation Systems

    DEFF Research Database (Denmark)

    Blanke, Mogens; Lorentzen, Torsten

    2006-01-01

    The paper introduces SaTool, a tool for structural analysis, the use of the Matlab (R)-based implementation is presented and special features are introduced, which were motivated by industrial users. Salient features of tool are presented, including the ability to specify the behavior of a comple...

  14. PublicationHarvester: An Open-Source Software Tool for Science Policy Research

    OpenAIRE

    Pierre Azoulay; Andrew Stellman; Joshua Graff Zivin

    2006-01-01

    We present PublicationHarvester, an open-source software tool for gathering publication information on individual life scientists. The software interfaces with MEDLINE, and allows the end-user to specify up to four MEDLINE-formatted names for each researcher. Using these names along with a user-specified search query, PublicationHarvester generates yearly publication counts, optionally weighted by Journal Impact Factors. These counts are further broken-down by order on the authorship list (fi...

  15. A software tool for design of process monitoring and analysis systems

    DEFF Research Database (Denmark)

    Singh, Ravendra; Gernaey, Krist; Gani, Rafiqul

    2009-01-01

    rigorously and integrated with the user interface, which made the software more generic and applicable to a wide range of problems. The software for the design of a process monitoring and analysis system is presented and illustrated with a tablet manufacturing process example.......A well designed process monitoring and analysis system is necessary to consistently achieve any predefined end product quality. Systematic computer aided methods and tools provide the means to design the necessary process monitoring and analysis systems and/or to validate any existing monitoring...... and analysis system. A software to achieve this has been developed. Two developed supporting tools for the design, a knowledge base (consisting of the process knowledge as well as the knowledge on measurement methods & tools) and a model library (consisting of the process operational models) have been extended...

  16. A software tool for simulation of surfaces generated by ball nose end milling

    DEFF Research Database (Denmark)

    Bissacco, Giuliano

    2004-01-01

    , reproducibility of experimental results concerning surface roughness requires tight control of all influencing factors, difficult to keep in actual machining workshops. This introduces further complications in surface topography modelling. In the light of these considerations, a simple software tool......, for prediction of surface topography of ball nose end milled surfaces, was developed. Such software tool is based on a simplified model of the ideal tool motion and neglects the effects due to run-out, static and dynamic deflections and error motions, but has the merit of generating in output a file in a format...... readable by a surface processor software (SPIP [2]), for calculation of a number of surface roughness parameters. In the next paragraph a description of the basic features of ball nose end milled surfaces is given, while in paragraph 3 the model is described....

  17. Data Mining for Secure Software Engineering – Source Code Management Tool Case Study

    Directory of Open Access Journals (Sweden)

    A.V.Krishna Prasad,

    2010-07-01

    Full Text Available As Data Mining for Secure Software Engineering improves software productivity and quality, software engineers are increasingly applying data mining algorithms to various software engineering tasks. However mining software engineering data poses several challenges, requiring various algorithms to effectively mine sequences, graphs and text from such data. Software engineering data includes code bases, execution traces, historical code changes,mailing lists and bug data bases. They contains a wealth of information about a projects-status, progress and evolution. Using well established data mining techniques, practitioners and researchers can explore the potential of this valuable data in order to better manage their projects and do produce higher-quality software systems that are delivered on time and with in budget. Data mining can be used in gathering and extracting latent security requirements, extracting algorithms and business rules from code, mining legacy applications for requirements and business rules for new projects etc. Mining algorithms for software engineering falls into four main categories: Frequent pattern mining – finding commonly occurring patterns; Pattern matching – finding data instances for given patterns; Clustering – grouping data into clusters and Classification – predicting labels of data based on already labeled data. In this paper, we will discuss the overview of strategies for data mining for secure software engineering, with the implementation of a case study of text mining for source code management tool.

  18. Towards early software reliability prediction for computer forensic tools (case study).

    Science.gov (United States)

    Abu Talib, Manar

    2016-01-01

    Versatility, flexibility and robustness are essential requirements for software forensic tools. Researchers and practitioners need to put more effort into assessing this type of tool. A Markov model is a robust means for analyzing and anticipating the functioning of an advanced component based system. It is used, for instance, to analyze the reliability of the state machines of real time reactive systems. This research extends the architecture-based software reliability prediction model for computer forensic tools, which is based on Markov chains and COSMIC-FFP. Basically, every part of the computer forensic tool is linked to a discrete time Markov chain. If this can be done, then a probabilistic analysis by Markov chains can be performed to analyze the reliability of the components and of the whole tool. The purposes of the proposed reliability assessment method are to evaluate the tool's reliability in the early phases of its development, to improve the reliability assessment process for large computer forensic tools over time, and to compare alternative tool designs. The reliability analysis can assist designers in choosing the most reliable topology for the components, which can maximize the reliability of the tool and meet the expected reliability level specified by the end-user. The approach of assessing component-based tool reliability in the COSMIC-FFP context is illustrated with the Forensic Toolkit Imager case study. PMID:27386276

  19. Managing the Testing Process Practical Tools and Techniques for Managing Hardware and Software Testing

    CERN Document Server

    Black, Rex

    2011-01-01

    New edition of one of the most influential books on managing software and hardware testing In this new edition of his top-selling book, Rex Black walks you through the steps necessary to manage rigorous testing programs of hardware and software. The preeminent expert in his field, Mr. Black draws upon years of experience as president of both the International and American Software Testing Qualifications boards to offer this extensive resource of all the standards, methods, and tools you'll need. The book covers core testing concepts and thoroughly examines the best test management practices

  20. Identify new Software Quality Assurance needs for the UK e-Science community and reintroduction for the right tools to improve evolved software engineering processes

    OpenAIRE

    Chang, Victor

    2008-01-01

    Software Quality Assurance (QA) is defined as the methodology and good practices for ensuring the quality of software in development. It involves in handling bug reports, bug tracking, error investigation, verification of fixed bugs, test management, test case plan and design, as well as test case execution and records. Standards such as ISO 9001 are commonly followed for software QA, which recommends using a wide range of tools to improve the existing software engineering processes (SEP) for...

  1. "Blogs" and "wikis" are valuable software tools for communication within research groups.

    Science.gov (United States)

    Sauer, Igor M; Bialek, Dominik; Efimova, Ekaterina; Schwartlander, Ruth; Pless, Gesine; Neuhaus, Peter

    2005-01-01

    Appropriate software tools may improve communication and ease access to knowledge for research groups. A weblog is a website which contains periodic, chronologically ordered posts on a common webpage, whereas a wiki is hypertext-based collaborative software that enables documents to be authored collectively using a web browser. Although not primarily intended for use as an intranet-based collaborative knowledge warehouse, both blogs and wikis have the potential to offer all the features of complex and expensive IT solutions. These tools enable the team members to share knowledge simply and quickly-the collective knowledge base of the group can be efficiently managed and navigated.

  2. A Vision on the Status and Evolution of HEP Physics Software Tools

    CERN Document Server

    Canal, P; Hatcher, R; Jun, S Y; Mrenna, S

    2013-01-01

    This paper represents the vision of the members of the Fermilab Scientific Computing Division's Computational Physics Department (SCD-CPD) on the status and the evolution of various HEP software tools such as the Geant4 detector simulation toolkit, the Pythia and GENIE physics generators, and the ROOT data analysis framework. The goal of this paper is to contribute ideas to the Snowmass 2013 process toward the composition of a unified document on the current status and potential evolution of the physics software tools which are essential to HEP.

  3. Integrated structural analysis tool using the linear matching method part 1 – Software development

    International Nuclear Information System (INIS)

    A number of direct methods based upon the Linear Matching Method (LMM) framework have been developed to address structural integrity issues for components subjected to cyclic thermal and mechanical load conditions. This paper presents a new integrated structural analysis tool using the LMM framework for the assessment of load carrying capacity, shakedown limit, ratchet limit and steady state cyclic response of structures. First, the development of the LMM for the evaluation of design limits in plasticity is introduced. Second, preliminary considerations for the development of the LMM into a tool which can be used on a regular basis by engineers are discussed. After the re-structuring of the LMM subroutines for multiple central processing unit (CPU) solution, the LMM software tool for the assessment of design limits in plasticity is implemented by developing an Abaqus CAE plug-in with graphical user interfaces. Further demonstration of this new LMM analysis tool including practical application and verification is presented in an accompanying paper. - Highlights: • A new structural analysis tool using the Linear Matching Method (LMM) is developed. • The software tool is able to evaluate the design limits in plasticity. • Able to assess limit load, shakedown, ratchet limit and steady state cyclic response. • Re-structuring of the LMM subroutines for multiple CPU solution is conducted. • The software tool is implemented by developing an Abaqus CAE plug-in with GUI

  4. Proceedings of the Workshop on software tools for distributed intelligent control systems

    Energy Technology Data Exchange (ETDEWEB)

    Herget, C.J. (ed.)

    1990-09-01

    The Workshop on Software Tools for Distributed Intelligent Control Systems was organized by Lawrence Livermore National Laboratory for the United States Army Headquarters Training and Doctrine Command and the Defense Advanced Research Projects Agency. The goals of the workshop were to the identify the current state of the art in tools which support control systems engineering design and implementation, identify research issues associated with writing software tools which would provide a design environment to assist engineers in multidisciplinary control design and implementation, formulate a potential investment strategy to resolve the research issues and develop public domain code which can form the core of more powerful engineering design tools, and recommend test cases to focus the software development process and test associated performance metrics. Recognizing that the development of software tools for distributed intelligent control systems will require a multidisciplinary effort, experts in systems engineering, control systems engineering, and compute science were invited to participate in the workshop. In particular, experts who could address the following topics were selected: operating systems, engineering data representation and manipulation, emerging standards for manufacturing data, mathematical foundations, coupling of symbolic and numerical computation, user interface, system identification, system representation at different levels of abstraction, system specification, system design, verification and validation, automatic code generation, and integration of modular, reusable code.

  5. Review of free software tools for image analysis of fluorescence cell micrographs.

    Science.gov (United States)

    Wiesmann, V; Franz, D; Held, C; Münzenmayer, C; Palmisano, R; Wittenberg, T

    2015-01-01

    An increasing number of free software tools have been made available for the evaluation of fluorescence cell micrographs. The main users are biologists and related life scientists with no or little knowledge of image processing. In this review, we give an overview of available tools and guidelines about which tools the users should use to segment fluorescence micrographs. We selected 15 free tools and divided them into stand-alone, Matlab-based, ImageJ-based, free demo versions of commercial tools and data sharing tools. The review consists of two parts: First, we developed a criteria catalogue and rated the tools regarding structural requirements, functionality (flexibility, segmentation and image processing filters) and usability (documentation, data management, usability and visualization). Second, we performed an image processing case study with four representative fluorescence micrograph segmentation tasks with figure-ground and cell separation. The tools display a wide range of functionality and usability. In the image processing case study, we were able to perform figure-ground separation in all micrographs using mainly thresholding. Cell separation was not possible with most of the tools, because cell separation methods are provided only by a subset of the tools and are difficult to parametrize and to use. Most important is that the usability matches the functionality of a tool. To be usable, specialized tools with less functionality need to fulfill less usability criteria, whereas multipurpose tools need a well-structured menu and intuitive graphical user interface.

  6. Contingency Contractor Optimization Phase 3 Sustainment Software Design Document - Contingency Contractor Optimization Tool - Prototype

    Energy Technology Data Exchange (ETDEWEB)

    Durfee, Justin David; Frazier, Christopher Rawls; Bandlow, Alisa; Jones, Katherine A

    2016-05-01

    This document describes the final software design of the Contingency Contractor Optimization Tool - Prototype. Its purpose is to provide the overall architecture of the software and the logic behind this architecture. Documentation for the individual classes is provided in the application Javadoc. The Contingency Contractor Optimization project is intended to address Department of Defense mandates by delivering a centralized strategic planning tool that allows senior decision makers to quickly and accurately assess the impacts, risks, and mitigation strategies associated with utilizing contract support. The Contingency Contractor Optimization Tool - Prototype was developed in Phase 3 of the OSD ATL Contingency Contractor Optimization project to support strategic planning for contingency contractors. The planning tool uses a model to optimize the Total Force mix by minimizing the combined total costs for selected mission scenarios. The model optimizes the match of personnel types (military, DoD civilian, and contractors) and capabilities to meet mission requirements as effectively as possible, based on risk, cost, and other requirements.

  7. An Open-Source Tool Set Enabling Analog-Digital-Software Co-Design

    Directory of Open Access Journals (Sweden)

    Michelle Collins

    2016-02-01

    Full Text Available This paper presents an analog-digital hardware-software co-design environment for simulating and programming reconfigurable systems. The tool simulates, designs, as well as enables experimental measurements after compiling to configurable systems in the same integrated design tool framework. High level software in Scilab/Xcos (open-source programs similar to MATLAB/Simulink that converts the high-level block description by the user to blif format (sci2blif, which acts as an input to the modified VPR tool, including the code v p r 2 s w c s , encoding the specific platform through specific architecture files, resulting in a targetable switch list on the resulting configurable analog–digital system. The resulting tool uses an analog and mixed-signal library of components, enabling users and future researchers access to the basic analog operations/computations that are possible.

  8. Evaluation of computer-aided software engineering tools for data base development

    Energy Technology Data Exchange (ETDEWEB)

    Woyna, M.A.; Carlson, C.R.

    1989-02-01

    More than 80 computer-aided software engineering (CASE) tools were evaluated to determine their usefulness in data base development projects. The goal was to review the current state of the CASE industry and recommend one or more tools for inclusion in the uniform development environment (UDE), a programming environment being designed by Argonne National Laboratory for the US Department of Defense Organization of the Joint Chiefs of Staff, J-8 Directorate. This environment gives a computer programmer a consistent user interface and access to a full suite of tools and utilities for software development. In an effort to identify tools that would be useful in the planning, analysis, design, implementation, and maintenance of Argonne's data base development projects for the J-8 Directorate, we evaluated 83 commercially available CASE products. This report outlines the method used and presents the results of the evaluation.

  9. Development of software tools for 4-D visualization and quantitative analysis of PHITS simulation results

    International Nuclear Information System (INIS)

    A suite of software tools has been developed to facilitate the development of apparatus using a radiation transport simulation code PHITS by enabling 4D visualization (3D space and time) and quantitative analysis of so-called dieaway plots. To deliver useable tools as soon as possible, the existing software was utilized as much as possible; ParaView will be used for the 4D visualization of the results, whereas the analyses of dieaway plots will be done with ROOT toolkit with a tool named “diana”. To enable 4D visualization using ParaView, a group of tools (angel2vtk, DispDCAS1, CamPos) has been developed for the conversion of the data format to the one which can be read from ParaView and to ease the visualization. (author)

  10. Activity Theory applied to Global Software Engineering: Theoretical Foundations and Implications for Tool Builders

    DEFF Research Database (Denmark)

    Tell, Paolo; Ali Babar, Muhammad

    2012-01-01

    be a promising alternative to build tools for GSE. However, significant effort is required to introduce a new paradigm; there is a need of sound theoretical foundation based on activity theory to address challenges faced by tools in GSE. This paper reports our effort aimed at building theoretical foundations...... for applying activity theory to GSE. We analyze and explain the fundamental concepts of activity theory, and how they can be applied by using examples of software architecture design and evaluation processes. We describe the kind of data model and architectural support required for applying activity theory......Although a plethora of tools are available for Global Software Engineering (GSE) teams, it is being realized increasingly that the most prevalent desktop metaphor underpinning the majority of tools have several inherent limitations. We have proposed that Activity-Based Computing (ABC) can...

  11. Management of an affiliated Physics Residency Program using a commercial software tool.

    Science.gov (United States)

    Zacarias, Albert S; Mills, Michael D

    2010-01-01

    A review of commercially available allied health educational management software tools was performed to evaluate their capacity to manage program data associated with a CAMPEP-accredited Therapy Physics Residency Program. Features of these software tools include: a) didactic course reporting and organization, b) competency reporting by topic, category and didactic course, c) student time management and accounting, and d) student patient case reporting by topic, category and course. The software package includes features for recording school administrative information; setting up lists of courses, faculty, clinical sites, categories, competencies, and time logs; and the inclusion of standardized external documents. There are provisions for developing evaluation and survey instruments. The mentors and program may be evaluated by residents, and residents may be evaluated by faculty members using this feature. Competency documentation includes the time spent on the problem or with the patient, time spent with the mentor, date of the competency, and approval by the mentor and program director. Course documentation includes course and lecture title, lecturer, topic information, date of lecture and approval by the Program Director. These software tools have the facility to include multiple clinical sites, with local subadministrators having the ability to approve competencies and attendance at clinical conferences. In total, these software tools have the capability of managing all components of a CAMPEP-accredited residency program. The application database lends the software to the support of multiple affiliated clinical sites within a single residency program. Such tools are a critical and necessary component if the medical physics profession is to meet the projected needs for qualified medical physicists in future years. PMID:20717075

  12. An upgrade of a computerized tool for managing agile software projects

    OpenAIRE

    Bačnar, Andrej

    2015-01-01

    The thesis describes the development of an upgrade for an agile project management software tool. In the first part, thesis presents the basic characteristics of agile methodologies with the emphasis on Scrum and Kanban methodologies. The next chapter consists of a brief description of the existing tool and the upgrade requirements specification which includes: the workflow visualization by using the board, elaboration of additional functionality to monitor the development teams and creation ...

  13. TINA manual landmarking tool: software for the precise digitization of 3D landmarks

    OpenAIRE

    Schunke Anja C; Bromiley Paul A; Tautz Diethard; Thacker Neil A

    2012-01-01

    Abstract Background Interest in the placing of landmarks and subsequent morphometric analyses of shape for 3D data has increased with the increasing accessibility of computed tomography (CT) scanners. However, current computer programs for this task suffer from various practical drawbacks. We present here a free software tool that overcomes many of these problems. Results The TINA Manual Landmarking Tool was developed for the digitization of 3D data sets. It enables the generation of a modifi...

  14. Using a Self-Administered Visual Basic Software Tool To Teach Psychological Concepts.

    Science.gov (United States)

    Strang, Harold R.; Sullivan, Amie K.; Schoeny, Zahrl G.

    2002-01-01

    Introduces LearningLinks, a Visual Basic software tool that allows teachers to create individualized learning modules that use constructivist and behavioral learning principles. Describes field testing of undergraduates at the University of Virginia that tested a module designed to improve understanding of the psychological concepts of…

  15. Wiki as a Corporate Learning Tool: Case Study for Software Development Company

    Science.gov (United States)

    Milovanovic, Milos; Minovic, Miroslav; Stavljanin, Velimir; Savkovic, Marko; Starcevic, Dusan

    2012-01-01

    In our study, we attempted to further investigate how Web 2.0 technologies influence workplace learning. Our particular interest was on using Wiki as a tool for corporate exchange of knowledge with the focus on informal learning. In this study, we collaborated with a multinational software development company that uses Wiki as a corporate tool…

  16. IDA: A new software tool for INTEGRAL field spectroscopy Data Analysis

    CERN Document Server

    Lorenzo, B Garcia; Megias, E

    2016-01-01

    We present a software package, IDA, which can easily handle two-dimensional spectroscopy data. IDA has been written in IDL and offers a window-based interface. The available tools can visualize a recovered image from spectra at any desired wavelength interval, obtain velocity fields, velocity dispersion distributions, etc.

  17. Claire, a tool used for the simulation of events in software tests

    International Nuclear Information System (INIS)

    CLAIRE provides a purely software system which makes it possible to validate the on line applications dealt out to the specifications domain or the code. This tool offers easy graphic design of the application and of its environment. It caries out quite efficiently the simulation of any logged in model and runs the control of the evolution either dynamically or with prerecorded time. (TEC)

  18. Development and implementation of software tools for NPP component safety and life cycle monitoring

    International Nuclear Information System (INIS)

    Two information systems affecting the technical safety and durability of components, viz. the Surveillance Program and the OPTIMUD application, are described as a basis for discussion of the broader context induced by any software tool implementation in the nuclear power area. (orig.)

  19. SDMdata: A Web-Based Software Tool for Collecting Species Occurrence Records.

    Directory of Open Access Journals (Sweden)

    Xiaoquan Kong

    Full Text Available It is important to easily and efficiently obtain high quality species distribution data for predicting the potential distribution of species using species distribution models (SDMs. There is a need for a powerful software tool to automatically or semi-automatically assist in identifying and correcting errors. Here, we use Python to develop a web-based software tool (SDMdata to easily collect occurrence data from the Global Biodiversity Information Facility (GBIF and check species names and the accuracy of coordinates (latitude and longitude. It is an open source software (GNU Affero General Public License/AGPL licensed allowing anyone to access and manipulate the source code. SDMdata is available online free of charge from .

  20. Programming heterogeneous MPSoCs tool flows to close the software productivity gap

    CERN Document Server

    Castrillón Mazo, Jerónimo

    2014-01-01

    This book provides embedded software developers with techniques for programmingheterogeneous Multi-Processor Systems-on-Chip (MPSoCs), capable of executing multiple applications simultaneously. It describes a set of algorithms and methodologies to narrow the software productivity gap, as well as an in-depth description of the underlying problems and challenges of today’s programming practices. The authors present four different tool flows: A parallelism extraction flow for applications writtenusing the C programming language, a mapping and scheduling flow for parallel applications, a special mapping flow for baseband applications in the context of Software Defined Radio (SDR) and a final flow for analyzing multiple applications at design time. The tool flows are evaluated on Virtual Platforms (VPs), which mimic different characteristics of state-of-the-art heterogeneous MPSoCs.   • Provides a novel set of algorithms and methodologies for programming heterogeneous Multi-Processor Systems-on-Chip (MPSoCs)...

  1. Astrophysics datamining in the classroom: Exploring real data with new software tools and robotic telescopes

    CERN Document Server

    Doran, Rosa; Boudier, Thomas; Pacôme,; Delva,; Ferlet, Roger; Almeida, Maria L T; Barbosa, Domingos; Gomez, Edward; Pennypacker, Carl; Roche, Paul; Roberts, Sarah

    2012-01-01

    Within the efforts to bring frontline interactive astrophysics and astronomy to the classroom, the Hands on Universe (HOU) developed a set of exercises and platform using real data obtained by some of the most advanced ground and space observatories. The backbone of this endeavour is a new free software Web tool - Such a Lovely Software for Astronomy based on Image J (Salsa J). It is student-friendly and developed specifically for the HOU project and targets middle and high schools. It allows students to display, analyze, and explore professionally obtained astronomical images, while learning concepts on gravitational dynamics, kinematics, nuclear fusion, electromagnetism. The continuous evolving set of exercises and tutorials is being completed with real (professionally obtained) data to download and detailed tutorials. The flexibility of the Salsa J platform tool enables students and teachers to extend the exercises with their own observations. The software developed for the HOU program has been designed to...

  2. Software Construction and Composition Tools for Petascale Computing SCW0837 Progress Report

    Energy Technology Data Exchange (ETDEWEB)

    Epperly, T W; Hochstein, L

    2011-09-12

    The majority of scientific software is distributed as source code. As the number of library dependencies and supported platforms increases, so does the complexity of describing the rules for configuring and building software. In this project, we have performed an empirical study of the magnitude of the build problem by examining the development history of two DOE-funded scientific software projects. We have developed MixDown, a meta-build tool, to simplify the task of building applications that depend on multiple third-party libraries. The results of this research indicate that the effort that scientific programmers spend takes a significant fraction of the total development effort and that the use of MixDown can significantly simplify the task of building software with multiple dependencies.

  3. Assess/Mitigate Risk through the Use of Computer-Aided Software Engineering (CASE) Tools

    Science.gov (United States)

    Aguilar, Michael L.

    2013-01-01

    The NASA Engineering and Safety Center (NESC) was requested to perform an independent assessment of the mitigation of the Constellation Program (CxP) Risk 4421 through the use of computer-aided software engineering (CASE) tools. With the cancellation of the CxP, the assessment goals were modified to capture lessons learned and best practices in the use of CASE tools. The assessment goal was to prepare the next program for the use of these CASE tools. The outcome of the assessment is contained in this document.

  4. New software tools for enhanced precision in robot-assisted laser phonomicrosurgery.

    Science.gov (United States)

    Dagnino, Giulio; Mattos, Leonardo S; Caldwell, Darwin G

    2012-01-01

    This paper describes a new software package created to enhance precision during robot-assisted laser phonomicrosurgery procedures. The new software is composed of three tools for camera calibration, automatic tumor segmentation, and laser tracking. These were designed and developed to improve the outcome of this demanding microsurgical technique, and were tested herein to produce quantitative performance data. The experimental setup was based on the motorized laser micromanipulator created by Istituto Italiano di Tecnologia and the experimental protocols followed are fully described in this paper. The results show the new tools are robust and effective: The camera calibration tool reduced residual errors (RMSE) to 0.009 ± 0.002 mm under 40× microscope magnification; the automatic tumor segmentation tool resulted in deep lesion segmentations comparable to manual segmentations (RMSE= 0.160 ± 0.028 mm under 40× magnification); and the laser tracker tool proved to be reliable even during cutting procedures (RMSE= 0.073 ± 0.023 mm under 40× magnification). These results demonstrate the new software package can provide excellent improvements to the previous microsurgical system, leading to important enhancements in surgical outcome.

  5. Discovering patterns of correlation and similarities in software project data with the Circos visualization tool

    CERN Document Server

    Kosti, Makrina Viola; Bourazani, Nikoleta; Angelis, Lefteris

    2011-01-01

    Software cost estimation based on multivariate data from completed projects requires the building of efficient models. These models essentially describe relations in the data, either on the basis of correlations between variables or of similarities between the projects. The continuous growth of the amount of data gathered and the need to perform preliminary analysis in order to discover patterns able to drive the building of reasonable models, leads the researchers towards intelligent and time-saving tools which can effectively describe data and their relationships. The goal of this paper is to suggest an innovative visualization tool, widely used in bioinformatics, which represents relations in data in an aesthetic and intelligent way. In order to illustrate the capabilities of the tool, we use a well known dataset from software engineering projects.

  6. IPAT: a freely accessible software tool for analyzing multiple patent documents with inbuilt landscape visualizer.

    Science.gov (United States)

    Ajay, Dara; Gangwal, Rahul P; Sangamwar, Abhay T

    2015-01-01

    Intelligent Patent Analysis Tool (IPAT) is an online data retrieval tool, operated based on text mining algorithm to extract specific patent information in a predetermined pattern into an Excel sheet. The software is designed and developed to retrieve and analyze technology information from multiple patent documents and generate various patent landscape graphs and charts. The software is C# coded in visual studio 2010, which extracts the publicly available patent information from the web pages like Google Patent and simultaneously study the various technology trends based on user-defined parameters. In other words, IPAT combined with the manual categorization will act as an excellent technology assessment tool in competitive intelligence and due diligence for predicting the future R&D forecast. PMID:26452016

  7. SNL2Z: Tool for Translating an Informal Structured Software Specification into Formal Specification

    Directory of Open Access Journals (Sweden)

    Mohamed A. Sullabi

    2008-01-01

    Full Text Available In the area of software engineering there have been very few efforts to automate the translation from specifications written in natural language to the formal specification languages. Writing of the specifications in natural language is always depending on context and it is commonly vagueness; this represents the major reasons of the challenge. This paper discusses the design of a tool for translating a software specification written in natural language into a formal specification. We apply controlled natural language that limits the syntax and semantics when the natural language statements been written by proposing structured natural language (SNL to avoid the ambiguity problem. The tool uses basic information about the operation schemas and statements describing the specification of the system written by a group of user collaboratively as input. The output of the tool is a translation and interpreting of the specification statements into equivalent statements in LATEX form, which are compiled to produce an equivalent statements in Z.

  8. Pathway Tools version 19.0 update: software for pathway/genome informatics and systems biology.

    Science.gov (United States)

    Karp, Peter D; Latendresse, Mario; Paley, Suzanne M; Krummenacker, Markus; Ong, Quang D; Billington, Richard; Kothari, Anamika; Weaver, Daniel; Lee, Thomas; Subhraveti, Pallavi; Spaulding, Aaron; Fulcher, Carol; Keseler, Ingrid M; Caspi, Ron

    2016-09-01

    Pathway Tools is a bioinformatics software environment with a broad set of capabilities. The software provides genome-informatics tools such as a genome browser, sequence alignments, a genome-variant analyzer and comparative-genomics operations. It offers metabolic-informatics tools, such as metabolic reconstruction, quantitative metabolic modeling, prediction of reaction atom mappings and metabolic route search. Pathway Tools also provides regulatory-informatics tools, such as the ability to represent and visualize a wide range of regulatory interactions. This article outlines the advances in Pathway Tools in the past 5 years. Major additions include components for metabolic modeling, metabolic route search, computation of atom mappings and estimation of compound Gibbs free energies of formation; addition of editors for signaling pathways, for genome sequences and for cellular architecture; storage of gene essentiality data and phenotype data; display of multiple alignments, and of signaling and electron-transport pathways; and development of Python and web-services application programming interfaces. Scientists around the world have created more than 9800 Pathway/Genome Databases by using Pathway Tools, many of which are curated databases for important model organisms.

  9. SU-E-T-27: A Tool for Routine Quality Assurance of Radiotherapy Dose Calculation Software

    International Nuclear Information System (INIS)

    Purpose: Dose calculation software is thoroughly evaluated when it is commissioned; however, evaluation of periodic software updates is typically limited in scope due to staffing constraints and the need to quickly return the treatment planning system to clinical service. We developed a tool for quickly and comprehensively testing and documenting dose calculation software against measured data. Methods: A tool was developed using MatLab (The MathWorks, Natick, MA) for evaluation of dose calculation algorithms against measured data. Inputs to the tool are measured data, reference DICOM RT PLAN files describing the measurements, and dose calculations in DICOM format. The tool consists of a collection of extensible modules that can perform analysis of point dose, depth dose curves, and profiles using dose difference, distance-to-agreement, and the gamma-index. Each module generates a report subsection that is incorporated into a master template, which is converted to final form in portable document format (PDF). Results: After each change to the treatment planning system, a report can be generated in approximately 90 minutes. The tool has been in use for more than 5 years, spanning 5 versions of the eMC and 4 versions of the AAA. We have detected changes to the algorithms that affected clinical practice once during this period. Conclusion: Our tool provides an efficient method for quality assurance of dose calculation software, providing a complete set of tests for an update. Future work includes the addition of plan level tests, allowing incorporation of, for example, the TG-119 test suite for IMRT, and integration with the treatment planning system via an application programming interface. Integration with the planning system will permit fully-automated testing and reporting at scheduled intervals

  10. Life Cycle Assessment Studies of Chemical and Biochemical Processes through the new LCSoft Software-tool

    DEFF Research Database (Denmark)

    Supawanich, Perapong; Malakul, Pomthong; Gani, Rafiqul

    2015-01-01

    Life Cycle Assessment or LCA is an effective tool for quantifying the potential environmental impacts of products, processes, or services in order to support the selection making of desired products and/or processes from different alternatives. For more sustainable process designs, technical requ...... LCI assessment results. The fourth task has been added to validate and improve LCSoft by testing it against several case studies and compare the assessment results with other available tools....... requirements have to be evaluated together with environmental and economic aspects. The LCSoft software-tool has been developed to perform LCA as a stand-alone tool as well as integrated with other process design tools such as process simulation, economic analysis (ECON), and sustainable process design...

  11. Cerec Smile Design--a software tool for the enhancement of restorations in the esthetic zone.

    Science.gov (United States)

    Kurbad, Andreas; Kurbad, Susanne

    2013-01-01

    Restorations in the esthetic zone can now be enhanced using software tools. In addition to the design of the restoration itself, a part or all of the patient's face can be displayed on the monitor to increase the predictability of treatment results. Using the Smile Design components of the Cerec and inLab software, a digital photograph of the patient can be projected onto a three-dimensional dummy head. In addition to its use for the enhancement of the CAD process, this technology can also be utilized for marketing purposes.

  12. Development of Safety-Critical Software for Nuclear Power Plant using a CASE Tool

    International Nuclear Information System (INIS)

    The Integrated SOftware Development Environment (ISODE) is developed to provide the major S/W life cycle processes that are composed of development process, V/V process, requirements traceability process, and automated document generation process and target importing process to Programmable Logic Controller (PLC) platform. This provides critical safety software developers with a certified, domain optimized, model-based development environment, and the associated services to reduce time and efforts to develop software such as debugging, simulation, code generation and document generation. This also provides critical safety software verifiers with integrated V/V features of each phase of the software life cycle using appropriate tools such as model test coverage, formal verification, and automated report generation. In addition to development and verification, the ISODE gives a complete traceability solution from the SW design phase to the testing phase. Using this information, the coverage and impact analysis can be done easily whenever software modification is necessary. The final source codes of ISODE are imported into the newly developed PLC environment, as a module based after automatically converted into the format required by PLC. Additional tests for module and unit level are performed on the target platform

  13. Creating a strategic plan for configuration management using computer aided software engineering (CASE) tools

    International Nuclear Information System (INIS)

    This paper provides guidance in the definition, documentation, measurement, enhancement of processes, and validation of a strategic plan for configuration management (CM). The approach and methodology used in establishing a strategic plan is the same for any enterprise, including the Department of Energy (DOE), commercial nuclear plants, the Department of Defense (DOD), or large industrial complexes. The principles and techniques presented are used world wide by some of the largest corporations. The authors used industry knowledge and the areas of their current employment to illustrate and provide examples. Developing a strategic configuration and information management plan for DOE Idaho Field Office (DOE-ID) facilities is discussed in this paper. A good knowledge of CM principles is the key to successful strategic planning. This paper will describe and define CM elements, and discuss how CM integrates the facility's physical configuration, design basis, and documentation. The strategic plan does not need the support of a computer aided software engineering (CASE) tool. However, the use of the CASE tool provides a methodology for consistency in approach, graphics, and database capability combined to form an encyclopedia and a method of presentation that is easily understood and aids the process of reengineering. CASE tools have much more capability than those stated above. Some examples are supporting a joint application development group (JAD) to prepare a software functional specification document and, if necessary, provide the capability to automatically generate software application code. This paper briefly discusses characteristics and capabilities of two CASE tools that use different methodologies to generate similar deliverables

  14. Distortion Analysis Toolkit—A Software Tool for Easy Analysis of Nonlinear Audio Systems

    Directory of Open Access Journals (Sweden)

    Pakarinen Jyri

    2010-01-01

    Full Text Available Several audio effects devices deliberately add nonlinear distortion to the processed signal in order to create a desired sound. When creating virtual analog models of nonlinearly distorting devices, it would be very useful to carefully analyze the type of distortion, so that the model could be made as realistic as possible. While traditional system analysis tools such as the frequency response give detailed information on the operation of linear and time-invariant systems, they are less useful for analyzing nonlinear devices. Furthermore, although there do exist separate algorithms for nonlinear distortion analysis, there is currently no unified, easy-to-use tool for rapid analysis of distorting audio systems. This paper offers a remedy by introducing a new software tool for easy analysis of distorting effects. A comparison between a well-known guitar tube amplifier and two commercial software simulations is presented as a case study. This freely available software is written in Matlab language, but the analysis tool can also run as a standalone program, so the user does not need to have Matlab installed in order to perform the analysis.

  15. Distortion Analysis Toolkit—A Software Tool for Easy Analysis of Nonlinear Audio Systems

    Science.gov (United States)

    Pakarinen, Jyri

    2010-12-01

    Several audio effects devices deliberately add nonlinear distortion to the processed signal in order to create a desired sound. When creating virtual analog models of nonlinearly distorting devices, it would be very useful to carefully analyze the type of distortion, so that the model could be made as realistic as possible. While traditional system analysis tools such as the frequency response give detailed information on the operation of linear and time-invariant systems, they are less useful for analyzing nonlinear devices. Furthermore, although there do exist separate algorithms for nonlinear distortion analysis, there is currently no unified, easy-to-use tool for rapid analysis of distorting audio systems. This paper offers a remedy by introducing a new software tool for easy analysis of distorting effects. A comparison between a well-known guitar tube amplifier and two commercial software simulations is presented as a case study. This freely available software is written in Matlab language, but the analysis tool can also run as a standalone program, so the user does not need to have Matlab installed in order to perform the analysis.

  16. A Tool for Testing of Inheritance Related Bugs in Object Oriented Software

    Directory of Open Access Journals (Sweden)

    B. G. Geetha

    2008-01-01

    Full Text Available Object oriented software development different from traditional development products. In object oriented software polymorphism, inheritance, dynamic binding are the important features. An inheritance property is the main feature. The compilers usually detect the syntax oriented errors only. Some of the property errors may be located in the product. Data flow testing is an appropriate testing method for testing program futures. This test analysis structure of the software and gives the flow of property. This study is designed to detect the hidden errors with reference to the inheritance property. Inputs of the tool are set of classes and packages. Outputs of the tools are hierarchies of the classes, methods, attributes and a set of inheritance related bugs like naked access, spaghetti inheritance bugs are automatically detected by the tool. The tool is developed as three major modules. They are code analysis, knowledge base preparation and bugs analysis. The code analysis module is designed to parse extract details from the code. The knowledge base preparation module is designed to prepare the knowledge base about the program details. The bug's analysis module is designed to extract bugs related information from the database. It is a static testing. This study focused on Java programs.

  17. A Runtime Environment for Supporting Research in Resilient HPC System Software & Tools

    Energy Technology Data Exchange (ETDEWEB)

    Vallee, Geoffroy R [ORNL; Naughton, III, Thomas J [ORNL; Boehm, Swen [ORNL; Engelmann, Christian [ORNL

    2013-01-01

    The high-performance computing (HPC) community continues to increase the size and complexity of hardware platforms that support advanced scientific workloads. The runtime environment (RTE) is a crucial layer in the software stack for these large-scale systems. The RTE manages the interface between the operating system and the application running in parallel on the machine. The deployment of applications and tools on large-scale HPC computing systems requires the RTE to manage process creation in a scalable manner, support sparse connectivity, and provide fault tolerance. We have developed a new RTE that provides a basis for building distributed execution environments and developing tools for HPC to aid research in system software and resilience. This paper describes the software architecture of the Scalable runTime Component Infrastructure (STCI), which is intended to provide a complete infrastructure for scalable start-up and management of many processes in large-scale HPC systems. We highlight features of the current implementation, which is provided as a system library that allows developers to easily use and integrate STCI in their tools and/or applications. The motivation for this work has been to support ongoing research activities in fault-tolerance for large-scale systems. We discuss the advantages of the modular framework employed and describe two use cases that demonstrate its capabilities: (i) an alternate runtime for a Message Passing Interface (MPI) stack, and (ii) a distributed control and communication substrate for a fault-injection tool.

  18. TINA manual landmarking tool: software for the precise digitization of 3D landmarks

    Directory of Open Access Journals (Sweden)

    Schunke Anja C

    2012-04-01

    Full Text Available Abstract Background Interest in the placing of landmarks and subsequent morphometric analyses of shape for 3D data has increased with the increasing accessibility of computed tomography (CT scanners. However, current computer programs for this task suffer from various practical drawbacks. We present here a free software tool that overcomes many of these problems. Results The TINA Manual Landmarking Tool was developed for the digitization of 3D data sets. It enables the generation of a modifiable 3D volume rendering display plus matching orthogonal 2D cross-sections from DICOM files. The object can be rotated and axes defined and fixed. Predefined lists of landmarks can be loaded and the landmarks identified within any of the representations. Output files are stored in various established formats, depending on the preferred evaluation software. Conclusions The software tool presented here provides several options facilitating the placing of landmarks on 3D objects, including volume rendering from DICOM files, definition and fixation of meaningful axes, easy import, placement, control, and export of landmarks, and handling of large datasets. The TINA Manual Landmark Tool runs under Linux and can be obtained for free from http://www.tina-vision.net/tarballs/.

  19. PyElph - a software tool for gel images analysis and phylogenetics

    Directory of Open Access Journals (Sweden)

    Pavel Ana Brânduşa

    2012-01-01

    Full Text Available Abstract Background This paper presents PyElph, a software tool which automatically extracts data from gel images, computes the molecular weights of the analyzed molecules or fragments, compares DNA patterns which result from experiments with molecular genetic markers and, also, generates phylogenetic trees computed by five clustering methods, using the information extracted from the analyzed gel image. The software can be successfully used for population genetics, phylogenetics, taxonomic studies and other applications which require gel image analysis. Researchers and students working in molecular biology and genetics would benefit greatly from the proposed software because it is free, open source, easy to use, has a friendly Graphical User Interface and does not depend on specific image acquisition devices like other commercial programs with similar functionalities do. Results PyElph software tool is entirely implemented in Python which is a very popular programming language among the bioinformatics community. It provides a very friendly Graphical User Interface which was designed in six steps that gradually lead to the results. The user is guided through the following steps: image loading and preparation, lane detection, band detection, molecular weights computation based on a molecular weight marker, band matching and finally, the computation and visualization of phylogenetic trees. A strong point of the software is the visualization component for the processed data. The Graphical User Interface provides operations for image manipulation and highlights lanes, bands and band matching in the analyzed gel image. All the data and images generated in each step can be saved. The software has been tested on several DNA patterns obtained from experiments with different genetic markers. Examples of genetic markers which can be analyzed using PyElph are RFLP (Restriction Fragment Length Polymorphism, AFLP (Amplified Fragment Length Polymorphism, RAPD

  20. PC Software graphics tool for conceptual design of space/planetary electrical power systems

    Science.gov (United States)

    Truong, Long V.

    1995-01-01

    This paper describes the Decision Support System (DSS), a personal computer software graphics tool for designing conceptual space and/or planetary electrical power systems. By using the DSS, users can obtain desirable system design and operating parameters, such as system weight, electrical distribution efficiency, and bus power. With this tool, a large-scale specific power system was designed in a matter of days. It is an excellent tool to help designers make tradeoffs between system components, hardware architectures, and operation parameters in the early stages of the design cycle. The DSS is a user-friendly, menu-driven tool with online help and a custom graphical user interface. An example design and results are illustrated for a typical space power system with multiple types of power sources, frequencies, energy storage systems, and loads.

  1. SOFTWARE TOOL FOR LASER CUTTING PROCESS CONTROL – SOLVING REAL INDUSTRIAL CASE STUDIES

    Directory of Open Access Journals (Sweden)

    Miloš Madić

    2016-08-01

    Full Text Available Laser cutting is one of the leading non-conventional machining technologies with a wide spectrum of application in modern industry. It order to exploit a number of advantages that this technology offers for contour cutting of materials, it is necessary to carefully select laser cutting conditions for each given workpiece material, thickness and desired cut qualities. In other words, there is a need for process control of laser cutting. After a comprehensive analysis of the main laser cutting parameters and process performance characteristics, the application of the developed software tool “BRUTOMIZER” for off-line control of CO2 laser cutting process of three different workpiece materials (mild steel, stainless steel and aluminum is illustrated. Advantages and abilities of the developed software tool are also illustrated.

  2. RAVEN as a tool for dynamic probabilistic risk assessment: Software overview

    Energy Technology Data Exchange (ETDEWEB)

    Alfonsi, A.; Rabiti, C.; Mandelli, D.; Cogliati, J. J.; Kinoshita, R. A. [Idaho National Laboratory, 2525 Fremont Avenue, Idaho Falls, ID 83415 (United States)

    2013-07-01

    RAVEN is a software tool under development at the Idaho National Laboratory (INL) that acts as the control logic driver and post-processing tool for the newly developed Thermal-Hydraulic code RELAP-7. The scope of this paper is to show the software structure of RAVEN and its utilization in connection with RELAP-7. A short overview of the mathematical framework behind the code is presented along with its main capabilities such as on-line controlling/ monitoring and Monte-Carlo sampling. A demo of a Station Black Out PRA analysis of a simplified Pressurized Water Reactor (PWR) model is shown in order to demonstrate the Monte-Carlo and clustering capabilities. (authors)

  3. RAVEN AS A TOOL FOR DYNAMIC PROBABILISTIC RISK ASSESSMENT: SOFTWARE OVERVIEW

    Energy Technology Data Exchange (ETDEWEB)

    Alfonsi Andrea; Mandelli Diego; Rabiti Cristian; Joshua Cogliati; Robert Kinoshita

    2013-05-01

    RAVEN is a software tool under development at the Idaho National Laboratory (INL) that acts as the control logic driver and post-processing tool for the newly developed Thermo-Hydraylic code RELAP- 7. The scope of this paper is to show the software structure of RAVEN and its utilization in connection with RELAP-7. A short overview of the mathematical framework behind the code is presented along with its main capabilities such as on-line controlling/monitoring and Monte-Carlo sampling. A demo of a Station Black Out PRA analysis of a simplified Pressurized Water Reactor (PWR) model is shown in order to demonstrate the Monte-Carlo and clustering capabilities.

  4. Development of computer-aided software engineering tool for sequential control of JT-60U

    Energy Technology Data Exchange (ETDEWEB)

    Shimono, M.; Akasaka, H.; Kurihara, K.; Kimura, T. [Japan Atomic Energy Research Inst., Naka, Ibaraki (Japan). Naka Fusion Research Establishment

    1995-12-31

    Discharge sequential control (DSC) is an essential control function for the intermittent and pulse discharge operation of a tokamak device, so that many subsystems may work with each other in correct order and/or synchronously. In the development of the DSC program, block diagrams of logical operation for sequential control are illustrated in its design at first. Then, the logical operators and I/O`s which are involved in the block diagrams are compiled and converted to a certain particular form. Since the block diagrams of the sequential control amounts to about 50 sheets in the case of the JT-60 upgrade tokamak (JT-60U) high power discharge and the above steps of the development have been performed manually so far, a great effort has been required for the program development. In order to remove inefficiency in such development processes, a computer-aided software engineering (CASE) tool has been developed on a UNIX workstation. This paper reports how the authors design it for the development of the sequential control programs. The tool is composed of the following three tools: (1) Automatic drawing tool, (2) Editing tool, and (3) Trace tool. This CASE tool, an object-oriented programming tool having graphical formalism, can powerfully accelerate the cycle for the development of the sequential control function commonly associated with pulse discharge in a tokamak fusion device.

  5. Software tools for manipulating fe mesh, virtual surgery and post-processing

    OpenAIRE

    Milašinović Danko Z.; Cvjetković Vladimir M.; Böckler Dittmar; von Tengg-Kobligk Hendrik; Filipović Nenad D.

    2009-01-01

    This paper describes a set of software tools which we developed for the calculation of fluid flow through cardiovascular organs. Our tools work with medical data from a CT scanner, but could be used with any other 3D input data. For meshing we used a Tetgen tetrahedral mesh generator, as well as a mesh re-generator that we have developed for conversion of tetrahedral elements into bricks. After adequate meshing we used our PAKF solver for calculation of fluid flow. For human-friendly presenta...

  6. Software Tools for Emittance Measurement and Matching for 12 GeV CEBAF

    Energy Technology Data Exchange (ETDEWEB)

    Turner, Dennis L. [Thomas Jefferson National Accelerator Facility (TJNAF), Newport News, VA (United States)

    2016-05-01

    This paper discusses model-driven setup of the Continuous Electron Beam Accelerator Facility (CEBAF) for the 12GeV era, focusing on qsUtility. qsUtility is a set of software tools created to perform emittance measurements, analyze those measurements, and compute optics corrections based upon the measurements.qsUtility was developed as a toolset to facilitate reducing machine configuration time and reproducibility by way of an accurate accelerator model, and to provide Operations staff with tools to measure and correct machine optics with little or no assistance from optics experts.

  7. A Case Study of Black-Box Testing for Embedded Software using Test Automation Tool

    OpenAIRE

    Changhyun Baek; Joongsoon Jang; Gihyun Jung; Kyunghee Choi; Seungkyu Park

    2007-01-01

    This research shows a case study of the Black-Box testing for Temperature Controller (TC) which is one of the typical embedded systems. A test automation tool, TEST, was developed and some kinds of TCs were tested using the tool. We presented statistical analysis for the test results of the automated testing and defined the properties of the software bugs for the embedded system. The main result of the study were the following: (a) test case prioritization technique was needed because the rev...

  8. A Novel Software Tool to Generate Customer Needs for Effective Design of Online Shopping Websites

    Directory of Open Access Journals (Sweden)

    Ashish K. Sharma

    2016-03-01

    Full Text Available —Effective design of online shopping websites is the need of the hour as design plays a crucial role in the success of online shopping businesses. Recently, the use of Quality Function Deployment (QFD has been reported for the design of online shopping websites. QFD is a customer driven process that encompasses voluminous data gathered from customers through several techniques like personal interview, focus groups, surveys etc. This massive, unsorted and unstructured data is required to be transformed into a limited number of structured information to represent the actual Customer Needs (CNs which are then utilized in subsequent stages of QFD process. This can be achieved through brainstorming using techniques like Affinity Process. However, integrating the Affinity Process within QFD is tedious and time consuming and cannot be dealt with manually. This generates a pressing need for a software tool to serve the purpose. Moreover, the researches carried out so far have focused on QFD application, post the generation of CNs. Also, the available QFD softwares lack the option to generate CNs from collected data. Thus, the paper aims to develop a novel software tool that integrates Affinity Process with QFD to generate customers‘ needs for effective design of online shopping websites. The software system is developed using Visual Basic Dot Net (VB.Net that integrates a MS-Access database.

  9. Formal testing of object-oriented software: from the method to the tool

    OpenAIRE

    Péraire, Cécile; Strohmeier, Alfred

    2005-01-01

    This thesis presents a method and a tool for test set selection, dedicated to object-oriented applications and based on formal specifications. Testing is one method to increase the quality of today’s extraordinary complex software. The aim is to find program errors with respect to given criteria of correctness. In the case of formal testing, the criterion of correctness is the formal specification of the tested application: program behaviors are compared to those required by the specification...

  10. Users' manual for the Hydroecological Integrity Assessment Process software (including the New Jersey Assessment Tools)

    Science.gov (United States)

    Henriksen, James A.; Heasley, John; Kennen, Jonathan G.; Nieswand, Steven

    2006-01-01

    This manual is a user’s guide to four computer software tools that have been developed for the Hydroecological Integrity Assessment Process. The Hydroecological Integrity Assessment Process recognizes that streamflow is strongly related to many critical physiochemical components of rivers, such as dissolved oxygen, channel geomorphology, and water temperature, and can be considered a “master variable” that limits the disturbance, abundance, and diversity of many aquatic plant and animal species.

  11. Open Source Platforms, Applications and Tools for Software-Defined Networking and 5G Research

    OpenAIRE

    Suomalainen, Lauri; Nikkhouy, Emad; Ding, Aaron Yi; Tarkoma, Sasu

    2014-01-01

    Software-Defined Networking (SDN) is a novel solution to network configuration and management. Its openness and programmability features have greatly motivated the open source communities where numerous applications and tools are developed for various R&D purposes. For the strength of SDN, the upcoming 5th Generation mobile networks (5G) can also benefit from the modular and open design to innovate the network architecture and services. In this report, we present a survey of existing open ...

  12. NgsRelate: a software tool for estimating pairwise relatedness from next-generation sequencing data

    OpenAIRE

    Korneliussen, Thorfinn Sand; Moltke, Ida

    2015-01-01

    MOTIVATION: Pairwise relatedness estimation is important in many contexts such as disease mapping and population genetics. However, all existing estimation methods are based on called genotypes, which is not ideal for next-generation sequencing (NGS) data of low depth from which genotypes cannot be called with high certainty.RESULTS: We present a software tool, NgsRelate, for estimating pairwise relatedness from NGS data. It provides maximum likelihood estimates that are based on genotype lik...

  13. GMFilter and SXTestPlate: software tools for improving the SNPlex™ genotyping system

    OpenAIRE

    Schreiber Stefan; Wenz Michael H; Teuber Markus; Franke Andre

    2009-01-01

    Abstract Background Genotyping of single-nucleotide polymorphisms (SNPs) is a fundamental technology in modern genetics. The SNPlex™ mid-throughput genotyping system (Applied Biosystems, Foster City, CA, USA) enables the multiplexed genotyping of up to 48 SNPs simultaneously in a single DNA sample. The high level of automation and the large amount of data produced in a high-throughput laboratory require advanced software tools for quality control and workflow management. Results We have devel...

  14. Ignominy:a Tool for Software Dependency and Metric Analysis with Examples from Large HEP Packages

    Institute of Scientific and Technical Information of China (English)

    LassiA.Tuura; LucasTaylor

    2001-01-01

    Ignominy is a tool developed in the CMS IGUANA project to analyse the structure of software systems.Its primary component is a dependency scanner that distills information into human-usable forms.It also includes several tools to visualise the collected data in the form of graphical views and numerical metrics.Ignominy was designed to adapt to almost any reasonable structure,and it has been used to analyse several large projects. The original purpose of Ignominy was to help us better ensure the quality of our own software,and in particular warn us about possible structureal problems early on .As a part of this activity it is now used as a standard part of our release procedure,we also use it to evaluate and study the quality of external packages we plan to make use of .We describe what Ignominy can find out,and how if can be used to ivsualise and assess a software structure.We also discuss the inherent problems of the analysis as well as the different approaches to modularity the tool makes quite evident.The focus is the illustration of these issues through the analysis results for several sizable HEP softwre projects.

  15. User Driven Development of Software Tools for Open Data Discovery and Exploration

    Science.gov (United States)

    Schlobinski, Sascha; Keppel, Frank; Dihe, Pascal; Boot, Gerben; Falkenroth, Esa

    2016-04-01

    The use of open data in research faces challenges not restricted to inherent properties such as data quality, resolution of open data sets. Often Open data is catalogued insufficiently or fragmented. Software tools that support the effective discovery including the assessment of the data's appropriateness for research have shortcomings such as the lack of essential functionalities like support for data provenance. We believe that one of the reasons is the neglect of real end users requirements in the development process of aforementioned software tools. In the context of the FP7 Switch-On project we have pro-actively engaged the relevant user user community to collaboratively develop a means to publish, find and bind open data relevant for hydrologic research. Implementing key concepts of data discovery and exploration we have used state of the art web technologies to provide an interactive software tool that is easy to use yet powerful enough to satisfy the data discovery and access requirements of the hydrological research community.

  16. The anatomy of E-Learning tools: Does software usability influence learning outcomes?

    Science.gov (United States)

    Van Nuland, Sonya E; Rogers, Kem A

    2016-07-01

    Reductions in laboratory hours have increased the popularity of commercial anatomy e-learning tools. It is critical to understand how the functionality of such tools can influence the mental effort required during the learning process, also known as cognitive load. Using dual-task methodology, two anatomical e-learning tools were examined to determine the effect of their design on cognitive load during two joint learning exercises. A.D.A.M. Interactive Anatomy is a simplistic, two-dimensional tool that presents like a textbook, whereas Netter's 3D Interactive Anatomy has a more complex three-dimensional usability that allows structures to be rotated. It was hypothesized that longer reaction times on an observation task would be associated with the more complex anatomical software (Netter's 3D Interactive Anatomy), indicating a higher cognitive load imposed by the anatomy software, which would result in lower post-test scores. Undergraduate anatomy students from Western University, Canada (n = 70) were assessed using a baseline knowledge test, Stroop observation task response times (a measure of cognitive load), mental rotation test scores, and an anatomy post-test. Results showed that reaction times and post-test outcomes were similar for both tools, whereas mental rotation test scores were positively correlated with post-test values when students used Netter's 3D Interactive Anatomy (P = 0.007), but not when they used A.D.A.M. Interactive Anatomy. This suggests that a simple e-learning tool, such as A.D.A.M. Interactive Anatomy, is as effective as more complicated tools, such as Netter's 3D Interactive Anatomy, and does not academically disadvantage those with poor spatial ability. Anat Sci Educ 9: 378-390. © 2015 American Association of Anatomists.

  17. BYMUR software: a free and open source tool for quantifying and visualizing multi-risk analyses

    Science.gov (United States)

    Tonini, Roberto; Selva, Jacopo

    2013-04-01

    The BYMUR software aims to provide an easy-to-use open source tool for both computing multi-risk and managing/visualizing/comparing all the inputs (e.g. hazard, fragilities and exposure) as well as the corresponding results (e.g. risk curves, risk indexes). For all inputs, a complete management of inter-model epistemic uncertainty is considered. The BYMUR software will be one of the final products provided by the homonymous ByMuR project (http://bymur.bo.ingv.it/) funded by Italian Ministry of Education, Universities and Research (MIUR), focused to (i) provide a quantitative and objective general method for a comprehensive long-term multi-risk analysis in a given area, accounting for inter-model epistemic uncertainty through Bayesian methodologies, and (ii) apply the methodology to seismic, volcanic and tsunami risks in Naples (Italy). More specifically, the BYMUR software will be able to separately account for the probabilistic hazard assessment of different kind of hazardous phenomena, the relative (time-dependent/independent) vulnerabilities and exposure data, and their possible (predefined) interactions: the software will analyze these inputs and will use them to estimate both single- and multi- risk associated to a specific target area. In addition, it will be possible to connect the software to further tools (e.g., a full hazard analysis), allowing a dynamic I/O of results. The use of Python programming language guarantees that the final software will be open source and platform independent. Moreover, thanks to the integration of some most popular and rich-featured Python scientific modules (Numpy, Matplotlib, Scipy) with the wxPython graphical user toolkit, the final tool will be equipped with a comprehensive Graphical User Interface (GUI) able to control and visualize (in the form of tables, maps and/or plots) any stage of the multi-risk analysis. The additional features of importing/exporting data in MySQL databases and/or standard XML formats (for

  18. Software tools of the Computis European project to process mass spectrometry images.

    Science.gov (United States)

    Robbe, Marie-France; Both, Jean-Pierre; Prideaux, Brendan; Klinkert, Ivo; Picaud, Vincent; Schramm, Thorsten; Hester, Atfons; Guevara, Victor; Stoeckli, Markus; Roempp, Andreas; Heeren, Ron M A; Spengler, Bernhard; Gala, Olivier; Haan, Serge

    2014-01-01

    Among the needs usually expressed by teams using mass spectrometry imaging, one that often arises is that for user-friendly software able to manage huge data volumes quickly and to provide efficient assistance for the interpretation of data. To answer this need, the Computis European project developed several complementary software tools to process mass spectrometry imaging data. Data Cube Explorer provides a simple spatial and spectral exploration for matrix-assisted laser desorption/ionisation-time of flight (MALDI-ToF) and time of flight-secondary-ion mass spectrometry (ToF-SIMS) data. SpectViewer offers visualisation functions, assistance to the interpretation of data, classification functionalities, peak list extraction to interrogate biological database and image overlay, and it can process data issued from MALDI-ToF, ToF-SIMS and desorption electrospray ionisation (DESI) equipment. EasyReg2D is able to register two images, in American Standard Code for Information Interchange (ASCII) format, issued from different technologies. The collaboration between the teams was hampered by the multiplicity of equipment and data formats, so the project also developed a common data format (imzML) to facilitate the exchange of experimental data and their interpretation by the different software tools. The BioMap platform for visualisation and exploration of MALDI-ToF and DESI images was adapted to parse imzML files, enabling its access to all project partners and, more globally, to a larger community of users. Considering the huge advantages brought by the imzML standard format, a specific editor (vBrowser) for imzML files and converters from proprietary formats to imzML were developed to enable the use of the imzML format by a broad scientific community. This initiative paves the way toward the development of a large panel of software tools able to process mass spectrometry imaging datasets in the future.

  19. SOFI Simulation Tool: A Software Package for Simulating and Testing Super-Resolution Optical Fluctuation Imaging

    Science.gov (United States)

    Sharipov, Azat; Geissbuehler, Stefan; Leutenegger, Marcel; Vandenberg, Wim; Dedecker, Peter; Hofkens, Johan; Lasser, Theo

    2016-01-01

    Super-resolution optical fluctuation imaging (SOFI) allows one to perform sub-diffraction fluorescence microscopy of living cells. By analyzing the acquired image sequence with an advanced correlation method, i.e. a high-order cross-cumulant analysis, super-resolution in all three spatial dimensions can be achieved. Here we introduce a software tool for a simple qualitative comparison of SOFI images under simulated conditions considering parameters of the microscope setup and essential properties of the biological sample. This tool incorporates SOFI and STORM algorithms, displays and describes the SOFI image processing steps in a tutorial-like fashion. Fast testing of various parameters simplifies the parameter optimization prior to experimental work. The performance of the simulation tool is demonstrated by comparing simulated results with experimentally acquired data. PMID:27583365

  20. Utilization of Software Tools for Uncertainty Calculation in Measurement Science Education

    International Nuclear Information System (INIS)

    Despite its importance, uncertainty is often neglected by practitioners in the design of system even in safety critical applications. Thus, problems arising from uncertainty may only be identified late in the design process and thus lead to additional costs. Although there exists numerous tools to support uncertainty calculation, reasons for limited usage in early design phases may be low awareness of the existence of the tools and insufficient training in the practical application. We present a teaching philosophy that addresses uncertainty from the very beginning of teaching measurement science, in particular with respect to the utilization of software tools. The developed teaching material is based on the GUM method and makes use of uncertainty toolboxes in the simulation environment. Based on examples in measurement science education we discuss advantages and disadvantages of the proposed teaching philosophy and include feedback from students

  1. TESPI (Tool for Environmental Sound Product Innovation): a simplified software tool to support environmentally conscious design in SMEs

    Science.gov (United States)

    Misceo, Monica; Buonamici, Roberto; Buttol, Patrizia; Naldesi, Luciano; Grimaldi, Filomena; Rinaldi, Caterina

    2004-12-01

    TESPI (Tool for Environmental Sound Product Innovation) is the prototype of a software tool developed within the framework of the "eLCA" project. The project, (www.elca.enea.it)financed by the European Commission, is realising "On line green tools and services for Small and Medium sized Enterprises (SMEs)". The implementation by SMEs of environmental product innovation (as fostered by the European Integrated Product Policy, IPP) needs specific adaptation to their economic model, their knowledge of production and management processes and their relationships with innovation and the environment. In particular, quality and costs are the main driving forces of innovation in European SMEs, and well known barriers exist to the adoption of an environmental approach in the product design. Starting from these considerations, the TESPI tool has been developed to support the first steps of product design taking into account both the quality and the environment. Two main issues have been considered: (i) classic Quality Function Deployment (QFD) can hardly be proposed to SMEs; (ii) the environmental aspects of the product life cycle need to be integrated with the quality approach. TESPI is a user friendly web-based tool, has a training approach and applies to modular products. Users are guided through the investigation of the quality aspects of their product (customer"s needs and requirements fulfilment) and the identification of the key environmental aspects in the product"s life cycle. A simplified check list allows analyzing the environmental performance of the product. Help is available for a better understanding of the analysis criteria. As a result, the significant aspects for the redesign of the product are identified.

  2. Emerging role of bioinformatics tools and software in evolution of clinical research.

    Science.gov (United States)

    Gill, Supreet Kaur; Christopher, Ajay Francis; Gupta, Vikas; Bansal, Parveen

    2016-01-01

    Clinical research is making toiling efforts for promotion and wellbeing of the health status of the people. There is a rapid increase in number and severity of diseases like cancer, hepatitis, HIV etc, resulting in high morbidity and mortality. Clinical research involves drug discovery and development whereas clinical trials are performed to establish safety and efficacy of drugs. Drug discovery is a long process starting with the target identification, validation and lead optimization. This is followed by the preclinical trials, intensive clinical trials and eventually post marketing vigilance for drug safety. Softwares and the bioinformatics tools play a great role not only in the drug discovery but also in drug development. It involves the use of informatics in the development of new knowledge pertaining to health and disease, data management during clinical trials and to use clinical data for secondary research. In addition, new technology likes molecular docking, molecular dynamics simulation, proteomics and quantitative structure activity relationship in clinical research results in faster and easier drug discovery process. During the preclinical trials, the software is used for randomization to remove bias and to plan study design. In clinical trials software like electronic data capture, Remote data capture and electronic case report form (eCRF) is used to store the data. eClinical, Oracle clinical are software used for clinical data management and for statistical analysis of the data. After the drug is marketed the safety of a drug could be monitored by drug safety software like Oracle Argus or ARISg. Therefore, softwares are used from the very early stages of drug designing, to drug development, clinical trials and during pharmacovigilance. This review describes different aspects related to application of computers and bioinformatics in drug designing, discovery and development, formulation designing and clinical research. PMID:27453827

  3. DVS-SOFTWARE: An Effective Tool for Applying Highly Parallelized Hardware To Computational Geophysics

    Science.gov (United States)

    Herrera, I.; Herrera, G. S.

    2015-12-01

    Most geophysical systems are macroscopic physical systems. The behavior prediction of such systems is carried out by means of computational models whose basic models are partial differential equations (PDEs) [1]. Due to the enormous size of the discretized version of such PDEs it is necessary to apply highly parallelized super-computers. For them, at present, the most efficient software is based on non-overlapping domain decomposition methods (DDM). However, a limiting feature of the present state-of-the-art techniques is due to the kind of discretizations used in them. Recently, I. Herrera and co-workers using 'non-overlapping discretizations' have produced the DVS-Software which overcomes this limitation [2]. The DVS-software can be applied to a great variety of geophysical problems and achieves very high parallel efficiencies (90%, or so [3]). It is therefore very suitable for effectively applying the most advanced parallel supercomputers available at present. In a parallel talk, in this AGU Fall Meeting, Graciela Herrera Z. will present how this software is being applied to advance MOD-FLOW. Key Words: Parallel Software for Geophysics, High Performance Computing, HPC, Parallel Computing, Domain Decomposition Methods (DDM)REFERENCES [1]. Herrera Ismael and George F. Pinder, Mathematical Modelling in Science and Engineering: An axiomatic approach", John Wiley, 243p., 2012. [2]. Herrera, I., de la Cruz L.M. and Rosas-Medina A. "Non Overlapping Discretization Methods for Partial, Differential Equations". NUMER METH PART D E, 30: 1427-1454, 2014, DOI 10.1002/num 21852. (Open source) [3]. Herrera, I., & Contreras Iván "An Innovative Tool for Effectively Applying Highly Parallelized Software To Problems of Elasticity". Geofísica Internacional, 2015 (In press)

  4. Emerging role of bioinformatics tools and software in evolution of clinical research

    Science.gov (United States)

    Gill, Supreet Kaur; Christopher, Ajay Francis; Gupta, Vikas; Bansal, Parveen

    2016-01-01

    Clinical research is making toiling efforts for promotion and wellbeing of the health status of the people. There is a rapid increase in number and severity of diseases like cancer, hepatitis, HIV etc, resulting in high morbidity and mortality. Clinical research involves drug discovery and development whereas clinical trials are performed to establish safety and efficacy of drugs. Drug discovery is a long process starting with the target identification, validation and lead optimization. This is followed by the preclinical trials, intensive clinical trials and eventually post marketing vigilance for drug safety. Softwares and the bioinformatics tools play a great role not only in the drug discovery but also in drug development. It involves the use of informatics in the development of new knowledge pertaining to health and disease, data management during clinical trials and to use clinical data for secondary research. In addition, new technology likes molecular docking, molecular dynamics simulation, proteomics and quantitative structure activity relationship in clinical research results in faster and easier drug discovery process. During the preclinical trials, the software is used for randomization to remove bias and to plan study design. In clinical trials software like electronic data capture, Remote data capture and electronic case report form (eCRF) is used to store the data. eClinical, Oracle clinical are software used for clinical data management and for statistical analysis of the data. After the drug is marketed the safety of a drug could be monitored by drug safety software like Oracle Argus or ARISg. Therefore, softwares are used from the very early stages of drug designing, to drug development, clinical trials and during pharmacovigilance. This review describes different aspects related to application of computers and bioinformatics in drug designing, discovery and development, formulation designing and clinical research.

  5. Emerging role of bioinformatics tools and software in evolution of clinical research

    Directory of Open Access Journals (Sweden)

    Supreet Kaur Gill

    2016-01-01

    Full Text Available Clinical research is making toiling efforts for promotion and wellbeing of the health status of the people. There is a rapid increase in number and severity of diseases like cancer, hepatitis, HIV etc, resulting in high morbidity and mortality. Clinical research involves drug discovery and development whereas clinical trials are performed to establish safety and efficacy of drugs. Drug discovery is a long process starting with the target identification, validation and lead optimization. This is followed by the preclinical trials, intensive clinical trials and eventually post marketing vigilance for drug safety. Softwares and the bioinformatics tools play a great role not only in the drug discovery but also in drug development. It involves the use of informatics in the development of new knowledge pertaining to health and disease, data management during clinical trials and to use clinical data for secondary research. In addition, new technology likes molecular docking, molecular dynamics simulation, proteomics and quantitative structure activity relationship in clinical research results in faster and easier drug discovery process. During the preclinical trials, the software is used for randomization to remove bias and to plan study design. In clinical trials software like electronic data capture, Remote data capture and electronic case report form (eCRF is used to store the data. eClinical, Oracle clinical are software used for clinical data management and for statistical analysis of the data. After the drug is marketed the safety of a drug could be monitored by drug safety software like Oracle Argus or ARISg. Therefore, softwares are used from the very early stages of drug designing, to drug development, clinical trials and during pharmacovigilance. This review describes different aspects related to application of computers and bioinformatics in drug designing, discovery and development, formulation designing and clinical research.

  6. Software tool for analysing the family shopping basket without candidate generation

    Directory of Open Access Journals (Sweden)

    Roberto Carlos Naranjo Cuervo

    2010-05-01

    Full Text Available Tools leading to useful knowledge being obtained for supporting marketing decisions being taken are currently needed in the e-commerce environment. A process is needed for this which uses a series of techniques for data-processing; data-mining is one such technique enabling automatic information discovery. This work presents the association rules as a suitable technique for dis-covering how customers buy from a company offering business to consumer (B2C e-business, aimed at supporting decision-ma-king in supplying its customers or capturing new ones. Many algorithms such as A priori, DHP, Partition, FP-Growth and Eclat are available for implementing association rules; the following criteria were defined for selecting the appropriate algorithm: database insert, computational cost, performance and execution time. The development of a software tool is also presented which involved the CRISP-DM approach; this software tool was formed by the following four sub-modules: data pre-processing, data-mining, re-sults analysis and results application. The application design used three-layer architecture: presentation logic, business logic and service logic. Data warehouse design and algorithm design were included in developing this data-mining software tool. It was tested by using a FoodMart company database; the tests included performance, functionality and results’ validity, thereby allo-wing association rules to be found. The results led to concluding that using association rules as a data mining technique facilita-tes analysing volumes of information for B2C e-business services which represents a competitive advantage for those companies using Internet as their sales’ media.

  7. Creating a strategic plan for configuration management using computer aided software engineering (CASE) tools

    Energy Technology Data Exchange (ETDEWEB)

    Smith, P.R.; Sarfaty, R.

    1993-05-01

    This paper provides guidance in the definition, documentation, measurement, enhancement of processes, and validation of a strategic plan for configuration management (CM). The approach and methodology used in establishing a strategic plan is the same for any enterprise, including the Department of Energy (DOE), commercial nuclear plants, the Department of Defense (DOD), or large industrial complexes. The principles and techniques presented are used world wide by some of the largest corporations. The authors used industry knowledge and the areas of their current employment to illustrate and provide examples. Developing a strategic configuration and information management plan for DOE Idaho Field Office (DOE-ID) facilities is discussed in this paper. A good knowledge of CM principles is the key to successful strategic planning. This paper will describe and define CM elements, and discuss how CM integrates the facility`s physical configuration, design basis, and documentation. The strategic plan does not need the support of a computer aided software engineering (CASE) tool. However, the use of the CASE tool provides a methodology for consistency in approach, graphics, and database capability combined to form an encyclopedia and a method of presentation that is easily understood and aids the process of reengineering. CASE tools have much more capability than those stated above. Some examples are supporting a joint application development group (JAD) to prepare a software functional specification document and, if necessary, provide the capability to automatically generate software application code. This paper briefly discusses characteristics and capabilities of two CASE tools that use different methodologies to generate similar deliverables.

  8. A software tool of digital tomosynthesis application for patient positioning in radiotherapy.

    Science.gov (United States)

    Yan, Hui; Dai, Jian-Rong

    2016-01-01

    Digital Tomosynthesis (DTS) is an image modality in reconstructing tomographic images from two-dimensional kV projections covering a narrow scan angles. Comparing with conventional cone-beam CT (CBCT), it requires less time and radiation dose in data acquisition. It is feasible to apply this technique in patient positioning in radiotherapy. To facilitate its clinical application, a software tool was developed and the reconstruction processes were accelerated by graphic process-ing unit (GPU). Two reconstruction and two registration processes are required for DTS application which is different from conventional CBCT application which requires one image reconstruction process and one image registration process. The reconstruction stage consists of productions of two types of DTS. One type of DTS is reconstructed from cone-beam (CB) projections covering a narrow scan angle and is named onboard DTS (ODTS), which represents the real patient position in treatment room. Another type of DTS is reconstructed from digitally reconstructed radiography (DRR) and is named reference DTS (RDTS), which represents the ideal patient position in treatment room. Prior to the reconstruction of RDTS, The DRRs are reconstructed from planning CT using the same acquisition setting of CB projections. The registration stage consists of two matching processes between ODTS and RDTS. The target shift in lateral and longitudinal axes are obtained from the matching between ODTS and RDTS in coronal view, while the target shift in longitudinal and vertical axes are obtained from the matching between ODTS and RDTS in sagittal view. In this software, both DRR and DTS reconstruction algorithms were implemented on GPU environments for acceleration purpose. The comprehensive evaluation of this software tool was performed including geometric accuracy, image quality, registration accuracy, and reconstruction efficiency. The average correlation coefficient between DRR/DTS generated by GPU-based algorithm

  9. Practical requirements for software tools to assist in the validation and verification of hybrid expert systems

    International Nuclear Information System (INIS)

    Any practical software development effort must remain focused on verification and validation of user requirements. Knowledge-based system development is no different in this regard. In industry today, most expert systems being produced are, in reality, hybrid software systems which, in addition to those components that provide the knowledge base and expert reasoning over the problem domain using various rule-based and object-oriented paradigms, incorporate significant bodies of code based on more traditional software techniques such as database management, graphical user interfaces, hypermedia, spreadsheets, as well as specially developed sequential code. Validation and verification of such hybrid systems must perforce integrate suitable methodologies from all such fields. This paper attempts to provide a broad overview of the practical requirements for methodologies and the concomitant groupware tools which would assist in such an enterprise. These methodologies and groupware tools would facilitate the teamwork efforts necessary to validate and verify all components of such hybrid systems by emphasizing cooperative recording of requirements and negotiated resolutions of any conflicts grounded in a solid understanding of the semantics of such a system

  10. A practical comparison of de novo genome assembly software tools for next-generation sequencing technologies.

    Directory of Open Access Journals (Sweden)

    Wenyu Zhang

    Full Text Available The advent of next-generation sequencing technologies is accompanied with the development of many whole-genome sequence assembly methods and software, especially for de novo fragment assembly. Due to the poor knowledge about the applicability and performance of these software tools, choosing a befitting assembler becomes a tough task. Here, we provide the information of adaptivity for each program, then above all, compare the performance of eight distinct tools against eight groups of simulated datasets from Solexa sequencing platform. Considering the computational time, maximum random access memory (RAM occupancy, assembly accuracy and integrity, our study indicate that string-based assemblers, overlap-layout-consensus (OLC assemblers are well-suited for very short reads and longer reads of small genomes respectively. For large datasets of more than hundred millions of short reads, De Bruijn graph-based assemblers would be more appropriate. In terms of software implementation, string-based assemblers are superior to graph-based ones, of which SOAPdenovo is complex for the creation of configuration file. Our comparison study will assist researchers in selecting a well-suited assembler and offer essential information for the improvement of existing assemblers or the developing of novel assemblers.

  11. GMFilter and SXTestPlate: software tools for improving the SNPlex™ genotyping system

    Science.gov (United States)

    Teuber, Markus; Wenz, Michael H; Schreiber, Stefan; Franke, Andre

    2009-01-01

    Background Genotyping of single-nucleotide polymorphisms (SNPs) is a fundamental technology in modern genetics. The SNPlex™ mid-throughput genotyping system (Applied Biosystems, Foster City, CA, USA) enables the multiplexed genotyping of up to 48 SNPs simultaneously in a single DNA sample. The high level of automation and the large amount of data produced in a high-throughput laboratory require advanced software tools for quality control and workflow management. Results We have developed two programs, which address two main aspects of quality control in a SNPlex™ genotyping environment: GMFilter improves the analysis of SNPlex™ plates by removing wells with a low overall signal intensity. It enables scientists to automatically process the raw data in a standardized way before analyzing a plate with the proprietary GeneMapper software from Applied Biosystems. SXTestPlate examines the genotype concordance of a SNPlex™ test plate, which was typed with a control SNP set. This program allows for regular quality control checks of a SNPlex™ genotyping platform. It is compatible to other genotyping methods as well. Conclusion GMFilter and SXTestPlate provide a valuable tool set for laboratories engaged in genotyping based on the SNPlex™ system. The programs enhance the analysis of SNPlex™ plates with the GeneMapper software and enable scientists to evaluate the performance of their genotyping platform. PMID:19267942

  12. GMFilter and SXTestPlate: software tools for improving the SNPlex™ genotyping system

    Directory of Open Access Journals (Sweden)

    Schreiber Stefan

    2009-03-01

    Full Text Available Abstract Background Genotyping of single-nucleotide polymorphisms (SNPs is a fundamental technology in modern genetics. The SNPlex™ mid-throughput genotyping system (Applied Biosystems, Foster City, CA, USA enables the multiplexed genotyping of up to 48 SNPs simultaneously in a single DNA sample. The high level of automation and the large amount of data produced in a high-throughput laboratory require advanced software tools for quality control and workflow management. Results We have developed two programs, which address two main aspects of quality control in a SNPlex™ genotyping environment: GMFilter improves the analysis of SNPlex™ plates by removing wells with a low overall signal intensity. It enables scientists to automatically process the raw data in a standardized way before analyzing a plate with the proprietary GeneMapper software from Applied Biosystems. SXTestPlate examines the genotype concordance of a SNPlex™ test plate, which was typed with a control SNP set. This program allows for regular quality control checks of a SNPlex™ genotyping platform. It is compatible to other genotyping methods as well. Conclusion GMFilter and SXTestPlate provide a valuable tool set for laboratories engaged in genotyping based on the SNPlex™ system. The programs enhance the analysis of SNPlex™ plates with the GeneMapper software and enable scientists to evaluate the performance of their genotyping platform.

  13. Conception and validation software tools for the level 0 muon trigger of LHCb

    International Nuclear Information System (INIS)

    The Level-0 muon trigger processor of the LHCb experiment looks for straight particles crossing muon detector and measures their transverse momentum. It processes 40*106 proton-proton collisions per second. The tracking uses a road algorithm relying on the projectivity of the muon detector (the logical layout in the 5 muon station is projective in y to the interaction point and it is also projective in x when the bending in the horizontal direction introduced by the magnetic field is ignored). The architecture of the Level-0 muon trigger is complex with a dense network of data interconnections. The design and validation of such an intricate system has only been possible with intense use of software tools for the detector simulation, the modelling of the hardware components behaviour and the validation. A database describing the data-flow is the corner stone between the software and hardware components. (authors)

  14. A software tool for teaching and training how to build and use a TOWS matrix

    Directory of Open Access Journals (Sweden)

    Amparo Mariño Ibáñez

    2010-05-01

    Full Text Available Strategic planning is currently being used by most companies; it analyses current and expected future situations, determines com-pany orientation and develops means or strategies for achieving their stated missions. This article is aimed at reviewing general considerations in strategic planning and presenting a computational tool designed for building a TOWS matrix for matching a company’s opportunities and threats with its weaknesses and, more especially, its strengths. The software development life cycle (SDLC involved analysis, design, implementation and use. The literature about strategic planning and SWOT analysis was re-viewed for making the analysis. The software only automates an aspect of the whole strategic planning process and can be used for improving students and staff training in SWOT analysis. This type of work seeks to motivate interdisciplinary research.

  15. Techniques and tools for measuring energy efficiency of scientific software applications

    CERN Document Server

    Abdurachmanov, David; Eulisse, Giulio; Knight, Robert; Niemi, Tapio; Nurminen, Jukka K.; Nyback, Filip; Pestana, Goncalo; Ou, Zhonghong; Khan, Kashif

    2014-01-01

    The scale of scientific High Performance Computing (HPC) and High Throughput Computing (HTC) has increased significantly in recent years, and is becoming sensitive to total energy use and cost. Energy-efficiency has thus become an important concern in scientific fields such as High Energy Physics (HEP). There has been a growing interest in utilizing alternate architectures, such as low power ARM processors, to replace traditional Intel x86 architectures. Nevertheless, even though such solutions have been successfully used in mobile applications with low I/O and memory demands, it is unclear if they are suitable and more energy-efficient in the scientific computing environment. Furthermore, there is a lack of tools and experience to derive and compare power consumption between the architectures for various workloads, and eventually to support software optimizations for energy efficiency. To that end, we have performed several physical and software-based measurements of workloads from HEP applications running o...

  16. Software tools for manipulating fe mesh, virtual surgery and post-processing

    Directory of Open Access Journals (Sweden)

    Milašinović Danko Z.

    2009-01-01

    Full Text Available This paper describes a set of software tools which we developed for the calculation of fluid flow through cardiovascular organs. Our tools work with medical data from a CT scanner, but could be used with any other 3D input data. For meshing we used a Tetgen tetrahedral mesh generator, as well as a mesh re-generator that we have developed for conversion of tetrahedral elements into bricks. After adequate meshing we used our PAKF solver for calculation of fluid flow. For human-friendly presentation of results we developed a set of post-processing software tools. With modification of 2D mesh (boundary of cardiovascular organ it is possible to do virtual surgery, so in a case of an aorta with aneurism, which we had received from University Clinical center in Heidelberg from a multi-slice 64-CT scanner, we removed the aneurism and ran calculations on both geometrical models afterwards. The main idea of this methodology is creating a system that could be used in clinics.

  17. Analyst Tools and Quality Control Software for the ARM Data System

    Energy Technology Data Exchange (ETDEWEB)

    Moore, Sean; Hughes, Gary

    2008-07-31

    Mission Research develops analyst tools and automated quality control software in order to assist the Atmospheric Radiation Measurement (ARM) Data Quality Office with their data inspection tasks. We have developed web-based data analysis and visualization tools such as the interactive plotting program NCVweb, various diagnostic plot browsers, and a datastream processing status application. These tools allow even novice ARM researchers to be productive with ARM data with only minimal effort. We also contribute to the ARM Data Quality Office by analyzing ARM data streams, developing new quality control metrics, new diagnostic plots, and integrating this information into DQ HandS - the Data Quality Health and Status web-based explorer. We have developed several ways to detect outliers in ARM data streams and have written software to run in an automated fashion to flag these outliers. We have also embarked on a system to comprehensively generate long time-series plots, frequency distributions, and other relevant statistics for scientific and engineering data in most high-level, publicly available ARM data streams. Furthermore, frequency distributions categorized by month or by season are made available to help define valid data ranges specific to those time domains. These statistics can be used to set limits that when checked, will improve upon the reporting of suspicious data and the early detection of instrument malfunction. The statistics and proposed limits are stored in a database for easy reporting, refining, and for use by other processes. Web-based applications to view the results are also available.

  18. New tools for digital medical image processing implemented in DIP software

    International Nuclear Information System (INIS)

    The anthropomorphic models used in computational dosimetry, also called phantoms, are mostly built from stacks of images CT (Computed Tomography) or MRI (Magnetic Resonance Imaging) obtained from scans of patients or volunteers. The construction of voxel phantoms requires computational processing for transforming image formats, dimensional image compression (2D) to form three-dimensional arrays (3D), quantization, resampling, enhancement, restoration and image segmentation, among others. Hardly the computational dosimetry researcher finds all these skills into a single software and often it results in a decreased development of their research or inadequate use of alternative tools. The need to integrate the various tasks of the original digital image processing to obtain an image that can be used in a computational model of exposure led to the development of software DIP (Digital Image Processing). This software reads, writes and edits binary files containing the 3D matrix corresponding to a stack of cross-sectional images of a given geometry that can be a human body or other volume of interest. It can also read any type of computer image and do conversions. When the task involves only one output image, it is saved in the JPEG standard Windows. When it involves a stack of images, the binary output file is called SGI (Interactive Graphic Simulations, a symbol already used in other publications of the Research Group in Numerical Dosimetry). The following paper presents the third version of the DIP software and emphasizes the new tools it implemented. Currently it has the menus Basics, Views, Spatial Domain, Frequency Domain, Segmentations and Study. Each menu contains items and subitems with features that generally require an image as input and produce an image or an attribute in the output. (author)

  19. DSC: software tool for simulation-based design of control strategies applied to wastewater treatment plants.

    Science.gov (United States)

    Ruano, M V; Ribes, J; Seco, A; Ferrer, J

    2011-01-01

    This paper presents a computer tool called DSC (Simulation based Controllers Design) that enables an easy design of control systems and strategies applied to wastewater treatment plants. Although the control systems are developed and evaluated by simulation, this tool aims to facilitate the direct implementation of the designed control system to the PC of the full-scale WWTP (wastewater treatment plants). The designed control system can be programmed in a dedicated control application and can be connected to either the simulation software or the SCADA of the plant. To this end, the developed DSC incorporates an OPC server (OLE for process control) which facilitates an open-standard communication protocol for different industrial process applications. The potential capabilities of the DSC tool are illustrated through the example of a full-scale application. An aeration control system applied to a nutrient removing WWTP was designed, tuned and evaluated with the DSC tool before its implementation in the full scale plant. The control parameters obtained by simulation were suitable for the full scale plant with only few modifications to improve the control performance. With the DSC tool, the control systems performance can be easily evaluated by simulation. Once developed and tuned by simulation, the control systems can be directly applied to the full-scale WWTP.

  20. EVALUATION METRICS FOR WIRELESS SENSOR NETWORK SECURITY: ALGORITHMS REVIEW AND SOFTWARE TOOL

    Directory of Open Access Journals (Sweden)

    Qasem Abu Al-Haija

    2013-01-01

    Full Text Available Wireless Sensor Networks (WSN is currently receiving a significant attention due to their potential impact into several real life applications such as military and home automation technology. The work in this study is a complementary part of what’s discussed. In this study, we propose a software tool to simulate and evaluate the six evaluation metrics presented for non-deterministic wireless sensor network in which are: Scalability, Key Connectivity, Memory complexity, Communication complexity, Power Consumption and Confidentiality. The evaluation metrics were simulated as well as evaluated to help the network designer choosing the best probabilistic security key management algorithm for certain randomly distributed sensory network.

  1. Software Tool for Automated Failure Modes and Effects Analysis (FMEA) of Hydraulic Systems

    DEFF Research Database (Denmark)

    Stecki, J. S.; Conrad, Finn; Oh, B.

    2002-01-01

    management techniques and a vast array of computer aided techniques are applied during design and testing stages. The paper present and discusses the research and development of a software tool for automated failure mode and effects analysis - FMEA - of hydraulic systems. The paper explains the underlying......Offshore, marine,aircraft and other complex engineering systems operate in harsh environmental and operational conditions and must meet stringent requirements of reliability, safety and maintability. To reduce the hight costs of development of new systems in these fields improved the design...

  2. Computation of Internal Fluid Flows in Channels Using the CFD Software Tool FlowVision

    CERN Document Server

    Kochevsky, A N

    2004-01-01

    The article describes the CFD software tool FlowVision (OOO "Tesis", Moscow). The model equations used for this research are the set of Reynolds and continuity equations and equations of the standard k - e turbulence model. The aim of the paper was testing of FlowVision by comparing the computational results for a number of simple internal channel fluid flows with known experimental data. The test cases are non-swirling and swirling flows in pipes and diffusers, flows in stationary and rotating bends. Satisfactory correspondence of results was obtained both for flow patterns and respective quantitative values.

  3. Virtual Power Electronics: Novel Software Tools for Design, Modeling and Education

    Science.gov (United States)

    Hamar, Janos; Nagy, István; Funato, Hirohito; Ogasawara, Satoshi; Dranga, Octavian; Nishida, Yasuyuki

    The current paper is dedicated to present browser-based multimedia-rich software tools and e-learning curriculum to support the design and modeling process of power electronics circuits and to explain sometimes rather sophisticated phenomena. Two projects will be discussed. The so-called Inetele project is financed by the Leonardo da Vinci program of the European Union (EU). It is a collaborative project between numerous EU universities and institutes to develop state-of-the art curriculum in Electrical Engineering. Another cooperative project with participation of Japanese, European and Australian institutes focuses especially on developing e-learning curriculum, interactive design and modeling tools, furthermore on development of a virtual laboratory. Snapshots from these two projects will be presented.

  4. NEuronMOrphological analysis tool: open-source software for quantitative morphometrics

    Directory of Open Access Journals (Sweden)

    Lucia eBilleci

    2013-02-01

    Full Text Available Morphometric analysis of neurons and brain tissue is relevant to the study of neuron circuitry development during the first phases of brain growth or for probing the link between microstructural morphology and degenerative diseases. As neural imaging techniques become ever more sophisticated, so does the amount and complexity of data generated. The NEuronMOrphological analysis tool NEMO was purposely developed to handle and process large numbers of optical microscopy image files of neurons in culture or slices in order to automatically run batch routines, store data and apply multivariate classification and feature extraction using3-way principal component analysis. Here we describe the software's main features, underlining the differences between NEMO and other commercial and non-commercial image processing tools, and show an example of how NEMO can be used to classify neurons from wild-type mice and from animal models of autism.

  5. The impact of software and CAE tools on SEU in field programmable gate arrays

    International Nuclear Information System (INIS)

    Field programmable gate array (FPGA) devices, heavily used in spacecraft electronics, have grown substantially in size over the past few years, causing designers to work at a higher conceptual level, with computer aided engineering (CAE) tools synthesizing and optimizing the logic from a description. It is shown that the use of commercial-off-the-shelf (COTS) CAE tools can produce unreliable circuit designs when the device is used in a radiation environment and a flip-flop is upset. At a lower level, software can be used to improve the SEU performance of a flip-flop, exploiting the configurable nature of FPGA technology and on-chip delay, parasitic resistive, and capacitive circuit elements

  6. Software Tool Support to Specify and Verify Scientific Sensor Data Properties to Improve Anomaly Detection

    Science.gov (United States)

    Gallegos, I.; Gates, A. Q.; Tweedie, C.; Cybershare

    2010-12-01

    Advancements in scientific sensor data acquisition technologies, such as wireless sensor networks and robotic trams equipped with sensors, are increasing the amount of data being collected at field sites . This elevates the challenges of verifying the quality of streamed data and monitoring the correct operation of the instrumentation. Without the ability to evaluate the data collection process at near real-time, scientists can lose valuable time and data. In addition, scientists have to rely on their knowledge and experience in the field to evaluate data quality. Such knowledge is rarely shared or reused by other scientists mostly because of the lack of a well-defined methodology and tool support. Numerous scientific projects address anomaly detection, mostly as part of the verification system’s source code; however, anomaly detection properties, which often are embedded or hard-coded in the source code, are difficult to refine. In addition, a software developer is required to modify the source code every time a new anomaly detection property or a modification to an existing property is needed. This poster describes the tool support that has been developed, based on software engineering techniques, to address these challenges. The overall tool support allows scientists to specify and reuse anomaly detection properties generated using the specification tool and to use the specified properties to conduct automated anomaly detection at near-real time. The anomaly-detection mechanism is independent of the system used to collect the sensor data. With guidance provided by a classification and categorization of anomaly-detection properties, the user specifies properties on scientific sensor data. The properties, which can be associated with particular field sites or instrumentation, document knowledge about data anomalies that otherwise would have limited availability to the scientific community.

  7. The Comprehensive Evaluation of Electronic Learning Tools and Educational Software (CEELTES

    Directory of Open Access Journals (Sweden)

    Štefan KAROLČÍK

    2015-10-01

    Full Text Available Despite the fact that digital technologies are more and more used in the learning and education process, there is still lack of professional evaluation tools capable of assessing the quality of used digital teaching aids in a comprehensive and objective manner. Construction of the Comprehensive Evaluation of Electronic Learning Tools and Educational Software (CEELTES tool was preceded by several surveys and knowledge obtained in the course of creation of digital learning and teaching aids and implementation thereof in the teaching process. The evaluation tool as such consists of sets (catalogues of criteria divided into four separately assessed areas - the area of technical, technological and user attributes; the area of criteria evaluating the content, operation, information structuring and processing; the area of criteria evaluating the information processing in terms of learning, recognition, and education needs; and, finally, the area of criteria evaluating the psychological and pedagogical aspects of a digital product. The specified areas are assessed independently, separately, by a specialist in the given science discipline. The final evaluation of the assessed digital product objectifies (quantifies the overall rate of appropriateness of inclusion of a particular digital teaching aid in the teaching process.

  8. Acts -- A collection of high performing software tools for scientific computing

    Energy Technology Data Exchange (ETDEWEB)

    Drummond, L.A.; Marques, O.A.

    2002-11-01

    During the past decades there has been a continuous growth in the number of physical and societal problems that have been successfully studied and solved by means of computational modeling and simulation. Further, many new discoveries depend on high performance computer simulations to satisfy their demands for large computational resources and short response time. The Advanced CompuTational Software (ACTS) Collection brings together a number of general-purpose computational tool development projects funded and supported by the U.S. Department of Energy (DOE). These tools make it easier for scientific code developers to write high performance applications for parallel computers. They tackle a number of computational issues that are common to a large number of scientific applications, mainly implementation of numerical algorithms, and support for code development, execution and optimization. The ACTS collection promotes code portability, reusability, reduction of duplicate efforts, and tool maturity. This paper presents a brief introduction to the functionality available in ACTS. It also highlight the tools that are in demand by Climate and Weather modelers.

  9. Software tools for quantification of X-ray microtomography at the UGCT

    Energy Technology Data Exchange (ETDEWEB)

    Vlassenbroeck, J. [Department of Subatomic and Radiation Physics, Ghent University, Proeftuinstraat 86, B-9000 Gent (Belgium)], E-mail: jelle.vlassenbroeck@ugent.be; Dierick, M.; Masschaele, B. [Department of Subatomic and Radiation Physics, Ghent University, Proeftuinstraat 86, B-9000 Gent (Belgium); Cnudde, V. [Department of Geology and Soil Science, Ghent University, Krijgslaan 281/S8, B-9000 Gent (Belgium); Van Hoorebeke, L. [Department of Subatomic and Radiation Physics, Ghent University, Proeftuinstraat 86, B-9000 Gent (Belgium); Jacobs, P. [Department of Geology and Soil Science, Ghent University, Krijgslaan 281/S8, B-9000 Gent (Belgium)

    2007-09-21

    The technique of X-ray microtomography using X-ray tube radiation offers an interesting tool for the non-destructive investigation of a wide range of materials. A major challenge lies in the analysis and quantification of the resulting data, allowing for a full characterization of the sample under investigation. In this paper, we discuss the software tools for reconstruction and analysis of tomographic data that are being developed at the UGCT. The tomographic reconstruction is performed using Octopus, a high-performance and user-friendly software package. The reconstruction process transforms the raw acquisition data into a stack of 2D cross-sections through the sample, resulting in a 3D data set. A number of artifact and noise reduction algorithms are integrated to reduce ring artifacts, beam hardening artifacts, COR misalignment, detector or stage tilt, pixel non-linearities, etc. These corrections are very important to facilitate the analysis of the 3D data. The analysis of the 3D data focuses primarily on the characterization of pore structures, but will be extended to other applications. A first package for the analysis of pore structures in three dimensions was developed under Matlab. A new package, called Morpho+, is being developed in a C++ environment, with optimizations and extensions of the previously used algorithms. The current status of this project will be discussed. Examples of pore analysis can be found in pharmaceuticals, material science, geology and numerous other fields.

  10. Data Analysis Software Tools for Enhanced Collaboration at the DIII-D National Fusion Facility

    International Nuclear Information System (INIS)

    Data analysis at the DIII-D National Fusion Facility is simplified by the use of two software packages in analysis codes. The first is GAP1otObj, an IDL-based object-oriented library used in visualization tools for dynamic plotting. GAPlotObj gives users the ability to manipulate graphs directly through mouse and keyboard-driven commands. The second software package is MDSplus, which is used at DIED as a central repository for analyzed data. GAPlotObj and MDSplus reduce the effort required for a collaborator to become familiar with the DIII-D analysis environment by providing uniform interfaces for data display and retrieval. Two visualization tools at DIII-D that benefit from them are ReviewPlus and EFITviewer. ReviewPlus is capable of displaying interactive 2D and 3D graphs of raw, analyzed, and simulation code data. EFITviewer is used to display results from the EFIT analysis code together with kinetic profiles and machine geometry. Both bring new possibilities for data exploration to the user, and are able to plot data from any fusion research site with an MDSplus data server

  11. ConfocalCheck--a software tool for the automated monitoring of confocal microscope performance.

    Directory of Open Access Journals (Sweden)

    Keng Imm Hng

    Full Text Available Laser scanning confocal microscopy has become an invaluable tool in biomedical research but regular quality testing is vital to maintain the system's performance for diagnostic and research purposes. Although many methods have been devised over the years to characterise specific aspects of a confocal microscope like measuring the optical point spread function or the field illumination, only very few analysis tools are available. Our aim was to develop a comprehensive quality assurance framework ranging from image acquisition to automated analysis and documentation. We created standardised test data to assess the performance of the lasers, the objective lenses and other key components required for optimum confocal operation. The ConfocalCheck software presented here analyses the data fully automatically. It creates numerous visual outputs indicating potential issues requiring further investigation. By storing results in a web browser compatible file format the software greatly simplifies record keeping allowing the operator to quickly compare old and new data and to spot developing trends. We demonstrate that the systematic monitoring of confocal performance is essential in a core facility environment and how the quantitative measurements obtained can be used for the detailed characterisation of system components as well as for comparisons across multiple instruments.

  12. Proofreading Using an Assistive Software Homophone Tool: Compensatory and Remedial Effects on the Literacy Skills of Students with Reading Difficulties

    Science.gov (United States)

    Lange, Alissa A.; Mulhern, Gerry; Wylie, Judith

    2009-01-01

    The present study investigated the effects of using an assistive software homophone tool on the assisted proofreading performance and unassisted basic skills of secondary-level students with reading difficulties. Students aged 13 to 15 years proofread passages for homophonic errors under three conditions: with the homophone tool, with homophones…

  13. Ranking of Business Process Simulation Software Tools with DEX/QQ Hierarchical Decision Model.

    Directory of Open Access Journals (Sweden)

    Nadja Damij

    Full Text Available The omnipresent need for optimisation requires constant improvements of companies' business processes (BPs. Minimising the risk of inappropriate BP being implemented is usually performed by simulating the newly developed BP under various initial conditions and "what-if" scenarios. An effectual business process simulations software (BPSS is a prerequisite for accurate analysis of an BP. Characterisation of an BPSS tool is a challenging task due to the complex selection criteria that includes quality of visual aspects, simulation capabilities, statistical facilities, quality reporting etc. Under such circumstances, making an optimal decision is challenging. Therefore, various decision support models are employed aiding the BPSS tool selection. The currently established decision support models are either proprietary or comprise only a limited subset of criteria, which affects their accuracy. Addressing this issue, this paper proposes a new hierarchical decision support model for ranking of BPSS based on their technical characteristics by employing DEX and qualitative to quantitative (QQ methodology. Consequently, the decision expert feeds the required information in a systematic and user friendly manner. There are three significant contributions of the proposed approach. Firstly, the proposed hierarchical model is easily extendible for adding new criteria in the hierarchical structure. Secondly, a fully operational decision support system (DSS tool that implements the proposed hierarchical model is presented. Finally, the effectiveness of the proposed hierarchical model is assessed by comparing the resulting rankings of BPSS with respect to currently available results.

  14. Ranking of Business Process Simulation Software Tools with DEX/QQ Hierarchical Decision Model.

    Science.gov (United States)

    Damij, Nadja; Boškoski, Pavle; Bohanec, Marko; Mileva Boshkoska, Biljana

    2016-01-01

    The omnipresent need for optimisation requires constant improvements of companies' business processes (BPs). Minimising the risk of inappropriate BP being implemented is usually performed by simulating the newly developed BP under various initial conditions and "what-if" scenarios. An effectual business process simulations software (BPSS) is a prerequisite for accurate analysis of an BP. Characterisation of an BPSS tool is a challenging task due to the complex selection criteria that includes quality of visual aspects, simulation capabilities, statistical facilities, quality reporting etc. Under such circumstances, making an optimal decision is challenging. Therefore, various decision support models are employed aiding the BPSS tool selection. The currently established decision support models are either proprietary or comprise only a limited subset of criteria, which affects their accuracy. Addressing this issue, this paper proposes a new hierarchical decision support model for ranking of BPSS based on their technical characteristics by employing DEX and qualitative to quantitative (QQ) methodology. Consequently, the decision expert feeds the required information in a systematic and user friendly manner. There are three significant contributions of the proposed approach. Firstly, the proposed hierarchical model is easily extendible for adding new criteria in the hierarchical structure. Secondly, a fully operational decision support system (DSS) tool that implements the proposed hierarchical model is presented. Finally, the effectiveness of the proposed hierarchical model is assessed by comparing the resulting rankings of BPSS with respect to currently available results.

  15. Ranking of Business Process Simulation Software Tools with DEX/QQ Hierarchical Decision Model.

    Science.gov (United States)

    Damij, Nadja; Boškoski, Pavle; Bohanec, Marko; Mileva Boshkoska, Biljana

    2016-01-01

    The omnipresent need for optimisation requires constant improvements of companies' business processes (BPs). Minimising the risk of inappropriate BP being implemented is usually performed by simulating the newly developed BP under various initial conditions and "what-if" scenarios. An effectual business process simulations software (BPSS) is a prerequisite for accurate analysis of an BP. Characterisation of an BPSS tool is a challenging task due to the complex selection criteria that includes quality of visual aspects, simulation capabilities, statistical facilities, quality reporting etc. Under such circumstances, making an optimal decision is challenging. Therefore, various decision support models are employed aiding the BPSS tool selection. The currently established decision support models are either proprietary or comprise only a limited subset of criteria, which affects their accuracy. Addressing this issue, this paper proposes a new hierarchical decision support model for ranking of BPSS based on their technical characteristics by employing DEX and qualitative to quantitative (QQ) methodology. Consequently, the decision expert feeds the required information in a systematic and user friendly manner. There are three significant contributions of the proposed approach. Firstly, the proposed hierarchical model is easily extendible for adding new criteria in the hierarchical structure. Secondly, a fully operational decision support system (DSS) tool that implements the proposed hierarchical model is presented. Finally, the effectiveness of the proposed hierarchical model is assessed by comparing the resulting rankings of BPSS with respect to currently available results. PMID:26871694

  16. Establishing a Web-based DICOM teaching file authoring tool using open-source public software.

    Science.gov (United States)

    Lee, Wen-Jeng; Yang, Chung-Yi; Liu, Kao-Lang; Liu, Hon-Man; Ching, Yu-Tai; Chen, Shyh-Jye

    2005-09-01

    Online teaching files are an important source of educational and referential materials in the radiology community. The commonly used Digital Imaging and Communications in Medicine (DICOM) file format of the radiology community is not natively supported by common Web browsers. The ability of the Web server to convert and parse DICOM is important when the DICOM-converting tools are not available. In this paper, we describe our approach to develop a Web-based teaching file authoring tool. Our server is built using Apache Web server running on FreeBSD operating system. The dynamic page content is produced by Hypertext Preprocessor (PHP). Digital Imaging and Communications in Medicine images are converted by ImageMagick into Joint Photographic Experts Group (JPEG) format. Digital Imaging and Communications in Medicine attributes are parsed by dicom3tools and stored in PostgreSQL database. Using free software available from the Internet, we build a Web service that allows radiologists to create their own online teaching file cases with a common Web browser. PMID:15924271

  17. Ranking of Business Process Simulation Software Tools with DEX/QQ Hierarchical Decision Model

    Science.gov (United States)

    2016-01-01

    The omnipresent need for optimisation requires constant improvements of companies’ business processes (BPs). Minimising the risk of inappropriate BP being implemented is usually performed by simulating the newly developed BP under various initial conditions and “what-if” scenarios. An effectual business process simulations software (BPSS) is a prerequisite for accurate analysis of an BP. Characterisation of an BPSS tool is a challenging task due to the complex selection criteria that includes quality of visual aspects, simulation capabilities, statistical facilities, quality reporting etc. Under such circumstances, making an optimal decision is challenging. Therefore, various decision support models are employed aiding the BPSS tool selection. The currently established decision support models are either proprietary or comprise only a limited subset of criteria, which affects their accuracy. Addressing this issue, this paper proposes a new hierarchical decision support model for ranking of BPSS based on their technical characteristics by employing DEX and qualitative to quantitative (QQ) methodology. Consequently, the decision expert feeds the required information in a systematic and user friendly manner. There are three significant contributions of the proposed approach. Firstly, the proposed hierarchical model is easily extendible for adding new criteria in the hierarchical structure. Secondly, a fully operational decision support system (DSS) tool that implements the proposed hierarchical model is presented. Finally, the effectiveness of the proposed hierarchical model is assessed by comparing the resulting rankings of BPSS with respect to currently available results. PMID:26871694

  18. Quality-driven multi-objective optimization of software architecture design : method, tool, and application

    NARCIS (Netherlands)

    Etemadi Idgahi (Etemaadi), Ramin

    2014-01-01

    Software architecting is a non-trivial and demanding task for software engineers to perform. The architecture is a key enabler for software systems. Besides being crucial for user functionality, the software architecture has deep impact on software qualities such as performance, safety, and cost.

  19. A software tool to evaluate crystal types and morphological developments of accessory zircon

    Science.gov (United States)

    Sturm, Robert

    2014-08-01

    Computer programs for an appropriate visualization of crystal types and morphological developments of accessory zircon are not available hitherto. Usually, typological computations are conducted by using simple calculation tools or spread-sheet programs. In practice, however, high numbers of data sets including information of numerous zircon populations have to be processed and stored. The paper describes the software ZIRCTYP, which is a macro-driven program within the Microsoft Access database management system. It allows the computation of zircon morphologies occurring in specific rock samples and their presentation in typology diagrams. In addition, morphological developments within a given zircon population are presented (1) statistically and (2) graphically as crystal sequences showing initial, intermediate, and final growth stages.

  20. CALDoseX: a software tool for absorbed dose calculations in diagnostic radiology

    International Nuclear Information System (INIS)

    Conversion coefficients (CCs) between absorbed dose to organs and tissues at risk and measurable quantities commonly used in X-ray diagnosis have been calculated for the last 30 years mostly with mathematical MIRD5-type phantoms, in which organs are represented by simple geometrical bodies, like ellipsoids, tori, truncated cylinders, etc. In contrast, voxel-based phantoms are true to nature representations of human bodies. The purpose of this study is therefore to calculate CCs for common examinations in X-ray diagnosis with the recently developed MAX06 (Male Adult voXel) and FAX06 (Female Adult voXel) phantoms for various projections and different X-ray spectra and to make these CCs available to the public through a software tool, called CALDoseX (CALculation of Dose for X-ray diagnosis). (author)

  1. A TAXONOMY FOR TOOLS, PROCESSES AND LANGUAGES IN AUTOMOTIVE SOFTWARE ENGINEERING

    Directory of Open Access Journals (Sweden)

    Florian Bock

    2016-01-01

    Full Text Available Within the growing domain of software engineering in the automotive sector, the number of used tools, processes, methods and languages has increased distinctly in the past years. To be able to choose proper methods for particular development use cases, factors like the intended use, key-features and possible limitations have to be evaluated. This requires a taxonomy that aids the decision making. An analysis of the main existing taxonomies revealed two major deficiencies: the lack of the automotive focus and the limitation to particular engineering method types. To face this, a graphical taxonomy is proposed based on two well-established engineering approaches and enriched with additional classification information. It provides a self-evident and -explanatory overview and comparison technique for engineering methods in the automotive domain. The taxonomy is applied to common automotive engineering methods. The resulting diagram classifies each method and enables the reader to select appropriate solutions for given project requirements.

  2. Development and validation of evolutionary algorithm software as an optimization tool for biological and environmental applications.

    Science.gov (United States)

    Sys, K; Boon, N; Verstraete, W

    2004-06-01

    A flexible, extendable tool for the optimization of (micro)biological processes and protocols using evolutionary algorithms was developed. It has been tested using three different theoretical optimization problems: 2 two-dimensional problems, one with three maxima and one with five maxima and a river autopurification optimization problem with boundary conditions. For each problem, different evolutionary parameter settings were used for the optimization. For each combination of evolutionary parameters, 15 generations were run 20 times. It has been shown that in all cases, the evolutionary algorithm gave rise to valuable results. Generally, the algorithms were able to detect the more stable sub-maximum even if there existed less stable maxima. The latter is, from a practical point of view, generally more desired. The most important factors influencing the convergence process were the parameter value randomization rate and distribution. The developed software, described in this work, is available for free.

  3. Automated tools and techniques for distributed Grid Software Development of the testbed infrastructure

    CERN Document Server

    Aguado Sanchez, C

    2007-01-01

    Grid technology is becoming more and more important as the new paradigm for sharing computational resources across different organizations in a secure way. The great powerfulness of this solution, requires the definition of a generic stack of services and protocols and this is the scope of the different Grid initiatives. As a result of international collaborations for its development, the Open Grid Forum created the Open Grid Services Architecture (OGSA) which aims to define the common set of services that will enable interoperability across the different implementations. This master thesis has been developed in this framework, as part of the two European-funded projects ETICS and OMII-Europe. The main objective is to contribute to the design and maintenance of large distributed development projects with the automated tool that enables to implement Software Engineering techniques oriented to achieve an acceptable level of quality at the release process. Specifically, this thesis develops the testbed concept a...

  4. Quality-driven multi-objective optimization of software architecture design: method, tool, and application

    OpenAIRE

    Etemadi Idgahi (Etemaadi), Ramin

    2014-01-01

    Software architecting is a non-trivial and demanding task for software engineers to perform. The architecture is a key enabler for software systems. Besides being crucial for user functionality, the software architecture has deep impact on software qualities such as performance, safety, and cost. In this dissertation, an automated approach for software architecture design is proposed that supports analysis and optimization of multiple quality attributes: First of all, we demonstrate an optimi...

  5. Development of software tools for supporting building clearance and site release at UKAEA

    International Nuclear Information System (INIS)

    UKAEA sites generally have complex histories and have been subject to a diverse range of nuclear operations. Most of the nuclear reactors, laboratories, workshops and other support facilities are now redundant and a programme of decommissioning works in accordance with IAEA guidance is in progress. Decommissioning is being carried out in phases with post- operative activities, care and maintenance and care and surveillance periods between stages to allow relatively short-lived radioactivity to decay. This reduces dose levels to personnel and minimises radioactive waste production. Following on from these stages is an end point phase which corresponds to the point at which the risks to human health and the environment are sufficiently low so that the buildings / land can be released for future use. Unconditional release corresponds to meeting the requirement for 'de-licensing'. Although reaching a de-licensable end point is the desired aim for UKAEA sites, it is recognised that this may take hundreds of years for parts of some UKAEA sites, or may never be attainable at a reasonable cost to the UK taxpayer. Thus on these sites, long term risk management systems are in place to minimise the impact on health, safety and the environment. In order to manage these short, medium and long term liabilities, UKAEA has developed a number of software tools based on good practice guidance. One of these tools in particular is being developed to address building clearance and site release. This tool, IMAGES (Information Management and Geographical Information System) integrates systematic data capture, with database management and spatial assessment (through a Geographical Information System). Details of IMAGES and its applications are discussed in the paper. This paper outlines the approach being adopted by UKAEA for building and site release and the integrated software system, IMAGES, being used to capture, collate, interpret and report results. The key to UKAEA's strategy for

  6. A Software Tool for Optimal Sizing of PV Systems in Malaysia

    Directory of Open Access Journals (Sweden)

    Tamer Khatib

    2012-01-01

    Full Text Available This paper presents a MATLAB based user friendly software tool called as PV.MY for optimal sizing of photovoltaic (PV systems. The software has the capabilities of predicting the metrological variables such as solar energy, ambient temperature and wind speed using artificial neural network (ANN, optimizes the PV module/ array tilt angle, optimizes the inverter size and calculate optimal capacities of PV array, battery, wind turbine and diesel generator in hybrid PV systems. The ANN based model for metrological prediction uses four meteorological variables, namely, sun shine ratio, day number and location coordinates. As for PV system sizing, iterative methods are used for determining the optimal sizing of three types of PV systems, which are standalone PV system, hybrid PV/wind system and hybrid PV/diesel generator system. The loss of load probability (LLP technique is used for optimization in which the energy sources capacities are the variables to be optimized considering very low LLP. As for determining the optimal PV panels tilt angle and inverter size, the Liu and Jordan model for solar energy incident on a tilt surface is used in optimizing the monthly tilt angle, while a model for inverter efficiency curve is used in the optimization of inverter size.

  7. RadNotes: a novel software development tool for radiology education.

    Science.gov (United States)

    Baxter, A B; Klein, J S; Oesterle, E V

    1997-01-01

    RadNotes is a novel software development tool that enables physicians to develop teaching materials incorporating text and images in an intelligent, highly usable format. Projects undertaken in the RadNotes environment require neither programming expertise nor the assistance of a software engineer. The first of these projects, Thoracic Imaging, integrates image teaching files, concise disease and topic summaries, references, and flash card quizzes into a single program designed to provide an overview of chest radiology. RadNotes is intended to support the academic goals of teaching radiologists by enabling authors to create, edit, and electronically distribute image-oriented presentations. RadNotes also supports the educational goals of physicians who wish to quickly review selected imaging topics, as well as to develop a visual vocabulary of corresponding radiologic anatomy and pathologic conditions. Although Thoracic Imaging was developed with the aim of introducing chest radiology to residents, RadNotes can be used to develop tutorials and image-based tests for all levels; create corresponding World Wide Web sites; and organize notes, images, and references for individual use.

  8. Open Source Software Openfoam as a New Aerodynamical Simulation Tool for Rocket-Borne Measurements

    Science.gov (United States)

    Staszak, T.; Brede, M.; Strelnikov, B.

    2015-09-01

    The only way to do in-situ measurements, which are very important experimental studies for atmospheric science, in the mesoshere/lower thermosphere (MLT) is to use sounding rockets. The drawback of using rockets is the shock wave appearing because of the very high speed of the rocket motion (typically about 1000 mIs). This shock wave disturbs the density, the temperature and the velocity fields in the vicinity of the rocket, compared to undisturbed values of the atmosphere. This effect, however, can be quantified and the measured data has to be corrected not just to make it more precise but simply usable. The commonly accepted and widely used tool for this calculations is the Direct Simulation Monte Carlo (DSMC) technique developed by GA. Bird which is available as stand-alone program limited to use a single processor. Apart from complications with simulations of flows around bodies related to different flow regimes in the altitude range of MLT, that rise due to exponential density change by several orders of magnitude, a particular hardware configuration introduces significant difficulty for aerodynamical calculations due to choice of the grid sizes mainly depending on the demands on adequate DSMCs and good resolution of geometries with scale differences of factor of iO~. This makes either the calculation time unreasonably long or even prevents the calculation algorithm from converging. In this paper we apply the free open source software OpenFOAM (licensed under GNU GPL) for a three-dimensional CFD-Simulation of a flow around a sounding rocket instrumentation. An advantage of this software package, among other things, is that it can run on high performance clusters, which are easily scalable. We present the first results and discuss the potential of the new tool in applications for sounding rockets.

  9. Techniques and software tools for estimating ultrasonic signal-to-noise ratios

    Science.gov (United States)

    Chiou, Chien-Ping; Margetan, Frank J.; McKillip, Matthew; Engle, Brady J.; Roberts, Ronald A.

    2016-02-01

    At Iowa State University's Center for Nondestructive Evaluation (ISU CNDE), the use of models to simulate ultrasonic inspections has played a key role in R&D efforts for over 30 years. To this end a series of wave propagation models, flaw response models, and microstructural backscatter models have been developed to address inspection problems of interest. One use of the combined models is the estimation of signal-to-noise ratios (S/N) in circumstances where backscatter from the microstructure (grain noise) acts to mask sonic echoes from internal defects. Such S/N models have been used in the past to address questions of inspection optimization and reliability. Under the sponsorship of the National Science Foundation's Industry/University Cooperative Research Center at ISU, an effort was recently initiated to improve existing research-grade software by adding graphical user interface (GUI) to become user friendly tools for the rapid estimation of S/N for ultrasonic inspections of metals. The software combines: (1) a Python-based GUI for specifying an inspection scenario and displaying results; and (2) a Fortran-based engine for computing defect signal and backscattered grain noise characteristics. The latter makes use of several models including: the Multi-Gaussian Beam Model for computing sonic fields radiated by commercial transducers; the Thompson-Gray Model for the response from an internal defect; the Independent Scatterer Model for backscattered grain noise; and the Stanke-Kino Unified Model for attenuation. The initial emphasis was on reformulating the research-grade code into a suitable modular form, adding the graphical user interface and performing computations rapidly and robustly. Thus the initial inspection problem being addressed is relatively simple. A normal-incidence pulse/echo immersion inspection is simulated for a curved metal component having a non-uniform microstructure, specifically an equiaxed, untextured microstructure in which the average

  10. Effectiveness of Crown Preparation Assessment Software As an Educational Tool in Simulation Clinic: A Pilot Study.

    Science.gov (United States)

    Tiu, Janine; Cheng, Enxin; Hung, Tzu-Chiao; Yu, Chuan-Chia; Lin, Tony; Schwass, Don; Al-Amleh, Basil

    2016-08-01

    The aim of this pilot study was to evaluate the feasibility of a new tooth preparation assessment software, Preppr, as an educational tool for dental students in achieving optimal parameters for a crown preparation. In February 2015, 30 dental students in their fourth year in a five-year undergraduate dental curriculum in New Zealand were randomly selected from a pool of volunteers (N=40) out of the total class of 85. The participants were placed into one of three groups of ten students each: Group A, the control group, received only written and pictorial instructions; Group B received tutor evaluation and feedback; and Group C performed self-directed learning with the aid of Preppr. Each student was asked to prepare an all-ceramic crown on the lower first molar typodont within three hours and to repeat the exercise three times over the next four weeks. The exercise stipulated a 1 mm finish line dimension and total convergence angles (TOC) between 10 and 20 degrees. Fulfillment of these parameters was taken as an acceptable preparation. The results showed that Group C had the highest percentage of students who achieved minimum finish line dimensions and acceptable TOC angles. Those students also achieved the stipulated requirements earlier than the other groups. This study's findings provide promising data on the feasibility of using Preppr as a self-directed educational tool for students training to prepare dental crowns. PMID:27480712

  11. SHAPA: An interactive software tool for protocol analysis applied to aircrew communications and workload

    Science.gov (United States)

    James, Jeffrey M.; Sanderson, Penelope M.; Seidler, Karen S.

    1990-01-01

    As modern transport environments become increasingly complex, issues such as crew communication, interaction with automation, and workload management have become crucial. Much research is being focused on holistic aspects of social and cognitive behavior, such as the strategies used to handle workload, the flow of information, the scheduling of tasks, the verbal and non-verbal interactions between crew members. Traditional laboratory performance measures no longer sufficiently meet the needs of researchers addressing these issues. However observational techniques are better equipped to capture the type of data needed and to build models of the requisite level of sophistication. Presented here is SHAPA, an interactive software tool for performing both verbal and non-verbal protocol analysis. It has been developed with the idea of affording the researchers the closest possible degree of engagement with protocol data. The researcher can configure SHAPA to encode protocols using any theoretical framework or encoding vocabulary that is desired. SHAPA allows protocol analysis to be performed at any level of analysis, and it supplies a wide variety of tools for data aggregation, manipulation. The output generated by SHAPA can be used alone or in combination with other performance variables to get a rich picture of the influences on sequences of verbal or nonverbal behavior.

  12. GraphCrunch 2: Software tool for network modeling, alignment and clustering

    Directory of Open Access Journals (Sweden)

    Hayes Wayne

    2011-01-01

    Full Text Available Abstract Background Recent advancements in experimental biotechnology have produced large amounts of protein-protein interaction (PPI data. The topology of PPI networks is believed to have a strong link to their function. Hence, the abundance of PPI data for many organisms stimulates the development of computational techniques for the modeling, comparison, alignment, and clustering of networks. In addition, finding representative models for PPI networks will improve our understanding of the cell just as a model of gravity has helped us understand planetary motion. To decide if a model is representative, we need quantitative comparisons of model networks to real ones. However, exact network comparison is computationally intractable and therefore several heuristics have been used instead. Some of these heuristics are easily computable "network properties," such as the degree distribution, or the clustering coefficient. An important special case of network comparison is the network alignment problem. Analogous to sequence alignment, this problem asks to find the "best" mapping between regions in two networks. It is expected that network alignment might have as strong an impact on our understanding of biology as sequence alignment has had. Topology-based clustering of nodes in PPI networks is another example of an important network analysis problem that can uncover relationships between interaction patterns and phenotype. Results We introduce the GraphCrunch 2 software tool, which addresses these problems. It is a significant extension of GraphCrunch which implements the most popular random network models and compares them with the data networks with respect to many network properties. Also, GraphCrunch 2 implements the GRAph ALigner algorithm ("GRAAL" for purely topological network alignment. GRAAL can align any pair of networks and exposes large, dense, contiguous regions of topological and functional similarities far larger than any other

  13. Should we have blind faith in bioinformatics software? Illustrations from the SNAP web-based tool.

    Directory of Open Access Journals (Sweden)

    Sébastien Robiou-du-Pont

    Full Text Available Bioinformatics tools have gained popularity in biology but little is known about their validity. We aimed to assess the early contribution of 415 single nucleotide polymorphisms (SNPs associated with eight cardio-metabolic traits at the genome-wide significance level in adults in the Family Atherosclerosis Monitoring In earLY Life (FAMILY birth cohort. We used the popular web-based tool SNAP to assess the availability of the 415 SNPs in the Illumina Cardio-Metabochip genotyped in the FAMILY study participants. We then compared the SNAP output with the Cardio-Metabochip file provided by Illumina using chromosome and chromosomal positions of SNPs from NCBI Human Genome Browser (Genome Reference Consortium Human Build 37. With the HapMap 3 release 2 reference, 201 out of 415 SNPs were reported as missing in the Cardio-Metabochip by the SNAP output. However, the Cardio-Metabochip file revealed that 152 of these 201 SNPs were in fact present in the Cardio-Metabochip array (false negative rate of 36.6%. With the more recent 1000 Genomes Project release, we found a false-negative rate of 17.6% by comparing the outputs of SNAP and the Illumina product file. We did not find any 'false positive' SNPs (SNPs specified as available in the Cardio-Metabochip by SNAP, but not by the Cardio-Metabochip Illumina file. The Cohen's Kappa coefficient, which calculates the percentage of agreement between both methods, indicated that the validity of SNAP was fair to moderate depending on the reference used (the HapMap 3 or 1000 Genomes. In conclusion, we demonstrate that the SNAP outputs for the Cardio-Metabochip are invalid. This study illustrates the importance of systematically assessing the validity of bioinformatics tools in an independent manner. We propose a series of guidelines to improve practices in the fast-moving field of bioinformatics software implementation.

  14. Evaluation and Usage of Browser Compatibility Tools during the Software Development Process

    OpenAIRE

    Boyaci, Burak

    2016-01-01

    The software testing process is one of the most important phases during software development to check that the developed software product meets its specified specifications/requirements. This is especially true for the software products used in the health-industry. Supported browsers can also be documented into the requirements. Thus browser compatibility testing needs to be considered, especially while performing testing on web-based software products. Browser compatibility testing is perfor...

  15. Performance analysis and optimization of an advanced pharmaceutical wastewater treatment plant through a visual basic software tool (PWWT.VB).

    Science.gov (United States)

    Pal, Parimal; Thakura, Ritwik; Chakrabortty, Sankha

    2016-05-01

    A user-friendly, menu-driven simulation software tool has been developed for the first time to optimize and analyze the system performance of an advanced continuous membrane-integrated pharmaceutical wastewater treatment plant. The software allows pre-analysis and manipulation of input data which helps in optimization and shows the software performance visually on a graphical platform. Moreover, the software helps the user to "visualize" the effects of the operating parameters through its model-predicted output profiles. The software is based on a dynamic mathematical model, developed for a systematically integrated forward osmosis-nanofiltration process for removal of toxic organic compounds from pharmaceutical wastewater. The model-predicted values have been observed to corroborate well with the extensive experimental investigations which were found to be consistent under varying operating conditions like operating pressure, operating flow rate, and draw solute concentration. Low values of the relative error (RE = 0.09) and high values of Willmott-d-index (d will = 0.981) reflected a high degree of accuracy and reliability of the software. This software is likely to be a very efficient tool for system design or simulation of an advanced membrane-integrated treatment plant for hazardous wastewater.

  16. Performance analysis and optimization of an advanced pharmaceutical wastewater treatment plant through a visual basic software tool (PWWT.VB).

    Science.gov (United States)

    Pal, Parimal; Thakura, Ritwik; Chakrabortty, Sankha

    2016-05-01

    A user-friendly, menu-driven simulation software tool has been developed for the first time to optimize and analyze the system performance of an advanced continuous membrane-integrated pharmaceutical wastewater treatment plant. The software allows pre-analysis and manipulation of input data which helps in optimization and shows the software performance visually on a graphical platform. Moreover, the software helps the user to "visualize" the effects of the operating parameters through its model-predicted output profiles. The software is based on a dynamic mathematical model, developed for a systematically integrated forward osmosis-nanofiltration process for removal of toxic organic compounds from pharmaceutical wastewater. The model-predicted values have been observed to corroborate well with the extensive experimental investigations which were found to be consistent under varying operating conditions like operating pressure, operating flow rate, and draw solute concentration. Low values of the relative error (RE = 0.09) and high values of Willmott-d-index (d will = 0.981) reflected a high degree of accuracy and reliability of the software. This software is likely to be a very efficient tool for system design or simulation of an advanced membrane-integrated treatment plant for hazardous wastewater. PMID:26856870

  17. Diva software, a tool for European regional seas and Ocean climatologies production

    Science.gov (United States)

    Ouberdous, M.; Troupin, C.; Barth, A.; Alvera-Azcàrate, A.; Beckers, J.-M.

    2012-04-01

    Diva (Data-Interpolating Variational Analysis) is a software based on a method designed to perform data-gridding (or analysis) tasks, with the assets of taking into account the intrinsic nature of oceanographic data, i.e., the uncertainty on the in situ measurements and the anisotropy due to advection and irregular coastlines and topography. The Variational Inverse Method (VIM, Brasseur et al., 1996) implemented in Diva consists in minimizing a variational principle which accounts for the differences between the observations and the reconstructed field, the influence of the gradients and variability of the reconstructed field. The resolution of the numerical problem is based on finite-element method, which allows a great numerical efficiency and the consideration of complicated contours. Along with the analysis, Diva provides also error fields (Brankart and Brasseur, 1998; Rixen et al., 2000) based on the data coverage and noise. Diva is used for the production of climatologies in the pan-European network SeaDataNet. SeaDataNet is connecting the existing marine data centres of more than 30 countries and set up a data management infrastructure consisting of a standardized distributed system. The consortium has elaborated integrated products, using common procedures and methods. Among these, it uses the Diva software as reference tool for climatologies computation for various European regional seas, the Atlantic and the global ocean. During the first phase of the SeaDataNet project, a number of additional tools were developed to make easier the climatologies production for the users. Among these tools: the advection constraint during the field reconstruction through the specification of a velocity field on a regular grid, forcing the analysis to align with the velocity vectors; the Generalized Cross Validation for the determination of analysis parameters (signal-to-noise ratio); the creation of contours at selected depths; the detection of possible outliers; the

  18. An open-source software tool for the generation of relaxation time maps in magnetic resonance imaging

    International Nuclear Information System (INIS)

    In magnetic resonance (MR) imaging, T1, T2 and T2* relaxation times represent characteristic tissue properties that can be quantified with the help of specific imaging strategies. While there are basic software tools for specific pulse sequences, until now there is no universal software program available to automate pixel-wise mapping of relaxation times from various types of images or MR systems. Such a software program would allow researchers to test and compare new imaging strategies and thus would significantly facilitate research in the area of quantitative tissue characterization. After defining requirements for a universal MR mapping tool, a software program named MRmap was created using a high-level graphics language. Additional features include a manual registration tool for source images with motion artifacts and a tabular DICOM viewer to examine pulse sequence parameters. MRmap was successfully tested on three different computer platforms with image data from three different MR system manufacturers and five different sorts of pulse sequences: multi-image inversion recovery T1; Look-Locker/TOMROP T1; modified Look-Locker (MOLLI) T1; single-echo T2/T2*; and multi-echo T2/T2*. Computing times varied between 2 and 113 seconds. Estimates of relaxation times compared favorably to those obtained from non-automated curve fitting. Completed maps were exported in DICOM format and could be read in standard software packages used for analysis of clinical and research MR data. MRmap is a flexible cross-platform research tool that enables accurate mapping of relaxation times from various pulse sequences. The software allows researchers to optimize quantitative MR strategies in a manufacturer-independent fashion. The program and its source code were made available as open-source software on the internet

  19. An open-source software tool for the generation of relaxation time maps in magnetic resonance imaging

    Directory of Open Access Journals (Sweden)

    Kühne Titus

    2010-07-01

    Full Text Available Abstract Background In magnetic resonance (MR imaging, T1, T2 and T2* relaxation times represent characteristic tissue properties that can be quantified with the help of specific imaging strategies. While there are basic software tools for specific pulse sequences, until now there is no universal software program available to automate pixel-wise mapping of relaxation times from various types of images or MR systems. Such a software program would allow researchers to test and compare new imaging strategies and thus would significantly facilitate research in the area of quantitative tissue characterization. Results After defining requirements for a universal MR mapping tool, a software program named MRmap was created using a high-level graphics language. Additional features include a manual registration tool for source images with motion artifacts and a tabular DICOM viewer to examine pulse sequence parameters. MRmap was successfully tested on three different computer platforms with image data from three different MR system manufacturers and five different sorts of pulse sequences: multi-image inversion recovery T1; Look-Locker/TOMROP T1; modified Look-Locker (MOLLI T1; single-echo T2/T2*; and multi-echo T2/T2*. Computing times varied between 2 and 113 seconds. Estimates of relaxation times compared favorably to those obtained from non-automated curve fitting. Completed maps were exported in DICOM format and could be read in standard software packages used for analysis of clinical and research MR data. Conclusions MRmap is a flexible cross-platform research tool that enables accurate mapping of relaxation times from various pulse sequences. The software allows researchers to optimize quantitative MR strategies in a manufacturer-independent fashion. The program and its source code were made available as open-source software on the internet.

  20. COMSY - A software tool for PLIM + PLEX with integrated risk-informed approaches

    International Nuclear Information System (INIS)

    The majority of mechanical components and structures in a thermal power plant are designed to experience a service life which is far above the intended design life. In most cases, only a small percentage of mechanical components are subject to significant degradation which may affect the integrity or the function of the component. If plant life extension (PLEX) is considered as an option, a plant specific PLIM strategy needs to be developed. One of the most important tasks of such a PLIM strategy is to identify those components which (i) are relevant for the safety and/or availability of the plant and (ii) experience elevated degradation due to their operating and design conditions. For these components special life management strategies need to be established to reliably monitor their condition. FRAMATOME ANP GmbH has developed the software tool COMSY, which is designed to efficiently support a plant-wide lifetime management strategy for static mechanical components, providing the basis for plant life extension (PLEX) activities. The objective is the economical and safe operation of power plants over their design lifetime - and beyond. The tool provides the capability to establish a program guided technical documentation of the plant by utilizing a virtual plant data model. The software integrates engineering analysis functions and comprehensive material libraries to perform a lifetime analysis for various degradation mechanisms typically experienced in power plants (e.g. flow-accelerated corrosion, intergranular stress corrosion cracking, strain-induced cracking, material fatigue, cavitation erosion, droplet impingement erosion, pitting, etc.). A risk-based prioritization serves to focus inspection activities on safety or availability relevant locations, where a degradation potential exists. Trending functions support the comparison of the as-measured condition with the predicted progress of degradation while making allowance for measurement tolerances. The

  1. Application of the PredictAD Software Tool to Predict Progression in Patients with Mild Cognitive Impairment

    DEFF Research Database (Denmark)

    Simonsen, Anja H; Mattila, Jussi; Hejl, Anne-Mette;

    2012-01-01

    Background: The PredictAD tool integrates heterogeneous data such as imaging, cerebrospinal fluid biomarkers and results from neuropsychological tests for compact visualization in an interactive user interface. This study investigated whether the software tool could assist physicians in the early...... diagnosis of Alzheimer's disease. Methods: Baseline data from 140 patients with mild cognitive impairment were selected from the Alzheimer's Disease Neuroimaging Study. Three clinical raters classified patients into 6 categories of confidence in the prediction of early Alzheimer's disease, in 4 phases...

  2. Data-Driven Decision Making as a Tool to Improve Software Development Productivity

    Science.gov (United States)

    Brown, Mary Erin

    2013-01-01

    The worldwide software project failure rate, based on a survey of information technology software manager's view of user satisfaction, product quality, and staff productivity, is estimated to be between 24% and 36% and software project success has not kept pace with the advances in hardware. The problem addressed by this study was the limited…

  3. Software Tool for Analysis of Breathing-Related Errors in Transthoracic Electrical Bioimpedance Spectroscopy Measurements

    Science.gov (United States)

    Abtahi, F.; Gyllensten, I. C.; Lindecrantz, K.; Seoane, F.

    2012-12-01

    During the last decades, Electrical Bioimpedance Spectroscopy (EBIS) has been applied in a range of different applications and mainly using the frequency sweep-technique. Traditionally the tissue under study is considered to be timeinvariant and dynamic changes of tissue activity are ignored and instead treated as a noise source. This assumption has not been adequately tested and could have a negative impact and limit the accuracy for impedance monitoring systems. In order to successfully use frequency-sweeping EBIS for monitoring time-variant systems, it is paramount to study the effect of frequency-sweep delay on Cole Model-based analysis. In this work, we present a software tool that can be used to simulate the influence of respiration activity in frequency-sweep EBIS measurements of the human thorax and analyse the effects of the different error sources. Preliminary results indicate that the deviation on the EBIS measurement might be significant at any frequency, and especially in the impedance plane. Therefore the impact on Cole-model analysis might be different depending on method applied for Cole parameter estimation.

  4. Software Tool for Analysis of Breathing-Related Errors in Transthoracic Electrical Bioimpedance Spectroscopy Measurements

    International Nuclear Information System (INIS)

    During the last decades, Electrical Bioimpedance Spectroscopy (EBIS) has been applied in a range of different applications and mainly using the frequency sweep-technique. Traditionally the tissue under study is considered to be timeinvariant and dynamic changes of tissue activity are ignored and instead treated as a noise source. This assumption has not been adequately tested and could have a negative impact and limit the accuracy for impedance monitoring systems. In order to successfully use frequency-sweeping EBIS for monitoring time-variant systems, it is paramount to study the effect of frequency-sweep delay on Cole Model-based analysis. In this work, we present a software tool that can be used to simulate the influence of respiration activity in frequency-sweep EBIS measurements of the human thorax and analyse the effects of the different error sources. Preliminary results indicate that the deviation on the EBIS measurement might be significant at any frequency, and especially in the impedance plane. Therefore the impact on Cole-model analysis might be different depending on method applied for Cole parameter estimation.

  5. A Software Tool to Visualize Verbal Protocols to Enhance Strategic and Metacognitive Abilities in Basic Programming

    Directory of Open Access Journals (Sweden)

    Carlos A. Arévalo

    2011-07-01

    Full Text Available Learning to program is difficult for many first year undergraduate students. Instructional strategies of traditional programming courses tend to focus on syntactic issues and assigning practice exercises using the presentation-examples-practice formula and by showing the verbal and visual explanation of a teacher during the “step by step” process of writing a computer program. Cognitive literature regarding the mental processes involved in programming suggests that the explicit teaching of certain aspects such as mental models, strategic knowledge and metacognitive abilities, are critical issues of how to write and assemble the pieces of a computer program. Verbal protocols are often used in software engineering as a technique to record the short term cognitive process of a user or expert in evaluation or problem solving scenarios. We argue that verbal protocols can be used as a mechanism to explicitly show the strategic and metacognitive process of an instructor when writing a program. In this paper we present an Information System Prototype developed to store and visualize worked examples derived from transcribed verbal protocols during the process of writing introductory level programs. Empirical data comparing the grades obtained by two groups of novice programming students, using ANOVA, indicates a statistically positive difference in performance in the group using the tool, even though these results still cannot be extrapolated to general population, given the reported limitations of this study.

  6. CubeSat mission design software tool for risk estimating relationships

    Science.gov (United States)

    Gamble, Katharine Brumbaugh; Lightsey, E. Glenn

    2014-09-01

    In an effort to make the CubeSat risk estimation and management process more scientific, a software tool has been created that enables mission designers to estimate mission risks. CubeSat mission designers are able to input mission characteristics, such as form factor, mass, development cycle, and launch information, in order to determine the mission risk root causes which historically present the highest risk for their mission. Historical data was collected from the CubeSat community and analyzed to provide a statistical background to characterize these Risk Estimating Relationships (RERs). This paper develops and validates the mathematical model based on the same cost estimating relationship methodology used by the Unmanned Spacecraft Cost Model (USCM) and the Small Satellite Cost Model (SSCM). The RER development uses general error regression models to determine the best fit relationship between root cause consequence and likelihood values and the input factors of interest. These root causes are combined into seven overall CubeSat mission risks which are then graphed on the industry-standard 5×5 Likelihood-Consequence (L-C) chart to help mission designers quickly identify areas of concern within their mission. This paper is the first to document not only the creation of a historical database of CubeSat mission risks, but, more importantly, the scientific representation of Risk Estimating Relationships.

  7. Development of a Kinect Software Tool to Classify Movements during Active Video Gaming.

    Science.gov (United States)

    Rosenberg, Michael; Thornton, Ashleigh L; Lay, Brendan S; Ward, Brodie; Nathan, David; Hunt, Daniel; Braham, Rebecca

    2016-01-01

    While it has been established that using full body motion to play active video games results in increased levels of energy expenditure, there is little information on the classification of human movement during active video game play in relationship to fundamental movement skills. The aim of this study was to validate software utilising Kinect sensor motion capture technology to recognise fundamental movement skills (FMS), during active video game play. Two human assessors rated jumping and side-stepping and these assessments were compared to the Kinect Action Recognition Tool (KART), to establish a level of agreement and determine the number of movements completed during five minutes of active video game play, for 43 children (m = 12 years 7 months ± 1 year 6 months). During five minutes of active video game play, inter-rater reliability, when examining the two human raters, was found to be higher for the jump (r = 0.94, p game play, demonstrating that both humans and KART had higher agreement for jumps than sidesteps in the game play condition. The results of the study provide confidence that the Kinect sensor can be used to count the number of jumps and sidestep during five minutes of active video game play with a similar level of accuracy as human raters. However, in contrast to humans, the KART system required a fraction of the time to analyse and tabulate the results.

  8. UNBizPlanner: a software tool for preparing a business plan

    Directory of Open Access Journals (Sweden)

    Oscar Ávila Cifuentes

    2010-04-01

    Full Text Available Universities are currently expected to play a new role in society (in addition to research and teaching by engaging in a third mission concerning socio-economic development. Universities also play an important role in encouraging en-trepreneurs through training them in business planning. A business plan is a document summarising how an entre-preneur will create an organisation to exploit a business opportunity. Preparing a business plan draws on a wide range of knowledge from many business disciplines (e.g. finance, human resource management, intellectual pro-perty management, supply chain management, operations management and marketing. This article presents a computational tool for drawing up a business plan from a Colombian viewpoint by identifying the most relevant stages which are born in mind by national entities having most experience in creating and consolidating companies. Special emphasis was placed on analysing, designing and implementing a systems development life cycle for de-veloping the software. Reviewing the literature concerning business plans formed an important part of the analysis stage (bearing a Colombian viewpoint in mind.

  9. CancellationTools: All-in-one software for administration and analysis of cancellation tasks.

    Science.gov (United States)

    Dalmaijer, Edwin S; Van der Stigchel, Stefan; Nijboer, Tanja C W; Cornelissen, Tim H W; Husain, Masud

    2015-12-01

    In a cancellation task, a participant is required to search for and cross out ("cancel") targets, which are usually embedded among distractor stimuli. The number of cancelled targets and their location can be used to diagnose the neglect syndrome after stroke. In addition, the organization of search provides a potentially useful way to measure executive control over multitarget search. Although many useful cancellation measures have been introduced, most fail to make their way into research studies and clinical practice due to the practical difficulty of acquiring such parameters from traditional pen-and-paper measures. Here we present new, open-source software that is freely available to all. It allows researchers and clinicians to flexibly administer computerized cancellation tasks using stimuli of their choice, and to directly analyze the data in a convenient manner. The automated analysis suite provides output that includes almost all of the currently existing measures, as well as several new ones introduced here. All tasks can be performed using either a computer mouse or a touchscreen as an input device, and an online version of the task runtime is available for tablet devices. A summary of the results is produced in a single A4-sized PDF document, including high quality data visualizations. For research purposes, batch analysis of large datasets is possible. In sum, CancellationTools allows users to employ a flexible, computerized cancellation task, which provides extensive benefits and ease of use. PMID:25381020

  10. TRANSIT--A Software Tool for Himar1 TnSeq Analysis.

    Directory of Open Access Journals (Sweden)

    Michael A DeJesus

    2015-10-01

    Full Text Available TnSeq has become a popular technique for determining the essentiality of genomic regions in bacterial organisms. Several methods have been developed to analyze the wealth of data that has been obtained through TnSeq experiments. We developed a tool for analyzing Himar1 TnSeq data called TRANSIT. TRANSIT provides a graphical interface to three different statistical methods for analyzing TnSeq data. These methods cover a variety of approaches capable of identifying essential genes in individual datasets as well as comparative analysis between conditions. We demonstrate the utility of this software by analyzing TnSeq datasets of M. tuberculosis grown on glycerol and cholesterol. We show that TRANSIT can be used to discover genes which have been previously implicated for growth on these carbon sources. TRANSIT is written in Python, and thus can be run on Windows, OSX and Linux platforms. The source code is distributed under the GNU GPL v3 license and can be obtained from the following GitHub repository: https://github.com/mad-lab/transit.

  11. CancellationTools: All-in-one software for administration and analysis of cancellation tasks.

    Science.gov (United States)

    Dalmaijer, Edwin S; Van der Stigchel, Stefan; Nijboer, Tanja C W; Cornelissen, Tim H W; Husain, Masud

    2015-12-01

    In a cancellation task, a participant is required to search for and cross out ("cancel") targets, which are usually embedded among distractor stimuli. The number of cancelled targets and their location can be used to diagnose the neglect syndrome after stroke. In addition, the organization of search provides a potentially useful way to measure executive control over multitarget search. Although many useful cancellation measures have been introduced, most fail to make their way into research studies and clinical practice due to the practical difficulty of acquiring such parameters from traditional pen-and-paper measures. Here we present new, open-source software that is freely available to all. It allows researchers and clinicians to flexibly administer computerized cancellation tasks using stimuli of their choice, and to directly analyze the data in a convenient manner. The automated analysis suite provides output that includes almost all of the currently existing measures, as well as several new ones introduced here. All tasks can be performed using either a computer mouse or a touchscreen as an input device, and an online version of the task runtime is available for tablet devices. A summary of the results is produced in a single A4-sized PDF document, including high quality data visualizations. For research purposes, batch analysis of large datasets is possible. In sum, CancellationTools allows users to employ a flexible, computerized cancellation task, which provides extensive benefits and ease of use.

  12. Computer-generated holograms (CGH) realization: the integration of dedicated software tool with digital slides printer

    Science.gov (United States)

    Guarnieri, Vittorio; Francini, Franco

    1997-12-01

    Last generation of digital printer is usually characterized by a spatial resolution enough high to allow the designer to realize a binary CGH directly on a transparent film avoiding photographic reduction techniques. These devices are able to produce slides or offset prints. Furthermore, services supplied by commercial printing company provide an inexpensive method to rapidly verify the validity of the design by means of a test-and-trial process. Notably, this low-cost approach appears to be suitable for a didactical environment. On the basis of these considerations, a set of software tools able to design CGH's has been developed. The guidelines inspiring the work have been the following ones: (1) ray-tracing approach, considering the object to be reproduced as source of spherical waves; (2) Optimization and speed-up of the algorithms used, in order to produce a portable code, runnable on several hardware platforms. In this paper calculation methods to obtain some fundamental geometric functions (points, lines, curves) are described. Furthermore, by the juxtaposition of these primitives functions it is possible to produce the holograms of more complex objects. Many examples of generated CGHs are presented.

  13. GTest: a software tool for graphical assessment of empirical distributions' Gaussianity.

    Science.gov (United States)

    Barca, E; Bruno, E; Bruno, D E; Passarella, G

    2016-03-01

    their request for an effective tool for addressing such difficulties motivated us in adopting the inference-by-eye paradigm and implementing an easy-to-use, quick and reliable statistical tool. GTest visualizes its outcomes as a modified version of the Q-Q plot. The application has been developed in Visual Basic for Applications (VBA) within MS Excel 2010, which demonstrated to have all the characteristics of robustness and reliability needed. GTest provides true graphical normality tests which are as reliable as any statistical quantitative approach but much easier to understand. The Q-Q plots have been integrated with the outlining of an acceptance region around the representation of the theoretical distribution, defined in accordance with the alpha level of significance and the data sample size. The test decision rule is the following: if the empirical scatterplot falls completely within the acceptance region, then it can be concluded that the empirical distribution fits the theoretical one at the given alpha level. A comprehensive case study has been carried out with simulated and real-world data in order to check the robustness and reliability of the software. PMID:26846288

  14. GTest: a software tool for graphical assessment of empirical distributions' Gaussianity.

    Science.gov (United States)

    Barca, E; Bruno, E; Bruno, D E; Passarella, G

    2016-03-01

    their request for an effective tool for addressing such difficulties motivated us in adopting the inference-by-eye paradigm and implementing an easy-to-use, quick and reliable statistical tool. GTest visualizes its outcomes as a modified version of the Q-Q plot. The application has been developed in Visual Basic for Applications (VBA) within MS Excel 2010, which demonstrated to have all the characteristics of robustness and reliability needed. GTest provides true graphical normality tests which are as reliable as any statistical quantitative approach but much easier to understand. The Q-Q plots have been integrated with the outlining of an acceptance region around the representation of the theoretical distribution, defined in accordance with the alpha level of significance and the data sample size. The test decision rule is the following: if the empirical scatterplot falls completely within the acceptance region, then it can be concluded that the empirical distribution fits the theoretical one at the given alpha level. A comprehensive case study has been carried out with simulated and real-world data in order to check the robustness and reliability of the software.

  15. 基于软件体系结构的测试及其工具研究%Research on Testing and its Tools Based on Software Architecture

    Institute of Scientific and Technical Information of China (English)

    叶俊民; 王振宇; 陈利; 赵恒

    2003-01-01

    In this paper ,we discuss the objective and the significance of software testing on the basis of software ar-chitecture; we propose the content of testing planing of software architecture and testing criteria; on the level of ar-chitecture,through comparison with traditional testing tools,we present testing tools in integration environment andanalyze the roles of these tools. On the basis of this,we further suggest the structure of a integration testing environment.

  16. Integration of life cycle assessment software with tools for economic and sustainability analyses and process simulation for sustainable process design

    DEFF Research Database (Denmark)

    Kalakul, Sawitree; Malakul, Pomthong; Siemanond, Kitipat;

    2014-01-01

    with other tools. To test the software, a bioethanol production process using cassava rhizome is employed as a case study. Results from LCSoft highlight the estimated environmental performance in terms of various aspects such as carbon footprint, resource and energy consumptions, and various environmental...

  17. The Design and Development of a Computerized Tool Support for Conducting Senior Projects in Software Engineering Education

    Science.gov (United States)

    Chen, Chung-Yang; Teng, Kao-Chiuan

    2011-01-01

    This paper presents a computerized tool support, the Meetings-Flow Project Collaboration System (MFS), for designing, directing and sustaining the collaborative teamwork required in senior projects in software engineering (SE) education. Among many schools' SE curricula, senior projects serve as a capstone course that provides comprehensive…

  18. A fluoroscopy-based planning and guidance software tool for minimally invasive hip refixation by cement injection

    NARCIS (Netherlands)

    Malan, D.F.; Van der Walt, S.J.; Raidou, R.G.; Van den Berg, B.; Stoel, B.C.; Botha, C.P.; Nelissen, R.G.H.H.; Valstar, E.R.

    2015-01-01

    Purpose In orthopaedics, minimally invasive injection of bone cement is an established technique. We present HipRFX, a software tool for planning and guiding a cement injection procedure for stabilizing a loosening hip prosthesis. HipRFX works by analysing a pre-operative CT and intraoperative C-arm

  19. Plagiarism Detection: A Comparison of Teaching Assistants and a Software Tool in Identifying Cheating in a Psychology Course

    Science.gov (United States)

    Seifried, Eva; Lenhard, Wolfgang; Spinath, Birgit

    2015-01-01

    Essays that are assigned as homework in large classes are prone to cheating via unauthorized collaboration. In this study, we compared the ability of a software tool based on Latent Semantic Analysis (LSA) and student teaching assistants to detect plagiarism in a large group of students. To do so, we took two approaches: the first approach was…

  20. EPA's science blog: "It All Starts with Science"; Article title: "EPA's Solvent Substitution Software Tool, PARIS III"

    Science.gov (United States)

    EPA's solvent substitution software tool, PARIS III is provided by the EPA for free, and can be effective and efficiently used to help environmentally-conscious individuals find better and greener solvent mixtures for many different common industrial processes. People can downlo...

  1. A Software Tool for Atmospheric Correction and Surface Temperature Estimation of Landsat Infrared Thermal Data

    Directory of Open Access Journals (Sweden)

    Benjamin Tardy

    2016-08-01

    Full Text Available Land surface temperature (LST is an important variable involved in the Earth’s surface energy and water budgets and a key component in many aspects of environmental research. The Landsat program, jointly carried out by NASA and the USGS, has been recording thermal infrared data for the past 40 years. Nevertheless, LST data products for Landsat remain unavailable. The atmospheric correction (AC method commonly used for mono-window Landsat thermal data requires detailed information concerning the vertical structure (temperature, pressure and the composition (water vapor, ozone of the atmosphere. For a given coordinate, this information is generally obtained through either radio-sounding or atmospheric model simulations and is passed to the radiative transfer model (RTM to estimate the local atmospheric correction parameters. Although this approach yields accurate LST data, results are relevant only near this given coordinate. To meet the scientific community’s demand for high-resolution LST maps, we developed a new software tool dedicated to processing Landsat thermal data. The proposed tool improves on the commonly-used AC algorithm by incorporating spatial variations occurring in the Earth’s atmosphere composition. The ERA-Interim dataset (ECMWFmeteorological organization was used to retrieve vertical atmospheric conditions, which are available at a global scale with a resolution of 0.125 degrees and a temporal resolution of 6 h. A temporal and spatial linear interpolation of meteorological variables was performed to match the acquisition dates and coordinates of the Landsat images. The atmospheric correction parameters were then estimated on the basis of this reconstructed atmospheric grid using the commercial RTMsoftware MODTRAN. The needed surface emissivity was derived from the common vegetation index NDVI, obtained from the red and near-infrared (NIR bands of the same Landsat image. This permitted an estimation of LST for the entire

  2. Mars, accessing the third dimension: a software tool to exploit Mars ground penetrating radars data.

    Science.gov (United States)

    Cantini, Federico; Ivanov, Anton B.

    2016-04-01

    The Mars Advanced Radar for Subsurface and Ionosphere Sounding (MARSIS), on board the ESA's Mars Express and the SHAllow RADar (SHARAD), on board the NASA's Mars Reconnaissance Orbiter are two ground penetrating radars (GPRs) aimed to probe the crust of Mars to explore the subsurface structure of the planet. By now they are collecting data since about 10 years covering a large fraction of the Mars surface. On the Earth GPRs collect data by sending electromagnetic (EM) pulses toward the surface and listening to the return echoes occurring at the dielectric discontinuities on the planet's surface and subsurface. The wavelengths used allow MARSIS EM pulses to penetrate the crust for several kilometers. The data products (Radargrams) are matrices where the x-axis spans different sampling points on the planet surface and the y-axis is the power of the echoes over time in the listening window. No standard way to manage this kind of data is established in the planetary science community and data analysis and interpretation require very often some knowledge of radar signal processing. Our software tool is aimed to ease the access to this data in particular to scientists without a specific background in signal processing. MARSIS and SHARAD geometrical data such as probing point latitude and longitude and spacecraft altitude, are stored, together with relevant acquisition metadata, in a geo-enabled relational database implemented using PostgreSQL and PostGIS. Data are extracted from official ESA and NASA released data using self-developed python classes and scripts and inserted in the database using OGR utilities. This software is also aimed to be the core of a collection of classes and script to implement more complex GPR data analysis. Geometrical data and metadata are exposed as WFS layers using a QGIS server, which can be further integrated with other data, such as imaging, spectroscopy and topography. Radar geometry data will be available as a part of the iMars Web

  3. Development and application of a new software tool for the basic design of flue gas cleaning processes

    Energy Technology Data Exchange (ETDEWEB)

    Schausberger, P.; Friedl, A. [Vienna Univ. of Technology, Inst. of Chemical Engineering, Group of Thermal Process Engineering and Simulation, Vienna (Austria); Wieland, A.; Reissner, H. [AE and E Austrian Energy and Environment AG, Flue Gas Cleaning Div., Raaba/Graz (Austria)

    2004-07-01

    The development of a new software tool designed for improvement of the basic engineering of flue-gas cleaning processes and its specific application is presented. The tool is based on the commercially available simulation tool IPSEpro originating from the field of power engineering. Here, a modelling environment enables the enhancement of the existing content: substances, streams and unit operations to be included are structured in an object-oriented manner, the according steady mass and heat balances are setup to yield a system of equations to be solved simultaneously. (orig.)

  4. Improvement of a free software tool for the assessment of sediment connectivity

    Science.gov (United States)

    Crema, Stefano; Lanni, Cristiano; Goldin, Beatrice; Marchi, Lorenzo; Cavalli, Marco

    2015-04-01

    Sediment connectivity expresses the degree of linkage that controls sediment fluxes throughout landscape, in particular between sediment sources and downstream areas. The assessment of sediment connectivity becomes a key issue when dealing with risk mitigation and priorities of intervention in the territory. In this work, the authors report the improvements made to an open source and stand-alone application (SedInConnect, http://www.sedalp.eu/download/tools.shtml), along with extensive applications to alpine catchments. SedInConnect calculates a sediment connectivity index as expressed in Cavalli et al. (2013); the software improvements consisted primarily in the introduction of the sink feature, i.e. areas that act as traps for sediment produced upstream (e.g., lakes, sediment traps). Based on user-defined sinks, the software decouples those parts of the catchment that do not deliver sediment to a selected target of interest (e.g., fan apex, main drainage network). In this way the assessment of sediment connectivity is achieved by taking in consideration effective sediment contributing areas. Sediment connectivity analysis has been carried out on several catchments in the South Tyrol alpine area (Northern Italy) with the goal of achieving a fast and objective characterization of the topographic control on sediment transfer. In addition to depicting the variability of sediment connectivity inside each basin, the index of connectivity has proved to be a valuable indicator of the dominant process characterizing the basin sediment dynamics (debris flow, bedload, mixed behavior). The characterization of the dominant process is of great importance for the hazard and risk assessment in mountain areas, and for choice and design of structural and non-structural intervention measures. The recognition of the dominant sediment transport process by the index of connectivity is in agreement with evidences arising from post-event field surveys and with the application of

  5. Open Software Tools Applied to Jordan's National Multi-Agent Water Management Model

    Science.gov (United States)

    Knox, Stephen; Meier, Philipp; Harou, Julien; Yoon, Jim; Selby, Philip; Lachaut, Thibaut; Klassert, Christian; Avisse, Nicolas; Khadem, Majed; Tilmant, Amaury; Gorelick, Steven

    2016-04-01

    Jordan is the fourth most water scarce country in the world, where demand exceeds supply in a politically and demographically unstable context. The Jordan Water Project (JWP) aims to perform policy evaluation by modelling the hydrology, economics, and governance of Jordan's water resource system. The multidisciplinary nature of the project requires a modelling software system capable of integrating submodels from multiple disciplines into a single decision making process and communicating results to stakeholders. This requires a tool for building an integrated model and a system where diverse data sets can be managed and visualised. The integrated Jordan model is built using Pynsim, an open-source multi-agent simulation framework implemented in Python. Pynsim operates on network structures of nodes and links and supports institutional hierarchies, where an institution represents a grouping of nodes, links or other institutions. At each time step, code within each node, link and institution can executed independently, allowing for their fully autonomous behaviour. Additionally, engines (sub-models) perform actions over the entire network or on a subset of the network, such as taking a decision on a set of nodes. Pynsim is modular in design, allowing distinct modules to be modified easily without affecting others. Data management and visualisation is performed using Hydra (www.hydraplatform.org), an open software platform allowing users to manage network structure and data. The Hydra data manager connects to Pynsim, providing necessary input parameters for the integrated model. By providing a high-level portal to the model, Hydra removes a barrier between the users of the model (researchers, stakeholders, planners etc) and the model itself, allowing them to manage data, run the model and visualise results all through a single user interface. Pynsim's ability to represent institutional hierarchies, inter-network communication and the separation of node, link and

  6. Sustaining an Online, Shared Community Resource for Models, Robust Open source Software Tools and Data for Volcanology - the Vhub Experience

    Science.gov (United States)

    Patra, A. K.; Valentine, G. A.; Bursik, M. I.; Connor, C.; Connor, L.; Jones, M.; Simakov, N.; Aghakhani, H.; Jones-Ivey, R.; Kosar, T.; Zhang, B.

    2015-12-01

    Over the last 5 years we have created a community collaboratory Vhub.org [Palma et al, J. App. Volc. 3:2 doi:10.1186/2191-5040-3-2] as a place to find volcanology-related resources, and a venue for users to disseminate tools, teaching resources, data, and an online platform to support collaborative efforts. As the community (current active users > 6000 from an estimated community of comparable size) embeds the tools in the collaboratory into educational and research workflows it became imperative to: a) redesign tools into robust, open source reusable software for online and offline usage/enhancement; b) share large datasets with remote collaborators and other users seamlessly with security; c) support complex workflows for uncertainty analysis, validation and verification and data assimilation with large data. The focus on tool development/redevelopment has been twofold - firstly to use best practices in software engineering and new hardware like multi-core and graphic processing units. Secondly we wish to enhance capabilities to support inverse modeling, uncertainty quantification using large ensembles and design of experiments, calibration, validation. Among software engineering practices we practice are open source facilitating community contributions, modularity and reusability. Our initial targets are four popular tools on Vhub - TITAN2D, TEPHRA2, PUFF and LAVA. Use of tools like these requires many observation driven data sets e.g. digital elevation models of topography, satellite imagery, field observations on deposits etc. These data are often maintained in private repositories that are privately shared by "sneaker-net". As a partial solution to this we tested mechanisms using irods software for online sharing of private data with public metadata and access limits. Finally, we adapted use of workflow engines (e.g. Pegasus) to support the complex data and computing workflows needed for usage like uncertainty quantification for hazard analysis using physical

  7. Exploiting Patterns and Tool Support for Reusable and Automated Change Support for Software Architectures

    OpenAIRE

    Aakash Ahmad; Claus Pahl; Fawad Khaliq; Onaiza Maqbool; Pooyan Jamshidi

    2016-01-01

    Lehman?s law of continuing change implies that software must continually evolve to accommodate frequently changing requirements in existing systems. Also, maintainability as an attribute of system quality requires that changes are to be systematically implemented in existing software throughout its lifecycle. To support a continuous software evolution, the primary challenges include (i) enhancing reuse of recurring changes; and (ii) decreasing the efforts for change implementation. We propose...

  8. Genoviz Software Development Kit: Java tool kit for building genomics visualization applications

    OpenAIRE

    Chervitz Stephen A; Blanchard Steven G; Erwin Ed; Blossom Eric; Nicol John W; Helt Gregg A; Harmon Cyrus; Loraine Ann E

    2009-01-01

    Abstract Background Visualization software can expose previously undiscovered patterns in genomic data and advance biological science. Results The Genoviz Software Development Kit (SDK) is an open source, Java-based framework designed for rapid assembly of visualization software applications for genomics. The Genoviz SDK framework provides a mechanism for incorporating adaptive, dynamic zooming into applications, a desirable feature of genome viewers. Visualization capabilities of the Genoviz...

  9. Development of an open source software tool for propeller design in the MAAT project

    OpenAIRE

    Morgado, João Paulo Salgueiro

    2016-01-01

    This thesis presents the development of a new propeller design and analysis software capable of adequately predicting the low Reynolds number performance. JBLADE software was developed from QBLADE and XFLR5 and it uses an improved version of Blade Element Momentum (BEM) theory that embeds a new model for the three-dimensional flow equilibrium. The software allows the introduction of the blade geometry as an arbitrary number of sections characterized by their radial position, chord, twist, len...

  10. Development of a Kinect Software Tool to Classify Movements during Active Video Gaming.

    Directory of Open Access Journals (Sweden)

    Michael Rosenberg

    Full Text Available While it has been established that using full body motion to play active video games results in increased levels of energy expenditure, there is little information on the classification of human movement during active video game play in relationship to fundamental movement skills. The aim of this study was to validate software utilising Kinect sensor motion capture technology to recognise fundamental movement skills (FMS, during active video game play. Two human assessors rated jumping and side-stepping and these assessments were compared to the Kinect Action Recognition Tool (KART, to establish a level of agreement and determine the number of movements completed during five minutes of active video game play, for 43 children (m = 12 years 7 months ± 1 year 6 months. During five minutes of active video game play, inter-rater reliability, when examining the two human raters, was found to be higher for the jump (r = 0.94, p < .01 than the sidestep (r = 0.87, p < .01, although both were excellent. Excellent reliability was also found between human raters and the KART system for the jump (r = 0.84, p, .01 and moderate reliability for sidestep (r = 0.6983, p < .01 during game play, demonstrating that both humans and KART had higher agreement for jumps than sidesteps in the game play condition. The results of the study provide confidence that the Kinect sensor can be used to count the number of jumps and sidestep during five minutes of active video game play with a similar level of accuracy as human raters. However, in contrast to humans, the KART system required a fraction of the time to analyse and tabulate the results.

  11. Biomedical Mutation Analysis (BMA): A software tool for analyzing mutations associated with antiviral resistance

    Science.gov (United States)

    Salvatierra, Karina; Florez, Hector

    2016-01-01

    Introduction: Hepatitis C virus (HCV) is considered a major public health problem, with 200 million people infected worldwide. The treatment for HCV chronic infection with pegylated interferon alpha plus ribavirin inhibitors is unspecific; consequently, the treatment is effective in only 50% of patients infected. This has prompted the development of direct-acting antivirals (DAA) that target virus proteins. These DAA have demonstrated a potent effect in vitro and in vivo; however, virus mutations associated with the development of resistance have been described. Objective: To design and develop an online information system for detecting mutations in amino acids known to be implicated in resistance to DAA. Materials and methods:    We have used computer applications, technological tools, standard languages, infrastructure systems and algorithms, to analyze positions associated with resistance to DAA for the NS3, NS5A, and NS5B genes of HCV. Results: We have designed and developed an online information system named Biomedical Mutation Analysis (BMA), which allows users to calculate changes in nucleotide and amino acid sequences for each selected sequence from conventional Sanger and cloning sequencing using a graphical interface. Conclusion: BMA quickly, easily and effectively analyzes mutations, including complete documentation and examples. Furthermore, the development of different visualization techniques allows proper interpretation and understanding of the results. The data obtained using BMA will be useful for the assessment and surveillance of HCV resistance to new antivirals, and for the treatment regimens by selecting those DAA to which the virus is not resistant, avoiding unnecessary treatment failures. The software is available at: http://bma.itiud.org.

  12. A flexible, interactive software tool for fitting the parameters of neuronal models

    Directory of Open Access Journals (Sweden)

    Péter eFriedrich

    2014-07-01

    Full Text Available The construction of biologically relevant neuronal models as well as model-based analysis of experimental data often requires the simultaneous fitting of multiple model parameters, so that the behavior of the model in a certain paradigm matches (as closely as possible the corresponding output of a real neuron according to some predefined criterion. Although the task of model optimization is often computationally hard, and the quality of the results depends heavily on technical issues such as the appropriate choice (and implementation of cost functions and optimization algorithms, no existing program provides access to the best available methods while also guiding the user through the process effectively. Our software, called Optimizer, implements a modular and extensible framework for the optimization of neuronal models, and also features a graphical interface which makes it easy for even non-expert users to handle many commonly occurring scenarios. Meanwhile, educated users can extend the capabilities of the program and customize it according to their needs with relatively little effort. Optimizer has been developed in Python, takes advantage of open-source Python modules for nonlinear optimization, and interfaces directly with the NEURON simulator to run the models. Other simulators are supported through an external interface. We have tested the program on several different types of problem of varying complexity, using different model classes. As targets, we used simulated traces from the same or a more complex model class, as well as experimental data. We successfully used Optimizer to determine passive parameters and conductance densities in compartmental models, and to fit simple (adaptive exponential integrate-and-fire neuronal models to complex biological data. Our detailed comparisons show that Optimizer can handle a wider range of problems, and delivers equally good or better performance than any other existing neuronal model fitting

  13. Reviewing the impact of problem structure on planning: a software tool for analyzing tower tasks.

    Science.gov (United States)

    Kaller, Christoph P; Rahm, Benjamin; Köstering, Lena; Unterrainer, Josef M

    2011-01-01

    Cognitive, clinical, and neuroimaging studies on planning abilities most frequently implement the Tower of London task or one of its variants. Yet, cumulating evidence from a series of experiments suggests that the commonly used approximation of problem difficulty in terms of the minimum number of moves for goal attainment is too coarse a measure for the underlying cognitive operations, and in some cases may be even misleading. Rather, problem difficulty can be more specifically characterized by a set of structural task parameters such as the number and nature of optimal and suboptimal solution paths, the required search depths, the patterns of intermediate and goal moves, goal hierarchies and the associated degree of ambiguity in the sequential ordering of goal moves. First applications in developmental and patient studies have proven fruitful in targeting fundamental alterations of planning abilities in healthy and clinical conditions. In addition, recent evidence from neuroimaging shows that manipulations of problem structure relate to separate cognitive and neural processes and are accompanied by dissociable brain activation patterns. Here, we briefly review these structural problem parameters and the concepts behind. As controlling for task parameters and selecting a balanced problem set is a complex and error-prone endeavor, we further present TowerTool, a software solution that allows easy access to in-depth analysis of the problem structure of widely used planning tasks like the Tower of London, the Tower of Hanoi, and their variants. Thereby, we hope to encourage and facilitate the implementation of structurally balanced task sets in future studies on planning and to promote transfer between the cognitive, developmental, and clinical neurosciences. PMID:20723568

  14. Development of a Kinect Software Tool to Classify Movements during Active Video Gaming.

    Science.gov (United States)

    Rosenberg, Michael; Thornton, Ashleigh L; Lay, Brendan S; Ward, Brodie; Nathan, David; Hunt, Daniel; Braham, Rebecca

    2016-01-01

    While it has been established that using full body motion to play active video games results in increased levels of energy expenditure, there is little information on the classification of human movement during active video game play in relationship to fundamental movement skills. The aim of this study was to validate software utilising Kinect sensor motion capture technology to recognise fundamental movement skills (FMS), during active video game play. Two human assessors rated jumping and side-stepping and these assessments were compared to the Kinect Action Recognition Tool (KART), to establish a level of agreement and determine the number of movements completed during five minutes of active video game play, for 43 children (m = 12 years 7 months ± 1 year 6 months). During five minutes of active video game play, inter-rater reliability, when examining the two human raters, was found to be higher for the jump (r = 0.94, p < .01) than the sidestep (r = 0.87, p < .01), although both were excellent. Excellent reliability was also found between human raters and the KART system for the jump (r = 0.84, p, .01) and moderate reliability for sidestep (r = 0.6983, p < .01) during game play, demonstrating that both humans and KART had higher agreement for jumps than sidesteps in the game play condition. The results of the study provide confidence that the Kinect sensor can be used to count the number of jumps and sidestep during five minutes of active video game play with a similar level of accuracy as human raters. However, in contrast to humans, the KART system required a fraction of the time to analyse and tabulate the results. PMID:27442437

  15. Development of a Kinect Software Tool to Classify Movements during Active Video Gaming

    Science.gov (United States)

    Rosenberg, Michael; Lay, Brendan S.; Ward, Brodie; Nathan, David; Hunt, Daniel; Braham, Rebecca

    2016-01-01

    While it has been established that using full body motion to play active video games results in increased levels of energy expenditure, there is little information on the classification of human movement during active video game play in relationship to fundamental movement skills. The aim of this study was to validate software utilising Kinect sensor motion capture technology to recognise fundamental movement skills (FMS), during active video game play. Two human assessors rated jumping and side-stepping and these assessments were compared to the Kinect Action Recognition Tool (KART), to establish a level of agreement and determine the number of movements completed during five minutes of active video game play, for 43 children (m = 12 years 7 months ± 1 year 6 months). During five minutes of active video game play, inter-rater reliability, when examining the two human raters, was found to be higher for the jump (r = 0.94, p < .01) than the sidestep (r = 0.87, p < .01), although both were excellent. Excellent reliability was also found between human raters and the KART system for the jump (r = 0.84, p, .01) and moderate reliability for sidestep (r = 0.6983, p < .01) during game play, demonstrating that both humans and KART had higher agreement for jumps than sidesteps in the game play condition. The results of the study provide confidence that the Kinect sensor can be used to count the number of jumps and sidestep during five minutes of active video game play with a similar level of accuracy as human raters. However, in contrast to humans, the KART system required a fraction of the time to analyse and tabulate the results. PMID:27442437

  16. The NEPLAN software package a universal tool for electric power systems analysis

    CERN Document Server

    Kahle, K

    2002-01-01

    The NEPLAN software package has been used by CERN's Electric Power Systems Group since 1997. The software is designed for the calculation of short-circuit currents, load flow, motor start, dynamic stability, harmonic analysis and harmonic filter design. This paper describes the main features of the software package and their application to CERN's electric power systems. The implemented models of CERN's power systems are described in detail. Particular focus is given to fault calculations, harmonic analysis and filter design. Based on this software package and the CERN power network model, several recommendations are given.

  17. A new online software tool for pressure ulcer monitoring as an educational instrument for unified nursing assessment in clinical settings

    Directory of Open Access Journals (Sweden)

    Andrea Pokorná

    2016-07-01

    Full Text Available Data collection and evaluation of that data is crucial for effective quality management and naturally also for prevention and treatment of pressure ulcers. Data collected in a uniform manner by nurses in clinical practice could be used for further analyses. Data about pressure ulcers are collected to differing degrees of quality based on the local policy of the given health care facility and in relation to the nurse’s actual level of knowledge concerning pressure ulcer identification and use of objective scales (i.e. categorization of pressure ulcers. Therefore, we have developed software suitable for data collection which includes some educational tools to promote unified reporting of data by nurses. A description of this software and some educational and learning components of the tool is presented herein. The planned process of clinical application of the newly developed software is also briefly mentioned. The discussion is focused on the usability of the online reporting tool and possible further development of the tool.

  18. Development of the software tool for generation and visualization of the finite element head model with bone conduction sounds

    Science.gov (United States)

    Nikolić, Dalibor; Milošević, Žarko; Saveljić, Igor; Filipović, Nenad

    2015-12-01

    Vibration of the skull causes a hearing sensation. We call it Bone Conduction (BC) sound. There are several investigations about transmission properties of bone conducted sound. The aim of this study was to develop a software tool for easy generation of the finite element (FE) model of the human head with different materials based on human head anatomy and to calculate sound conduction through the head. Developed software tool generates a model in a few steps. The first step is to do segmentation of CT medical images (DICOM) and to generate a surface mesh files (STL). Each STL file presents a different layer of human head with different material properties (brain, CSF, different layers of the skull bone, skin, etc.). The next steps are to make tetrahedral mesh from obtained STL files, to define FE model boundary conditions and to solve FE equations. This tool uses PAK solver, which is the open source software implemented in SIFEM FP7 project, for calculations of the head vibration. Purpose of this tool is to show impact of the bone conduction sound of the head on the hearing system and to estimate matching of obtained results with experimental measurements.

  19. Software sensors as a tool for optimization of animal-cell cultures.

    NARCIS (Netherlands)

    Dorresteijn, P.C.

    1997-01-01

    In this thesis software sensors are introduced that predict the biomass activity and the concentrations of glucose, glutamine, lactic acid, and ammonium on line, The software sensors for biomass activity, glucose and lactic acid can be applied for any type of animal cell that is grown in a bioreacto

  20. Using McIDAS-V data analysis and visualization software as an educational tool for understanding the atmosphere

    Science.gov (United States)

    Achtor, T. H.; Rink, T.

    2010-12-01

    The University of Wisconsin’s Space Science and Engineering Center (SSEC) has been at the forefront in developing data analysis and visualization tools for environmental satellites and other geophysical data. The fifth generation of the Man-computer Interactive Data Access System (McIDAS-V) is Java-based, open-source, freely available software that operates on Linux, Macintosh and Windows systems. The software tools provide powerful new data manipulation and visualization capabilities that work with geophysical data in research, operational and educational environments. McIDAS-V provides unique capabilities to support innovative techniques for evaluating research results, teaching and training. McIDAS-V is based on three powerful software elements. VisAD is a Java library for building interactive, collaborative, 4 dimensional visualization and analysis tools. The Integrated Data Viewer (IDV) is a reference application based on the VisAD system and developed by the Unidata program that demonstrates the flexibility that is needed in this evolving environment, using a modern, object-oriented software design approach. The third tool, HYDRA, allows users to build, display and interrogate multi and hyperspectral environmental satellite data in powerful ways. The McIDAS-V software is being used for training and education in several settings. The McIDAS User Group provides training workshops at its annual meeting. Numerous online tutorials with training data sets have been developed to aid users in learning simple and more complex operations in McIDAS-V, all are available online. In a University of Wisconsin-Madison undergraduate course in Radar and Satellite Meteorology, McIDAS-V is used to create and deliver laboratory exercises using case study and real time data. At the high school level, McIDAS-V is used in several exercises in our annual Summer Workshop in Earth and Atmospheric Sciences to provide young scientists the opportunity to examine data with friendly and

  1. The Seismic Tool-Kit (STK): an open source software for seismology and signal processing.

    Science.gov (United States)

    Reymond, Dominique

    2016-04-01

    We present an open source software project (GNU public license), named STK: Seismic ToolKit, that is dedicated mainly for seismology and signal processing. The STK project that started in 2007, is hosted by SourceForge.net, and count more than 19 500 downloads at the date of writing. The STK project is composed of two main branches: First, a graphical interface dedicated to signal processing (in the SAC format (SAC_ASCII and SAC_BIN): where the signal can be plotted, zoomed, filtered, integrated, derivated, ... etc. (a large variety of IFR and FIR filter is proposed). The estimation of spectral density of the signal are performed via the Fourier transform, with visualization of the Power Spectral Density (PSD) in linear or log scale, and also the evolutive time-frequency representation (or sonagram). The 3-components signals can be also processed for estimating their polarization properties, either for a given window, or either for evolutive windows along the time. This polarization analysis is useful for extracting the polarized noises, differentiating P waves, Rayleigh waves, Love waves, ... etc. Secondly, a panel of Utilities-Program are proposed for working in a terminal mode, with basic programs for computing azimuth and distance in spherical geometry, inter/auto-correlation, spectral density, time-frequency for an entire directory of signals, focal planes, and main components axis, radiation pattern of P waves, Polarization analysis of different waves (including noize), under/over-sampling the signals, cubic-spline smoothing, and linear/non linear regression analysis of data set. A MINimum library of Linear AlGebra (MIN-LINAG) is also provided for computing the main matrix process like: QR/QL decomposition, Cholesky solve of linear system, finding eigen value/eigen vectors, QR-solve/Eigen-solve of linear equations systems ... etc. STK is developed in C/C++, mainly under Linux OS, and it has been also partially implemented under MS-Windows. Usefull links: http

  2. ROCKETSHIP: a flexible and modular software tool for the planning, processing and analysis of dynamic MRI studies

    International Nuclear Information System (INIS)

    Dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) is a promising technique to characterize pathology and evaluate treatment response. However, analysis of DCE-MRI data is complex and benefits from concurrent analysis of multiple kinetic models and parameters. Few software tools are currently available that specifically focuses on DCE-MRI analysis with multiple kinetic models. Here, we developed ROCKETSHIP, an open-source, flexible and modular software for DCE-MRI analysis. ROCKETSHIP incorporates analyses with multiple kinetic models, including data-driven nested model analysis. ROCKETSHIP was implemented using the MATLAB programming language. Robustness of the software to provide reliable fits using multiple kinetic models is demonstrated using simulated data. Simulations also demonstrate the utility of the data-driven nested model analysis. Applicability of ROCKETSHIP for both preclinical and clinical studies is shown using DCE-MRI studies of the human brain and a murine tumor model. A DCE-MRI software suite was implemented and tested using simulations. Its applicability to both preclinical and clinical datasets is shown. ROCKETSHIP was designed to be easily accessible for the beginner, but flexible enough for changes or additions to be made by the advanced user as well. The availability of a flexible analysis tool will aid future studies using DCE-MRI. A public release of ROCKETSHIP is available at (https://github.com/petmri/ROCKETSHIP)

  3. A Survey on Usage and Diffusion of Project Risk Management Techniques and Software Tools in the Construction Industry

    OpenAIRE

    Thaheem, Muhammad Jamaluddin; Marco, Alberto

    2013-01-01

    The area of Project Risk Management (PRM) has been extensively researched, and the utilization of various tools and techniques for managing risk in several industries has been sufficiently reported. Formal and systematic PRM practices have been made available for the construction industry. Based on such body of knowledge, this paper tries to find out the global picture of PRM practices and approaches with the help of a survey to look into the usage of PRM techniques and diffusion of software ...

  4. Towards a Software Tool Supporting Urban Decision Makers in Locating and Sizing the Household Garbage Accumulation Points Within Cities

    OpenAIRE

    Di Felice, Paolino

    2015-01-01

    Locating and sizing garbage bins for the separate accumulation of household solid waste within urban areas is of primary interest for the local administrations that so far lack adequate IT support. The paper highlights the versatility of a method for solving such a problem, which involves both standard and geographic data. Implementation of the proposal, centered around a spatial database, goes in the direction of developing a supporting software tool to the officials responsible for the mana...

  5. Methods and tools for dynamic requirements catalog management in agile software development

    OpenAIRE

    Tkachuk, M. V.; Gamzaev, R. A.; Martinkus, I. O.; Ianushkevych, S. D.

    2015-01-01

    A method for managing dynamic requirements catalog in agile software development, especially on example of Scrum-methodology is proposed. Popular approaches to solving this problem are reviewed. The proposed approach is based on the combined usage of the latent semantic analysis and analytical hierarchy process, it allows to evaluate the given textual software specification with respect to their possible redundancy and possible logical conflicts. Besides that this approach supports the decisi...

  6. GUM2DFT—a software tool for uncertainty evaluation of transient signals in the frequency domain

    International Nuclear Information System (INIS)

    The Fourier transform and its counterpart for discrete time signals, the discrete Fourier transform (DFT), are common tools in measurement science and application. Although almost every scientific software package offers ready-to-use implementations of the DFT, the propagation of uncertainties in line with the guide to the expression of uncertainty in measurement (GUM) is typically neglected. This is of particular importance in dynamic metrology, when input estimation is carried out by deconvolution in the frequency domain. To this end, we present the new open-source software tool GUM2DFT, which utilizes closed formulas for the efficient propagation of uncertainties for the application of the DFT, inverse DFT and input estimation in the frequency domain. It handles different frequency domain representations, accounts for autocorrelation and takes advantage of the symmetry inherent in the DFT result for real-valued time domain signals. All tools are presented in terms of examples which form part of the software package. GUM2DFT will foster GUM-compliant evaluation of uncertainty in a DFT-based analysis and enable metrologists to include uncertainty evaluations in their routine work. (paper)

  7. GUM2DFT—a software tool for uncertainty evaluation of transient signals in the frequency domain

    Science.gov (United States)

    Eichstädt, S.; Wilkens, V.

    2016-05-01

    The Fourier transform and its counterpart for discrete time signals, the discrete Fourier transform (DFT), are common tools in measurement science and application. Although almost every scientific software package offers ready-to-use implementations of the DFT, the propagation of uncertainties in line with the guide to the expression of uncertainty in measurement (GUM) is typically neglected. This is of particular importance in dynamic metrology, when input estimation is carried out by deconvolution in the frequency domain. To this end, we present the new open-source software tool GUM2DFT, which utilizes closed formulas for the efficient propagation of uncertainties for the application of the DFT, inverse DFT and input estimation in the frequency domain. It handles different frequency domain representations, accounts for autocorrelation and takes advantage of the symmetry inherent in the DFT result for real-valued time domain signals. All tools are presented in terms of examples which form part of the software package. GUM2DFT will foster GUM-compliant evaluation of uncertainty in a DFT-based analysis and enable metrologists to include uncertainty evaluations in their routine work.

  8. Particle Loss Calculator – a new software tool for the assessment of the performance of aerosol inlet systems

    Directory of Open Access Journals (Sweden)

    S.-L. von der Weiden

    2009-09-01

    Full Text Available Most aerosol measurements require an inlet system to transport aerosols from a select sampling location to a suitable measurement device through some length of tubing. Such inlet systems must be optimized to minimize aerosol sampling artifacts and maximize sampling efficiency. In this study we introduce a new multifunctional software tool (Particle Loss Calculator, PLC that can be used to quickly determine aerosol sampling efficiency and particle transport losses due to passage through arbitrary tubing systems. The software employs relevant empirical and theoretical relationships found in established literature and accounts for the most important sampling and transport effects that might be encountered during deployment of typical, ground-based ambient aerosol measurements through a constant-diameter sampling probe. The software treats non-isoaxial and non-isokinetic aerosol sampling, aerosol diffusion and sedimentation as well as turbulent inertial deposition and inertial deposition in bends and contractions of tubing. This software was validated through comparison with experimentally determined particle losses for several tubing systems bent to create various diffusion, sedimentation and inertial deposition properties. As long as the tube geometries are not "too extreme", agreement is satisfactory. We discuss the conclusions of these experiments, the limitations of the software and present three examples of the use of the Particle Loss Calculator in the field.

  9. Particle Loss Calculator – a new software tool for the assessment of the performance of aerosol inlet systems

    Directory of Open Access Journals (Sweden)

    S.-L. von der Weiden

    2009-04-01

    Full Text Available Most aerosol measurements require an inlet system to transport aerosols from a select sampling location to a suitable measurement device through some length of tubing. Such inlet systems must be optimized to minimize aerosol sampling artifacts and maximize sampling efficiency. In this study we introduce a new multifunctional software tool (Particle Loss Calculator, PLC that can be used to quickly determine aerosol sampling efficiency and particle transport losses due to passage through arbitrary tubing systems. The software employs relevant empirical and theoretical relationships found in established literature and accounts for the most important sampling and transport effects that might be encountered during deployment of typical, ground-based ambient aerosol measurements. The software treats non-isoaxial and non-isokinetic aerosol sampling, aerosol diffusion and sedimentation as well as turbulent inertial deposition and inertial deposition in bends and contractions of tubing. This software was validated through comparison with experimentally determined particle losses for several tubing systems bent to create various diffusion, sedimentation and inertial deposition properties. As long as the tube geometries are not "too extreme", agreement is satisfactory. We discuss the conclusions of these experiments, the limitations of the software and present three examples of the use of the Particle Loss Calculator in the field.

  10. 软件静态分析工具评析%Evaluation of Software Static Analysis Tools

    Institute of Scientific and Technical Information of China (English)

    王凯; 孔祥营

    2011-01-01

    For finding more software defects during coding phase in software lifecycle to decrease costs and development time, it is necessary for us to actualize static analysis of source codes tested, the most effective means of carrying out static analysis is to use static analysis tools.Aiming at software defects of C procedure to us, via.comparison of functionality of several popular static analysis tools, we discuss the advantages and shortages of static analysis tools as well as many factors which influence us to select static analysis tools.These factors provide us references to selection among static analysis tools.%为了在软件生命周期的编码阶段尽可能多地发现软件缺陷以降低软件成本和开发时间,需要对被测程序源代码实施软件静态分析.软件静态分析最有效的手段是使用软件静态分析工具.针对C程序常见的软件缺陷,通过对几种主流静态分析工具的功能性对比分析,探讨了软件静态分析工具的优缺点及影响软件静态分析工具选择的诸多因素,可为软件测试人员选择合适的软件静态分析工具提供参考.

  11. TU-C-17A-03: An Integrated Contour Evaluation Software Tool Using Supervised Pattern Recognition for Radiotherapy

    International Nuclear Information System (INIS)

    Purpose: Radiotherapy (RT) contours delineated either manually or semiautomatically require verification before clinical usage. Manual evaluation is very time consuming. A new integrated software tool using supervised pattern contour recognition was thus developed to facilitate this process. Methods: The contouring tool was developed using an object-oriented programming language C# and application programming interfaces, e.g. visualization toolkit (VTK). The C# language served as the tool design basis. The Accord.Net scientific computing libraries were utilized for the required statistical data processing and pattern recognition, while the VTK was used to build and render 3-D mesh models from critical RT structures in real-time and 360° visualization. Principal component analysis (PCA) was used for system self-updating geometry variations of normal structures based on physician-approved RT contours as a training dataset. The inhouse design of supervised PCA-based contour recognition method was used for automatically evaluating contour normality/abnormality. The function for reporting the contour evaluation results was implemented by using C# and Windows Form Designer. Results: The software input was RT simulation images and RT structures from commercial clinical treatment planning systems. Several abilities were demonstrated: automatic assessment of RT contours, file loading/saving of various modality medical images and RT contours, and generation/visualization of 3-D images and anatomical models. Moreover, it supported the 360° rendering of the RT structures in a multi-slice view, which allows physicians to visually check and edit abnormally contoured structures. Conclusion: This new software integrates the supervised learning framework with image processing and graphical visualization modules for RT contour verification. This tool has great potential for facilitating treatment planning with the assistance of an automatic contour evaluation module in avoiding

  12. The NetVISA automatic association tool. Next generation software testing and performance under realistic conditions.

    Science.gov (United States)

    Le Bras, Ronan; Arora, Nimar; Kushida, Noriyuki; Tomuta, Elena; Kebede, Fekadu; Feitio, Paulino

    2016-04-01

    The CTBTO's International Data Centre is in the process of developing the next generation software to perform the automatic association step. The NetVISA software uses a Bayesian approach with a forward physical model using probabilistic representations of the propagation, station capabilities, background seismicity, noise detection statistics, and coda phase statistics. The software has been in development for a few years and is now reaching the stage where it is being tested in a realistic operational context. An interactive module has been developed where the NetVISA automatic events that are in addition to the Global Association (GA) results are presented to the analysts. We report on a series of tests where the results are examined and evaluated by seasoned analysts. Consistent with the statistics previously reported (Arora et al., 2013), the first test shows that the software is able to enhance analysis work by providing additional event hypothesis for consideration by analysts. A test on a three-day data set was performed and showed that the system found 42 additional real events out of 116 examined, including 6 that pass the criterion for the Reviewed Event Bulletin of the IDC. The software was functional in a realistic, real-time mode, during the occurrence of the fourth nuclear test claimed by the Democratic People's Republic of Korea on January 6th, 2016. Confirming a previous statistical observation, the software found more associated stations (51, including 35 primary stations) than GA (36, including 26 primary stations) for this event. Nimar S. Arora, Stuart Russell, Erik Sudderth. Bulletin of the Seismological Society of America (BSSA) April 2013, vol. 103 no. 2A pp709-729.

  13. A Systemic Approach to the Preservation of Audio Documents: Methodology and Software Tools

    Directory of Open Access Journals (Sweden)

    Federica Bressan

    2013-01-01

    protocol reflects the methodological principles adopted by the authors, and its effectiveness is based on the results obtained in recent research projects involving some of the finest audio archives in Europe. Some recommendations are given for the rerecording process, aimed at minimizing the information loss and at quantifying the unintentional alterations introduced by the technical equipment. Finally, the paper introduces an original software system that guides and supports the preservation staff along the process, reducing the processing timing, automatizing tasks, minimizing errors, and using information hiding strategies to ease the cognitive load. Currently the software system is in use in several international archives.

  14. The Comprehensive Evaluation of Electronic Learning Tools and Educational Software (CEELTES)

    Science.gov (United States)

    Karolcík, Štefan; Cipková, Elena; Hrušecký, Roman; Veselský, Milan

    2015-01-01

    Despite the fact that digital technologies are more and more used in the learning and education process, there is still lack of professional evaluation tools capable of assessing the quality of used digital teaching aids in a comprehensive and objective manner. Construction of the Comprehensive Evaluation of Electronic Learning Tools and…

  15. APASVO: A free software tool for automatic P-phase picking and event detection in seismic traces

    Science.gov (United States)

    Romero, José Emilio; Titos, Manuel; Bueno, Ángel; Álvarez, Isaac; García, Luz; Torre, Ángel de la; Benítez, M.a. Carmen

    2016-05-01

    The accurate estimation of the arrival time of seismic waves or picking is a problem of major interest in seismic research given its relevance in many seismological applications, such as earthquake source location and active seismic tomography. In the last decades, several automatic picking methods have been proposed with the ultimate goal of implementing picking algorithms whose results are comparable to those obtained by manual picking. In order to facilitate the use of these automated methods in the analysis of seismic traces, this paper presents a new free, open source, software graphical tool, named APASVO, which allows picking tasks in an easy and user-friendly way. The tool also provides event detection functionality, where a relatively imprecise estimation of the onset time is sufficient. The application implements the STA-LTA detection algorithm and the AMPA picking algorithm. An autoregressive AIC-based picking method can also be applied. Besides, this graphical tool is complemented with two additional command line tools, an event picking tool and a synthetic earthquake generator. APASVO is a multiplatform tool that works on Windows, Linux and OS X. The application can process data in a large variety of file formats. It is implemented in Python and relies on well-known scientific computing packages such as ObsPy, NumPy, SciPy and Matplotlib.

  16. In-depth evaluation of software tools for data-independent acquisition based label-free quantification.

    Science.gov (United States)

    Kuharev, Jörg; Navarro, Pedro; Distler, Ute; Jahn, Olaf; Tenzer, Stefan

    2015-09-01

    Label-free quantification (LFQ) based on data-independent acquisition workflows currently experiences increasing popularity. Several software tools have been recently published or are commercially available. The present study focuses on the evaluation of three different software packages (Progenesis, synapter, and ISOQuant) supporting ion mobility enhanced data-independent acquisition data. In order to benchmark the LFQ performance of the different tools, we generated two hybrid proteome samples of defined quantitative composition containing tryptically digested proteomes of three different species (mouse, yeast, Escherichia coli). This model dataset simulates complex biological samples containing large numbers of both unregulated (background) proteins as well as up- and downregulated proteins with exactly known ratios between samples. We determined the number and dynamic range of quantifiable proteins and analyzed the influence of applied algorithms (retention time alignment, clustering, normalization, etc.) on quantification results. Analysis of technical reproducibility revealed median coefficients of variation of reported protein abundances below 5% for MS(E) data for Progenesis and ISOQuant. Regarding accuracy of LFQ, evaluation with synapter and ISOQuant yielded superior results compared to Progenesis. In addition, we discuss reporting formats and user friendliness of the software packages. The data generated in this study have been deposited to the ProteomeXchange Consortium with identifier PXD001240 (http://proteomecentral.proteomexchange.org/dataset/PXD001240).

  17. Featureous: A Tool for Feature-Centric Analysis of Java Software

    DEFF Research Database (Denmark)

    Olszak, Andrzej; Jørgensen, Bo Nørregaard

    2010-01-01

    Feature-centric comprehension of source code is necessary for incorporating user-requested modifications during software evolution and maintenance. However, such comprehension is difficult to achieve in case of large object-oriented programs due to the size, complexity, and implicit character of ...

  18. Cyber-physical systems software development: way of working and tool suite

    NARCIS (Netherlands)

    Bezemer, Maarten Matthijs

    2013-01-01

    Designing embedded control software for modern cyber-physical systems becomes more and more difficult, because of the increasing amount and complexity of their requirements. The regular requirements are extended with modern requirements, for example, to get a general purpose cyber-physical system ca

  19. The Application of Intentional Subjective Properties and Mediated Communication Tools to Software Agents in Online Disputes Resolution Environments

    Directory of Open Access Journals (Sweden)

    Renzo Gobbin

    2004-11-01

    Full Text Available This paper examines the use of subjective properties in modeling an architecture for cooperative agents using Agent Communication Language (ACL that is used as a mediating tool for cooperative communication activities between and within software agents. The role that subjective and objective properties have in explaining and modeling agent internalization and externalization of ACL messages is investigated and related to Vygotsky’s developmental learning theories such as Mediated Activity Theory. A novel agent architecture ALMA (Agent Language Mediated Activity based on the integration of agents’ subjective and objective properties within an agent communication activity framework will be presented. The relevance of software agents subjective properties in modeling applications such as e-Law Online Dispute Resolution for e-business contractual arrangements using natural language subject/object relation in their communication patterns will be discussed.

  20. APPLICATION Of MODEL PSP MANUAL AND SUPPORTED BY TOOL MARRIES IN A STUDY OF CASE OF BRAZILIAN PLANT OF SOFTWARE

    Directory of Open Access Journals (Sweden)

    Denis Ávila Montini

    2006-06-01

    Full Text Available In a context of continuous improvement of quality in software development’s projects, the PSP experimental process was applied to discipline some of the processes suggested by CMMI level 2 with two different strategies. The first one consists of observing the behavior of a software factory on collecting the necessary data to assist the PSP model manually, and in the second one the collecting happened with the help of a CASE tool. The results show the impacts in the performance and in the quality patterns that the two strategies provided with their advantages and their vulnerabilities. In both cases the fulfillment of the stated periods was obtained from the moment where the specification and the course of the activities were controlled by the two PSP’ strategies suggested. Key words: CMMI, PSP, CASE, Factory, Improvement of processes.

  1. A new software tool is developed to evaluate the measured/simulated transmission characteristics of optical multiplexers/demultiplexers

    Science.gov (United States)

    Seyringer, D.; Schmid, P.

    2011-10-01

    A new software tool, called AWG-Analyzer, is developed to evaluate the simulated/measured transmission characteristics of optical multiplexers/demultiplexers based on arrayed waveguide gratings (AWG). The output of the calculation is a set of the transmission parameters like: non-uniformity, adjacent channel crosstalk, non-adjacent channel crosstalk, background crosstalk, insertion loss, polarisation dependent loss (PDL), etc. calculated for each output channel first and then for the whole AWG - the worst case value of each parameter over all the output channels. This set of the parameters is then taken as the AWG specification. The parameters are calculated for a particular channel bandwidth (also known as the channel passband or ITU passband), that is also an input parameter for the calculations. Additionally, the developed software tool, having a user friendly interface, offers the help where all calculated transmission parameters are explained and exactly defined. The tool also includes a brief overview about AWG functionality with a small animation and the information about various AWG types (CWDM and DWDM AWGs, Colourless AWGs).

  2. A System Level Tool for Translating Software to Reconfigurable Hardware Project

    Data.gov (United States)

    National Aeronautics and Space Administration — In this research we will develop a system level tool to translate binary code of a general-purpose processor into Register Transfer Level VHDL code to be mapped...

  3. Establishing a Web-Based DICOM Teaching File Authoring Tool Using Open-Source Public Software

    OpenAIRE

    Lee, Wen-Jeng; Yang, Chung-Yi; Liu, Kao-Lang; Liu, Hon-Man; Ching, Yu-Tai; Chen, Shyh-Jye

    2005-01-01

    Online teaching files are an important source of educational and referential materials in the radiology community. The commonly used Digital Imaging and Communications in Medicine (DICOM) file format of the radiology community is not natively supported by common Web browsers. The ability of the Web server to convert and parse DICOM is important when the DICOM-converting tools are not available. In this paper, we describe our approach to develop a Web-based teaching file authoring tool. Our se...

  4. Performance Evaluation of a Software Engineering Tool for Automated Design of Cooling Systems in Injection Moulding

    DEFF Research Database (Denmark)

    Jauregui-Becker, Juan M.; Tosello, Guido; van Houten, Fred J.A.M.;

    2013-01-01

    different design methods (i.e. automatic and manual) were applied to the mould design of two thin-walled products, namely a rectangular flat box and a cylindrical container with a flat base. Injection moulding process simulations based on the finite element method were performed to assess the quality...... of the moulded parts. Results indicate the tool is capable of generating feasible cooling solutions. Recommendations are provided for improving the performance of the tool....

  5. Providing a Connection between a Bayesian Inverse Modeling Tool and a Coupled Hydrogeological Processes Modeling Software

    Science.gov (United States)

    Frystacky, H.; Osorio-Murillo, C. A.; Over, M. W.; Kalbacher, T.; Gunnell, D.; Kolditz, O.; Ames, D.; Rubin, Y.

    2013-12-01

    The Method of Anchored Distributions (MAD) is a Bayesian technique for characterizing the uncertainty in geostatistical model parameters. Open-source software has been developed in a modular framework such that this technique can be applied to any forward model software via a driver. This presentation is about the driver that has been developed for OpenGeoSys (OGS), open-source software that can simulate many hydrogeological processes, including couple processes. MAD allows the use of multiple data types for conditioning the spatially random fields and assessing model parameter likelihood. For example, if simulating flow and mass transport, the inversion target variable could be hydraulic conductivity and the inversion data types could be head, concentration, or both. The driver detects from the OGS files which processes and variables are being used in a given project and allows MAD to prompt the user to choose those that are to be modeled or to be treated deterministically. In this way, any combination of processes allowed by OGS can have MAD applied. As for the software, there are two versions, each with its own OGS driver. A Windows desktop version is available as a graphical user interface and is ideal for the learning and teaching environment. High-throughput computing can even be achieved with this version via HTCondor if large projects want to be pursued in a computer lab. In addition to this desktop application, a Linux version is available equipped with MPI such that it can be run in parallel on a computer cluster. All releases can be downloaded from the MAD Codeplex site given below.

  6. Mid-water Software Tools and the Application to Processing and Analysis of the Latest Generation Multibeam Sonars

    Science.gov (United States)

    Gee, L.; Doucet, M.

    2010-12-01

    The latest generation of multibeam sonars now has the ability to map the water-column, along with the seafloor. Currently, the users of these sonars have a limited view of the mid-water data in real-time, and if they do store the data, they are restricted to replaying it only, with no ability for further analysis. The water-column data has the potential to address a number of research areas including detection of small targets (wrecks, etc.) above the seabed, mapping of fish and marine mammals and a wide range of physical oceanographic processes. However, researchers have been required to develop their own in-house software tools before they can even begin their study of the water column data. This paper describes the development of more general software tools for the full processing of raw sonar data (bathymetry, backscatter and water-column) to yield output products suitable for visualization in a 4D time-synchronized environment. The huge water-column data volumes generated by the new sonars, combined with the variety of data formats from the different sonar manufacturers, provides a significant challenge in the design and development of tools that can be applied to the wide variety of applications. The development of the mid-water tools on this project addressed this problem by using a unified way of storing the water column data in a generic water column format (GWC). The sonar data are converted into the GWC by re-integrating the water column packets with time-based navigation and attitude, such that downstream in the workflow, the tools will have access to all relevant data of any particular ping. Dependent on the application and the resolution requirements, the conversion process also allows simple sub-sampling. Additionally, each file is indexed to enable fast non-linear lookup and extraction of any packet type or packet type collection in the sonar file. These tools also fully exploit multi-core and hyper-threading technologies to maximize the throughput

  7. Genoviz Software Development Kit: Java tool kit for building genomics visualization applications

    Directory of Open Access Journals (Sweden)

    Chervitz Stephen A

    2009-08-01

    Full Text Available Abstract Background Visualization software can expose previously undiscovered patterns in genomic data and advance biological science. Results The Genoviz Software Development Kit (SDK is an open source, Java-based framework designed for rapid assembly of visualization software applications for genomics. The Genoviz SDK framework provides a mechanism for incorporating adaptive, dynamic zooming into applications, a desirable feature of genome viewers. Visualization capabilities of the Genoviz SDK include automated layout of features along genetic or genomic axes; support for user interactions with graphical elements (Glyphs in a map; a variety of Glyph sub-classes that promote experimentation with new ways of representing data in graphical formats; and support for adaptive, semantic zooming, whereby objects change their appearance depending on zoom level and zooming rate adapts to the current scale. Freely available demonstration and production quality applications, including the Integrated Genome Browser, illustrate Genoviz SDK capabilities. Conclusion Separation between graphics components and genomic data models makes it easy for developers to add visualization capability to pre-existing applications or build new applications using third-party data models. Source code, documentation, sample applications, and tutorials are available at http://genoviz.sourceforge.net/.

  8. Immunogenetic Management Software: a new tool for visualization and analysis of complex immunogenetic datasets.

    Science.gov (United States)

    Johnson, Z P; Eady, R D; Ahmad, S F; Agravat, S; Morris, T; Else, J; Lank, S M; Wiseman, R W; O'Connor, D H; Penedo, M C T; Larsen, C P; Kean, L S

    2012-04-01

    Here we describe the Immunogenetic Management Software (IMS) system, a novel web-based application that permits multiplexed analysis of complex immunogenetic traits that are necessary for the accurate planning and execution of experiments involving large animal models, including nonhuman primates. IMS is capable of housing complex pedigree relationships, microsatellite-based MHC typing data, as well as MHC pyrosequencing expression analysis of class I alleles. It includes a novel, automated MHC haplotype naming algorithm and has accomplished an innovative visualization protocol that allows users to view multiple familial and MHC haplotype relationships through a single, interactive graphical interface. Detailed DNA and RNA-based data can also be queried and analyzed in a highly accessible fashion, and flexible search capabilities allow experimental choices to be made based on multiple, individualized and expandable immunogenetic factors. This web application is implemented in Java, MySQL, Tomcat, and Apache, with supported browsers including Internet Explorer and Firefox on Windows and Safari on Mac OS. The software is freely available for distribution to noncommercial users by contacting Leslie.kean@emory.edu. A demonstration site for the software is available at http://typing.emory.edu/typing_demo , user name: imsdemo7@gmail.com and password: imsdemo. PMID:22080300

  9. A software tool for analysis and quantification of regional pulmonary ventilation using dynamic hyperpolarised-3He-MRI

    International Nuclear Information System (INIS)

    Purpose: 3He-MRI is able to visualize the regional distribution of lung ventilation with a temporal and spatial resolution so far unmatched by any other technique. The main of the study was the development of a new software tool for quantification of dynamic ventilation parameters in absolute physical units. Materials and Methods: During continuous breathing, a bolus of hyperpolarized 3He (300 ml) was applied at inspiration and a series of 168 coronal projection images simultaneously acquired using a 2D FLASH-sequence. Postprocessing software was developed to analyze the 3He distribution in the lung. After correction for lung motion, several ventilation parameters (rise time, delay time, 3He amount and 3He peak flow) were calculated. Due to normalization of signal intensities, these parameters are presented in absolute physical units. The data sets were analyzed on a ROI basis as well as on a pixel-by-pixel basis. Results: Using the developed software, the measurements were analyzed in 6 lung-healthy volunteers, in one patient after lung transplantation, and in one patient with lung emphysema. The volunteers' parameter maps of the pixel-based analysis showed an almost homogeneous distribution of the ventilation parameters within the lung. In the parameter maps of both patients, regions with poor ventilation were observed. Conclusion: The developed software permits an objective and quantitative analysis of regional lung ventilation in absolute physical units. The clinical significance of the parameters, however, has to be determined in larger clinical studies. The software may become valuable in grading and following pulmonary function as well as in monitoring any therapy. (orig.)

  10. 软件测试过程管理工具的设计与实现%Design and Implementation of Software Testing Process Management Tools

    Institute of Scientific and Technical Information of China (English)

    王象刚

    2014-01-01

    With the development of technology, the popularity of computers, software applications are increasingly being used. As software quality guarantee, software testing is particularly important. This paper first describes software testing management, process management and software testing tools analyzed, final design and implementation of software test management tool for a detailed explanation.%随着科技的发展,计算机的普及,软件的应用也越来越广泛。而作为软件质量的保障,软件测试显得尤为重要。本文先是对软件测试管理进行阐述,然后对软件测试过程管理工具进行了分析,最后对软件测试管理工具的设计与实现进行了详细的说明。

  11. Reliability of wind farm design tools in complex terrain : A comparative study of commercial software

    OpenAIRE

    Timander, Tobias; WESTERLUND, JIMMY

    2012-01-01

    A comparative study of two different approaches in wind energy simulations has been made where the aim was to investigate the performance of two commercially available tools. The study includes the linear model by WAsP and the computational fluid dynamic model of WindSim (also featuring an additional forest module). The case studied is a small wind farm located in the inland of Sweden featuring a fairly complex and forested terrain. The results showed similar estimations from both tools and i...

  12. Plots, Calculations and Graphics Tools (PCG2). Software Transfer Request Presentation

    Science.gov (United States)

    Richardson, Marilou R.

    2010-01-01

    This slide presentation reviews the development of the Plots, Calculations and Graphics Tools (PCG2) system. PCG2 is an easy to use tool that provides a single user interface to view data in a pictorial, tabular or graphical format. It allows the user to view the same display and data in the Control Room, engineering office area, or remote sites. PCG2 supports extensive and regular engineering needs that are both planned and unplanned and it supports the ability to compare, contrast and perform ad hoc data mining over the entire domain of a program's test data.

  13. A Systematic Mapping Study of Tools for Distributed Software Development Teams

    DEFF Research Database (Denmark)

    Tell, Paolo; Ali Babar, Muhammad

    .e., collaborative tools and innovative knowledge management systems) shows that lots of collaborative technologies have been reported, but knowledge management is being addressed by focusing on supporting awareness, which is being considered as important as the three elements of 3C model (i.e., communication...

  14. Experiences and perspectives with SRI's tools for software design and validation

    Science.gov (United States)

    Goguen, J.; Levitt, K. N.

    1982-01-01

    Development of tools that include the STP theorem poer and its associated Design Verification Systems; PHIL, a meta-programmable context sensitive structured editor; Pegasus, a system for support of graphical programming; and OBJ, an ultra high level programming language based on rewrite rules and abstract data type is reported.

  15. MoRFchibi SYSTEM: software tools for the identification of MoRFs in protein sequences.

    Science.gov (United States)

    Malhis, Nawar; Jacobson, Matthew; Gsponer, Jörg

    2016-07-01

    Molecular recognition features, MoRFs, are short segments within longer disordered protein regions that bind to globular protein domains in a process known as disorder-to-order transition. MoRFs have been found to play a significant role in signaling and regulatory processes in cells. High-confidence computational identification of MoRFs remains an important challenge. In this work, we introduce MoRFchibi SYSTEM that contains three MoRF predictors: MoRFCHiBi, a basic predictor best suited as a component in other applications, MoRFCHiBi_ Light, ideal for high-throughput predictions and MoRFCHiBi_ Web, slower than the other two but best for high accuracy predictions. Results show that MoRFchibi SYSTEM provides more than double the precision of other predictors. MoRFchibi SYSTEM is available in three different forms: as HTML web server, RESTful web server and downloadable software at: http://www.chibi.ubc.ca/faculty/joerg-gsponer/gsponer-lab/software/morf_chibi/. PMID:27174932

  16. Ident 1D - a novel software tool for an easy identification of material constitutive parameters

    International Nuclear Information System (INIS)

    Non-linear finite element computations make use of very sophisticated constitutive equations for description of materials behaviour. The first difficulty encountered by potential users is the gap existing between raw material characterisation on uniaxial specimens and the knowledge of the required equation's parameters. There are very few software for this particular task. IDENT 1D is a special software developed under Matlab language in our laboratory, which is able to provide a complete optimised parameters set for implemented models. The originality of IDENT 1D is that no initial estimation of the material parameters is requested of the user. Two main examples are described in this article: the Lemaitre and Chaboche creep law coupled with damage and a non unified cyclic law proposed by Contesti and Cailletaud with a separation of plastic and viscous strain terms which is called DDI model. For both laws, the identification method is completely described. Each method is then applied to a set of experimental data. In both cases, the results of the parameters identification show a very good agreement with experimental data. (authors)

  17. Software Tools for the Analysis of the Photocathode Response of Photomultiplier Vacuum Tubes

    CERN Document Server

    Fabbri, R

    2013-01-01

    The central institute of electronics (ZEA-2) in the Forschungszentrum Juelich (FZJ) has developed a system to scan the response of the photocathode of photomultiplier tubes (PMT). The PMT sits tight on a supporting structure, while a blue light emitting diode is moved along its surface by two stepper motors, spanning both the x and y coordinates. All the system is located in a light-tight box made by wood. A graphical software was developed in-situ to perform the scan operations under different configurations (e.g., the step size of the scan and the number of measurements per point). During each point measurement the current output generated in the vacuum photomultiplier is processed in sequence by a pre-amplifier (mainly to convert the current signal into a voltage signal), an amplifier, and by an ADC module (typically a CAEN N957). The information of the measurement is saved in files at the end of the scan. Recently, software based on the CERN ROOT and on the Qt libraries was developed to help the user anal...

  18. A software tool to estimate the dynamic behaviour of the IP2C samples as sensors for didactic purposes

    International Nuclear Information System (INIS)

    Ionic Polymer Polymer Composites (IP2Cs) are emerging materials used to realize motion actuators and sensors. In the former case a voltage input is able to cause the membrane to bend while in the latter case by bending an IP2C membrane, a voltage output is obtained. In this paper authors introduce a software tool able to estimate the dynamic behaviour for sensors based on IP2Cs working in air. In the proposed tool, geometrical quantities that rule the sensing properties of IP2C-based transducers are taken into account together with their dynamic characteristics. A graphical interface (GUI) has been developed in order to give a useful tool that allows the user to understand the behaviour and the role of the parameters involved in the transduction phenomena. The tool is based on the idea that a graphical user interface will allow persons not skilled in IP2C materials to observe their behaviour and to analyze their characteristics. This could greatly increase the interest of researchers towards this new class of transducers; moreover, it can support the educational activity of students involved in advanced academical courses.

  19. An assessment of a software simulation tool for lidar atmosphere and ocean measurements

    Science.gov (United States)

    Powell, K. A.; Vaughan, M.; Burton, S. P.; Hair, J. W.; Hostetler, C. A.; Kowch, R. S.

    2014-12-01

    A high-fidelity lidar simulation tool is used to generate synthetic lidar backscatter data that closely matches the expected performance of various lidars, including the noise characteristics inherent to analog detection and uncertainties related to the measurement environment. This tool supports performance trade studies and scientific investigations for both the Cloud-Aerosol Lidar with Orthogonal Polarization (CALIOP), which flies aboard Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observations (CALIPSO), and the NASA Langley Research Center airborne High Spectral Resolution Lidar (HSRL). CALIOP measures profiles of attenuated backscatter coefficients (532 and 1064 nm) and volume depolarization ratios at 532 nm. HSRL measures the same profiles plus volume depolarization at 1064 nm and a molecular-only profile which allows for the direct retrieval of aerosol extinction and backscatter profiles at 532 nm. The simulation tool models both the fundamental physics of the lidar instruments and the signals generated from aerosols, clouds, and the ocean surface and subsurface. This work presents the results of a study conducted to verify the accuracy of the simulated data using data from both HSRL and CALIOP. The tool was tuned to CALIOP instrument settings and the model atmosphere was defined using profiles of attenuated backscatter and depolarization obtained by HSRL during underflights of CALIPSO. The validated HSRL data provide highly accurate measurements of the particulate intensive and extensive optical properties and thus were considered as the truth atmosphere. The resulting simulated data were processed through the CALIPSO data analysis system. Comparisons showed good agreement between the simulated and CALIOP data. This verifies the accuracy of the tool to support studies involving the characterization of instrument components and advanced data analysis techniques. The capability of the tool to simulate ocean surface scattering and subsurface

  20. The GlycanBuilder: a fast, intuitive and flexible software tool for building and displaying glycan structures

    Directory of Open Access Journals (Sweden)

    Dell Anne

    2007-08-01

    Full Text Available Abstract Background Carbohydrates play a critical role in human diseases and their potential utility as biomarkers for pathological conditions is a major driver for characterization of the glycome. However, the additional complexity of glycans compared to proteins and nucleic acids has slowed the advancement of glycomics in comparison to genomics and proteomics. The branched nature of carbohydrates, the great diversity of their constituents and the numerous alternative symbolic notations, make the input and display of glycans not as straightforward as for example the amino-acid sequence of a protein. Every glycoinformatic tool providing a user interface would benefit from a fast, intuitive, appealing mechanism for input and output of glycan structures in a computer readable format. Results A software tool for building and displaying glycan structures using a chosen symbolic notation is described here. The "GlycanBuilder" uses an automatic rendering algorithm to draw the saccharide symbols and to place them on the drawing board. The information about the symbolic notation is derived from a configurable graphical model as a set of rules governing the aspect and placement of residues and linkages. The algorithm is able to represent a structure using only few traversals of the tree and is inherently fast. The tool uses an XML format for import and export of encoded structures. Conclusion The rendering algorithm described here is able to produce high-quality representations of glycan structures in a chosen symbolic notation. The automated rendering process enables the "GlycanBuilder" to be used both as a user-independent component for displaying glycans and as an easy-to-use drawing tool. The "GlycanBuilder" can be integrated in web pages as a Java applet for the visual editing of glycans. The same component is available as a web service to render an encoded structure into a graphical format. Finally, the "GlycanBuilder" can be integrated into other

  1. SU-E-J-199: A Software Tool for Quality Assurance of Online Replanning with MR-Linac

    International Nuclear Information System (INIS)

    Purpose: To develop a quality assurance software tool, ArtQA, capable of automatically checking radiation treatment plan parameters, verifying plan data transfer from treatment planning system (TPS) to record and verify (R&V) system, performing a secondary MU calculation considering the effect of magnetic field from MR-Linac, and verifying the delivery and plan consistency, for online replanning. Methods: ArtQA was developed by creating interfaces to TPS (e.g., Monaco, Elekta), R&V system (Mosaiq, Elekta), and secondary MU calculation system. The tool obtains plan parameters from the TPS via direct file reading, and retrieves plan data both transferred from TPS and recorded during the actual delivery in the R&V system database via open database connectivity and structured query language. By comparing beam/plan datasets in different systems, ArtQA detects and outputs discrepancies between TPS, R&V system and secondary MU calculation system, and delivery. To consider the effect of 1.5T transverse magnetic field from MR-Linac in the secondary MU calculation, a method based on modified Clarkson integration algorithm was developed and tested for a series of clinical situations. Results: ArtQA is capable of automatically checking plan integrity and logic consistency, detecting plan data transfer errors, performing secondary MU calculations with or without a transverse magnetic field, and verifying treatment delivery. The tool is efficient and effective for pre- and post-treatment QA checks of all available treatment parameters that may be impractical with the commonly-used visual inspection. Conclusion: The software tool ArtQA can be used for quick and automatic pre- and post-treatment QA check, eliminating human error associated with visual inspection. While this tool is developed for online replanning to be used on MR-Linac, where the QA needs to be performed rapidly as the patient is lying on the table waiting for the treatment, ArtQA can be used as a general QA tool

  2. The pyPHaz software, an interactive tool to analyze and visualize results from probabilistic hazard assessments

    Science.gov (United States)

    Tonini, Roberto; Selva, Jacopo; Costa, Antonio; Sandri, Laura

    2014-05-01

    Probabilistic Hazard Assessment (PHA) is becoming an essential tool for risk mitigation policies, since it allows to quantify the hazard due to hazardous phenomena and, differently from the deterministic approach, it accounts for both aleatory and epistemic uncertainties. On the other hand, one of the main disadvantages of PHA methods is that their results are not easy to understand and interpret by people who are not specialist in probabilistic tools. For scientists, this leads to the issue of providing tools that can be easily used and understood by decision makers (i.e., risk managers or local authorities). The work here presented fits into the problem of simplifying the transfer between scientific knowledge and land protection policies, by providing an interface between scientists, who produce PHA's results, and decision makers, who use PHA's results for risk analyses. In this framework we present pyPHaz, an open tool developed and designed to visualize and analyze PHA results due to one or more phenomena affecting a specific area of interest. The software implementation has been fully developed with the free and open-source Python programming language and some featured Python-based libraries and modules. The pyPHaz tool allows to visualize the Hazard Curves (HC) calculated in a selected target area together with different levels of uncertainty (mean and percentiles) on maps that can be interactively created and modified by the user, thanks to a dedicated Graphical User Interface (GUI). Moreover, the tool can be used to compare the results of different PHA models and to merge them, by creating ensemble models. The pyPHaz software has been designed with the features of storing and accessing all the data through a MySQL database and of being able to read as input the XML-based standard file formats defined in the frame of GEM (Global Earthquake Model). This format model is easy to extend also to any other kind of hazard, as it will be shown in the applications

  3. Data Assimilation Tools for CO2 Reservoir Model Development – A Review of Key Data Types, Analyses, and Selected Software

    Energy Technology Data Exchange (ETDEWEB)

    Rockhold, Mark L.; Sullivan, E. C.; Murray, Christopher J.; Last, George V.; Black, Gary D.

    2009-09-30

    Pacific Northwest National Laboratory (PNNL) has embarked on an initiative to develop world-class capabilities for performing experimental and computational analyses associated with geologic sequestration of carbon dioxide. The ultimate goal of this initiative is to provide science-based solutions for helping to mitigate the adverse effects of greenhouse gas emissions. This Laboratory-Directed Research and Development (LDRD) initiative currently has two primary focus areas—advanced experimental methods and computational analysis. The experimental methods focus area involves the development of new experimental capabilities, supported in part by the U.S. Department of Energy’s (DOE) Environmental Molecular Science Laboratory (EMSL) housed at PNNL, for quantifying mineral reaction kinetics with CO2 under high temperature and pressure (supercritical) conditions. The computational analysis focus area involves numerical simulation of coupled, multi-scale processes associated with CO2 sequestration in geologic media, and the development of software to facilitate building and parameterizing conceptual and numerical models of subsurface reservoirs that represent geologic repositories for injected CO2. This report describes work in support of the computational analysis focus area. The computational analysis focus area currently consists of several collaborative research projects. These are all geared towards the development and application of conceptual and numerical models for geologic sequestration of CO2. The software being developed for this focus area is referred to as the Geologic Sequestration Software Suite or GS3. A wiki-based software framework is being developed to support GS3. This report summarizes work performed in FY09 on one of the LDRD projects in the computational analysis focus area. The title of this project is Data Assimilation Tools for CO2 Reservoir Model Development. Some key objectives of this project in FY09 were to assess the current state

  4. ATAQS: A computational software tool for high throughput transition optimization and validation for selected reaction monitoring mass spectrometry

    Directory of Open Access Journals (Sweden)

    Ramos Hector

    2011-03-01

    proteomics via SRM is a powerful new technique that enables the reproducible and accurate identification and quantification of sets of proteins of interest. ATAQS is the first open-source software that supports all steps of the targeted proteomics workflow. ATAQS also provides software API (Application Program Interface documentation that enables the addition of new algorithms to each of the workflow steps. The software, installation guide and sample dataset can be found in http://tools.proteomecenter.org/ATAQS/ATAQS.html

  5. Software tool for analysing the family shopping basket without candidate generation

    OpenAIRE

    Roberto Carlos Naranjo Cuervo; Luz Marina Sierra Martínez

    2010-01-01

    Tools leading to useful knowledge being obtained for supporting marketing decisions being taken are currently needed in the e-commerce environment. A process is needed for this which uses a series of techniques for data-processing; data-mining is one such technique enabling automatic information discovery. This work presents the association rules as a suitable technique for dis-covering how customers buy from a company offering business to consumer (B2C) e-business, aimed at supporting decisi...

  6. TIP-EXE: A software tool for studying the use and understanding of procedural documents

    OpenAIRE

    Ganier, Franck; Querrec, Ronan

    2012-01-01

    International audience Research problem: When dealing with procedural documents, individuals sometimes encounter comprehension problems due to poor information design. Researchers studying the use and understanding of procedural documents, as well as technical writers charged with the design of these documents, or usability specialists evaluating their quality, would all benefit from tools allowing them to collect real-time data concerning user behavior in user-centered studies. With this ...

  7. A Bluetooth low-energy capture and analysis tool using software-defined radio

    OpenAIRE

    Kilgour, Christopher David

    2013-01-01

    Wireless protocol analysis is a useful tool for researchers, engineers, and network security professionals. Exhaustive BTLE sniffing – the full capture and analysis of Bluetooth Low-Energy radio transmissions – has been out of reach for individuals to apply to research, engineering, and security analysis tasks. Discovering and following an arbitrary Bluetooth frequency-hopping pattern with a cheap narrow-band receiver is a complex undertaking with little chance of success. Further, the high-e...

  8. Automatic Tools for Software Quality Analysis in a Project-Based-Learning Course

    OpenAIRE

    Montero Martínez, Juan Manuel; San Segundo Hernández, Rubén; Córdoba Herralde, Ricardo de; Marin de la Barcena, Amparo; Zlotnik, Alexander

    2009-01-01

    Over the last decade, the “Anytime, anywhere” paradigm has gained pace in Higher Education teaching, leading many universities to innovate in pedagogical strategies based on Internet and Web access technologies. Development of remote access technologies has enabled teachers to achieve higher levels of efficiency while students can access tools and resources no longer constrained by time or location. Additionally, students can submit their assignments, be evaluated and be provided feedbac...

  9. Use of slide presentation software as a tool to measure hip arthroplasty wear.

    Science.gov (United States)

    Yun, Ho Hyun; Jajodia, Nirmal K; Myung, Jae Sung; Oh, Jong Keon; Park, Sang Won; Shon, Won Yong

    2009-12-01

    The authors propose a manual measurement method for wear in total hip arthroplasty (PowerPoint method) based on the well-known Microsoft PowerPoint software (Microsoft Corporation, Redmond, Wash). In addition, the accuracy and reproducibility of the devised method were quantified and compared with two methods previously described by Livermore and Dorr, and accuracies were determined at different degrees of wear. The 57 hips recruited were allocated to: class 1 (retrieval series), class 2 (clinical series), and class 3 (a repeat film analysis series). The PowerPoint method was found to have good reproducibility and to better detect wear differences between classes. The devised method can be easily used for recording wear at follow-up visits and could be used as a supplementary method when computerized methods cannot be employed. PMID:19896061

  10. SIGSAC Software: A tool for the Management of Chronic Disease and Telecare.

    Science.gov (United States)

    Claudia, Bustamante; Claudia, Alcayaga; Ilta, Lange; Iñigo, Meza

    2012-01-01

    Chronic disease management is highly complex because multiple interventions are required to improve clinical outcomes. From the patient's perspective, his main problems are dealing with self-management without support and feeling isolated between clinical visits. A strategy for providing continuous self-management support is the use of communication technologies, such as the telephone. However, to be efficient and effective, an information system is required for telecare planning and follows up. The use of electronic clinical records facilitates the implementation of telecare, but those systems often do not allow to combine usual care (visits to the health clinics) with telecare. This paper presents the experience of developing an application called SIGSAC (Software de Información, Gestión y Seguimiento para el Autocuidado Crónico) for Chronic Disease Management and Telecare follow up.

  11. Combining On-Line Characterization Tools with Modern Software Environments for Optimal Operation of Polymerization Processes

    Directory of Open Access Journals (Sweden)

    Navid Ghadipasha

    2016-02-01

    Full Text Available This paper discusses the initial steps towards the formulation and implementation of a generic and flexible model centric framework for integrated simulation, estimation, optimization and feedback control of polymerization processes. For the first time it combines the powerful capabilities of the automatic continuous on-line monitoring of polymerization system (ACOMP, with a modern simulation, estimation and optimization software environment towards an integrated scheme for the optimal operation of polymeric processes. An initial validation of the framework was performed for modelling and optimization using literature data, illustrating the flexibility of the method to apply under different systems and conditions. Subsequently, off-line capabilities of the system were fully tested experimentally for model validations, parameter estimation and process optimization using ACOMP data. Experimental results are provided for free radical solution polymerization of methyl methacrylate.

  12. On a Formal Tool for Reasoning About Flight Software Cost Analysis

    Science.gov (United States)

    Spagnuolo, John N., Jr.; Stukes, Sherry A.

    2013-01-01

    A report focuses on the development of flight software (FSW) cost estimates for 16 Discovery-class missions at JPL. The techniques and procedures developed enabled streamlining of the FSW analysis process, and provided instantaneous confirmation that the data and processes used for these estimates were consistent across all missions. The research provides direction as to how to build a prototype rule-based system for FSW cost estimation that would provide (1) FSW cost estimates, (2) explanation of how the estimates were arrived at, (3) mapping of costs, (4) mathematical trend charts with explanations of why the trends are what they are, (5) tables with ancillary FSW data of interest to analysts, (6) a facility for expert modification/enhancement of the rules, and (7) a basis for conceptually convenient expansion into more complex, useful, and general rule-based systems.

  13. Using Teamcenter engineering software for a successive punching tool lifecycle management

    Science.gov (United States)

    Blaga, F.; Pele, A.-V.; Stǎnǎşel, I.; Buidoş, T.; Hule, V.

    2015-11-01

    The paper presents studies and researches results of the implementation of Teamcenter (TC) integrated management of a product lifecycle, in a virtual enterprise. The results are able to be implemented also in a real enterprise. The product was considered a successive punching and cutting tool, designed to materialize a metal sheet part. The paper defines the technical documentation flow (flow of information) in the process of constructive computer aided design of the tool. After the design phase is completed a list of parts is generated containing standard or manufactured components (BOM, Bill of Materials). The BOM may be exported to MS Excel (.xls) format and can be transferred to other departments of the company in order to supply the necessary materials and resources to achieve the final product. This paper describes the procedure to modify or change certain dimensions of sheet metal part obtained by punching. After 3D and 2D design, the digital prototype of punching tool moves to following lifecycle phase of the manufacturing process. For each operation of the technological process the corresponding phases are described in detail. Teamcenter enables to describe manufacturing company structure, underlying workstations that carry out various operations of manufacturing process. The paper revealed that the implementation of Teamcenter PDM in a company, improves efficiency of managing product information, eliminating time working with search, verification and correction of documentation, while ensuring the uniqueness and completeness of the product data.

  14. YANA – a software tool for analyzing flux modes, gene-expression and enzyme activities

    Directory of Open Access Journals (Sweden)

    Engels Bernd

    2005-06-01

    Full Text Available Abstract Background A number of algorithms for steady state analysis of metabolic networks have been developed over the years. Of these, Elementary Mode Analysis (EMA has proven especially useful. Despite its low user-friendliness, METATOOL as a reliable high-performance implementation of the algorithm has been the instrument of choice up to now. As reported here, the analysis of metabolic networks has been improved by an editor and analyzer of metabolic flux modes. Analysis routines for expression levels and the most central, well connected metabolites and their metabolic connections are of particular interest. Results YANA features a platform-independent, dedicated toolbox for metabolic networks with a graphical user interface to calculate (integrating METATOOL, edit (including support for the SBML format, visualize, centralize, and compare elementary flux modes. Further, YANA calculates expected flux distributions for a given Elementary Mode (EM activity pattern and vice versa. Moreover, a dissection algorithm, a centralization algorithm, and an average diameter routine can be used to simplify and analyze complex networks. Proteomics or gene expression data give a rough indication of some individual enzyme activities, whereas the complete flux distribution in the network is often not known. As such data are noisy, YANA features a fast evolutionary algorithm (EA for the prediction of EM activities with minimum error, including alerts for inconsistent experimental data. We offer the possibility to include further known constraints (e.g. growth constraints in the EA calculation process. The redox metabolism around glutathione reductase serves as an illustration example. All software and documentation are available for download at http://yana.bioapps.biozentrum.uni-wuerzburg.de. Conclusion A graphical toolbox and an editor for METATOOL as well as a series of additional routines for metabolic network analyses constitute a new user

  15. Models, methods and software tools to evaluate the quality of informational and educational resources

    International Nuclear Information System (INIS)

    The paper studies the modern methods and tools to evaluate the quality of data systems, which allows determining the specificity of informational and educational resources (IER). The author has developed a model of IER quality management at all stages of the life cycle and an integrated multi-level hierarchical system of IER quality assessment, taking into account both information properties and targeted resource assignment. The author presents a mathematical and algorithmic justification of solving the problem of IER quality management, and offers data system to assess the IER quality

  16. Software tools for 3d modeling as a part of design and technology in primary school

    OpenAIRE

    Mihovec, Nastja

    2013-01-01

    There are numerous programs that enable 3D modeling. We can choose from various free programs or the ones that we must pay for. Many designers and engineers use payable programs such as AutoCad, Maya, ProEngineer, Cinema 3D, SolidWorks, etc. In their opinion these programs give their users more than the free ones mainly because of their better modeling quality, tools, functions, easy usage, support, maintenance, etc. Free program developers try very hard to convince these users to reconsider,...

  17. A Critical Study of Effect of Web-Based Software Tools in Finding and Sharing Digital Resources--A Literature Review

    Science.gov (United States)

    Baig, Muntajeeb Ali

    2010-01-01

    The purpose of this paper is to review the effect of web-based software tools for finding and sharing digital resources. A positive correlation between learning and studying through online tools has been found in recent researches. In traditional classroom, searching resources are limited to the library and sharing of resources is limited to the…

  18. Safety assessment driving radioactive waste management solutions (SADRWMS Methodology) implemented in a software tool (SAFRAN)

    Energy Technology Data Exchange (ETDEWEB)

    Kinker, M., E-mail: M.Kinker@iaea.org [International Atomic Energy Agency (IAEA), Vienna (Austria); Avila, R.; Hofman, D., E-mail: rodolfo@facilia.se [FACILIA AB, Stockholm (Sweden); Jova Sed, L., E-mail: jovaluis@gmail.com [Centro Nacional de Seguridad Nuclear (CNSN), La Habana (Cuba); Ledroit, F., E-mail: frederic.ledroit@irsn.fr [IRSN PSN-EXP/SSRD/BTE, (France)

    2013-07-01

    In 2004, the International Atomic Energy Agency (IAEA) organized the International Project on Safety Assessment Driving Radioactive Waste Management Solutions (SADRWMS) to examine international approaches to safety assessment for predisposal management of radioactive waste. The initial outcome of the SADRWMS Project was achieved through the development of flowcharts which could be used to improve the mechanisms for applying safety assessment methodologies to predisposal management of radioactive waste. These flowcharts have since been incorporated into DS284 (General Safety Guide on the Safety Case and Safety Assessment for Predisposal Management of Radioactive Waste), and were also considered during the early development stages of the Safety Assessment Framework (SAFRAN) Tool. In 2009 the IAEA presented DS284 to the IAEA Waste Safety Standards Committee, during which it was proposed that the graded approach to safety case and safety assessment be illustrated through the development of Safety Reports for representative predisposal radioactive waste management facilities and activities. To oversee the development of these reports, it was agreed to establish the International Project on Complementary Safety Reports: Development and Application to Waste Management Facilities (CRAFT). The goal of the CRAFT project is to develop complementary reports by 2014, which the IAEA could then publish as IAEA Safety Reports. The present work describes how the DS284 methodology and SAFRAN Tool can be applied in the development and review of the safety case and safety assessment to a range of predisposal waste management facilities or activities within the Region. (author)

  19. Regional Economic Accounting (REAcct). A software tool for rapidly approximating economic impacts

    Energy Technology Data Exchange (ETDEWEB)

    Ehlen, Mark Andrew; Vargas, Vanessa N.; Loose, Verne William; Starks, Shirley J.; Ellebracht, Lory A.

    2011-07-01

    This paper describes the Regional Economic Accounting (REAcct) analysis tool that has been in use for the last 5 years to rapidly estimate approximate economic impacts for disruptions due to natural or manmade events. It is based on and derived from the well-known and extensively documented input-output modeling technique initially presented by Leontief and more recently further developed by numerous contributors. REAcct provides county-level economic impact estimates in terms of gross domestic product (GDP) and employment for any area in the United States. The process for using REAcct incorporates geospatial computational tools and site-specific economic data, permitting the identification of geographic impact zones that allow differential magnitude and duration estimates to be specified for regions affected by a simulated or actual event. Using these data as input to REAcct, the number of employees for 39 directly affected economic sectors (including 37 industry production sectors and 2 government sectors) are calculated and aggregated to provide direct impact estimates. Indirect estimates are then calculated using Regional Input-Output Modeling System (RIMS II) multipliers. The interdependent relationships between critical infrastructures, industries, and markets are captured by the relationships embedded in the inputoutput modeling structure.

  20. Determination of Flux linkage Characteristics and Inductance of a Submersible Switched Reluctance Motor using Software Tools

    Directory of Open Access Journals (Sweden)

    Sundaram Maruthachalam

    2011-01-01

    Full Text Available Problem statement: The Switched Reluctance Motor (SRM is an old member of the Electric Machines Family. It’s simple structure, ruggedness and inexpensive manufacturing capability make it more attractive for Industrial applications. Now, the applications of switched reluctance motors in various industrial fields are tried by many engineers. However, switched reluctance motors are not used so far in submersible underwater motor for agriculture purposee. The torque developed by an SRM is dependent on the change of flux-linkage and rotor position. The flux linkage characteristic of the motor is required to make the control circuit. Since the SRM is non-linear in nature, estimation and calculation of the flux linkage characteristics is very difficult. Approach: With the flux tube method concept a simple algorithm is being developed in a MATLAB. ANSYS Software is used to determine the flux distribution at various rotor positions. Results: The aligned and unaligned flux linkage values for theoretical calculation at a current of 7 A is 72.7 mwb and 13.79 mwb respectively. With FEA simulation the obtained value is 92.73 mwb and 19.175. Conclusion: In this and, a simplified method for the determination of flux linkage characteristics of submersible SRM using MATLAB has been presented. The obtained value has been validated with the ANSYS FEM method. the calculated unaligned and aligned inductance values of a 4- phase, 3 hp, 220 V Submersible SRM using simplified MATLAB method very much matches with the ANSYS FEM Method.

  1. Development of a new software tool, based on ANN technology, in neutron spectrometry and dosimetry research

    International Nuclear Information System (INIS)

    Artificial Intelligence is a branch of study which enhances the capability of computers by giving them human-like intelligence. The brain architecture has been extensively studied and attempts have been made to emulate it as in the Artificial Neural Network technology. A large variety of neural network architectures have been developed and they have gained wide-spread popularity over the last few decades. Their application is considered as a substitute for many classical techniques that have been used for many years, as in the case of neutron spectrometry and dosimetry research areas. In previous works, a new approach called Robust Design of Artificial Neural network was applied to build an ANN topology capable to solve the neutron spectrometry and dosimetry problems within the Mat lab programming environment. In this work, the knowledge stored at Mat lab ANN's synaptic weights was extracted in order to develop for first time a customized software application based on ANN technology, which is proposed to be used in the neutron spectrometry and simultaneous dosimetry fields. (Author)

  2. A software tool for geostatistical analysis of thermal response test data: GA-TRT

    Science.gov (United States)

    Focaccia, Sara; Tinti, Francesco; Bruno, Roberto

    2013-09-01

    In this paper we present a new method (DCE - Drift and Conditional Estimation), coupling Infinite Line Source (ILS) theory with geostatistics, to interpret thermal response test (TRT) data and the relative implementing user-friendly software (GA-TRT). Many methods (analytical and numerical) currently exist to analyze TRT data. The innovation derives from the fact that we use a probabilistic approach, able to overcome, without excessively complicated calculations, many interpretation problems (choice of the guess value of ground volumetric heat capacity, identification of the fluctuations of recorded data, inability to provide a measure of the precision of the estimates obtained) that cannot be solved otherwise. The new procedure is based on a geostatistical drift analysis of temperature records which leads to a precise equivalent ground thermal conductivity (λg) estimation, confirmed by the calculation of its estimation variance. Afterwards, based on λg, a monovariate regression on the original data allows for the identification of the theoretical relationship between ground volumetric heat capacity (cg) and borehole thermal resistance (Rb). By assuming the monovariate Probability Distribution Function (PDF) for each variable, the joint conditional PDF to the cg-Rb relationship is found; finally, the conditional expectation allows for the identification of the correct and optimal couple of the cg-Rb estimated values.

  3. CAGO: a software tool for dynamic visual comparison and correlation measurement of genome organization.

    Directory of Open Access Journals (Sweden)

    Yi-Feng Chang

    Full Text Available CAGO (Comparative Analysis of Genome Organization is developed to address two critical shortcomings of conventional genome atlas plotters: lack of dynamic exploratory functions and absence of signal analysis for genomic properties. With dynamic exploratory functions, users can directly manipulate chromosome tracks of a genome atlas and intuitively identify distinct genomic signals by visual comparison. Signal analysis of genomic properties can further detect inconspicuous patterns from noisy genomic properties and calculate correlations between genomic properties across various genomes. To implement dynamic exploratory functions, CAGO presents each genome atlas in Scalable Vector Graphics (SVG format and allows users to interact with it using a SVG viewer through JavaScript. Signal analysis functions are implemented using R statistical software and a discrete wavelet transformation package waveslim. CAGO is not only a plotter for generating complex genome atlases, but also a platform for exploring genome atlases with dynamic exploratory functions for visual comparison and with signal analysis for comparing genomic properties across multiple organisms. The web-based application of CAGO, its source code, user guides, video demos, and live examples are publicly available and can be accessed at http://cbs.ym.edu.tw/cago.

  4. Development of a new software tool, based on ANN technology, in neutron spectrometry and dosimetry research

    Energy Technology Data Exchange (ETDEWEB)

    Ortiz R, J.M.; Martinez B, M.R.; Vega C, H.R. [Universidad Autonoma de Zacatecas, Av. Ramon Lopez Velarde 801, A.P. 336, 98000 Zacatecas (Mexico)

    2007-07-01

    Artificial Intelligence is a branch of study which enhances the capability of computers by giving them human-like intelligence. The brain architecture has been extensively studied and attempts have been made to emulate it as in the Artificial Neural Network technology. A large variety of neural network architectures have been developed and they have gained wide-spread popularity over the last few decades. Their application is considered as a substitute for many classical techniques that have been used for many years, as in the case of neutron spectrometry and dosimetry research areas. In previous works, a new approach called Robust Design of Artificial Neural network was applied to build an ANN topology capable to solve the neutron spectrometry and dosimetry problems within the Mat lab programming environment. In this work, the knowledge stored at Mat lab ANN's synaptic weights was extracted in order to develop for first time a customized software application based on ANN technology, which is proposed to be used in the neutron spectrometry and simultaneous dosimetry fields. (Author)

  5. Data and software tools for gamma radiation spectral threat detection and nuclide identification algorithm development and evaluation

    International Nuclear Information System (INIS)

    The detection of radiological and nuclear threats is extremely important to national security. The federal government is spending significant resources developing new detection systems and attempting to increase the performance of existing ones. The detection of illicit radionuclides that may pose a radiological or nuclear threat is a challenging problem complicated by benign radiation sources (e.g., cat litter and medical treatments), shielding, and large variations in background radiation. Although there is a growing acceptance within the community that concentrating efforts on algorithm development (independent of the specifics of fully assembled systems) has the potential for significant overall system performance gains, there are two major hindrances to advancements in gamma spectral analysis algorithms under the current paradigm: access to data and common performance metrics along with baseline performance measures. Because many of the signatures collected during performance measurement campaigns are classified, dissemination to algorithm developers is extremely limited. This leaves developers no choice but to collect their own data if they are lucky enough to have access to material and sensors. This is often combined with their own definition of metrics for measuring performance. These two conditions make it all but impossible for developers and external reviewers to make meaningful comparisons between algorithms. Without meaningful comparisons, performance advancements become very hard to achieve and (more importantly) recognize. The objective of this work is to overcome these obstacles by developing and freely distributing real and synthetically generated gamma-spectra data sets as well as software tools for performance evaluation with associated performance baselines to national labs, academic institutions, government agencies, and industry. At present, datasets for two tracks, or application domains, have been developed: one that includes temporal

  6. Data and software tools for gamma radiation spectral threat detection and nuclide identification algorithm development and evaluation

    Science.gov (United States)

    Portnoy, David; Fisher, Brian; Phifer, Daniel

    2015-06-01

    The detection of radiological and nuclear threats is extremely important to national security. The federal government is spending significant resources developing new detection systems and attempting to increase the performance of existing ones. The detection of illicit radionuclides that may pose a radiological or nuclear threat is a challenging problem complicated by benign radiation sources (e.g., cat litter and medical treatments), shielding, and large variations in background radiation. Although there is a growing acceptance within the community that concentrating efforts on algorithm development (independent of the specifics of fully assembled systems) has the potential for significant overall system performance gains, there are two major hindrances to advancements in gamma spectral analysis algorithms under the current paradigm: access to data and common performance metrics along with baseline performance measures. Because many of the signatures collected during performance measurement campaigns are classified, dissemination to algorithm developers is extremely limited. This leaves developers no choice but to collect their own data if they are lucky enough to have access to material and sensors. This is often combined with their own definition of metrics for measuring performance. These two conditions make it all but impossible for developers and external reviewers to make meaningful comparisons between algorithms. Without meaningful comparisons, performance advancements become very hard to achieve and (more importantly) recognize. The objective of this work is to overcome these obstacles by developing and freely distributing real and synthetically generated gamma-spectra data sets as well as software tools for performance evaluation with associated performance baselines to national labs, academic institutions, government agencies, and industry. At present, datasets for two tracks, or application domains, have been developed: one that includes temporal

  7. Using Numerical Models in the Development of Software Tools for Risk Management of Accidents with Oil and Inert Spills

    Science.gov (United States)

    Fernandes, R.; Leitão, P. C.; Braunschweig, F.; Lourenço, F.; Galvão, P.; Neves, R.

    2012-04-01

    substances, helping in the management of the crisis, in the distribution of response resources, or prioritizing specific areas. They can also be used for detection of pollution sources. However, the resources involved, and the scientific and technological levels needed in the manipulation of numerical models, had both limited the interoperability between operational models, monitoring tools and decision-support software tools. The increasing predictive capacity of metocean conditions and fate and behaviour of pollutants spilt at sea or costal zones, and the presence of monitoring tools like vessel traffic control systems, can both provide a safer support for decision-making in emergency or planning issues associated to pollution risk management, especially if used in an integrated way. Following this approach, and taking advantage of an integrated framework developed in ARCOPOL (www.arcopol.eu) and EASYCO (www.project-easy.info) projects, three innovative model-supported software tools were developed and applied in the Atlantic Area, and / or the Portuguese Coast. Two of these tools are used for spill model simulations - a web-based interface (EASYCO web bidirectional tool) and an advanced desktop application (MOHID Desktop Spill Simulator) - both of them allowing end user to have control over the model simulations. Parameters such as date and time of the event, location and oil spill volume are provided the users; these interactive tools also integrate best available metocean forecasts (waves, meteorological, hydrodynamics) from different institutions in the Atlantic Area. Metocean data are continuously gathered from remote THREDDS data servers (using OPENDAP) or ftp sites, and then automatically interpolated and pre-processed to be available for the simulators. These simulation tools developed can also import initial data and export results from/to remote servers, using OGC WFS services. Simulations are provided to end user in a matter of seconds, and thus, can be very

  8. Software tools and preliminary design of a control system for the 40m OAN radiotelescope

    Science.gov (United States)

    de Vicente, P.; Bolaño, R.

    2004-07-01

    The Observatorio Astronómico Nacional (OAN) is building a 40m radiotelescope in its facilities in Yebes (Spain) which will be delivered by April 2004. The servosystem will be controlled by an ACU (Antenna Control Unit), a real time computer running VxWorks which will be commanded from a remote computer (RCC) or from a local computer (LCC) which will act as console. We present the tools we have chosen to develop and use the control system for the RCC and the criteria followed for the choices we made. We also present a preliminary design of the control system on which we are currently working. The RCC will run a server which communicates with the ACU using sockets and with the clients, receivers and backends using OmniOrb, a free implementation of CORBA. Clients running Python will allow the users to control the antenna from any host connected to a LAN or a secure Internet connection.

  9. MAAC: a software tool for user authentication and access control to the electronic patient record in an open distributed environment

    Science.gov (United States)

    Motta, Gustavo H.; Furuie, Sergio S.

    2004-04-01

    Designing proper models for authorization and access control for the electronic patient record (EPR) is essential to wide scale use of the EPR in large health organizations. This work presents MAAC (Middleware for Authentication and Access Control), a tool that implements a contextual role-based access control (RBAC) authorization model. RBAC regulates user"s access to computers resources based on their organizational roles. A contextual authorization uses environmental information available at access-request time, like user/patient relationship, in order to decide whether a user has the right to access an EPR resource. The software architecture where MAAC is implemented uses Lightweight Directory Access Protocol, Java programming language and the CORBA/OMG standards CORBA Security Service and Resource Access Decision Facility. With those open and distributed standards, heterogeneous EPR components can request user authentication and access authorization services in a unified and consistent fashion across multiple platforms.

  10. IMPACTO DEL USO DEL SOFTWARE CMAP-TOOLS EN LA TÉCNICA DE LOS MAPAS CONCEPTUALES

    OpenAIRE

    Susy Karina Dávila Panduro; Carlos Antonio Li Loo Kung

    2012-01-01

    La investigación tuvo como objetivo: aplicar el software Cmap-Tools en el uso de mapas conceptuales para cátedras de Ciencias Sociales en la Facultad de Ciencias de la Educación y Humanidades de la UNAP en la ciudad de Iquitos, en el año 2011. El estudio pertenece al tipo experimental y el diseño fue el pre-experimental de tipo Diseño de Comparación Estática o Comparación de Grupos sólo. La población estuvo conformada por los estudiantes de la especialidad de Ciencias Sociales de la Facultad ...

  11. Neurophysiological analytics for all! Free open-source software tools for documenting, analyzing, visualizing, and sharing using electronic notebooks.

    Science.gov (United States)

    Rosenberg, David M; Horn, Charles C

    2016-08-01

    Neurophysiology requires an extensive workflow of information analysis routines, which often includes incompatible proprietary software, introducing limitations based on financial costs, transfer of data between platforms, and the ability to share. An ecosystem of free open-source software exists to fill these gaps, including thousands of analysis and plotting packages written in Python and R, which can be implemented in a sharable and reproducible format, such as the Jupyter electronic notebook. This tool chain can largely replace current routines by importing data, producing analyses, and generating publication-quality graphics. An electronic notebook like Jupyter allows these analyses, along with documentation of procedures, to display locally or remotely in an internet browser, which can be saved as an HTML, PDF, or other file format for sharing with team members and the scientific community. The present report illustrates these methods using data from electrophysiological recordings of the musk shrew vagus-a model system to investigate gut-brain communication, for example, in cancer chemotherapy-induced emesis. We show methods for spike sorting (including statistical validation), spike train analysis, and analysis of compound action potentials in notebooks. Raw data and code are available from notebooks in data supplements or from an executable online version, which replicates all analyses without installing software-an implementation of reproducible research. This demonstrates the promise of combining disparate analyses into one platform, along with the ease of sharing this work. In an age of diverse, high-throughput computational workflows, this methodology can increase efficiency, transparency, and the collaborative potential of neurophysiological research. PMID:27098025

  12. Pol(F)lux software, a dedicated tool to stream nutrient fluxes and uncertainties calculations for survey optimization

    Science.gov (United States)

    Moatar, F.; Curie, F.; Meybeck, M.

    2015-12-01

    Data on stream material fluxes are essential for calculating element cycles (carbon, nutrients, and pollutants) and erosion rates from local to global scales. In most water-quality stations throughout the world stream fluxes are calculated from daily flow data (Q) and discrete concentration data (C), the latter being often the main cause of large uncertainties. This paper present the Pol(F)lux software tool, which addresses with two major issues: i) the selection of the optimal (minimal uncertainties) flux calculation method among 8 methods based on the flux variability matrix. ii) for the the discharge-weighted concentration method (the most commonly used method and recommended in the international convention for the protection of the North Sea and the Northeast Atlantic, OSPAR Convention), sampling frequency can be predicted to achieve a specified level of precision from the flux variability indicator (M2%, cumulative material fluxes discharged during the upper 2% of highest daily fluxes) through a nomograph for sampling intervals of 3 to 60 days. The software was validated for water-quality stations in medium to large basins (basin area>500 km²). The flux variability matrix, the cornerstone of the Pol(F)lux software, is based on two indicators: (a) cumulative flow volume discharged during the upper 2% of highest daily flow, W2%, which characterizes the hydrological reactivity of the catchment during highest flow, and (b) the truncated b50sup exponent, calculated as the exponent of the relationship between concentration and discharge (in logarithmic scale) at the high-water stages (discharges greater than median flow), which characterize the behaviour of stream material. We postulate that performance is similar for stream materials found in the same flux variability class, composed of 4 classes of hydrological reactivity (W2%) and 5 classes of biogeochemical behavior (b50sup), defining 20 potential variability classes.

  13. Neurophysiological analytics for all! Free open-source software tools for documenting, analyzing, visualizing, and sharing using electronic notebooks.

    Science.gov (United States)

    Rosenberg, David M; Horn, Charles C

    2016-08-01

    Neurophysiology requires an extensive workflow of information analysis routines, which often includes incompatible proprietary software, introducing limitations based on financial costs, transfer of data between platforms, and the ability to share. An ecosystem of free open-source software exists to fill these gaps, including thousands of analysis and plotting packages written in Python and R, which can be implemented in a sharable and reproducible format, such as the Jupyter electronic notebook. This tool chain can largely replace current routines by importing data, producing analyses, and generating publication-quality graphics. An electronic notebook like Jupyter allows these analyses, along with documentation of procedures, to display locally or remotely in an internet browser, which can be saved as an HTML, PDF, or other file format for sharing with team members and the scientific community. The present report illustrates these methods using data from electrophysiological recordings of the musk shrew vagus-a model system to investigate gut-brain communication, for example, in cancer chemotherapy-induced emesis. We show methods for spike sorting (including statistical validation), spike train analysis, and analysis of compound action potentials in notebooks. Raw data and code are available from notebooks in data supplements or from an executable online version, which replicates all analyses without installing software-an implementation of reproducible research. This demonstrates the promise of combining disparate analyses into one platform, along with the ease of sharing this work. In an age of diverse, high-throughput computational workflows, this methodology can increase efficiency, transparency, and the collaborative potential of neurophysiological research.

  14. A novel tool for user-friendly estimation of natural, diagnostic and professional radiation risk: Radio-Risk software

    Energy Technology Data Exchange (ETDEWEB)

    Carpeggiani, Clara; Paterni, Marco [CNR, Institute of Clinical Physiology (Italy); Caramella, Davide [Radiology Department, Pisa University, Pisa (Italy); Vano, Eliseo [San Carlos Hospital, Radiology Department, Complutense University, Madrid (Spain); Semelka, Richard C. [University of North Carolina, Chapel Hill, NC (United States); Picano, Eugenio, E-mail: picano@ifc.cnr.it [CNR, Institute of Clinical Physiology (Italy)

    2012-11-15

    Background: Awareness of radiological risk is low among doctors and patients. An educational/decision tool that considers each patient' s cumulative lifetime radiation exposure would facilitate provider-patient communication. Aim: The purpose of this work was to develop user-friendly software for simple estimation and communication of radiological risk to patients and doctors as a part of the SUIT-Heart (Stop Useless Imaging Testing in Heart disease) Project of the Tuscany Region. Methods: We developed a novel software program (PC-platform, Windows OS fully downloadable at (http://suit-heart.ifc.cnr.it)) considering reference dose estimates from American Heart Association Radiological Imaging 2009 guidelines and UK Royal College of Radiology 2007 guidelines. Cancer age and gender-weighted risk were derived from Biological Effects of Ionising Radiation VII Committee, 2006. Results: With simple input functions (demographics, age, gender) the user selects from a predetermined menu variables relating to natural (e.g., airplane flights and geo-tracked background exposure), professional (e.g., cath lab workers) and medical (e.g., CT, cardiac scintigraphy, coronary stenting) sources. The program provides a simple numeric (cumulative effective dose in milliSievert, mSv, and equivalent number of chest X-rays) and graphic (cumulative temporal trends of exposure, cancer cases out of 100 exposed persons) display. Conclusions: A simple software program allows straightforward estimation of cumulative dose (in multiples of chest X-rays) and risk (in extra % lifetime cancer risk), with simple numbers quantifying lifetime extra cancer risk. Pictorial display of radiation risk may be valuable for increasing radiological awareness in cardiologists.

  15. STUDY REGARDING THE USE OF THE TOOLS OFFERED BY MICROSOFT EXCEL SOFTWARE IN THE ACTIVITY OF THE BIHOR COUNTY COMPANIES

    Directory of Open Access Journals (Sweden)

    Țarca Naiana

    2014-07-01

    Full Text Available A business activity involves many decision situations. These include organization, aggregation, processing, modelling and analysis of large volumes of information. By using specialized software that provides tools for analysis, aggregation and data modeling, company managers can quickly obtain significant information about company activity. For different reasons some companies are opting for a summary analysis of data, while others for a complex analysis of the data. Many companies use spreadsheet applications for inventory, data processing, modeling and analysis, useful for business activities. Microsoft Excel software is used by many of those who know and use spreadsheet applications for carrying out the work. Using tools to organize, aggregate, modelling and data analysis provided by spreadsheet application, these companies can make complex economic analyses and prognoses that lead to better decisions. For example, the Pivot tables are a simple way to extract relevant information from complex data sets in a quick and simple manner. Solver can be used to solve optimization problems. Correlation is useful in interpreting the relationships between various indicators. Based on these considerations we conducted a study in which we sought to obtain information on how instruments such as Pivot tables, Solver and Correlation are used in the business activities of companies. Companies that attaches high importance of using Pivot tables are medium and large companies. Among the companies using Solver tool, very little use GRG Nonlinear and Evolutionary Algorithms. Therefore, the Solver is used more to resolving optimization problems involving linear modeling. Correlation tool, which could help decision makers to understand more easily why the increasing of one of the factors implies the change of other factors and consequently help them make better decisions, is used far too little. Still too many companies give less importance to data organizing

  16. A software tool enabling the analysis of small lateral features without the use of a micro-beam

    Science.gov (United States)

    Healy, M. J. F.; Torres, M.; Painter, J. D.

    2006-08-01

    A new method is developed that allows samples whose composition varies rapidly across the surface, such as actual microelectronic devices, to be composition depth profiled without the use of a micro-beam or other special equipment. This is achieved by extending the traditional simulation method to an extra dimension where lateral position is also accommodated. The tool is a software shell to SIMNRA [M. Mayer, SIMNRA User's Guide, Report IPP 9/113, Max-Planck-Institut fur Plasmaphysik, Garching, Germany, 1997] that allows a multi-dimensional model of the sample to be created, simulated and iterated towards experiment. It is demonstrated on a silicon dioxide coated wafer embedded with narrowly spaced sub-micron wide metal tracks probed with a conventional beam-spot of millimetre proportions, and is supported by electron microscopy studies. The software shell also eases the analysis of laterally homogeneous samples where complementary ion beam analysis techniques must be employed by allowing a single model to control multiple simulations based on different geometries or techniques.

  17. GPU-FS-kNN: a software tool for fast and scalable kNN computation using GPUs.

    Directory of Open Access Journals (Sweden)

    Ahmed Shamsul Arefin

    Full Text Available BACKGROUND: The analysis of biological networks has become a major challenge due to the recent development of high-throughput techniques that are rapidly producing very large data sets. The exploding volumes of biological data are craving for extreme computational power and special computing facilities (i.e. super-computers. An inexpensive solution, such as General Purpose computation based on Graphics Processing Units (GPGPU, can be adapted to tackle this challenge, but the limitation of the device internal memory can pose a new problem of scalability. An efficient data and computational parallelism with partitioning is required to provide a fast and scalable solution to this problem. RESULTS: We propose an efficient parallel formulation of the k-Nearest Neighbour (kNN search problem, which is a popular method for classifying objects in several fields of research, such as pattern recognition, machine learning and bioinformatics. Being very simple and straightforward, the performance of the kNN search degrades dramatically for large data sets, since the task is computationally intensive. The proposed approach is not only fast but also scalable to large-scale instances. Based on our approach, we implemented a software tool GPU-FS-kNN (GPU-based Fast and Scalable k-Nearest Neighbour for CUDA enabled GPUs. The basic approach is simple and adaptable to other available GPU architectures. We observed speed-ups of 50-60 times compared with CPU implementation on a well-known breast microarray study and its associated data sets. CONCLUSION: Our GPU-based Fast and Scalable k-Nearest Neighbour search technique (GPU-FS-kNN provides a significant performance improvement for nearest neighbour computation in large-scale networks. Source code and the software tool is available under GNU Public License (GPL at https://sourceforge.net/p/gpufsknn/.

  18. The DSET Tool Library: A software approach to enable data exchange between climate system models

    Energy Technology Data Exchange (ETDEWEB)

    McCormick, J. [Lawrence Livermore National Lab., CA (United States)

    1994-12-01

    Climate modeling is a computationally intensive process. Until recently computers were not powerful enough to perform the complex calculations required to simulate the earth`s climate. As a result standalone programs were created that represent components of the earth`s climate (e.g., Atmospheric Circulation Model). However, recent advances in computing, including massively parallel computing, make it possible to couple the components forming a complete earth climate simulation. The ability to couple different climate model components will significantly improve our ability to predict climate accurately and reliably. Historically each major component of the coupled earth simulation is a standalone program designed independently with different coordinate systems and data representations. In order for two component models to be coupled, the data of one model must be mapped to the coordinate system of the second model. The focus of this project is to provide a general tool to facilitate the mapping of data between simulation components, with an emphasis on using object-oriented programming techniques to provide polynomial interpolation, line and area weighting, and aggregation services.

  19. freeQuant: A Mass Spectrometry Label-Free Quantification Software Tool for Complex Proteome Analysis.

    Science.gov (United States)

    Deng, Ning; Li, Zhenye; Pan, Chao; Duan, Huilong

    2015-01-01

    Study of complex proteome brings forward higher request for the quantification method using mass spectrometry technology. In this paper, we present a mass spectrometry label-free quantification tool for complex proteomes, called freeQuant, which integrated quantification with functional analysis effectively. freeQuant consists of two well-integrated modules: label-free quantification and functional analysis with biomedical knowledge. freeQuant supports label-free quantitative analysis which makes full use of tandem mass spectrometry (MS/MS) spectral count, protein sequence length, shared peptides, and ion intensity. It adopts spectral count for quantitative analysis and builds a new method for shared peptides to accurately evaluate abundance of isoforms. For proteins with low abundance, MS/MS total ion count coupled with spectral count is included to ensure accurate protein quantification. Furthermore, freeQuant supports the large-scale functional annotations for complex proteomes. Mitochondrial proteomes from the mouse heart, the mouse liver, and the human heart were used to evaluate the usability and performance of freeQuant. The evaluation showed that the quantitative algorithms implemented in freeQuant can improve accuracy of quantification with better dynamic range.

  20. freeQuant: A Mass Spectrometry Label-Free Quantification Software Tool for Complex Proteome Analysis

    Directory of Open Access Journals (Sweden)

    Ning Deng

    2015-01-01

    Full Text Available Study of complex proteome brings forward higher request for the quantification method using mass spectrometry technology. In this paper, we present a mass spectrometry label-free quantification tool for complex proteomes, called freeQuant, which integrated quantification with functional analysis effectively. freeQuant consists of two well-integrated modules: label-free quantification and functional analysis with biomedical knowledge. freeQuant supports label-free quantitative analysis which makes full use of tandem mass spectrometry (MS/MS spectral count, protein sequence length, shared peptides, and ion intensity. It adopts spectral count for quantitative analysis and builds a new method for shared peptides to accurately evaluate abundance of isoforms. For proteins with low abundance, MS/MS total ion count coupled with spectral count is included to ensure accurate protein quantification. Furthermore, freeQuant supports the large-scale functional annotations for complex proteomes. Mitochondrial proteomes from the mouse heart, the mouse liver, and the human heart were used to evaluate the usability and performance of freeQuant. The evaluation showed that the quantitative algorithms implemented in freeQuant can improve accuracy of quantification with better dynamic range.

  1. The Use of Automated Software Tools in Evaluating an e-Learning Platform Quality

    Directory of Open Access Journals (Sweden)

    George Suciu

    2012-09-01

    Full Text Available

    This paper proposes an expert system which can be used to evaluate the quality of an e-learning platform. The proposed expert system is using the modified version of the SEEQUEL Core Quality Framework and it was built using CLIPS expert system generator. The SEEQUEL Core Quality Framework originated from the collaboration between the e-learning Industry Group (eLIG with a number of European expert organizations and associations, coordinated by the MENON Network, is a framework used to build the quality tree by selecting the quality characteristics from a list of common characteristics applicable to the whole e-learning experience. CLIPS is a productive development and delivery expert system tool which provides a complete environment for the construction of rule based expert systems.

    In the first part of this paper the SEEQUEL Core Quality Framework and CLIPS expert system generator are presented showing the advantage of using an expert system for this task. In the second part, a case study of evaluating an e-learning platform is presented. The final conclusion of the experiment was that an expert system can successfully replace a human expert for the proposed task.

  2. An automatic approach for calibrating dielectric bone properties by combining finite-element and optimization software tools.

    Science.gov (United States)

    Su, Yukun; Kluess, Daniel; Mittelmeier, Wolfram; van Rienen, Ursula; Bader, Rainer

    2016-09-01

    The dielectric properties of human bone are one of the most essential inputs required by electromagnetic stimulation for improved bone regeneration. Measuring the electric properties of bone is a difficult task because of the complexity of the bone structure. Therefore, an automatic approach is presented to calibrate the electric properties of bone. The numerical method consists of three steps: generating input from experimental data, performing the numerical simulation, and calibrating the bone dielectric properties. As an example, the dielectric properties at 20 Hz of a rabbit distal femur were calibrated. The calibration process was considered as an optimization process with the aim of finding the optimum dielectric bone properties that match most of the numerically calculated simulation and experimentally measured data sets. The optimization was carried out automatically by the optimization software tool iSIGHT in combination with the finite-element solver COMSOL Multiphysics. As a result, the optimum conductivity and relative permittivity of the rabbit distal femur at 20 Hz were found to be 0.09615 S/m and 19522 for cortical bone and 0.14913 S/m and 1561507 for cancellous bone, respectively. The proposed method is a potential tool for the identification of realistic dielectric properties of the entire bone volume. The presented approach combining iSIGHT with COMSOL is applicable to, amongst others, designing implantable electro-stimulative devices or the optimization of electrical stimulation parameters for improved bone regeneration. PMID:26777343

  3. 软件测试过程管理工具的设计与实现%Design and Implementation of Software Testing Process Management Tool

    Institute of Scientific and Technical Information of China (English)

    李亚伟; 严宏君

    2013-01-01

    Software testing increasingly received attention to domestic software industry,the use of standardized,efficient software tool to manage software testing process is becoming increasingly important. It mainly studies the software test process management tool based on C/S design and implementation of software,first briefly describe the design of software system architecture and design models,software design and so on,then in detail describe the main software functions realization,including server and client software capabilities,finally integration of software test process of management make software application and prospect analysis of processes. Through practical appli-cation shows that the tool can be effectively normative management of software testing process,providing software testing efficiency greatly.%  软件测试越来越受到国内软件行业重视,采用规范的、高效率的软件工具管理软件测试过程变得越来越重要。文中主要研究基于C/S架构的软件测试过程管理工具软件的设计与实现,首先简要描述了软件系统架构设计以及设计模型、软件概要设计等,然后详细描述软件功能主要功能实现,包括服务器和客户端软件功能实现等,最后结合软件测试过程管理流程进行了软件实际应用及前景分析。经过实际应用表明,该工具可以有效规范管理软件测试过程,大大提供软件测试工作效率。

  4. Software Testing Techniques and Strategies

    OpenAIRE

    Isha,; Sunita Sangwan

    2014-01-01

    Software testing provides a means to reduce errors, cut maintenance and overall software costs. Numerous software development and testing methodologies, tools, and techniques have emerged over the last few decades promising to enhance software quality. This paper describes Software testing, need for software testing, Software testing goals and principles. Further it describe about different Software testing techniques and different software testing strategies.

  5. Visualization of 5D Assimilation Data for Meteorological Forecasting and Its Related Disaster Mitigations Utilizing Vis5D of Software Tool

    Directory of Open Access Journals (Sweden)

    Kohei Arai

    2013-09-01

    Full Text Available Method for visualization of 5D assimilation data for meteorological forecasting and its related disaster mitigations utilizing Vis5D of software tool is proposed. In order to mitigate severe weather related disaster, meteorological forecasting and prediction is needed. There are some numerical weather forecasting data, in particular, assimilation data. Time series of three dimensional geophysical parameters have to be represented visually onto computer display in a comprehensive manner. On the other hand, there are some visualization software tools. In particular, Vis5D of software tool for animation of three dimensional imagery data can be displayed. Through experiments with NCEP/GDAS assimilation data, it is found that the proposed method is appropriate for representation of 5D assimilation data in a comprehensive manner.

  6. Medical SisRadiologia: a new software tool for analysis of radiological accidents and incidents in medical radiology

    Energy Technology Data Exchange (ETDEWEB)

    Lima, Camila M. Araujo; Silva, Francisco C.A. da, E-mail: araujocamila@yahoo.com.br, E-mail: dasilva@ird.gov.br [Instituto de Radioprotecao e Dosimetria (IRD/CNEN-RJ), Rio de Janeiro, RJ (Brazil); Araujo, Rilton A.; Pelegrineli, Samuel Q., E-mail: consultoria@maximindustrial.com.br, E-mail: samuelfisica@maximindustrial.com.br [Maxim Industrial, Rio de Janeiro, RJ (Brazil)

    2013-07-01

    The man's exposure to ionizing radiation in health are has increased considerably due not only the great request of medical examinations as well as the improvement of the techniques used in diagnostic imaging, for example, equipment for conventional X-rays, CT scans, mammography, hemodynamic and others. Although the benefits of using of radiology techniques are unquestionable, the lack of training in radiation protection of the workers, associated with procedure errors, have been responsible for the increasing number of radiation overexposures of these workers. Sometimes these high doses are real and there is a true radiological accident. The radiation workers, named occupationally Exposed Individual (IOE), must comply with two national regulations: Governmental Decree 453/1998 of the National Agency of Sanitary Surveillance (Portaria 453/1998 ANVISA Agencia Nacional de Vigilancia Sanitaria), which establishes the basic guidelines for radiation protection in medial and dental radiology and; the Governmental Decree NR-32/2002 of the Ministry of Labour and Employment (Ministerio do Trabalho e Emprego), which establishes the basic guidelines for the worker's health. The two mandatory regulations postulate a detailed investigation in the event of radiation overexposure of an IOE. In order to advice the diagnostic institution to perform an efficient analysis, investigation and report of high doses, it is proposed the use of a computational tool named 'Medical SisRadiologia'. This software tool enables the compilation and record of radiological abnormal data occurred in a diagnostic institution. It will also facilitate the detailed analysis of the event and will increase the effectiveness and development of work performed by the Radiation Protection Service. At the end, a technical report is issued, in accordance with the regulations of the technical regulations, which could also be used as training tool to avoid another event in the future. (author)

  7. Medical SisRadiologia: a new software tool for analysis of radiological accidents and incidents in medical radiology

    International Nuclear Information System (INIS)

    The man's exposure to ionizing radiation in health are has increased considerably due not only the great request of medical examinations as well as the improvement of the techniques used in diagnostic imaging, for example, equipment for conventional X-rays, CT scans, mammography, hemodynamic and others. Although the benefits of using of radiology techniques are unquestionable, the lack of training in radiation protection of the workers, associated with procedure errors, have been responsible for the increasing number of radiation overexposures of these workers. Sometimes these high doses are real and there is a true radiological accident. The radiation workers, named occupationally Exposed Individual (IOE), must comply with two national regulations: Governmental Decree 453/1998 of the National Agency of Sanitary Surveillance (Portaria 453/1998 ANVISA Agencia Nacional de Vigilancia Sanitaria), which establishes the basic guidelines for radiation protection in medial and dental radiology and; the Governmental Decree NR-32/2002 of the Ministry of Labour and Employment (Ministerio do Trabalho e Emprego), which establishes the basic guidelines for the worker's health. The two mandatory regulations postulate a detailed investigation in the event of radiation overexposure of an IOE. In order to advice the diagnostic institution to perform an efficient analysis, investigation and report of high doses, it is proposed the use of a computational tool named 'Medical SisRadiologia'. This software tool enables the compilation and record of radiological abnormal data occurred in a diagnostic institution. It will also facilitate the detailed analysis of the event and will increase the effectiveness and development of work performed by the Radiation Protection Service. At the end, a technical report is issued, in accordance with the regulations of the technical regulations, which could also be used as training tool to avoid another event in the future. (author)

  8. Assessment of the effect of Nd:YAG laser pulse operating parameters on the metallurgical characteristics of different tool steels using DOE software

    Directory of Open Access Journals (Sweden)

    T. Muhič

    2011-04-01

    Full Text Available To ensure the reliability of repair welded tool surfaces, clad quality should be improved. The relationships between metallurgical characteristics of cladding and laser input welding parameters were studied using the design of experiments software. The influence of laser power, welding speed, focal point position and diameter of welding wire on the weld-bead geometry (i.e. penetration, cladding zone width and heat-affected-zone width, microstructural homogeneity, dilution and bond strength was investigated on commonly used tool steels 1,2083, 1,2312 and 1,2343, using DOE software.

  9. Methodology and software tools used by IPSN crisis centre experts during an emergency in a French PWR

    International Nuclear Information System (INIS)

    The French nuclear power plants presently in operation are standard pressurized water reactors. Due to the potential consequences of an accident in this type of installation, a national emergency organization was constituted which has the capacity to implement countermeasures for the control of the risks for the surrounding population. The Institute for Nuclear Safety and Protection (IPSN), the technical support of the French nuclear safety authority, has thus defined and constituted a support system which could, in case of an emergency occurring in a French PWR, help to reach this aim. First of all, IPSN has defined and utilizes in its emergency technical centre a methodology to evaluate the plant status and to estimate the evolution of the accident, in order to be able to calculate the consequences of releases in the mean terms. To apply this methodology, a support system, constituted by software tools, was developed and is used in such case. To help decision, two systems are currently used: SESAME for the evaluation of the installation status and potential releases and CONRAD for radiological consequences calculations. The diagnosis of the status of a PWR during an accident is based on the analysis of plant-specific data. The information transmitted from the plant is organized in such way that the expert team assesses rapidly the status of the different safety functions and barriers. The specific tools of the SESAME system are used to quantify parameters such as break size or potential fission product release within or outside the plant. The prognosis of the evolution of safety functions and barriers is based on the assessment of the current and future availability of the safety systems and on extrapolations to forecast the evolution of the accident (time to core uncover and core degradation, fission product release prognosis etc.). The system CONRAD is mainly oriented to the prediction of the consequences in the early phase of the accident for the short term

  10. IMPACTO DEL USO DEL SOFTWARE CMAP-TOOLS EN LA TÉCNICA DE LOS MAPAS CONCEPTUALES

    Directory of Open Access Journals (Sweden)

    Susy Karina Dávila Panduro

    2012-12-01

    Full Text Available La investigación tuvo como objetivo: aplicar el software Cmap-Tools en el uso de mapas conceptuales para cátedras de Ciencias Sociales en la Facultad de Ciencias de la Educación y Humanidades de la UNAP en la ciudad de Iquitos, en el año 2011. El estudio pertenece al tipo experimental y el diseño fue el pre-experimental de tipo Diseño de Comparación Estática o Comparación de Grupos sólo. La población estuvo conformada por los estudiantes de la especialidad de Ciencias Sociales de la Facultad de Ciencias de la Educación y Humanidades de la UNAP que hacen un total de 147, la determinación de la muestra fue en forma no probabilística intencionada y estuvo conformada por 44 estudiantes.La técnica que se  empleó en la recolección de los datos fue: la encuesta, el instrumento fue el cuestionario.Para el procesamiento de los datos se utilizó el programa computarizado SPSS versión 17 en español con lo que se obtuvo la matriz de datos con lo que se organizó los datos en tablas y gráficos.Para el análisis e interpretación de los datos se empleó la estadística descriptiva: frecuencia, promedios simples y porcentaje y la estadística inferencial no paramétrica  de Chi Cuadrada (X2. Para la constatación de la hipótesis principal se utilizó la prueba estadística inferencial no paramétrica X2 de Chi Cuadrada con µ = 0,01; gl = 2 obteniendo X2c = 25,83; X2t = 9,21; es decir X2c> X2t se aceptó la hipótesis de la investigación: “A través del aplicación del software Cmap-Tools se mejorará el uso de mapas conceptuales para cátedras de Ciencias Sociales en la Facultad de Ciencias de la Educación y Humanidades de la UNAP en la ciudad de Iquitos en el año 2011”.

  11. Developing a Generic Risk Assessment Simulation Modelling Software Tool for Assessing the Risk of Foot and Mouth Virus Introduction.

    Science.gov (United States)

    Tameru, B; Gebremadhin, B; Habtemariam, T; Nganwa, D; Ayanwale, O; Wilson, S; Robnett, V; Wilson, W

    2008-06-01

    Foot and Mouth disease (FMD) is a highly contagious viral disease that affects all cloven-hoofed animals. Because of its devastating effects on the agricultural industry, many countries take measures to stop the introduction of FMD virus into their countries. Decision makers at multiple levels of the United States Department of Agriculture (USDA) use Risk Assessments (RAs) (both quantitative and qualitative) to make better and more informed scientifically based decisions to prevent the accidental or intentional introduction of the disease. There is a need for a generic RA that can be applied to any country (whether FMD free or non-FMD free) and for any product (FMD infected animals and animal products). We developed a user-friendly generic RA tool (software) that can be used to conduct and examine different scenarios of quantitative/qualitative risk assessments for the different countries with their varying FMD statuses in relation to reintroduction of FMD virus into the USA. The program was written in Microsoft Visual Basic 6.0 (Microsoft Corporation, Redmond, Washington, USA). The @Risk 6.1 Developer Kit (RDK) and @Risk 6.1 Best Fit Kit library (Palisade Corporation, Newfield, NY.USA) was used to build Monte Carlo simulation models. Microsoft Access 2000 (Microsoft Corporation, Redmond, Washington, USA) was used and SQL to query the data. Different input probability distributions can be selected for the nodes in the scenario tree and different output for each end-state of the simulation is given in different graphical formats and statistical values are used in describing the likelihood of FMD virus introduction. Sensitivity Analysis in determining which input factor has more effect on the total risk outputs is also given. The developed generic RA tools can be eventually extended and modified to conduct RAs for other animal diseases and animal products. PMID:25411550

  12. ImaSim, a software tool for basic education of medical x-ray imaging in radiotherapy and radiology

    Science.gov (United States)

    Landry, Guillaume; deBlois, François; Verhaegen, Frank

    2013-11-01

    Introduction: X-ray imaging is an important part of medicine and plays a crucial role in radiotherapy. Education in this field is mostly limited to textbook teaching due to equipment restrictions. A novel simulation tool, ImaSim, for teaching the fundamentals of the x-ray imaging process based on ray-tracing is presented in this work. ImaSim is used interactively via a graphical user interface (GUI). Materials and methods: The software package covers the main x-ray based medical modalities: planar kilo voltage (kV), planar (portal) mega voltage (MV), fan beam computed tomography (CT) and cone beam CT (CBCT) imaging. The user can modify the photon source, object to be imaged and imaging setup with three-dimensional editors. Objects are currently obtained by combining blocks with variable shapes. The imaging of three-dimensional voxelized geometries is currently not implemented, but can be added in a later release. The program follows a ray-tracing approach, ignoring photon scatter in its current implementation. Simulations of a phantom CT scan were generated in ImaSim and were compared to measured data in terms of CT number accuracy. Spatial variations in the photon fluence and mean energy from an x-ray tube caused by the heel effect were estimated from ImaSim and Monte Carlo simulations and compared. Results: In this paper we describe ImaSim and provide two examples of its capabilities. CT numbers were found to agree within 36 Hounsfield Units (HU) for bone, which corresponds to a 2% attenuation coefficient difference. ImaSim reproduced the heel effect reasonably well when compared to Monte Carlo simulations. Discussion: An x-ray imaging simulation tool is made available for teaching and research purposes. ImaSim provides a means to facilitate the teaching of medical x-ray imaging.

  13. Software Reviews.

    Science.gov (United States)

    Dwyer, Donna; And Others

    1989-01-01

    Reviewed are seven software packages for Apple and IBM computers. Included are: "Toxicology"; "Science Corner: Space Probe"; "Alcohol and Pregnancy"; "Science Tool Kit Plus"; Computer Investigations: Plant Growth"; "Climatrolls"; and "Animal Watch: Whales." (CW)

  14. Evaluation of three methods for retrospective correction of vignetting on medical microscopy images utilizing two open source software tools.

    Science.gov (United States)

    Babaloukas, Georgios; Tentolouris, Nicholas; Liatis, Stavros; Sklavounou, Alexandra; Perrea, Despoina

    2011-12-01

    Correction of vignetting on images obtained by a digital camera mounted on a microscope is essential before applying image analysis. The aim of this study is to evaluate three methods for retrospective correction of vignetting on medical microscopy images and compare them with a prospective correction method. One digital image from four different tissues was used and a vignetting effect was applied on each of these images. The resulted vignetted image was replicated four times and in each replica a different method for vignetting correction was applied with fiji and gimp software tools. The highest peak signal-to-noise ratio from the comparison of each method to the original image was obtained from the prospective method in all tissues. The morphological filtering method provided the highest peak signal-to-noise ratio value amongst the retrospective methods. The prospective method is suggested as the method of choice for correction of vignetting and if it is not applicable, then the morphological filtering may be suggested as the retrospective alternative method. PMID:21950542

  15. Claire, a tool used for the simulation of events in software tests; Claire, un outil de simulation evenementielle pour le test des logiciels

    Energy Technology Data Exchange (ETDEWEB)

    Henry, J.Y.; Boulc`h, J. [CEA Centre d`Etudes de Fontenay-aux-Roses, 92 (France). Dept. d`Evaluation de Surete; Raguideau, J.; Schoen, D. [CEA Centre d`Etudes de Saclay, 91 - Gif-sur-Yvette (France). Dept. d`Electronique et d`Instrumentation Nucleaire

    1994-06-01

    CLAIRE provides a purely software system which makes it possible to validate the on line applications dealt out to the specifications domain or the code. This tool offers easy graphic design of the application and of its environment. It caries out quite efficiently the simulation of any logged in model and runs the control of the evolution either dynamically or with prerecorded time. (TEC).

  16. Semi-automatic measurement of left ventricular function on dual source computed tomography using five different software tools in comparison with magnetic resonance imaging

    NARCIS (Netherlands)

    de Jonge, G. J.; van der Vleuten, P. A.; Overbosch, J.; Lubbers, D. D.; Jansen-van der Weide, M. C.; Zijlstra, F.; van Ooijen, P. M. A.; Oudkerk, M.

    2011-01-01

    Purpose: To compare left ventricular (LV) function assessment using five different software tools on the same dual source computed tomography (DSCT) datasets with the results of MRI. Materials and methods: Twenty-six patients, undergoing cardiac contrast-enhanced DSCT were included (20 men, mean age

  17. The FRISBEE tool, a software for optimising the trade-off between food quality, energy use, and global warming impact of cold chains

    NARCIS (Netherlands)

    Gwanpua, S.G.; Verboven, P.; Leducq, D.; Brown, T.; Verlinden, B.E.; Bekele, E.; Aregawi, W. Evans, J.; Foster, A.; Duret, S.; Hoang, H.M.; Sluis, S. van der; Wissink, E.; Hendriksen, L.J.A.M.; Taoukis, P.; Gogou, E.; Stahl, V.; El Jabri, M.; Le Page, J.F.; Claussen, I.; Indergård, E.; Nicolai, B.M.; Alvarez, G.; Geeraerd, A.H.

    2015-01-01

    Food quality (including safety) along the cold chain, energy use and global warming impact of refrigeration systems are three key aspects in assessing cold chain sustainability. In this paper, we present the framework of a dedicated software, the FRISBEE tool, for optimising quality of refrigerated

  18. Software attribute visualization for high integrity software

    Energy Technology Data Exchange (ETDEWEB)

    Pollock, G.M.

    1998-03-01

    This report documents a prototype tool developed to investigate the use of visualization and virtual reality technologies for improving software surety confidence. The tool is utilized within the execution phase of the software life cycle. It provides a capability to monitor an executing program against prespecified requirements constraints provided in a program written in the requirements specification language SAGE. The resulting Software Attribute Visual Analysis Tool (SAVAnT) also provides a technique to assess the completeness of a software specification.

  19. Development of a web GIS application for emissions inventory spatial allocation based on open source software tools

    Science.gov (United States)

    Gkatzoflias, Dimitrios; Mellios, Giorgos; Samaras, Zissis

    2013-03-01

    Combining emission inventory methods and geographic information systems (GIS) remains a key issue for environmental modelling and management purposes. This paper examines the development of a web GIS application as part of an emission inventory system that produces maps and files with spatial allocated emissions in a grid format. The study is not confined in the maps produced but also presents the features and capabilities of a web application that can be used by every user even without any prior knowledge of the GIS field. The development of the application was based on open source software tools such as MapServer for the GIS functions, PostgreSQL and PostGIS for the data management and HTML, PHP and JavaScript as programming languages. In addition, background processes are used in an innovative manner to handle the time consuming and computational costly procedures of the application. Furthermore, a web map service was created to provide maps to other clients such as the Google Maps API v3 that is used as part of the user interface. The output of the application includes maps in vector and raster format, maps with temporal resolution on daily and hourly basis, grid files that can be used by air quality management systems and grid files consistent with the European Monitoring and Evaluation Programme Grid. Although the system was developed and validated for the Republic of Cyprus covering a remarkable wide range of pollutant and emissions sources, it can be easily customized for use in other countries or smaller areas, as long as geospatial and activity data are available.

  20. DESASS: A software tool for designing, simulating and optimising WWTPs; DESASS: una herramienta informatica para el diseno, simulacion y optimizacion de EDARs

    Energy Technology Data Exchange (ETDEWEB)

    Ferrer Polo, J.; Seco Torecillas, A.; Serralta Sevilla, J.; Ribes Bertomeu, J.; Manga Certain, J.; Asensi Dasi, E.; Morenilla Martinez, J. J.; Llavador Colomer, F.

    2005-07-01

    This paper present a very useful tool for designing, simulating and optimising wastewater treatment plants (WWTPs). This software, called DESASS (Desing and Simulation of Activated Sludge Systems), has been developed by Calagua research group under the financial support from the Entidad Publica de Saneamiento de Aguas Residuales de la Comunidad Valenciana and the companies Sear and Aquagest. This software allows designing, simulating and optimising the whole performance of WWTP ad the mathematical model implemented consider most of physical chemical and biological processes taking place in WWTPs. (Author) 15 refs.

  1. A Visual Basic simulation software tool for performance analysis of a membrane-based advanced water treatment plant.

    Science.gov (United States)

    Pal, P; Kumar, R; Srivastava, N; Chowdhury, J

    2014-02-01

    A Visual Basic simulation software (WATTPPA) has been developed to analyse the performance of an advanced wastewater treatment plant. This user-friendly and menu-driven software is based on the dynamic mathematical model for an industrial wastewater treatment scheme that integrates chemical, biological and membrane-based unit operations. The software-predicted results corroborate very well with the experimental findings as indicated in the overall correlation coefficient of the order of 0.99. The software permits pre-analysis and manipulation of input data, helps in optimization and exhibits performance of an integrated plant visually on a graphical platform. It allows quick performance analysis of the whole system as well as the individual units. The software first of its kind in its domain and in the well-known Microsoft Excel environment is likely to be very useful in successful design, optimization and operation of an advanced hybrid treatment plant for hazardous wastewater.

  2. A Decision Tool that Combines Discrete Event Software Process Models with System Dynamics Pieces for Software Development Cost Estimation and Analysis

    Science.gov (United States)

    Mizell, Carolyn Barrett; Malone, Linda

    2007-01-01

    The development process for a large software development project is very complex and dependent on many variables that are dynamic and interrelated. Factors such as size, productivity and defect injection rates will have substantial impact on the project in terms of cost and schedule. These factors can be affected by the intricacies of the process itself as well as human behavior because the process is very labor intensive. The complex nature of the development process can be investigated with software development process models that utilize discrete event simulation to analyze the effects of process changes. The organizational environment and its effects on the workforce can be analyzed with system dynamics that utilizes continuous simulation. Each has unique strengths and the benefits of both types can be exploited by combining a system dynamics model and a discrete event process model. This paper will demonstrate how the two types of models can be combined to investigate the impacts of human resource interactions on productivity and ultimately on cost and schedule.

  3. Development of Software for Analyzing Breakage Cutting Tools Based on Image Processing%基于图像技术分析刀具破损的软件开发

    Institute of Scientific and Technical Information of China (English)

    赵彦玲; 刘献礼; 王鹏; 王波; 王红运

    2004-01-01

    As the present day digital microsystems do not provide specialized microscopes that can detect cutting-tool, analysis software has been developed using VC++. A module for verge test and image segmentation is designed specifically for cutting-tools. Known calibration relations and given postulates are used in scale measurements. Practical operations show that the software can perform accurate detection.

  4. Software reengineering

    Science.gov (United States)

    Fridge, Ernest M., III

    1991-01-01

    Today's software systems generally use obsolete technology, are not integrated properly with other software systems, and are difficult and costly to maintain. The discipline of reverse engineering is becoming prominent as organizations try to move their systems up to more modern and maintainable technology in a cost effective manner. JSC created a significant set of tools to develop and maintain FORTRAN and C code during development of the Space Shuttle. This tool set forms the basis for an integrated environment to re-engineer existing code into modern software engineering structures which are then easier and less costly to maintain and which allow a fairly straightforward translation into other target languages. The environment will support these structures and practices even in areas where the language definition and compilers do not enforce good software engineering. The knowledge and data captured using the reverse engineering tools is passed to standard forward engineering tools to redesign or perform major upgrades to software systems in a much more cost effective manner than using older technologies. A beta vision of the environment was released in Mar. 1991. The commercial potential for such re-engineering tools is very great. CASE TRENDS magazine reported it to be the primary concern of over four hundred of the top MIS executives.

  5. TCV software test and validation tools and technique. [Terminal Configured Vehicle program for commercial transport aircraft operation

    Science.gov (United States)

    Straeter, T. A.; Williams, J. R.

    1976-01-01

    The paper describes techniques for testing and validating software for the TCV (Terminal Configured Vehicle) program which is intended to solve problems associated with operating a commercial transport aircraft in the terminal area. The TCV research test bed is a Boeing 737 specially configured with digital computer systems to carry out automatic navigation, guidance, flight controls, and electronic displays research. The techniques developed for time and cost reduction include automatic documentation aids, an automatic software configuration, and an all software generation and validation system.

  6. Design and Development of a Nanoscale Multi Probe System Using Open Source SPM Controller and GXSM Software: A Tool of Nanotechnology

    CERN Document Server

    Babu, S K Suresh; Moni, D Jackuline; Devaprakasam, D

    2014-01-01

    We report our design, development, installation and troubleshooting of an open source Gnome X Scanning Microscopy (GXSM) software package for controlling and processing of modern Scanning Probe Microscopy (SPM) system as a development tool of Nanotechnology. GXSM is a full featured analysis tool for the characterization of nanomaterials with different controlling tools like Atomic Force Microscopy (AFM), Scanning Tunneling Spectroscopy (STS), scanning tunneling microscopy (STM), Nanoindentation and etc.,. This developed package tool consists of Digital Signal Processing (DSP) and image processing system of SPM. A digital signal processor (DSP) subsystem runs the feedback loop, generates the scanning signals and acquires the data during SPM measurements. With installed SR-Hwl plug-in this developed package was tested in no hardware mode.

  7. Visualization of scientific data for high energy physics: PAW, a general-purpose portable software tool for data analysis and presentation

    International Nuclear Information System (INIS)

    Visualization of scientific data although a fashionable word in the world of computer graphics, is not a new invention, but it is hundreds years old. With the advent of computer graphics the visualization of Scientific Data has now become a well understood and widely used technology, with hundreds of applications in the most different fields, ranging from media applications to real scientific ones. In the present paper, we shall discuss the design concepts of the Visualization of Scientific Data systems in particular in the specific field of High Energy Physics. During the last twenty years, CERN has played a leading role as the focus for development of packages and software libraries to solve problems related to High Energy Physics (HEP). The results of the integration of resources from many different Laboratories can be expressed in several million lines of code written at CERN during this period of time, used at CERN and distributed to collaborating laboratories. Nowadays, this role of software developer is considered very important by the entire HEP community. In this paper a large software package, where man-machine interaction and graphics play a key role (PAW-Physics Analysis Workstation), is described. PAW is essentially an interactive system which includes many different software tools, strongly oriented towards data analysis and data presentation. Some of these tools have been available in different forms and with different human interfaces for several years. 6 figs

  8. Pipe dream? Envisioning a grassroots Python ecosystem of open, common software tools and data access in support of river and coastal biogeochemical research (Invited)

    Science.gov (United States)

    Mayorga, E.

    2013-12-01

    Practical, problem oriented software developed by scientists and graduate students in domains lacking a strong software development tradition is often balkanized into the scripting environments provided by dominant, typically proprietary tools. In environmental fields, these tools include ArcGIS, Matlab, SAS, Excel and others, and are often constrained to specific operating systems. While this situation is the outcome of rational choices, it limits the dissemination of useful tools and their integration into loosely coupled frameworks that can meet wider needs and be developed organically by groups addressing their own needs. Open-source dynamic languages offer the advantages of an accessible programming syntax, a wealth of pre-existing libraries, multi-platform access, linkage to community libraries developed in lower level languages such as C or FORTRAN, and access to web service infrastructure. Python in particular has seen a large and increasing uptake in scientific communities, as evidenced by the continued growth of the annual SciPy conference. Ecosystems with distinctive physical structures and organization, and mechanistic processes that are well characterized, are both factors that have often led to the grass-roots development of useful code meeting the needs of a range of communities. In aquatic applications, examples include river and watershed analysis tools (River Tools, Taudem, etc), and geochemical modules such as CO2SYS, PHREEQ and LOADEST. I will review the state of affairs and explore the potential offered by a Python tool ecosystem in supporting aquatic biogeochemistry and water quality research. This potential is multi-faceted and broadly involves accessibility to lone grad students, access to a wide community of programmers and problem solvers via online resources such as StackExchange, and opportunities to leverage broader cyberinfrastructure efforts and tools, including those from widely different domains. Collaborative development of such

  9. Visualization of 5D Assimilation Data for Meteorological Forecasting and Its Related Disaster Mitigations Utilizing Vis5D of Software Tool

    OpenAIRE

    Kohei Arai

    2013-01-01

    Method for visualization of 5D assimilation data for meteorological forecasting and its related disaster mitigations utilizing Vis5D of software tool is proposed. In order to mitigate severe weather related disaster, meteorological forecasting and prediction is needed. There are some numerical weather forecasting data, in particular, assimilation data. Time series of three dimensional geophysical parameters have to be represented visually onto computer display in a comprehensive manner. On th...

  10. Flammable Gas Refined Safety Analysis Tool Software Verification and Validation Report for Resolve Version 2.5

    Energy Technology Data Exchange (ETDEWEB)

    BRATZEL, D.R.

    2000-09-28

    The purpose of this report is to document all software verification and validation activities, results, and findings related to the development of Resolve Version 2.5 for the analysis of flammable gas accidents in Hanford Site waste tanks.

  11. Flammable Gas Refined Safety Analysis Tool Software Verification and Validation Report for Resolve Version 2.5

    International Nuclear Information System (INIS)

    The purpose of this report is to document all software verification and validation activities, results, and findings related to the development of Resolve Version 2.5 for the analysis of flammable gas accidents in Hanford Site waste tanks

  12. Educational software tool for protection system engineers: distance relay; Herramienta educativa para la formacion de ingenieros en protecciones electricas: relevador de distancia

    Energy Technology Data Exchange (ETDEWEB)

    Trujillo-Guajardo, L.A.; Conde-Enriquez, A. [Universidad Autonoma de Nuevo Leon, Nuevo Leon (Mexico)]. E-mail: luistrujillo84@gmail.com; con_de@yahoo.com

    2012-04-15

    In this article, a graphical software tool is presented; this tool is based on the education of protection system engineers. The theoretical fundaments used for the design of operation characteristics of distance relays and their algorithms are presented. The software allows the evaluation and analysis of real time events or simulated ones of every stage of design of the distance relay. Some example cases are presented to illustrate the activities that could be done with the graphical software tool developed. [Spanish] En este articulo se presenta una herramienta computacional grafica para apoyar la formacion de ingenieros en protecciones electricas. Los fundamentos teoricos para el diseno de caracteristicas de operacion de relevadores de distancia, asi como las rutinas de programacion de un relevador de distancia son presentados. La herramienta desarrollada permite la evaluacion de las etapas de diseno de relevadores y el analisis de la operacion ante eventos reales o simulados. Se presentan algunos casos de ejemplo para ilustrar las actividades didacticas que son posibles de realizar con la herramienta presentada.

  13. System Software and Tools for High Performance Computing Environments: A report on the findings of the Pasadena Workshop, April 14--16, 1992

    Energy Technology Data Exchange (ETDEWEB)

    Sterling, T. [Universities Space Research Association, Washington, DC (United States); Messina, P. [Jet Propulsion Lab., Pasadena, CA (United States); Chen, M. [Yale Univ., New Haven, CT (United States)] [and others

    1993-04-01

    The Pasadena Workshop on System Software and Tools for High Performance Computing Environments was held at the Jet Propulsion Laboratory from April 14 through April 16, 1992. The workshop was sponsored by a number of Federal agencies committed to the advancement of high performance computing (HPC) both as a means to advance their respective missions and as a national resource to enhance American productivity and competitiveness. Over a hundred experts in related fields from industry, academia, and government were invited to participate in this effort to assess the current status of software technology in support of HPC systems. The overall objectives of the workshop were to understand the requirements and current limitations of HPC software technology and to contribute to a basis for establishing new directions in research and development for software technology in HPC environments. This report includes reports written by the participants of the workshop`s seven working groups. Materials presented at the workshop are reproduced in appendices. Additional chapters summarize the findings and analyze their implications for future directions in HPC software technology development.

  14. SOFAS: Software Analysis Services

    OpenAIRE

    Ghezzi, G

    2010-01-01

    We propose a distributed and collaborative software analysis platform to enable seamless interoperability of software analysis tools across platform, geographical and organizational boundaries. In particular, we devise software analysis tools as services that can be accessed and composed over the Internet. These distributed services shall be widely accessible through a software analysis broker where organizations and research groups can register and share their tools. To enable (semi)-automat...

  15. Digital-flight-control-system software written in automated-engineering-design language: A user's guide of verification and validation tools

    Science.gov (United States)

    Saito, Jim

    1987-01-01

    The user guide of verification and validation (V&V) tools for the Automated Engineering Design (AED) language is specifically written to update the information found in several documents pertaining to the automated verification of flight software tools. The intent is to provide, in one document, all the information necessary to adequately prepare a run to use the AED V&V tools. No attempt is made to discuss the FORTRAN V&V tools since they were not updated and are not currently active. Additionally, the current descriptions of the AED V&V tools are contained and provides information to augment the NASA TM 84276. The AED V&V tools are accessed from the digital flight control systems verification laboratory (DFCSVL) via a PDP-11/60 digital computer. The AED V&V tool interface handlers on the PDP-11/60 generate a Univac run stream which is transmitted to the Univac via a Remote Job Entry (RJE) link. Job execution takes place on the Univac 1100 and the job output is transmitted back to the DFCSVL and stored as a PDP-11/60 printfile.

  16. Software testing

    Science.gov (United States)

    Price-Whelan, Adrian M.

    2016-01-01

    Now more than ever, scientific results are dependent on sophisticated software and analysis. Why should we trust code written by others? How do you ensure your own code produces sensible results? How do you make sure it continues to do so as you update, modify, and add functionality? Software testing is an integral part of code validation and writing tests should be a requirement for any software project. I will talk about Python-based tools that make managing and running tests much easier and explore some statistics for projects hosted on GitHub that contain tests.

  17. SOTEA, a software tool for ascertaining the efficiency potential of electrical drives - Final report; SOTEA, Softwaretool zur Ermittlung des Effizienzpotenzials bei elektrischen Antrieben - Schlussbericht

    Energy Technology Data Exchange (ETDEWEB)

    Brunner, C. U. [S.A.F.E. Schweizerische Agentur fuer Energieeffizienz, Zuerich (Switzerland); Heldstab, T. [hematik, Heldstab Systemoptimierung und Informatik, Zuerich (Switzerland)

    2009-08-15

    As a scientific base for the Swiss electric motor efficiency implementation program Topmotors a software tool for industry managers has been developed and tested. The software allows an energy efficiency engineer in a first contact with industrial managers and with few simple data on the plant operation to estimate the energy efficiency potential of electric motor systems including pay-back and investment. The data can be fed to a laptop computer on site and the results can be shown immediately. The software has been programmed and tested with five prime users. The generally positive reactions were evaluated and the tool subsequently improved. 11 industrial objects with a total of 77.6 GWh electricity consumption and 7.9 million CHF electricity cost were studied. The SOTEA estimate is an annual efficiency improvement of the electric motor systems of 6.9 GWh (11 % of electricity for motors) with an average pay-back time of 1.7 years. The SOTEA software tool is publicly available since September 2008 under www.topmotors.ch, from 1 April 2009 in a Beta-2b version. It has been downloaded until 28 June 2009 218 times by 132 persons. It will be improved with results from new pilot studies. To avoid problems with different update versions a direct internet solution will be studied. The program will also be made available internationally for English speaking users for the IEA 4E EMSA project: International Energy Agency, Implementing Agreement for Efficient Electrical End-Use Equipment, Electric Motor Systems Annex www.motorsystems.org. (authors)

  18. Development of a software tool for the management of quality control in a helical tomotherapy unit; Desarrollo de una herramienta de software para la gestion integral del control de calidad en una unidad de tomoterapia helicoidal

    Energy Technology Data Exchange (ETDEWEB)

    Garcia Repiso, S.; Hernandez Rodriguez, J.; Martin Rincon, C.; Ramos Pacho, J. A.; Verde Velasco, J. M.; Delgado Aparacio, J. M.; Perez Alvarez, M. e.; Gomez Gonzalez, N.; Cons Perez, V.; Saez Beltran, M.

    2013-07-01

    The large amount of data and information that is managed in units of external radiotherapy quality control tests makes necessary the use of tools that facilitate, on the one hand, the management of measures and results in real time, and on other tasks of management, file, query and reporting of stored data. This paper presents an application of software of own development which is used for the integral management of the helical TomoTherapy unit in the aspects related to the roles and responsibilities of the hospital Radiophysics. (Author)

  19. User Acceptance of a Software Tool for Decision Making in IT Outsourcing: A Qualitative Study in Large Companies from Sweden

    Science.gov (United States)

    Andresen, Christoffer; Hodosi, Georg; Saprykina, Irina; Rusu, Lazar

    Decisions for IT outsourcing are very complex and needs to be supported by considerations based on many (multiple) criteria. In order to facilitate the use of a specific tool by a decision-maker in IT outsourcing, we need to find out whether such a tool for this purpose will be accepted or rejected or what improvements must be added to this tool to be accepted by some IT decision makers in large companies from Sweden.

  20. A Component Approach to Collaborative Scientific Software Development: Tools and Techniques Utilized by the Quantum Chemistry Science Application Partnership

    Directory of Open Access Journals (Sweden)

    Joseph P. Kenny

    2008-01-01

    Full Text Available Cutting-edge scientific computing software is complex, increasingly involving the coupling of multiple packages to combine advanced algorithms or simulations at multiple physical scales. Component-based software engineering (CBSE has been advanced as a technique for managing this complexity, and complex component applications have been created in the quantum chemistry domain, as well as several other simulation areas, using the component model advocated by the Common Component Architecture (CCA Forum. While programming models do indeed enable sound software engineering practices, the selection of programming model is just one building block in a comprehensive approach to large-scale collaborative development which must also address interface and data standardization, and language and package interoperability. We provide an overview of the development approach utilized within the Quantum Chemistry Science Application Partnership, identifying design challenges, describing the techniques which we have adopted to address these challenges and highlighting the advantages which the CCA approach offers for collaborative development.

  1. Evaluation of software tools for automated identification of neuroanatomical structures in quantitative β-amyloid PET imaging to diagnose Alzheimer's disease

    Energy Technology Data Exchange (ETDEWEB)

    Tuszynski, Tobias; Luthardt, Julia; Butzke, Daniel; Tiepolt, Solveig; Seese, Anita; Barthel, Henryk [Leipzig University Medical Centre, Department of Nuclear Medicine, Leipzig (Germany); Rullmann, Michael; Hesse, Swen; Sabri, Osama [Leipzig University Medical Centre, Department of Nuclear Medicine, Leipzig (Germany); Leipzig University Medical Centre, Integrated Treatment and Research Centre (IFB) Adiposity Diseases, Leipzig (Germany); Gertz, Hermann-Josef [Leipzig University Medical Centre, Department of Psychiatry, Leipzig (Germany); Lobsien, Donald [Leipzig University Medical Centre, Department of Neuroradiology, Leipzig (Germany)

    2016-06-15

    For regional quantification of nuclear brain imaging data, defining volumes of interest (VOIs) by hand is still the gold standard. As this procedure is time-consuming and operator-dependent, a variety of software tools for automated identification of neuroanatomical structures were developed. As the quality and performance of those tools are poorly investigated so far in analyzing amyloid PET data, we compared in this project four algorithms for automated VOI definition (HERMES Brass, two PMOD approaches, and FreeSurfer) against the conventional method. We systematically analyzed florbetaben brain PET and MRI data of ten patients with probable Alzheimer's dementia (AD) and ten age-matched healthy controls (HCs) collected in a previous clinical study. VOIs were manually defined on the data as well as through the four automated workflows. Standardized uptake value ratios (SUVRs) with the cerebellar cortex as a reference region were obtained for each VOI. SUVR comparisons between ADs and HCs were carried out using Mann-Whitney-U tests, and effect sizes (Cohen's d) were calculated. SUVRs of automatically generated VOIs were correlated with SUVRs of conventionally derived VOIs (Pearson's tests). The composite neocortex SUVRs obtained by manually defined VOIs were significantly higher for ADs vs. HCs (p=0.010, d=1.53). This was also the case for the four tested automated approaches which achieved effect sizes of d=1.38 to d=1.62. SUVRs of automatically generated VOIs correlated significantly with those of the hand-drawn VOIs in a number of brain regions, with regional differences in the degree of these correlations. Best overall correlation was observed in the lateral temporal VOI for all tested software tools (r=0.82 to r=0.95, p<0.001). Automated VOI definition by the software tools tested has a great potential to substitute for the current standard procedure to manually define VOIs in β-amyloid PET data analysis. (orig.)

  2. TOWARD DEVELOPMENT OF A COMMON SOFTWARE APPLICATION PROGRAMMING INTERFACE (API) FOR UNCERTAINTY, SENSITIVITY, AND PARAMETER ESTIMATION METHODS AND TOOLS

    Science.gov (United States)

    The final session of the workshop considered the subject of software technology and how it might be better constructed to support those who develop, evaluate, and apply multimedia environmental models. Two invited presentations were featured along with an extended open discussio...

  3. Mathematical and software tools for models of investment and innovation development of financial and industrial corporate structures

    International Nuclear Information System (INIS)

    The operation of the software that solves the problem of calculating the integral total discounted incomes of the financial and industrial corporate structure, as well as the search for the optimal control for each of them depending on the input parameters characterizing the activities of participants has been described in the paper

  4. Software distribution using xnetlib

    Energy Technology Data Exchange (ETDEWEB)

    Dongarra, J.J. [Univ. of Tennessee, Knoxville, TN (US). Dept. of Computer Science]|[Oak Ridge National Lab., TN (US); Rowan, T.H. [Oak Ridge National Lab., TN (US); Wade, R.C. [Univ. of Tennessee, Knoxville, TN (US). Dept. of Computer Science

    1993-06-01

    Xnetlib is a new tool for software distribution. Whereas its predecessor netlib uses e-mail as the user interface to its large collection of public-domain mathematical software, xnetlib uses an X Window interface and socket-based communication. Xnetlib makes it easy to search through a large distributed collection of software and to retrieve requested software in seconds.

  5. The effects of text-based and graphics-based software tools on planning and organizing of stories.

    Science.gov (United States)

    Bahr, C M; Nelson, N W; Van Meter, A M

    1996-07-01

    This article describes a research study comparing the effects of two computer-based writing tools on the story-writing skills of fourth-through eighth-grade students with language-related learning disabilities. The first tool, the prompted writing feature of FrEdWriter (Rogers, 1985), allowed students to answer story grammar questions, then type stories using those responses as the plan; the second tool, Once Upon a Time (Urban, Rushing, & Star, 1990), allowed students to create graphic scenes, then type stories about those scenes. Nine students attended a series of after-school writing labs twice weekly for 11 weeks, using each tool for half of the writing sessions. Group results did not clearly favor either tool; however, individual differences suggested that use of planning features should be linked to student needs. Students who had less internal organizational ability benefited from the computer-presented story grammar prompts and wrote less mature stories when using the graphics-based tool. Students with relatively strong organizational skills wrote more mature stories with the graphics-based tool.

  6. Verifying nuclear fuel assemblies in wet storages on a partial defect level: A software simulation tool for evaluating the capabilities of the Digital Cherenkov Viewing Device

    Energy Technology Data Exchange (ETDEWEB)

    Grape, Sophie, E-mail: sophie.grape@physics.uu.se [Department of Physics and Astronomy, Uppsala University, Box 516, SE-75120 Uppsala (Sweden); Jacobsson Svärd, Staffan [Department of Physics and Astronomy, Uppsala University, Box 516, SE-75120 Uppsala (Sweden); Lindberg, Bo [Lens-Tech AB, Box 733, SE-93127 Skellefteå (Sweden)

    2013-01-11

    The Digital Cherenkov Viewing Device (DCVD) is an instrument that records the Cherenkov light emitted from irradiated nuclear fuels in wet storages. The presence, intensity and pattern of the Cherenkov light can be used by the International Atomic Energy Agency (IAEA) inspectors to verify that the fuel properties comply with declarations. The DCVD is since several years approved by the IAEA for gross defect verification, i.e. to control whether an item in a storage pool is a nuclear fuel assembly or a non-fuel item [1]. Recently, it has also been endorsed as a tool for partial defect verification, i.e. to identify if a fraction of the fuel rods in an assembly have been removed or replaced. The latter recognition was based on investigations of experimental studies on authentic fuel assemblies and of simulation studies on hypothetic cases of partial defects [2]. This paper describes the simulation methodology and software which was used in the partial defect capability evaluations. The developed simulation procedure uses three stand-alone software packages: the ORIGEN-ARP code [3] used to obtain the gamma-ray spectrum from the fission products in the fuel, the Monte Carlo toolkit Geant4 [4] for simulating the gamma-ray transport in and around the fuel and the emission of Cherenkov light, and the ray-tracing programme Zemax [5] used to model the light transport through the assembly geometry to the DCVD and to mimic the behaviour of its lens system. Furthermore, the software allows for detailed information from the plant operator on power and/or burnup distributions to be taken into account to enhance the authenticity of the simulated images. To demonstrate the results of the combined software packages, simulated and measured DCVD images are presented. A short discussion on the usefulness of the simulation tool is also included.

  7. Verifying nuclear fuel assemblies in wet storages on a partial defect level: A software simulation tool for evaluating the capabilities of the Digital Cherenkov Viewing Device

    Science.gov (United States)

    Grape, Sophie; Jacobsson Svärd, Staffan; Lindberg, Bo

    2013-01-01

    The Digital Cherenkov Viewing Device (DCVD) is an instrument that records the Cherenkov light emitted from irradiated nuclear fuels in wet storages. The presence, intensity and pattern of the Cherenkov light can be used by the International Atomic Energy Agency (IAEA) inspectors to verify that the fuel properties comply with declarations. The DCVD is since several years approved by the IAEA for gross defect verification, i.e. to control whether an item in a storage pool is a nuclear fuel assembly or a non-fuel item [1]. Recently, it has also been endorsed as a tool for partial defect verification, i.e. to identify if a fraction of the fuel rods in an assembly have been removed or replaced. The latter recognition was based on investigations of experimental studies on authentic fuel assemblies and of simulation studies on hypothetic cases of partial defects [2]. This paper describes the simulation methodology and software which was used in the partial defect capability evaluations. The developed simulation procedure uses three stand-alone software packages: the ORIGEN-ARP code [3] used to obtain the gamma-ray spectrum from the fission products in the fuel, the Monte Carlo toolkit Geant4 [4] for simulating the gamma-ray transport in and around the fuel and the emission of Cherenkov light, and the ray-tracing programme Zemax [5] used to model the light transport through the assembly geometry to the DCVD and to mimic the behaviour of its lens system. Furthermore, the software allows for detailed information from the plant operator on power and/or burnup distributions to be taken into account to enhance the authenticity of the simulated images. To demonstrate the results of the combined software packages, simulated and measured DCVD images are presented. A short discussion on the usefulness of the simulation tool is also included.

  8. MieLab: A Software Tool to Perform Calculations on the Scattering of Electromagnetic Waves by Multilayered Spheres

    Directory of Open Access Journals (Sweden)

    Ovidio Peña-Rodríguez

    2011-01-01

    Full Text Available In this paper, we present MieLab, a free computational package for simulating the scattering of electromagnetic radiation by multilayered spheres or an ensemble of particles with normal size distribution. It has been designed as a virtual laboratory, including a friendly graphical user interface (GUI, an optimization algorithm (to fit the simulations to experimental results and scripting capabilities. The paper is structured in five different sections: the introduction is a perspective on the importance of the software for the study of scattering of light scattering. In the second section, various approaches used for modeling the scattering of electromagnetic radiation by small particles are discussed. The third and fourth sections are devoted to provide an overview of MieLab and to describe the main features of its architectural model and functional behavior, respectively. Finally, several examples are provided to illustrate the main characteristics of the software.

  9. Powerfarm: A power and emergency management thread-based software tool for the ATLAS Napoli Tier2

    Science.gov (United States)

    Doria, Alessandra; Carlino, Gianpaolo; Iengo, Salvatore; Merola, Leonardo; Ricciardi, Sergio; Staffa, Mariacarla

    2010-04-01

    The large computing power and the storage systems available in a Grid site or in a computing center are composed of several servers and devices, each with its own specific role in the center. A management and fault recovery system is of primary importance for management operations and to preserve the integrity of the systems in case of emergencies, such as power outages or temperature peaks. We developed Powerfarm, a customizable thread-based software system that monitors several parameters such as, for example, the status of power supplies, room and CPU temperatures and it promptly reacts to values out of range with the appropriate actions. Powerfarm enforces hardware and software dependencies between devices and it is able to switch them on/off in the particular order induced by the dependencies, so it's also useful for the site administrator to speed up the management of scheduled downtimes.

  10. Improving Software Reliability Forecasting

    NARCIS (Netherlands)

    Burtsy, Bernard; Albeanu, Grigore; Boros, Dragos N.; Popentiu, Florin; Nicola, Victor

    1997-01-01

    This work investigates some methods for software reliability forecasting. A supermodel is presented as a suited tool for prediction of reliability in software project development. Also, times series forecasting for cumulative interfailure time is proposed and illustrated.

  11. Methods and software tools for mitochondrial genome assembly and annotation%线粒体基因组数据的分析方法和软件

    Institute of Scientific and Technical Information of China (English)

    李雪娟; 杨婧; 王俊红; 任倩俐; 李霞; 黄原

    2013-01-01

    With the increasing popularity of mitochondrial genome studies, the correct assembly and annotation of genomes are the basis of all subsequent research into a species. Here we describe the protocols using Staden Package software to assemble and annotate the mitochondrial genome, along with other commonly used software, such as ContigExpress, DNAMAN, DNASTAR, BioEdit and Sequencher. In addition, methods for the use of different software packages (including DOGMA.MOSAS.MITOS.GOBASE.OGRe.MitoZoa.tRNAscan-SE.ARWEN.BLAST and MiTFi) to annotate mitochondrial genomic protein-coding genes, rRNA, tRNA and the A +T region are briefly introduced. Finally, application of MEGAS software to analyze the composition of mitochondrial genomes, Sequin software to submit sequences to GenBank, and mitochondrial genome data visualization tools ( CG view. MTviz and OGDRAW) are also briefly introduced.%线粒体基因组的研究已经普及,其正确的拼接和注释是所有后续研究的基础.本文以Staden Package软件为主介绍了拼接和注释的线粒体基因组的方法,同时介绍了其他常用的拼接软件ContigExpress、DNAMAN、DNASTAR、BioEdit和Sequencher,以及利用不同软件(包括DOGMA、MOSAS、MITOS、GOBASE、OGRe、MitoZoa、tRNAscan-SE、ARWEN、BLAST和MiTFi等)对线粒体基因组中的蛋白质编码基因、rRNA、tRNA和A+T富集区进行注释的方法,最后介绍了利用MEGA5软件分析线粒体基因组的组成、Sequin软件提交序列和线粒体基因组数据绘图工具(CG view、MTviz和OGDRAW).

  12. CoCoTools: open-source software for building connectomes using the CoCoMac anatomical database.

    Science.gov (United States)

    Blumenfeld, Robert S; Bliss, Daniel P; Perez, Fernando; D'Esposito, Mark

    2014-04-01

    Neuroanatomical tracer studies in the nonhuman primate macaque monkey are a valuable resource for cognitive neuroscience research. These data ground theories of cognitive function in anatomy, and with the emergence of graph theoretical analyses in neuroscience, there is high demand for these data to be consolidated into large-scale connection matrices ("macroconnectomes"). Because manual review of the anatomical literature is time consuming and error prone, computational solutions are needed to accomplish this task. Here we describe the "CoCoTools" open-source Python library, which automates collection and integration of macaque connectivity data for visualization and graph theory analysis. CoCoTools both interfaces with the CoCoMac database, which houses a vast amount of annotated tracer results from 100 years (1905-2005) of neuroanatomical research, and implements coordinate-free registration algorithms, which allow studies that use different parcellations of the brain to be translated into a single graph. We show that using CoCoTools to translate all of the data stored in CoCoMac produces graphs with properties consistent with what is known about global brain organization. Moreover, in addition to describing CoCoTools' processing pipeline, we provide worked examples, tutorials, links to on-line documentation, and detailed appendices to aid scientists interested in using CoCoTools to gather and analyze CoCoMac data. PMID:24116839

  13. Proposal for the Award of a Contract, without competitive Tendering, for the Provision of the ORACLE Database Management System Software together with Tools for Application Development and System Exploitation

    CERN Document Server

    1995-01-01

    Proposal for the Award of a Contract, without competitive Tendering, for the Provision of the ORACLE Database Management System Software together with Tools for Application Development and System Exploitation

  14. Tools for developing software for different types of microprocessors, to be used, in particular, for the acquisition and processing of nuclear data

    International Nuclear Information System (INIS)

    It is difficult to imagine the realization of a system with a microprocessor without the use of an adapted development system. As these systems are prohibitively expensive, it is difficult for a laboratory to acquire them. A computer, such as the Multi 20 is provided with programme generating tools and supervisors to put the computer's ressources at the microprocessor's disposal. An electronic crate assures interface functions with the computer, the emulation of the microprocessor and the loading of EROM and the live memory lank in order to execute the different units integrated into the crate, enables the crate to be used as a portable repair and maintenance outfit for the materials installed. In the first part of the text, we present the principles of the development tools showing how they are used to realize microprocessor equipment. In the second part, the software is optimized together with the choice of materials in order to define a low cost development system

  15. A COMPETITIVE MODEL BASED IN FREE SOFTWARE TOOLS FOR THE TECHNOLOGICAL MANAGEMENT OF ORGANIZATIONS - THE PROMOTION OF THE CORPORATIVE KNOWLEDGE AND THE TECHNOLOGICAL INNOVATION IN A TECHNOLOGICAL UNDERGRADUATE COURSE

    OpenAIRE

    Rubens Araújo de Oliveira; Mário Lucio Roloff

    2007-01-01

    This article presents the thematic of the technological management, where the research is focused on the choice of the best technological free software tools for the promotion of the knowledge management. This article evidences the hypothesis that it is possible to adopt the knowledge management with the union and customization of the free software tools. In such a way, any organization can act in the technological management and apply the politics of knowledge management, either to a micro-c...

  16. Intelligent Software Agents as tools for managing ethical issues in organisations caused by the introduction of new Information Technology

    DEFF Research Database (Denmark)

    Abolfazlian, Ali Reza Kian

    1996-01-01

    I denne artikel beskrives der, hvordan medarbejdernes værdier i og for organisationerne udvikler sig i sammenhæng med de teknologiske værktøjer, som de udfører deres job med. På denne baggrund beskrives nogle af de etiske problemer, der opstår som konsekvens af indførelsen af ny informationstekno...... informationsteknologi i organisationerne, og hvordan Intelligent Software Agents (ISAs) på en aktiv måde kan hjælpe managers med at overkomme disse problemer....

  17. DrCell – A Software Tool for the Analysis of Cell Signals Recorded with Extracellular Microelectrodes

    Directory of Open Access Journals (Sweden)

    Christoph Nick

    2013-09-01

    Full Text Available Microelectrode arrays (MEAs have been applied for in vivo and in vitro recording and stimulation of electrogenic cells, namely neurons and cardiac myocytes, for almost four decades. Extracellular recordings using the MEA technique inflict minimum adverse effects on cells and enable long term applications such as implants in brain or heart tissue. Hence, MEAs pose a powerful tool for studying the processes of learning and memory, investigating the pharmacological impacts of drugs and the fundamentals of the basic electrical interface between novel electrode materials and biological tissue. Yet in order to study the areas mentioned above, powerful signal processing and data analysis tools are necessary. In this paper a novel toolbox for the offline analysis of cell signals is presented that allows a variety of parameters to be detected and analyzed. We developed an intuitive graphical user interface (GUI that enables users to perform high quality data analysis. The presented MATLAB® based toolbox gives the opportunity to examine a multitude of parameters, such as spike and neural burst timestamps, network bursts, as well as heart beat frequency and signal propagation for cardiomyocytes, signal-to-noise ratio and many more. Additionally a spike-sorting tool is included, offering a powerful tool for cases of multiple cell recordings on a single microelectrode. For stimulation purposes, artifacts caused by the stimulation signal can be removed from the recording, allowing the detection of field potentials as early as 5 ms after the stimulation.

  18. Sleep scoring made easy-Semi-automated sleep analysis software and manual rescoring tools for basic sleep research in mice.

    Science.gov (United States)

    Kreuzer, M; Polta, S; Gapp, J; Schuler, C; Kochs, E F; Fenzl, T

    2015-01-01

    Studying sleep behavior in animal models demands clear separation of vigilance states. Pure manual scoring is time-consuming and commercial scoring software is costly. We present a LabVIEW-based, semi-automated scoring routine using recorded EEG and EMG signals. This scoring routine is •designed to reliably assign the vigilance/sleep states wakefulness (WAKE), non-rapid eye movement sleep (NREMS) and rapid eye movement sleep (REMS) to defined EEG/EMG episodes.•straightforward to use even for beginners in the field of sleep research.•freely available upon request. Chronic recordings from mice were used to design and evaluate the scoring routine consisting of an artifact-removal, a scoring- and a rescoring routine. The scoring routine processes EMG and different EEG frequency bands. Amplitude-based thresholds for EEG and EMG parameters trigger a decision tree assigning each EEG episode to a defined vigilance/sleep state automatically. Using the rescoring routine individual episodes or particular state transitions can be re-evaluated manually. High agreements between auto-scored and manual sleep scoring could be shown for experienced scorers and for beginners quickly and reliably. With small modifications to the software, it can be easily adapted for sleep analysis in other animal models.

  19. A software tool to estimate the dynamic behaviour of the IP{sup 2}C samples as sensors for didactic purposes

    Energy Technology Data Exchange (ETDEWEB)

    Graziani, S.; Pagano, F.; Pitrone, N.; Umana, E., E-mail: nicola.pitrone@diees.unict.i [Dipartimento di Ingegneria Elettrica Elettronica e dei Sistemi -University of Catania V.le A. Doria 6, 95125, Catania (Italy)

    2010-07-01

    Ionic Polymer Polymer Composites (IP{sup 2}Cs) are emerging materials used to realize motion actuators and sensors. In the former case a voltage input is able to cause the membrane to bend while in the latter case by bending an IP{sup 2}C membrane, a voltage output is obtained. In this paper authors introduce a software tool able to estimate the dynamic behaviour for sensors based on IP{sup 2}Cs working in air. In the proposed tool, geometrical quantities that rule the sensing properties of IP{sup 2}C-based transducers are taken into account together with their dynamic characteristics. A graphical interface (GUI) has been developed in order to give a useful tool that allows the user to understand the behaviour and the role of the parameters involved in the transduction phenomena. The tool is based on the idea that a graphical user interface will allow persons not skilled in IP{sup 2}C materials to observe their behaviour and to analyze their characteristics. This could greatly increase the interest of researchers towards this new class of transducers; moreover, it can support the educational activity of students involved in advanced academical courses.

  20. Software Model Of Software-Development Process

    Science.gov (United States)

    Lin, Chi Y.; Synott, Debra J.; Levary, Reuven R.

    1990-01-01

    Collection of computer programs constitutes software tool for simulation of medium- to large-scale software-development projects. Necessary to include easily identifiable and more-readily quantifiable characteristics like costs, times, and numbers of errors. Mathematical model incorporating these and other factors of dynamics of software-development process implemented in the Software Life Cycle Simulator (SLICS) computer program. Simulates dynamics of software-development process. In combination with input and output expert software systems and knowledge-based management software system, develops information for use in managing large software-development project. Intended to aid managers in planning, managing, and controlling software-development processes by reducing uncertainties in budgets, required personnel, and schedules.

  1. Virtual chromoendoscopy can be a useful software tool in capsule endoscopy La Cromoendoscopia virtual puede ser una herramienta de software útil en la cápsula endoscópica

    Directory of Open Access Journals (Sweden)

    Gabriela Duque

    2012-05-01

    Full Text Available Background: capsule endoscopy (CE has revolutionized the study of small bowel. One major drawback of this technique is that we cannot interfere with image acquisition process. Therefore, the development of new software tools that could modify the images and increase both detection and diagnosis of small-bowel lesions would be very useful. The Flexible Spectral Imaging Color Enhancement (FICE that allows for virtual chromoendoscopy is one of these software tools. Aims: to evaluate the reproducibility and diagnostic accuracy of the FICE system in CE. Methods: this prospective study involved 20 patients. First, four physicians interpreted 150 static FICE images and the overall agree-ment between them was determined using the Fleiss Kappa Test. Second, two experienced gastroenterologists, blinded to each other results, analyzed the complete 20 video streams. One interpreted conventional capsule videos and the other, the CE-FICE videos at setting 2. All findings were reported, regardless of their clinical value. Non-concordant findings between both interpretations were analyzed by a consensus panel of four gastroenterologists who reached a final result (positive or negative finding. Results: in the first arm of the study the overall concordance between the four gastroenterologists was substantial (0.650. In the second arm, the conventional mode identified 75 findings and the CE-FICE mode 95. The CE-FICE mode did not miss any lesions identified by the conventional mode and allowed the identification of a higher number of angiodysplasias (35 vs 32, and erosions (41 vs. 24. Conclusions: there is reproducibility for the interpretation of CE-FICE images between different observers experienced in conventional CE. The use of virtual chromoendoscopy in CE seems to increase its diagnostic accuracy by highlighting small bowel erosions and angiodysplasias that weren't identified by the conventional mode.

  2. Sleep scoring made easy—Semi-automated sleep analysis software and manual rescoring tools for basic sleep research in mice

    Directory of Open Access Journals (Sweden)

    M. Kreuzer

    2015-01-01

    Chronic recordings from mice were used to design and evaluate the scoring routine consisting of an artifact-removal, a scoring- and a rescoring routine. The scoring routine processes EMG and different EEG frequency bands. Amplitude-based thresholds for EEG and EMG parameters trigger a decision tree assigning each EEG episode to a defined vigilance/sleep state automatically. Using the rescoring routine individual episodes or particular state transitions can be re-evaluated manually. High agreements between auto-scored and manual sleep scoring could be shown for experienced scorers and for beginners quickly and reliably. With small modifications to the software, it can be easily adapted for sleep analysis in other animal models.

  3. Development of a software tool for evaluating driving assistance systems; Entwicklung eines Softwaretools zur Bewertung von Fahrerassistenzsystemen

    Energy Technology Data Exchange (ETDEWEB)

    Marstaller, R.; Bubb, H. [Technische Universitaet Muenchen (Germany). Lehrstuhl fuer Ergonomie

    2002-07-01

    The increase in road safety in Germany could for example be indicated by the reducing number of seriously injured and killed people (/6/) in spite of increasing number of cars and total amount of kilometres. The selective measures therefore are based on four points: Improvement of active and passive security, direct and indirect psychological measures. While developing systems, which assist drivers on the guidance level, the question of safety of these measures more and more occurs. This led to the development of software, which contains a so called normative driver model, and compares actual driving data with this model. Thereby, situations can be identified, which deviate from the situational normative model, and consequently could be classified as critical. The practical application to driving data with active assistance systems with regulation in longitudinal and lateral direction showed significant improvement of driving safety in comparison to data without system usage. (orig.)

  4. MIAWARE Software

    DEFF Research Database (Denmark)

    Wilkowski, Bartlomiej; Pereira, Oscar N. M.; Dias, Paulo;

    2008-01-01

    pathologies on images and report their characteristics. The reporting process is normalized - radiologists cannot describe pathological changes with their own words, but can only use some terms from a specific vocabulary set provided by the software. Consequently, a normalized radiological report......This article presents MIAWARE, a software for Medical Image Analysis With Automated Reporting Engine, which was designed and developed for doctor/radiologist assistance. It allows to analyze an image stack from computed axial tomography scan of lungs (thorax) and, at the same time, to mark all......, a deductive report search was obtained, which may be helpful for doctors while diagnosing patients’ cases. Finally, the MIAWARE software can be considered also as a teaching tool for future radiologists and physicians....

  5. 随钻声波测井仪维护系统软件设计%Software design of maintenance system for acoustic logging-while-drilling tool

    Institute of Scientific and Technical Information of China (English)

    陈鹏; 师奕兵; 张伟

    2012-01-01

    For solving the difficulties of maintenance for the logging-while-drilling tool, taking the advantages of the chip s3c2440 of Samsung company, the software application of the maintenance system based on Linux embedded operating system was proposed. This application achieved mutual communication ability for this maintenance system with the PC through USB or network. Simultaneously, this system can transport software program to downhole tool and receive data from it through the serial port. It focused on the development of software application based on Linux embedded operating system with Qt and the designation of the USB device driver and the software program for network sever. After the data-transmission test, this system achieved the two-way transmission of a huge number of data with high speed, stable rate and high reliability.%针对随钻测井仪的维护难问题,采用三星公司的s3c2440ARM9微处理器,基于Linux嵌入式系统设计并实现了随钻声波测井仪维护系统的软件部分.该维护系统能够实现与上位机之间的USB和网络双向通信,同时,能够通过串口和井下仪器进行双向数据通信,实现对井下仪器的程序下载和数据读取.着重阐述了基于Linux嵌入式系统的应用程序开发、USB从设备固件程序以及网络服务器端应用程序设计.经过数据测试表明:系统可以实现大数据量的双向传输,数据传输速度快、速率稳定、可靠性高.

  6. A software tool for teaching and training how to build and use matrixes in strategic information systems planning

    Directory of Open Access Journals (Sweden)

    Javier Andrés Arias Sanabria

    2010-10-01

    Full Text Available Strategic information systems planning (SISP allows an organisation to determine a portfolio of computer-based applications to help it achieve its business objectives. IBM’s business system planning for strategic alignment (BSP/SA is an important technique for developing a strategic plan for an entire company’s information resource. BSP/SA has been described in terms of stages and the specific tasks within them. Tasks are usually done manually and require some experience. This work was thus aimed at presenting a computer-based application that automates two of the most important tasks in BSP/SA methodology: process-organisation matrix (POM and processes-data classes–matrix (PDM. Special emphasis was placed on analysing, designing and implementing systems development life-cycle for developing the software. An important part of the analysis consisted of conducting a literature review and the semi-structured interviews with some experts in SISP. A special contribution of the present work is the design and implementation of statistical reports associated with each matrix. Automating this task has facilitated students being able to analyse POM and PDM during SISP workshops forming part of the Information Systems Management course (Systems Engineering, Universidad Nacional de Colombia. Results arising from the workshops have also been improved.

  7. 'nparACT' package for R: A free software tool for the non-parametric analysis of actigraphy data.

    Science.gov (United States)

    Blume, Christine; Santhi, Nayantara; Schabus, Manuel

    2016-01-01

    For many studies, participants' sleep-wake patterns are monitored and recorded prior to, during and following an experimental or clinical intervention using actigraphy, i.e. the recording of data generated by movements. Often, these data are merely inspected visually without computation of descriptive parameters, in part due to the lack of user-friendly software. To address this deficit, we developed a package for R Core Team [6], that allows computing several non-parametric measures from actigraphy data. Specifically, it computes the interdaily stability (IS), intradaily variability (IV) and relative amplitude (RA) of activity and gives the start times and average activity values of M10 (i.e. the ten hours with maximal activity) and L5 (i.e. the five hours with least activity). Two functions compute these 'classical' parameters and handle either single or multiple files. Two other functions additionally allow computing an L-value (i.e. the least activity value) for a user-defined time span termed 'Lflex' value. A plotting option is included in all functions. The package can be downloaded from the Comprehensive R Archives Network (CRAN). •The package 'nparACT' for R serves the non-parametric analysis of actigraphy data.•Computed parameters include interdaily stability (IS), intradaily variability (IV) and relative amplitude (RA) as well as start times and average activity during the 10 h with maximal and the 5 h with minimal activity (i.e. M10 and L5).

  8. Development of TUF-ELOCA - a software tool for integrated single-channel thermal-hydraulic and fuel element analyses

    International Nuclear Information System (INIS)

    The TUF-ELOCA tool couples the TUF and ELOCA codes to enable an integrated thermal-hydraulic and fuel element analysis for a single channel during transient conditions. The coupled architecture is based on TUF as the parent process controlling multiple ELOCA executions that simulate the fuel elements behaviour and is scalable to different fuel channel designs. The coupling ensures a proper feedback between the coolant conditions and fuel elements response, eliminates model duplications, and constitutes an improvement from the prediction accuracy point of view. The communication interfaces are based on PVM and allow parallelization of the fuel element simulations. Developmental testing results are presented showing realistic predictions for the fuel channel behaviour during a transient. (author)

  9. IDENTIFYING THE ROTATIONAL AXES OF DISTAL FEMUR IN SOUTH ANDHRA POPULATION OF INDIA USING IMAGE TOOL SOFTWARE

    Directory of Open Access Journals (Sweden)

    Sharmila Bhanu

    2015-08-01

    Full Text Available AIM : To study the normal relationship of the anteroposterior (APA, transepicondylar (TEA and posterior condylar (PCA axis of normal cadaveric femoral bones using digital technology and special computer program. MATERIAL AND METHOD: The study comprised of 196 dry adult femora from 98 right and 98 left sides irrespective of sex and age b elonging to Andhra Pradesh population of India. The bone collections were obtained from the Anatomy department, Narayana medical college, Nellore, India. The femurs were kept in normal anatomical position on OB. The photographs were taken from the distal e nd of all the femurs placing the camera lens 10cm constantly away from it with a digital camera. Using the reference points, angle between APA - TEA, APA - PCA and TEA - PCA were identified. The statistical significance of difference between the right and left g roups was evaluated by using Student paired t - test. Data were presented as mean±SD. P - value less than 0.05 were considered statistically significant. RESULTS : The relationship of the angle between the APA - TEA, APA - PCA and TEA - PCA were observed. The angle o f AP - TE, AP - PC and TE - PC was 94.84±3.43°, 87.64±1.62° and 6.84±2.71° respectively on right side. On the left side, the angle of AP - TE, AP - PC and TE - PC was 92.36±4.06°, 93.61±2.54° and 3.19±0.99° respectively. CONCLUSION: The normal femoral rotational alignment from cadaveric bone study using computer aided software can be helpful to the surgeon in selecting appropriate reference axis in any particular knee surgeries. In this regard the present data can be taken into consi deration for the femoral rotational alignment during any intraoperative surgeries of knee and in total knee arthroplasty

  10. A method for creating teaching movie clips using screen recording software: usefulness of teaching movies as self-learning tools for medical students

    Energy Technology Data Exchange (ETDEWEB)

    Hwang, Seong Su [The Catholic University of Korea, Suwon (Korea, Republic of)

    2007-04-15

    I wanted to describe a method to create teaching movies with using screen recordings, and I wanted to see if self-learning movies are useful for medical students. Teaching movies were created by direct recording of the screen activity and voice narration during the interpretation of educational cases; we used a PACS system and screen recording software for the recording (CamStudio, Rendersoft, U.S.A.). The usefulness of teaching movies for seft-learning of abdominal CT anatomy was evacuated by the medical students. Creating teaching movie clips with using screen recording software was simple and easy. Survey responses were collected from 43 medical students. The contents of teaching movie was adequately understandable (52%) and useful for learning (47%). Only 23% students agreed the these movies helped motivated them to learn. Teaching movies were more useful than still photographs of the teaching image files. The students wanted teaching movies on the cross-sectional CT anatomy of different body regions (82%) and for understanding the radiological interpretation of various diseases (42%). Creating teaching movie by direct screen recording of a radiologist's interpretation process is easy and simple. The teaching video clips reveal a radiologist's interpretation process or the explanation of teaching cases with his/her own voice narration, and it is an effective self-learning tool for medical students and residents.

  11. Challenges in Individualizing Drug Dosage for Intensive Care Unit Patients: Is Augmented Renal Clearance What We Really Want to Know? Some Suggested Management Approaches and Clinical Software Tools.

    Science.gov (United States)

    Jelliffe, Roger

    2016-08-01

    Acutely ill intensive care unit (ICU) patients often have large apparent volumes of distribution of drugs and, because of this, their drug clearance (CL) is usually also increased. 'Augmented renal Cl' is a current issue in the management of drug therapy for acutely ill and unstable ICU patients; however, Cl, the product of volume and the rate constant for excretion, describes only a theoretical volume of drug cleared per unit of time. Information of the actual rate of movement of the drug itself is obscured. It is suggested that the most useful clinical information is given by describing drug volume and elimination rate constant separately. This also permits better understanding of the patient's separate issues of fluid balance and drug elimination, especially when dialysis, renal replacement therapy, or extracorporeal membrane oxygenation (ECMO) may be used, and facilitates management of these two important separate clinical issues. Optimal management of drug therapy also requires optimal methods embodied in clinical software to describe drug behavior in these highly unstable patients, and considerably more data than for ordinary patients. The interacting multiple model (IMM) clinical software facilitates management of both fluid balance and drug therapy in these unstable patients. Illustrative cases are discussed, and new monitoring and management strategies are suggested. Like other ICU skills, physicians need to learn optimal tools for managing drug therapy in the ICU. Further work should help evaluate these new approaches. PMID:26914772

  12. TopCAT and PySESA: Open-source software tools for point cloud decimation, roughness analyses, and quantitative description of terrestrial surfaces

    Science.gov (United States)

    Hensleigh, J.; Buscombe, D.; Wheaton, J. M.; Brasington, J.; Welcker, C. W.; Anderson, K.

    2015-12-01

    The increasing use of high-resolution topography (HRT) constructed from point clouds obtained from technology such as LiDAR, SoNAR, SAR, SfM and a variety of range-imaging techniques, has created a demand for custom analytical tools and software for point cloud decimation (data thinning and gridding) and spatially explicit statistical analysis of terrestrial surfaces. We will present on a number of analytical and computational tools designed to quantify surface roughness and texture, directly from point clouds in a variety of ways (using spatial- and frequency-domain statistics). TopCAT (Topographic Point Cloud Analysis Toolkit; Brasington et al., 2012) and PySESA (Python program for Spatially Explicit Spectral Analysis) both work by applying a small moving window to (x,y,z) data to calculate a suite of (spatial and spectral domain) statistics, which are then spatially-referenced on a regular (x,y) grid at a user-defined resolution. Collectively, these tools facilitate quantitative description of surfaces and may allow, for example, fully automated texture characterization and segmentation, roughness and grain size calculation, and feature detection and classification, on very large point clouds with great computational efficiency. Using tools such as these, it may be possible to detect geomorphic change in surfaces which have undergone minimal elevation difference, for example deflation surfaces which have coarsened but undergone no net elevation change, or surfaces which have eroded and accreted, leaving behind a different textural surface expression than before. The functionalities of the two toolboxes are illustrated with example high-resolution bathymetric point cloud data collected with multibeam echosounder, and topographic data collected with LiDAR.

  13. FlowCal: A User-Friendly, Open Source Software Tool for Automatically Converting Flow Cytometry Data from Arbitrary to Calibrated Units.

    Science.gov (United States)

    Castillo-Hair, Sebastian M; Sexton, John T; Landry, Brian P; Olson, Evan J; Igoshin, Oleg A; Tabor, Jeffrey J

    2016-07-15

    Flow cytometry is widely used to measure gene expression and other molecular biological processes with single cell resolution via fluorescent probes. Flow cytometers output data in arbitrary units (a.u.) that vary with the probe, instrument, and settings. Arbitrary units can be converted to the calibrated unit molecules of equivalent fluorophore (MEF) using commercially available calibration particles. However, there is no convenient, nonproprietary tool available to perform this calibration. Consequently, most researchers report data in a.u., limiting interpretation. Here, we report a software tool named FlowCal to overcome current limitations. FlowCal can be run using an intuitive Microsoft Excel interface, or customizable Python scripts. The software accepts Flow Cytometry Standard (FCS) files as inputs and is compatible with different calibration particles, fluorescent probes, and cell types. Additionally, FlowCal automatically gates data, calculates common statistics, and produces publication quality plots. We validate FlowCal by calibrating a.u. measurements of E. coli expressing superfolder GFP (sfGFP) collected at 10 different detector sensitivity (gain) settings to a single MEF value. Additionally, we reduce day-to-day variability in replicate E. coli sfGFP expression measurements due to instrument drift by 33%, and calibrate S. cerevisiae Venus expression data to MEF units. Finally, we demonstrate a simple method for using FlowCal to calibrate fluorescence units across different cytometers. FlowCal should ease the quantitative analysis of flow cytometry data within and across laboratories and facilitate the adoption of standard fluorescence units in synthetic biology and beyond.

  14. FlowCal: A User-Friendly, Open Source Software Tool for Automatically Converting Flow Cytometry Data from Arbitrary to Calibrated Units.

    Science.gov (United States)

    Castillo-Hair, Sebastian M; Sexton, John T; Landry, Brian P; Olson, Evan J; Igoshin, Oleg A; Tabor, Jeffrey J

    2016-07-15

    Flow cytometry is widely used to measure gene expression and other molecular biological processes with single cell resolution via fluorescent probes. Flow cytometers output data in arbitrary units (a.u.) that vary with the probe, instrument, and settings. Arbitrary units can be converted to the calibrated unit molecules of equivalent fluorophore (MEF) using commercially available calibration particles. However, there is no convenient, nonproprietary tool available to perform this calibration. Consequently, most researchers report data in a.u., limiting interpretation. Here, we report a software tool named FlowCal to overcome current limitations. FlowCal can be run using an intuitive Microsoft Excel interface, or customizable Python scripts. The software accepts Flow Cytometry Standard (FCS) files as inputs and is compatible with different calibration particles, fluorescent probes, and cell types. Additionally, FlowCal automatically gates data, calculates common statistics, and produces publication quality plots. We validate FlowCal by calibrating a.u. measurements of E. coli expressing superfolder GFP (sfGFP) collected at 10 different detector sensitivity (gain) settings to a single MEF value. Additionally, we reduce day-to-day variability in replicate E. coli sfGFP expression measurements due to instrument drift by 33%, and calibrate S. cerevisiae Venus expression data to MEF units. Finally, we demonstrate a simple method for using FlowCal to calibrate fluorescence units across different cytometers. FlowCal should ease the quantitative analysis of flow cytometry data within and across laboratories and facilitate the adoption of standard fluorescence units in synthetic biology and beyond. PMID:27110723

  15. ATLAS software packaging

    CERN Document Server

    Rybkin, G

    2012-01-01

    Software packaging is indispensable part of build and prerequisite for deployment processes. Full ATLAS software stack consists of TDAQ, HLT, and Offline software. These software groups depend on some 80 external software packages. We present tools, package PackDist, developed and used to package all this software except for TDAQ project. PackDist is based on and driven by CMT, ATLAS software configuration and build tool, and consists of shell and Python scripts. The packaging unit used is CMT project. Each CMT project is packaged as several packages - platform dependent (one per platform available), source code excluding header files, other platform independent files, documentation, and debug information packages (the last two being built optionally). Packaging can be done recursively to package all the dependencies. The whole set of packages for one software release, distribution kit, also includes configuration packages and contains some 120 packages for one platform. Also packaged are physics analysis pro...

  16. DEVELOPMENT OF A SOFTWARE DESIGN TOOL FOR HYBRID SOLAR-GEOTHERMAL HEAT PUMP SYSTEMS IN HEATING- AND COOLING-DOMINATED BUILDINGS

    Energy Technology Data Exchange (ETDEWEB)

    Yavuzturk, C. C. [Univ. of Hartford, West Hartford, CT (United States); Chiasson, A. D. [Univ. of Hartford, West Hartford, CT (United States); Filburn, T. P. [Univ. of Hartford, West Hartford, CT (United States)

    2012-11-29

    This project provides an easy-to-use, menu-driven, software tool for designing hybrid solar-geothermal heat pump systems (GHP) for both heating- and cooling-dominated buildings. No such design tool currently exists. In heating-dominated buildings, the design approach takes advantage of glazed solar collectors to effectively balance the annual thermal loads on the ground with renewable solar energy. In cooling-dominated climates, the design approach takes advantage of relatively low-cost, unglazed solar collectors as the heat rejecting component. The primary benefit of hybrid GHPs is the reduced initial cost of the ground heat exchanger (GHX). Furthermore, solar thermal collectors can be used to balance the ground loads over the annual cycle, thus making the GHX fully sustainable; in heating-dominated buildings, the hybrid energy source (i.e., solar) is renewable, in contrast to a typical fossil fuel boiler or electric resistance as the hybrid component; in cooling-dominated buildings, use of unglazed solar collectors as a heat rejecter allows for passive heat rejection, in contrast to a cooling tower that consumes a significant amount of energy to operate, and hybrid GHPs can expand the market by allowing reduced GHX footprint in both heating- and cooling-dominated climates. The design tool allows for the straight-forward design of innovative GHP systems that currently pose a significant design challenge. The project lays the foundations for proper and reliable design of hybrid GHP systems, overcoming a series of difficult and cumbersome steps without the use of a system simulation approach, and without an automated optimization scheme. As new technologies and design concepts emerge, sophisticated design tools and methodologies must accompany them and be made usable for practitioners. Lack of reliable design tools results in reluctance of practitioners to implement more complex systems. A menu-driven software tool for the design of hybrid solar GHP systems is

  17. The Solid Earth Research and Teaching Environment, a new software framework to share research tools in the classroom and across disciplines

    Science.gov (United States)

    Milner, K.; Becker, T. W.; Boschi, L.; Sain, J.; Schorlemmer, D.; Waterhouse, H.

    2009-12-01

    The Solid Earth Teaching and Research Environment (SEATREE) is a modular and user-friendly software framework to facilitate the use of solid Earth research tools in the classroom and for interdisciplinary research collaboration. SEATREE is open source and community developed, distributed freely under the GNU General Public License. It is a fully contained package that lets users operate in a graphical mode, while giving more advanced users the opportunity to view and modify the source code. Top level graphical user interfaces which initiate the calculations and visualize results, are written in the Python programming language using an object-oriented, modern design. Results are plotted with either Matlab-like Python libraries, or SEATREE’s own Generic Mapping Tools wrapper. The underlying computational codes used to produce the results can be written in any programming language and accessed through Python wrappers. There are currently four fully developed science modules for SEATREE: (1) HC is a global geodynamics tool based on a semi-analytical mantle-circulation program based on work by B. Steinberger, Becker, and C. O'Neill. HC can compute velocities and tractions for global, spherical Stokes flow and radial viscosity variations. HC is fast enough to be used for classroom instruction, for example to let students interactively explore the role of radial viscosity variations for global geopotential (geoid) anomalies. (2) ConMan wraps Scott King’s 2D finite element mantle convection code, allowing users to quickly observe how modifications to input parameters affect heat flow over time. As seismology modules, SEATREE includes, (3), Larry, a global, surface wave phase-velocity inversion tool and, (4), Syn2D, a Cartesian tomography teaching tool for ray-theory wave propagation in synthetic, arbitrary velocity structure in the presence of noise. Both underlying programs were contributed by Boschi. Using Syn2D, students can explore, for example, how well a given

  18. SisRadiologia: a new software tool for analysis of radiological accidents and incidents in industrial radiography

    Energy Technology Data Exchange (ETDEWEB)

    Lima, Camila M. Araujo; Silva, Francisco C.A. da, E-mail: araujocamila@yahoo.com.br, E-mail: dasilva@ird.gov.br [Instituto de Radioprotecao e Dosimetria (IRD/CNEN-RJ), Rio de Janeiro, RJ (Brazil); Araujo, Rilton A., E-mail: consultoria@maximindustrial.com.br [Maxim Industrial Assessoria TI, Rio de Janeiro, RJ (Brazil)

    2013-07-01

    According to the International Atomic Energy Agency (IAEA), many efforts have been made by Member states, aiming a better control of radioactive sources. Accidents mostly happened in practices named as high radiological risk and classified by IAEA in categories 1 and 2, being highlighted those related to radiotherapy, large irradiators and industrial radiography. Worldwide, more than 40 radiological accidents have been recorded in the industrial radiography. Worldwide, more than 40 radiological accidents have been recorded in the industrial radiography area, involving 37 workers, 110 members of the public and 12 fatalities. Records display 5 severe radiological accidents in industrial radiography activities in Brazil, in which 7 workers and 19 members of the public were involved. Such events led to hands and fingers radiodermatitis, but to no death occurrence. The purpose of this study is to present a computational program that allows the data acquisition and recording in the company, in such a way to ease a further detailed analysis of radiological event, besides providing the learning cornerstones aiming the avoidance of future occurrences. After one year of the 'Industrial SisRadiologia' computational program application - and mostly based upon the workshop about Analysis and Dose Calculation of Radiological Accidents in Industrial Radiography (Workshop sobre Analise e Calculo de dose de acidentes Radiologicos em Radiografia Industrial - IRD 2012), in which several Radiation Protection officers took part - it can be concluded that the computational program is a powerful tool to data acquisition, as well as, to accidents and incidents events recording and surveying in Industrial Radiography. The program proved to be efficient in the report elaboration to the Brazilian Regulatory Authority, and very useful in workers training to fix the lessons learned from radiological events.

  19. Presentation of IngeniumTM, software tool for manage and share information and knowledge, and some applications in nuclear domain, with the CEA

    International Nuclear Information System (INIS)

    Full text: Principles. New technology allow the communication, exchange and sharing of many information. Search engine fit the profile of users more and more providing relevant document. But it's not enough to create a real collective thrust where everyone can express its own point of view, to confront it with the others, to enrich it while laying out the evolution of the argument. Furthermore, we realize the over-abundance of information, the difficulty to operate it and the increasing enrichment of the immaterial capital which is made of knowledge and know-how of the firm's staff. It's a very concrete and daily problem, in a lot of domain, everywhere we need information to act. The quality and relevance of founded solution contribute to the success of the firm or of the concerned group : how not to lose information, not to make again what is already done, not to waste time to find what exist, to share, to think with other, to lay out this thought and decision which ensue? In face of the several dimension of knowledge management procedural (organizational), cognitive (power is in the capability to operate the information) and instrumental (software tool, linguistic search engine and network), we offer multiple answer: methodological coaching and set a tool going, fitting best to requirement, individual and collective. Like AI, KM focus thought management, but unlike AI, instead of trying to formalize a problem's resolution by the automation of a reasoning, we now look for providing to the operator the information he need to resolve himself the problem, individually or collectively. We so make the bet, determinedly, of the user's intelligence, relying on its own cognitive capacity to operate at best the provided information. Ingenium software. Its on the above ideas that was build the Ingenium software, trying to answer to underlined requirement, ensuring employment easiness, share, subjectivity and relevance. Its inside Jean Michel Penalva's laboratory (CEA) that

  20. Bioinformatics resource manager v2.3: an integrated software environment for systems biology with microRNA and cross-species analysis tools

    Directory of Open Access Journals (Sweden)

    Tilton Susan C

    2012-11-01

    Full Text Available Abstract Background MicroRNAs (miRNAs are noncoding RNAs that direct post-transcriptional regulation of protein coding genes. Recent studies have shown miRNAs are important for controlling many biological processes, including nervous system development, and are highly conserved across species. Given their importance, computational tools are necessary for analysis, interpretation and integration of high-throughput (HTP miRNA data in an increasing number of model species. The Bioinformatics Resource Manager (BRM v2.3 is a software environment for data management, mining, integration and functional annotation of HTP biological data. In this study, we report recent updates to BRM for miRNA data analysis and cross-species comparisons across datasets. Results BRM v2.3 has the capability to query predicted miRNA targets from multiple databases, retrieve potential regulatory miRNAs for known genes, integrate experimentally derived miRNA and mRNA datasets, perform ortholog mapping across species, and retrieve annotation and cross-reference identifiers for an expanded number of species. Here we use BRM to show that developmental exposure of zebrafish to 30 uM nicotine from 6–48 hours post fertilization (hpf results in behavioral hyperactivity in larval zebrafish and alteration of putative miRNA gene targets in whole embryos at developmental stages that encompass early neurogenesis. We show typical workflows for using BRM to integrate experimental zebrafish miRNA and mRNA microarray datasets with example retrievals for zebrafish, including pathway annotation and mapping to human ortholog. Functional analysis of differentially regulated (p Conclusions BRM provides the ability to mine complex data for identification of candidate miRNAs or pathways that drive phenotypic outcome and, therefore, is a useful hypothesis generation tool for systems biology. The miRNA workflow in BRM allows for efficient processing of multiple miRNA and mRNA datasets in a single

  1. Software tools and e-infrastructure services to support the long term preservation of earth science data - new functionality from the SCIDIP-ES project

    Science.gov (United States)

    Riddick, Andrew; Glaves, Helen; Crompton, Shirley; Giaretta, David; Ritchie, Brian; Pepler, Sam; De Smet, Wim; Marelli, Fulvio; Mantovani, Pier-Luca

    2014-05-01

    The ability to preserve earth science data for the long-term is a key requirement to support on-going research and collaboration within and between earth science disciplines. A number of critically important current research initiatives (e.g. understanding climate change or ensuring sustainability of natural resources) typically rely on the continuous availability of data collected over several decades in a form which can be easily accessed and used by scientists. In many earth science disciplines the capture of key observational data may be difficult or even impossible to repeat. For example, a specific geological exposure or subsurface borehole may be only temporarily available, and earth observation data derived from a particular satellite mission is often unique. Another key driver for long-term data preservation is that the grand challenges of the kind described above frequently involve cross-disciplinary research utilising raw and interpreted data from a number of related earth science disciplines. Adopting effective data preservation strategies supports this requirement for interoperability as well as ensuring long term usability of earth science data, and has the added potential for stimulating innovative earth science research. The EU-funded SCIDIP-ES project seeks to address these challenges by developing a Europe-wide e-infrastructure for long-term data preservation by providing appropriate software tools and infrastructure services to enable and promote long-term preservation of earth science data. This poster will describe the current status of this e-infrastructure and outline the integration of the prototype SCIDIP-ES software components into the existing systems used by earth science archives and data providers. These prototypes utilise a system architecture which stores preservation information in a standardised OAIS-compliant way, and connects and adds value to existing earth science archives. A SCIDIP-ES test-bed has been implemented by the

  2. Computer Software.

    Science.gov (United States)

    Kay, Alan

    1984-01-01

    Discusses the nature and development of computer software. Programing, programing languages, types of software (including dynamic spreadsheets), and software of the future are among the topics considered. (JN)

  3. Overview of Environmental Transport Models Contained in the Risk Analysis, Communication, Evaluation, and Reduction (RACER) Software Tools at Los Alamos National Laboratory

    International Nuclear Information System (INIS)

    The objective of the Risk Analysis, Communication, Evaluation, and Reduction (RACER) project is to provide more relevant and timely access to information related to chemicals and radionuclides in the environment around Los Alamos National Laboratory (LANL), and to develop tools to support an effective and logical evaluation and reduction of human health risks and ecological impacts associated with exposures to these materials. The guiding principle of RACER is an open and transparent process that considers community input as an integral part of making decisions about how to most effectively reduce risks related to LANL operations. Tools and resources include a database of geo-referenced environmental data, mapping software to display spatial data, environmental transport models, a risk assessment module, and various options to assist with interpreting the results. Human health risk assessment is performed for a user-defined exposure scenario using current environmental measurements, environmental transfer functions to estimate contaminant concentrations in environmental media that do not have measurements, and environmental transport models to estimate contaminant concentrations in the future. Environmental transport and transfer models address transport in air, vadose zone, groundwater, and the food chain. Recognizing that environmental transport models are generally developed on a site-specific basis, the RACER software tools incorporate methodology to distill complex site-specific model behavior into simple functional forms that are stored within the RACER database tables and are executed either by external dynamic-linked libraries or within Visual Basic code. Groundwater model computer run times can be excessively long and construction and operation of the model require specialized expertise. Instead of incorporating a complex groundwater model directly into the tool, a response surface model was developed that abstracts the behavior of an external groundwater

  4. Spreadsheet Auditing Software

    CERN Document Server

    Nixon, David

    2010-01-01

    It is now widely accepted that errors in spreadsheets are both common and potentially dangerous. Further research has taken place to investigate how frequently these errors occur, what impact they have, how the risk of spreadsheet errors can be reduced by following spreadsheet design guidelines and methodologies, and how effective auditing of a spreadsheet is in the detection of these errors. However, little research exists to establish the usefulness of software tools in the auditing of spreadsheets. This paper documents and tests office software tools designed to assist in the audit of spreadsheets. The test was designed to identify the success of software tools in detecting different types of errors, to identify how the software tools assist the auditor and to determine the usefulness of the tools.

  5. SOFTWARE TOOL DEVELOPMENT FOR MULTIMEDIA

    Science.gov (United States)

    A long term goal of multimedia environmental management is to achieve sustainable ecological resources. Progress towards this goal rests on a foundation of science-based methods and data integrated into predictive multimedia, multi-stressor open architecture modeling systems. The...

  6. Orbit Software Suite

    Science.gov (United States)

    Osgood, Cathy; Williams, Kevin; Gentry, Philip; Brownfield, Dana; Hallstrom, John; Stuit, Tim

    2012-01-01

    Orbit Software Suite is used to support a variety of NASA/DM (Dependable Multiprocessor) mission planning and analysis activities on the IPS (Intrusion Prevention System) platform. The suite of Orbit software tools (Orbit Design and Orbit Dynamics) resides on IPS/Linux workstations, and is used to perform mission design and analysis tasks corresponding to trajectory/ launch window, rendezvous, and proximity operations flight segments. A list of tools in Orbit Software Suite represents tool versions established during/after the Equipment Rehost-3 Project.

  7. Easily Learn the Clone Stamp Tool in PhotoShop Software--Take the Example of Removing Pouch%轻松学会Photoshop软件中仿制图章工具--以去除眼袋为例

    Institute of Scientific and Technical Information of China (English)

    刘鑫

    2014-01-01

    This article takes the example of removing pouch in a photo, introduces the usage of the clone stamp tool in PhotoShop software.%本文以去除照片上的眼袋为例,介绍PhotoShop软件工具箱中仿制图章工具的使用方法。

  8. The cost-effectiveness of monitoring strategies for antiretroviral therapy of HIV infected patients in resource-limited settings: software tool.

    Directory of Open Access Journals (Sweden)

    Janne Estill

    Full Text Available The cost-effectiveness of routine viral load (VL monitoring of HIV-infected patients on antiretroviral therapy (ART depends on various factors that differ between settings and across time. Low-cost point-of-care (POC tests for VL are in development and may make routine VL monitoring affordable in resource-limited settings. We developed a software tool to study the cost-effectiveness of switching to second-line ART with different monitoring strategies, and focused on POC-VL monitoring.We used a mathematical model to simulate cohorts of patients from start of ART until death. We modeled 13 strategies (no 2nd-line, clinical, CD4 (with or without targeted VL, POC-VL, and laboratory-based VL monitoring, with different frequencies. We included a scenario with identical failure rates across strategies, and one in which routine VL monitoring reduces the risk of failure. We compared lifetime costs and averted disability-adjusted life-years (DALYs. We calculated incremental cost-effectiveness ratios (ICER. We developed an Excel tool to update the results of the model for varying unit costs and cohort characteristics, and conducted several sensitivity analyses varying the input costs.Introducing 2nd-line ART had an ICER of US$1651-1766/DALY averted. Compared with clinical monitoring, the ICER of CD4 monitoring was US$1896-US$5488/DALY averted and VL monitoring US$951-US$5813/DALY averted. We found no difference between POC- and laboratory-based VL monitoring, except for the highest measurement frequency (every 6 months, where laboratory-based testing was more effective. Targeted VL monitoring was on the cost-effectiveness frontier only if the difference between 1st- and 2nd-line costs remained large, and if we assumed that routine VL monitoring does not prevent failure.Compared with the less expensive strategies, the cost-effectiveness of routine VL monitoring essentially depends on the cost of 2nd-line ART. Our Excel tool is useful for determining optimal

  9. 基于MasterCAM软件的选刀及高度方向精度控制研究%Based on MasterCAM software tool selection and precision control of the height direction

    Institute of Scientific and Technical Information of China (English)

    吴平峰

    2013-01-01

    The study study based on the MasterCAM software tool selection and precision control of the height di-rection. Tool selection and precision control in height direction made MasterCAM software as the primary means of re-search, analysed and overview MasterCAM software, explored precision control for tool selection and precision control of high direction, analyzed the importance of the accuracy control between that two. In this paper, MasterCAM software pro-cess issues at home and abroad, the use of MasterCAM software for CNC machining technology have been analyzed,and put forward optimization MasterCAM software technology and precision control strategies,in order to improve efficiency for tool selection and highly precision control.%文章研究的是基于MasterCAM软件的选刀方面及其高度方向进度控制。选刀及其高度方向的精度控制以MasterCAM软件为主要的研究手段,分析和概述MasterCAM软件,探究选刀的精度控制和高度方向的精度控制,分析控制这两者精度的重要意义。本文对国内外MasterCAM软件的工艺问题、利用MasterCAM软件进行数控加工技术的现状进行了分析,并提出优化MasterCAM软件工艺和精度控制的策略,提高选刀及高度方向控制精度的效率。

  10. Bridging the Gap Between Software Process and Software Development

    OpenAIRE

    Rouillé, Emmanuelle; Combemale, Benoit; Barais, Olivier; David, Touzet; Jézéquel, Jean-Marc

    2011-01-01

    National audience Model Driven Engineering (MDE) benefits software development (a.k.a. Model Driven Software Development) as well as software processes (a.k.a. Software Process Modeling). Nevertheless, the gap between processes and development is still too great. Indeed, information from processes is not always used to improve development and vice versa. For instance, it is possible to define the development tools used in a process description without linking them to the real tools. This p...

  11. TS Tools

    Directory of Open Access Journals (Sweden)

    Yvette Linders

    2012-12-01

    Full Text Available In deze aflevering van TS Tools laat promovenda Yvette Linders (Radboud Universiteit Nijmegen zien hoe software voor kwalitatieve data-analyse kan worden toegepast in het onderzoek naar literatuurkritiek.

  12. The Software Management Environment (SME)

    Science.gov (United States)

    Valett, Jon D.; Decker, William; Buell, John

    1988-01-01

    The Software Management Environment (SME) is a research effort designed to utilize the past experiences and results of the Software Engineering Laboratory (SEL) and to incorporate this knowledge into a tool for managing projects. SME provides the software development manager with the ability to observe, compare, predict, analyze, and control key software development parameters such as effort, reliability, and resource utilization. The major components of the SME, the architecture of the system, and examples of the functionality of the tool are discussed.

  13. Visual assessment of software evolution

    OpenAIRE

    Voinea, Lucian; Lukkien, Johan; Telea, Alexandru

    2007-01-01

    Configuration management tools have become well and widely accepted by the software industry. Software Configuration Management (SCM) systems hold minute information about the entire evolution of complex software systems and thus represent a good source for process accounting and auditing. However, it is still difficult to use the entire spectrum of information such tools maintain. Currently, significant effort is being done in the direction of mining this kind of software repositories for ex...

  14. eSoftwareList

    Data.gov (United States)

    US Agency for International Development — USAID Software Database reporting tool created in Oracle Application Express (APEX). This version provides read only access to a database view of the JIRA SAR...

  15. Measuring software technology

    Science.gov (United States)

    Agresti, W. W.; Card, D. N.; Church, V. E.; Page, G.; Mcgarry, F. E.

    1983-01-01

    Results are reported from a series of investigations into the effectiveness of various methods and tools used in a software production environment. The basis for the analysis is a project data base, built through extensive data collection and process instrumentation. The project profiles become an organizational memory, serving as a reference point for an active program of measurement and experimentation on software technology.

  16. Computer Software tool for heart rate variability (HRV), T-wave alternans (TWA) and heart rate turbulence (HRT) analysis from ECGs

    OpenAIRE

    Kudryński, Krzysztof; Strumiłło, Paweł; Ruta, Jan

    2011-01-01

    Summary Background This paper presents a software package for quantitative evaluation of heart rate variability (HRV), heart rate turbulence (HRT), and T-wave alternans (TWA) from ECG recordings. The software has been developed for the purpose of scientific research rather than clinical diagnosis. Material/Methods The software is written in Matlab Mathematical Language. Procedures for evaluation of HRV, HRT and TWA were implemented. HRV analysis was carried out by applying statistical and spe...

  17. ATLAS software packaging

    Science.gov (United States)

    Rybkin, Grigory

    2012-12-01

    Software packaging is indispensable part of build and prerequisite for deployment processes. Full ATLAS software stack consists of TDAQ, HLT, and Offline software. These software groups depend on some 80 external software packages. We present tools, package PackDist, developed and used to package all this software except for TDAQ project. PackDist is based on and driven by CMT, ATLAS software configuration and build tool, and consists of shell and Python scripts. The packaging unit used is CMT project. Each CMT project is packaged as several packages—platform dependent (one per platform available), source code excluding header files, other platform independent files, documentation, and debug information packages (the last two being built optionally). Packaging can be done recursively to package all the dependencies. The whole set of packages for one software release, distribution kit, also includes configuration packages and contains some 120 packages for one platform. Also packaged are physics analysis projects (currently 6) used by particular physics groups on top of the full release. The tools provide an installation test for the full distribution kit. Packaging is done in two formats for use with the Pacman and RPM package managers. The tools are functional on the platforms supported by ATLAS—GNU/Linux and Mac OS X. The packaged software is used for software deployment on all ATLAS computing resources from the detector and trigger computing farms, collaboration laboratories computing centres, grid sites, to physicist laptops, and CERN VMFS and covers the use cases of running all applications as well as of software development.

  18. Social software in global software development

    DEFF Research Database (Denmark)

    Giuffrida, Rosalba; Dittrich, Yvonne

    2010-01-01

    Social software (SoSo) is defined by Farkas as tools that (1) allow people to communicate, collaborate, and build community online (2) can be syndicated, shared, reused or remixed and (3) let people learn easily from and capitalize on the behavior and knowledge of others. [1]. SoSo include a wide...... variety of tools such as: instant messaging, internet forums, mailing lists, blogs, wikis, social network sites, social bookmarking, social libraries, virtual worlds. Though normally rather belonging to the private realm, the use of social software in corporate context has been reported, e.g. as a way...

  19. TESTING FOR OBJECT ORIENTED SOFTWARE

    Directory of Open Access Journals (Sweden)

    Jitendra S. Kushwah

    2011-02-01

    Full Text Available This paper deals with design and development of an automated testing tool for Object Oriented Software. By an automated testing tool, we mean a tool that automates a part of the testing process. It can include one or more of the following processes: test strategy eneration, test case generation, test case execution, test data generation, reporting and logging results. By object-oriented software we mean a software designed using OO approach and implemented using a OO language. Testing of OO software is different from testing software created using procedural languages. Severalnew challenges are posed. In the past most of the methods for testing OO software were just a simple extension of existing methods for conventional software. However, they have been shown to be not very appropriate. Hence, new techniques have been developed. This thesis work has mainly focused on testing design specifications for OO software. As described later, there is a lack of specification-based testing tools for OO software. An advantage of testing software specifications as compared to program code is that specifications aregenerally correct whereas code is flawed. Moreover, with software engineering principles firmly established in the industry, most of the software developed nowadays follow all the steps of Software Development Life Cycle (SDLC. For this work, UML specifications created in Rational Rose are taken. UML has become the de-factostandard for analysis and design of OO software. Testing is conducted at 3 levels: Unit, Integration and System. At the system level there is no difference between the testing techniques used for OO software and other software created using a procedural language, and hence, conventional techniques can be used. This tool provides features for testing at Unit (Class level as well as Integration level. Further a maintenance-level component has also been incorporated. Results of applying this tool to sample Rational Rose files have

  20. PhasePlot: An Interactive Software Tool for Visualizing Phase Relations, Performing Virtual Experiments, and for Teaching Thermodynamic Concepts in Petrology

    Science.gov (United States)

    Ghiorso, M. S.

    2012-12-01

    The computer program PhasePlot was developed for Macintosh computers and released via the Mac App Store in December 2011. It permits the visualization of phase relations calculated from internally consistent thermodynamic data-model collections, including those from MELTS (Ghiorso and Sack, 1995, CMP 119, 197-212), pMELTS (Ghiorso et al., 2002, G-cubed 3, 10.1029/2001GC000217) and the deep mantle database of Stixrude and Lithgow-Bertelloni (2011, GJI 184, 1180-1213). The software allows users to enter a system bulk composition and a range of reference conditions, and then calculate a grid of phase relations. These relations may be visualized in a variety of ways including pseudosections, phase diagrams, phase proportion plots, and contour diagrams of phase compositions and abundances. The program interface is user friendly and the computations are fast on laptop-scale machines, which makes PhasePlot amenable to in-class demonstrations, as a tool in instructional laboratories, and as an aid in support of out-of-class exercises and research. Users focus on problem specification and interpretation of results rather than on manipulation and mechanics of computation. The software has been developed with NSF support and is free. The PhasePlot web site is at phaseplot.org where extensive user documentation, video tutorials and examples of use may be found. The original release of phase plot permitted calculations to be performed on pressure-, temperature-grids (P-T), by direct minimization of the Gibbs free energy of the system at each grid point. A revision of PhasePlot (scheduled for release to the Mac App Store in December 2012) extends capabilities to include pressure-, entropy-grids (P-S) by system enthalpy minimization, volume-, temperature-grids (V-T) by system Helmholtz energy minimization, and volume-,entropy-grids (V-S) by minimization of the Internal Energy of the system. P-S gridded results may be utilized to visualize phase relations as a function of heat