WorldWideScience

Sample records for bartab software tools

  1. Software engineering tools.

    Science.gov (United States)

    Wear, L L; Pinkert, J R

    1994-01-01

    We have looked at general descriptions and illustrations of several software development tools, such as tools for prototyping, developing DFDs, testing, and maintenance. Many others are available, and new ones are being developed. However, you have at least seen some examples of powerful CASE tools for systems development.

  2. Machine Tool Software

    Science.gov (United States)

    1988-01-01

    A NASA-developed software package has played a part in technical education of students who major in Mechanical Engineering Technology at William Rainey Harper College. Professor Hack has been using (APT) Automatically Programmed Tool Software since 1969 in his CAD/CAM Computer Aided Design and Manufacturing curriculum. Professor Hack teaches the use of APT programming languages for control of metal cutting machines. Machine tool instructions are geometry definitions written in APT Language to constitute a "part program." The part program is processed by the machine tool. CAD/CAM students go from writing a program to cutting steel in the course of a semester.

  3. Software Tool Issues

    Science.gov (United States)

    Hennell, Michael

    This chapter relies on experience with tool development gained over the last thirty years. It shows that there are a large number of techniques that contribute to any successful project, and that formality is always the key: a modern software test tool is based on a firm mathematical foundation. After a brief introduction, Section 2 recalls and extends the terminology of Chapter 1. Section 3 discusses the the design of different sorts of static and dynamic analysis tools. Nine important issues to be taken into consideration when evaluating such tools are presented in Section 4. Section 5 investigates the interplay between testing and proof. In Section 6, we call for developers to take their own medicine and verify their tools. Finally, we conclude in Section 7 with a summary of our main messages, emphasising the important role of testing.

  4. CSAM Metrology Software Tool

    Science.gov (United States)

    Vu, Duc; Sandor, Michael; Agarwal, Shri

    2005-01-01

    CSAM Metrology Software Tool (CMeST) is a computer program for analysis of false-color CSAM images of plastic-encapsulated microcircuits. (CSAM signifies C-mode scanning acoustic microscopy.) The colors in the images indicate areas of delamination within the plastic packages. Heretofore, the images have been interpreted by human examiners. Hence, interpretations have not been entirely consistent and objective. CMeST processes the color information in image-data files to detect areas of delamination without incurring inconsistencies of subjective judgement. CMeST can be used to create a database of baseline images of packages acquired at given times for comparison with images of the same packages acquired at later times. Any area within an image can be selected for analysis, which can include examination of different delamination types by location. CMeST can also be used to perform statistical analyses of image data. Results of analyses are available in a spreadsheet format for further processing. The results can be exported to any data-base-processing software.

  5. Software engineering methodologies and tools

    Science.gov (United States)

    Wilcox, Lawrence M.

    1993-01-01

    Over the years many engineering disciplines have developed, including chemical, electronic, etc. Common to all engineering disciplines is the use of rigor, models, metrics, and predefined methodologies. Recently, a new engineering discipline has appeared on the scene, called software engineering. For over thirty years computer software has been developed and the track record has not been good. Software development projects often miss schedules, are over budget, do not give the user what is wanted, and produce defects. One estimate is there are one to three defects per 1000 lines of deployed code. More and more systems are requiring larger and more complex software for support. As this requirement grows, the software development problems grow exponentially. It is believed that software quality can be improved by applying engineering principles. Another compelling reason to bring the engineering disciplines to software development is productivity. It has been estimated that productivity of producing software has only increased one to two percent a year in the last thirty years. Ironically, the computer and its software have contributed significantly to the industry-wide productivity, but computer professionals have done a poor job of using the computer to do their job. Engineering disciplines and methodologies are now emerging supported by software tools that address the problems of software development. This paper addresses some of the current software engineering methodologies as a backdrop for the general evaluation of computer assisted software engineering (CASE) tools from actual installation of and experimentation with some specific tools.

  6. Design of parametric software tools

    DEFF Research Database (Denmark)

    Sabra, Jakob Borrits; Mullins, Michael

    2011-01-01

    the operations and functions of the design method. To evaluate the prototype potentials, surveys with architectural and healthcare design companies are conducted. Evaluation is done by the administration of questionnaires being part of the development of the tools. The results show that architects, designers......The studies investigate the field of evidence-based design used in architectural design practice and propose a method using 2D/3D CAD applications to: 1) enhance integration of evidence-based design knowledge in architectural design phases with a focus on lighting and interior design and 2) assess...... fulfilment of evidence-based design criterion regarding light distribution and location in relation to patient safety in architectural health care design proposals. The study uses 2D/3D CAD modelling software Rhinoceros 3D with plug-in Grasshopper to create parametric tool prototypes to exemplify...

  7. Tools for Embedded Computing Systems Software

    Science.gov (United States)

    1978-01-01

    A workshop was held to assess the state of tools for embedded systems software and to determine directions for tool development. A synopsis of the talk and the key figures of each workshop presentation, together with chairmen summaries, are presented. The presentations covered four major areas: (1) tools and the software environment (development and testing); (2) tools and software requirements, design, and specification; (3) tools and language processors; and (4) tools and verification and validation (analysis and testing). The utility and contribution of existing tools and research results for the development and testing of embedded computing systems software are described and assessed.

  8. Software Tools Streamline Project Management

    Science.gov (United States)

    2009-01-01

    Three innovative software inventions from Ames Research Center (NETMARK, Program Management Tool, and Query-Based Document Management) are finding their way into NASA missions as well as industry applications. The first, NETMARK, is a program that enables integrated searching of data stored in a variety of databases and documents, meaning that users no longer have to look in several places for related information. NETMARK allows users to search and query information across all of these sources in one step. This cross-cutting capability in information analysis has exponentially reduced the amount of time needed to mine data from days or weeks to mere seconds. NETMARK has been used widely throughout NASA, enabling this automatic integration of information across many documents and databases. NASA projects that use NETMARK include the internal reporting system and project performance dashboard, Erasmus, NASA s enterprise management tool, which enhances organizational collaboration and information sharing through document routing and review; the Integrated Financial Management Program; International Space Station Knowledge Management; Mishap and Anomaly Information Reporting System; and management of the Mars Exploration Rovers. Approximately $1 billion worth of NASA s projects are currently managed using Program Management Tool (PMT), which is based on NETMARK. PMT is a comprehensive, Web-enabled application tool used to assist program and project managers within NASA enterprises in monitoring, disseminating, and tracking the progress of program and project milestones and other relevant resources. The PMT consists of an integrated knowledge repository built upon advanced enterprise-wide database integration techniques and the latest Web-enabled technologies. The current system is in a pilot operational mode allowing users to automatically manage, track, define, update, and view customizable milestone objectives and goals. The third software invention, Query

  9. Tools & training for more secure software

    CERN Document Server

    CERN. Geneva

    2017-01-01

    Just by fate of nature, software today is shipped out as “beta”, coming with vulnerabilities and weaknesses, which should already have been fixed at the programming stage. This presentation will show the consequences of suboptimal software, why good programming, thorough software design, and a proper software development process is imperative for the overall security of the Organization, and how a few simple tools and training are supposed to make CERN software more secure.

  10. Software management tools: Lessons learned from use

    Science.gov (United States)

    Reifer, D. J.; Valett, J.; Knight, J.; Wenneson, G.

    1985-01-01

    Experience in inserting software project planning tools into more than 100 projects producing mission critical software are discussed. The problems the software project manager faces are listed along with methods and tools available to handle them. Experience is reported with the Project Manager's Workstation (PMW) and the SoftCost-R cost estimating package. Finally, the results of a survey, which looked at what could be done in the future to overcome the problems experienced and build a set of truly useful tools, are presented.

  11. Assessment and Development of Software Engineering Tools

    Science.gov (United States)

    1991-01-16

    Assessment (REA) tool would advise a potential software reuser on the tradeoffs between reusing a RSC versus developing a brand new software product...of memberships in the key RSC reusability attributes; e.g., size, structure, or documentation, etc., all of which would be weighted by reuser

  12. A Software Tool for Legal Drafting

    CERN Document Server

    Gorín, Daniel; Schapachnik, Fernando; 10.4204/EPTCS.68.7

    2011-01-01

    Although many attempts at automated aids for legal drafting have been made, they were based on the construction of a new tool, completely from scratch. This is at least curious, considering that a strong parallelism can be established between a normative document and a software specification: both describe what an entity should or should not do, can or cannot do. In this article we compare normative documents and software specifications to find out their similarities and differences. The comparison shows that there are distinctive particularities, but they are restricted to a very specific subclass of normative propositions. The rest, we postulate, can be dealt with software tools. For such an enterprise the \\FormaLex tool set was devised: an LTL-based language and companion tools that utilize model checking to find out normative incoherences in regulations, contracts and other legal documents. A feature-rich case study is analyzed with the presented tools.

  13. A Software Tool for Legal Drafting

    Directory of Open Access Journals (Sweden)

    Daniel Gorín

    2011-09-01

    Full Text Available Although many attempts at automated aids for legal drafting have been made, they were based on the construction of a new tool, completely from scratch. This is at least curious, considering that a strong parallelism can be established between a normative document and a software specification: both describe what an entity should or should not do, can or cannot do. In this article we compare normative documents and software specifications to find out their similarities and differences. The comparison shows that there are distinctive particularities, but they are restricted to a very specific subclass of normative propositions. The rest, we postulate, can be dealt with software tools. For such an enterprise the FormaLex tool set was devised: an LTL-based language and companion tools that utilize model checking to find out normative incoherences in regulations, contracts and other legal documents. A feature-rich case study is analyzed with the presented tools.

  14. Tool Use Within NASA Software Quality Assurance

    Science.gov (United States)

    Shigeta, Denise; Port, Dan; Nikora, Allen P.; Wilf, Joel

    2013-01-01

    As space mission software systems become larger and more complex, it is increasingly important for the software assurance effort to have the ability to effectively assess both the artifacts produced during software system development and the development process itself. Conceptually, assurance is a straightforward idea - it is the result of activities carried out by an organization independent of the software developers to better inform project management of potential technical and programmatic risks, and thus increase management's confidence in the decisions they ultimately make. In practice, effective assurance for large, complex systems often entails assessing large, complex software artifacts (e.g., requirements specifications, architectural descriptions) as well as substantial amounts of unstructured information (e.g., anomaly reports resulting from testing activities during development). In such an environment, assurance engineers can benefit greatly from appropriate tool support. In order to do so, an assurance organization will need accurate and timely information on the tool support available for various types of assurance activities. In this paper, we investigate the current use of tool support for assurance organizations within NASA, and describe on-going work at JPL for providing assurance organizations with the information about tools they need to use them effectively.

  15. CPAchecker: A Tool for Configurable Software Verification

    CERN Document Server

    Beyer, Dirk

    2009-01-01

    Configurable software verification is a recent concept for expressing different program analysis and model checking approaches in one single formalism. This paper presents CPAchecker, a tool and framework that aims at easy integration of new verification components. Every abstract domain, together with the corresponding operations, is required to implement the interface of configurable program analysis (CPA). The main algorithm is configurable to perform a reachability analysis on arbitrary combinations of existing CPAs. The major design goal during the development was to provide a framework for developers that is flexible and easy to extend. We hope that researchers find it convenient and productive to implement new verification ideas and algorithms using this platform and that it advances the field by making it easier to perform practical experiments. The tool is implemented in Java and runs as command-line tool or as Eclipse plug-in. We evaluate the efficiency of our tool on benchmarks from the software mo...

  16. Software Tools Used for Continuous Assessment

    Directory of Open Access Journals (Sweden)

    Corina SBUGHEA

    2016-04-01

    Full Text Available he present paper addresses the subject of continuous evaluation and of the IT tools that support it. The approach starts from the main concepts and methods used in the teaching process, according to the assessment methodology and, then, it focuses on their implementation in the Wondershare QuizCreator software.

  17. ATLAS software configuration and build tool optimisation

    CERN Document Server

    Rybkin, G; The ATLAS collaboration

    2014-01-01

    ATLAS software code base is over 6 million lines organised in about 2000 packages. It makes use of some 100 external software packages, is developed by more than 400 developers and used by more than 2500 physicists from over 200 universities and laboratories in 6 continents. To meet the challenge of configuration and building of this software, the Configuration Management Tool (CMT) is used. CMT expects each package to describe its build targets, build and environment setup parameters, dependencies on other packages in a text file called requirements, and each project (group of packages) to describe its policies and dependencies on other projects in a text project file. Based on the effective set of configuration parameters read from the requirements files of dependent packages and project files, CMT commands build the packages, generate the environment for their use, or query the packages. The main focus was on build time performance that was optimised within several approaches: reduction of the number of re...

  18. ATLAS software configuration and build tool optimisation

    CERN Document Server

    Rybkin, G; The ATLAS collaboration

    2013-01-01

    ATLAS software code base is over 6 million lines organised in about 2000 packages. It makes use of some 100 external software packages, is developed by more than 400 developers and used by more than 2500 physicists from over 200 universities and laboratories in 6 continents. To meet the challenge of configuration and building of this software, the Configuration Management Tool (CMT) is used. CMT expects each package to describe its build targets, build and environment setup parameters, dependencies on other packages in a text file called requirements, and each project (group of packages) to describe its policies and dependencies on other projects in a text project file. Based on the effective set of configuration parameters read from the requirements files of dependent packages and project files, CMT commands build the packages, generate the environment for their use, or query the packages. The main focus was on build time performance that was optimised within several approaches: reduction of the number of re...

  19. Software Engineering Tools for Scientific Models

    Science.gov (United States)

    Abrams, Marc; Saboo, Pallabi; Sonsini, Mike

    2013-01-01

    Software tools were constructed to address issues the NASA Fortran development community faces, and they were tested on real models currently in use at NASA. These proof-of-concept tools address the High-End Computing Program and the Modeling, Analysis, and Prediction Program. Two examples are the NASA Goddard Earth Observing System Model, Version 5 (GEOS-5) atmospheric model in Cell Fortran on the Cell Broadband Engine, and the Goddard Institute for Space Studies (GISS) coupled atmosphere- ocean model called ModelE, written in fixed format Fortran.

  20. ATLAS software configuration and build tool optimisation

    Science.gov (United States)

    Rybkin, Grigory; Atlas Collaboration

    2014-06-01

    ATLAS software code base is over 6 million lines organised in about 2000 packages. It makes use of some 100 external software packages, is developed by more than 400 developers and used by more than 2500 physicists from over 200 universities and laboratories in 6 continents. To meet the challenge of configuration and building of this software, the Configuration Management Tool (CMT) is used. CMT expects each package to describe its build targets, build and environment setup parameters, dependencies on other packages in a text file called requirements, and each project (group of packages) to describe its policies and dependencies on other projects in a text project file. Based on the effective set of configuration parameters read from the requirements files of dependent packages and project files, CMT commands build the packages, generate the environment for their use, or query the packages. The main focus was on build time performance that was optimised within several approaches: reduction of the number of reads of requirements files that are now read once per package by a CMT build command that generates cached requirements files for subsequent CMT build commands; introduction of more fine-grained build parallelism at package task level, i.e., dependent applications and libraries are compiled in parallel; code optimisation of CMT commands used for build; introduction of package level build parallelism, i. e., parallelise the build of independent packages. By default, CMT launches NUMBER-OF-PROCESSORS build commands in parallel. The other focus was on CMT commands optimisation in general that made them approximately 2 times faster. CMT can generate a cached requirements file for the environment setup command, which is especially useful for deployment on distributed file systems like AFS or CERN VMFS. The use of parallelism, caching and code optimisation significantly-by several times-reduced software build time, environment setup time, increased the efficiency of

  1. Towards E-CASE Tools for Software Engineering

    OpenAIRE

    Nabil Arman,

    2013-01-01

    CASE tools are having an important role in all phases of software systems development and engineering. This is evident in the huge benefits obtained from using these tools including their cost-effectiveness, rapid software application development, and improving the possibility of software reuse to name just a few. In this paper, the idea of moving towards E-CASE tools, rather than traditional CASE tools, is advocated since these E-CASE tools have all the benefits and advantages of traditional...

  2. Selecting and effectively using a computer aided software engineering tool

    Energy Technology Data Exchange (ETDEWEB)

    Kuhn, D.L.

    1989-01-01

    Software engineering is a science by which user requirements are translated into a quality software product. Computer Aided Software Engineering (CASE) is the scientific application of a set of tools and methods to a software which results in high-quality, defect-free, and maintainable software products. The Computer Systems Engineering (CSE) group of Separations Technology at the Savannah River Site has successfully used CASE tools to produce high-quality, reliable, and maintainable software products. This paper details the selection process CSE used to acquire a commonly available CASE product and how the CSE group effectively used this CASE tool to consistently produce quality software. 9 refs.

  3. A Mock-Up Tool for Software Component Reuse Repository

    OpenAIRE

    P.Niranjan; C.V.Guru Rao

    2010-01-01

    Software Reuse effectiveness can be improved by reducing cost and investment.Software reuse costs can be reduced when reusable components are easy to locate, adaptand integrate into new efficient applications. Reuse is the key paradigm for increasingsoftware quality in the software development. This paper focuses on the implementationof software tool with a new integrated classification scheme to make classification buildof software components and effective software reuse repositories to faci...

  4. Tool Support for Software Lookup Table Optimization

    Directory of Open Access Journals (Sweden)

    Chris Wilcox

    2011-01-01

    Full Text Available A number of scientific applications are performance-limited by expressions that repeatedly call costly elementary functions. Lookup table (LUT optimization accelerates the evaluation of such functions by reusing previously computed results. LUT methods can speed up applications that tolerate an approximation of function results, thereby achieving a high level of fuzzy reuse. One problem with LUT optimization is the difficulty of controlling the tradeoff between performance and accuracy. The current practice of manual LUT optimization adds programming effort by requiring extensive experimentation to make this tradeoff, and such hand tuning can obfuscate algorithms. In this paper we describe a methodology and tool implementation to improve the application of software LUT optimization. Our Mesa tool implements source-to-source transformations for C or C++ code to automate the tedious and error-prone aspects of LUT generation such as domain profiling, error analysis, and code generation. We evaluate Mesa with five scientific applications. Our results show a performance improvement of 3.0× and 6.9× for two molecular biology algorithms, 1.4× for a molecular dynamics program, 2.1× to 2.8× for a neural network application, and 4.6× for a hydrology calculation. We find that Mesa enables LUT optimization with more control over accuracy and less effort than manual approaches.

  5. Sandia software guidelines: Volume 5, Tools, techniques, and methodologies

    Energy Technology Data Exchange (ETDEWEB)

    1989-07-01

    This volume is one in a series of Sandia Software Guidelines intended for use in producing quality software within Sandia National Laboratories. This volume describes software tools and methodologies available to Sandia personnel for the development of software, and outlines techniques that have proven useful within the Laboratories and elsewhere. References and evaluations by Sandia personnel are included. 6 figs.

  6. Tools and Behavioral Abstraction: A Direction for Software Engineering

    Science.gov (United States)

    Leino, K. Rustan M.

    As in other engineering professions, software engineers rely on tools. Such tools can analyze program texts and design specifications more automatically and in more detail than ever before. While many tools today are applied to find new defects in old code, I predict that more software-engineering tools of the future will be available to software authors at the time of authoring. If such analysis tools can be made to be fast enough and easy enough to use, they can help software engineers better produce and evolve programs.

  7. Educational Software Tool for Protection System Engineers. Distance Relay

    Directory of Open Access Journals (Sweden)

    Trujillo-Guajardo L.A.

    2012-04-01

    Full Text Available In this article, a graphical software tool is presented; this tool is based on the education of protection system engineers. The theoretical fundaments used for the design of operation characteristics of distance relays and their algorithms are presented. The software allows the evaluation and analysis of real time events or simulated ones of every stage of design of the distance relay. Some example cases are presented to illustrate the activities that could be done with the graphical software tool developed.

  8. Herramientas libres para modelar software Free tools to model software

    Directory of Open Access Journals (Sweden)

    Mauro Callejas Cuervo Óscar Yovany Baquero Moreno

    2010-11-01

    Full Text Available Observación acerca del  software libre y de suimplicación en procesos de desarrollo de  softwarecon herramientas 4G por parte de entidades opersonas sin capitales astronómicos y sin lamentalidad acaparadora de dominar el mercado conproductos costosos que las hagan multimillonarias yque no ofrecen una garantía real, ni la posibilidadsiquiera de conocer el  software por el que se hapagado, y mucho menos de modificarlo si no cumplenuestras expectativas.

  9. SRGM Analyzers Tool of SDLC for Software Improving Quality

    Directory of Open Access Journals (Sweden)

    Mr. Girish Nille

    2014-11-01

    Full Text Available Software Reliability Growth Models (SRGM have been developed to estimate software reliability measures such as software failure rate, number of remaining faults and software reliability. In this paper, the software analyzers tool proposed for deriving several software reliability growth models based on Enhanced Non-homogeneous Poisson Process (ENHPP in the presence of imperfect debugging and error generation. The proposed models are initially formulated for the case when there is no differentiation between failure observation and fault removal testing processes and then this extended for the case when there is a clear differentiation between failure observation and fault removal testing processes. Many Software Reliability Growth Models (SRGM have been developed to describe software failures as a random process and can be used to measure the development status during testing. With SRGM software consultants can easily measure (or evaluate the software reliability (or quality and plot software reliability growth charts.

  10. A Taxonomy of Knowledge Management Software Tools: Origins and Applications.

    Science.gov (United States)

    Tyndale, Peter

    2002-01-01

    Examines, evaluates, and organizes a wide variety of knowledge management software tools by examining the literature related to the selection and evaluation of knowledge management tools. (Author/SLD)

  11. Towards E-CASE Tools for Software Engineering

    Directory of Open Access Journals (Sweden)

    Nabil Arman

    2013-02-01

    Full Text Available CASE tools are having an important role in all phases of software systems development and engineering. This is evident in the huge benefits obtained from using these tools including their cost-effectiveness, rapid software application development, and improving the possibility of software reuse to name just a few. In this paper, the idea of moving towards E-CASE tools, rather than traditional CASE tools, is advocated since these E-CASE tools have all the benefits and advantages of traditional CASE tools and add to that all the benefits of web technology. This is presented by focusing on the role of E-CASE tools in facilitating the trend of telecommuting and virtual workplaces among software engineering and information technology professionals. In addition, E-CASE tools integrates smoothly with the trend of E-learning in conducting software engineering courses. Finally, two surveys were conducted for a group of software engineering professional and students of software engineering courses. The surveys show that E-CASE tools are of great value to both communities of students and professionals of software engineering.

  12. TAUS:A File—Based Software Understanding Tool

    Institute of Scientific and Technical Information of China (English)

    费翔林; 汪承藻; 等

    1990-01-01

    A program called TAUS,a Tool for Analyzing and Understanding Software,was developed.It is designed to help the programmer analyze and understand the software interactively.Its aim is to reduce the dependence on human intelligence in software understanding and improve the programmer's understanding productivity.The design and implementation of TAUS and its applications are described.

  13. Estimation of toxicity using a Java based software tool

    Science.gov (United States)

    A software tool has been developed that will allow a user to estimate the toxicity for a variety of endpoints (such as acute aquatic toxicity). The software tool is coded in Java and can be accessed using a web browser (or alternatively downloaded and ran as a stand alone applic...

  14. EISA 432 Energy Audits Best Practices: Software Tools

    Energy Technology Data Exchange (ETDEWEB)

    Maryl Fisher

    2014-11-01

    Five whole building analysis software tools that can aid an energy manager with fulfilling energy audit and commissioning/retro-commissioning requirements were selected for review in this best practices study. A description of each software tool is provided as well as a discussion of the user interface and level of expertise required for each tool, a review of how to use the tool for analyzing energy conservation opportunities, the format and content of reports generated by the tool, and a discussion on the applicability of the tool for commissioning.

  15. Software Tools for Fault Management Technologies Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Fault Management (FM) is a key requirement for safety, efficient onboard and ground operations, maintenance, and repair. QSI's TEAMS Software suite is a leading...

  16. Software Users Manual (SUM): Extended Testability Analysis (ETA) Tool

    Science.gov (United States)

    Maul, William A.; Fulton, Christopher E.

    2011-01-01

    This software user manual describes the implementation and use the Extended Testability Analysis (ETA) Tool. The ETA Tool is a software program that augments the analysis and reporting capabilities of a commercial-off-the-shelf (COTS) testability analysis software package called the Testability Engineering And Maintenance System (TEAMS) Designer. An initial diagnostic assessment is performed by the TEAMS Designer software using a qualitative, directed-graph model of the system being analyzed. The ETA Tool utilizes system design information captured within the diagnostic model and testability analysis output from the TEAMS Designer software to create a series of six reports for various system engineering needs. The ETA Tool allows the user to perform additional studies on the testability analysis results by determining the detection sensitivity to the loss of certain sensors or tests. The ETA Tool was developed to support design and development of the NASA Ares I Crew Launch Vehicle. The diagnostic analysis provided by the ETA Tool was proven to be valuable system engineering output that provided consistency in the verification of system engineering requirements. This software user manual provides a description of each output report generated by the ETA Tool. The manual also describes the example diagnostic model and supporting documentation - also provided with the ETA Tool software release package - that were used to generate the reports presented in the manual

  17. Innovative Software Tools Measure Behavioral Alertness

    Science.gov (United States)

    2014-01-01

    To monitor astronaut behavioral alertness in space, Johnson Space Center awarded Philadelphia-based Pulsar Informatics Inc. SBIR funding to develop software to be used onboard the International Space Station. Now used by the government and private companies, the technology has increased revenues for the firm by an average of 75 percent every year.

  18. An Evaluation Format for "Open" Software Tools.

    Science.gov (United States)

    Murphy, Cheryl A.

    1995-01-01

    Evaluates six "open" (empty of content and customized by users) software programs using the literature-based characteristics of documentation, learner control, branching capabilities, portability, ease of use, and cost-effectiveness. Interviewed computer-knowledgeable individuals to confirm the legitimacy of the evaluative characteristics. (LRW)

  19. Computer- Aided Design in Power Engineering Application of Software Tools

    CERN Document Server

    Stojkovic, Zlatan

    2012-01-01

    This textbooks demonstrates the application of software tools in solving a series of problems from the field of designing power system structures and systems. It contains four chapters: The first chapter leads the reader through all the phases necessary in the procedures of computer aided modeling and simulation. It guides through the complex problems presenting on the basis of eleven original examples. The second chapter presents  application of software tools in power system calculations of power systems equipment design. Several design example calculations are carried out using engineering standards like MATLAB, EMTP/ATP, Excel & Access, AutoCAD and Simulink. The third chapters focuses on the graphical documentation using a collection of software tools (AutoCAD, EPLAN, SIMARIS SIVACON, SIMARIS DESIGN) which enable the complete automation of the development of graphical documentation of a power systems. In the fourth chapter, the application of software tools in the project management in power systems ...

  20. ISWHM: Tools and Techniques for Software and System Health Management

    Science.gov (United States)

    Schumann, Johann; Mengshoel, Ole J.; Darwiche, Adnan

    2010-01-01

    This presentation presents status and results of research on Software Health Management done within the NRA "ISWHM: Tools and Techniques for Software and System Health Management." Topics include: Ingredients of a Guidance, Navigation, and Control System (GN and C); Selected GN and C Testbed example; Health Management of major ingredients; ISWHM testbed architecture; and Conclusions and next Steps.

  1. Use of Software Tools in Teaching Relational Database Design.

    Science.gov (United States)

    McIntyre, D. R.; And Others

    1995-01-01

    Discusses the use of state-of-the-art software tools in teaching a graduate, advanced, relational database design course. Results indicated a positive student response to the prototype of expert systems software and a willingness to utilize this new technology both in their studies and in future work applications. (JKP)

  2. iPhone examination with modern forensic software tools

    Science.gov (United States)

    Höne, Thomas; Kröger, Knut; Luttenberger, Silas; Creutzburg, Reiner

    2012-06-01

    The aim of the paper is to show the usefulness of modern forensic software tools for iPhone examination. In particular, we focus on the new version of Elcomsoft iOS Forensic Toolkit and compare it with Oxygen Forensics Suite 2012 regarding functionality, usability and capabilities. It is shown how these software tools works and how capable they are in examining non-jailbreaked and jailbreaked iPhones.

  3. Generating DEM from LIDAR data - comparison of available software tools

    Science.gov (United States)

    Korzeniowska, K.; Lacka, M.

    2011-12-01

    In recent years many software tools and applications have appeared that offer procedures, scripts and algorithms to process and visualize ALS data. This variety of software tools and of "point cloud" processing methods contributed to the aim of this study: to assess algorithms available in various software tools that are used to classify LIDAR "point cloud" data, through a careful examination of Digital Elevation Models (DEMs) generated from LIDAR data on a base of these algorithms. The works focused on the most important available software tools: both commercial and open source ones. Two sites in a mountain area were selected for the study. The area of each site is 0.645 sq km. DEMs generated with analysed software tools ware compared with a reference dataset, generated using manual methods to eliminate non ground points. Surfaces were analysed using raster analysis. Minimum, maximum and mean differences between reference DEM and DEMs generated with analysed software tools were calculated, together with Root Mean Square Error. Differences between DEMs were also examined visually using transects along the grid axes in the test sites.

  4. The evolution of CACSD tools-a software engineering perspective

    DEFF Research Database (Denmark)

    Ravn, Ole; Szymkat, Maciej

    1992-01-01

    The earlier evolution of computer-aided control system design (CACSD) tools is discussed from a software engineering perspective. A model of the design process is presented as the basis for principles and requirements of future CACSD tools. Combinability, interfacing in memory, and an open...... workspace are seen as important concepts in CACSD. Some points are made about the problem of buy or make when new software is required, and the idea of buy and make is put forward. Emphasis is put on the time perspective and the life cycle of the software...

  5. Classifying Desirable Features of Software Visualization Tools for Corrective Maintenance

    NARCIS (Netherlands)

    Sensalire, Mariam; Ogao, Patrick; Telea, Alexandru

    2008-01-01

    We provide an evaluation of 15 software visualization tools applicable to corrective maintenance. The tasks supported as well as the techniques used are presented and graded based on the support level. By analyzing user acceptation of current tools, we aim to help developers to select what to consid

  6. Software project management tools in global software development: a systematic mapping study.

    Science.gov (United States)

    Chadli, Saad Yasser; Idri, Ali; Ros, Joaquín Nicolás; Fernández-Alemán, José Luis; de Gea, Juan M Carrillo; Toval, Ambrosio

    2016-01-01

    Global software development (GSD) which is a growing trend in the software industry is characterized by a highly distributed environment. Performing software project management (SPM) in such conditions implies the need to overcome new limitations resulting from cultural, temporal and geographic separation. The aim of this research is to discover and classify the various tools mentioned in literature that provide GSD project managers with support and to identify in what way they support group interaction. A systematic mapping study has been performed by means of automatic searches in five sources. We have then synthesized the data extracted and presented the results of this study. A total of 102 tools were identified as being used in SPM activities in GSD. We have classified these tools, according to the software life cycle process on which they focus and how they support the 3C collaboration model (communication, coordination and cooperation). The majority of the tools found are standalone tools (77%). A small number of platforms (8%) also offer a set of interacting tools that cover the software development lifecycle. Results also indicate that SPM areas in GSD are not adequately supported by corresponding tools and deserve more attention from tool builders.

  7. Software for systems biology: from tools to integrated platforms.

    Science.gov (United States)

    Ghosh, Samik; Matsuoka, Yukiko; Asai, Yoshiyuki; Hsin, Kun-Yi; Kitano, Hiroaki

    2011-11-03

    Understanding complex biological systems requires extensive support from software tools. Such tools are needed at each step of a systems biology computational workflow, which typically consists of data handling, network inference, deep curation, dynamical simulation and model analysis. In addition, there are now efforts to develop integrated software platforms, so that tools that are used at different stages of the workflow and by different researchers can easily be used together. This Review describes the types of software tools that are required at different stages of systems biology research and the current options that are available for systems biology researchers. We also discuss the challenges and prospects for modelling the effects of genetic changes on physiology and the concept of an integrated platform.

  8. Runtime software adaptation: approaches and a programming tool

    Directory of Open Access Journals (Sweden)

    Jarosław Rudy

    2012-03-01

    Full Text Available Software systems steadily tend to be bigger and more complex, making it more difficult to change them, especially during runtime. Several types of runtime software adaptation approaches were proposed to increase the adaptation capability of applications and turn them into an evolution software. Many of these approaches (using software architectural models for example are implemented during the design phase of software development life cycle, making them ineffective or difficult to use in case of already existing applications. Moreover, the overhead caused by the use of these approaches has not been determined in many cases. In this paper author presents the taxonomy of high- and low-level approaches to runtime software adaptation and then introduces a lightweight prototype programming tool used to add runtime code modification capability (via function hotswapping to existing applications written in C++ and run under Linux. The tool also enables to replace a defective function by its older or corrected version at runtime. Several tests were prepared to compare traditional C++ applications with the same applications developed with the aforementioned programming tool. Applications were compared in terms of execution time, size of executable code and memory usage. Different size and number of functions have been considered. The paper also researches the constant overhead caused by the programming tool regardless of the target application. The paper ends with the summary of presented approaches and their characteristics, including effects on the targeted systems, capabilities, ease of use, level of abstraction etc.

  9. Software tool for data mining and its applications

    Science.gov (United States)

    Yang, Jie; Ye, Chenzhou; Chen, Nianyi

    2002-03-01

    A software tool for data mining is introduced, which integrates pattern recognition (PCA, Fisher, clustering, hyperenvelop, regression), artificial intelligence (knowledge representation, decision trees), statistical learning (rough set, support vector machine), computational intelligence (neural network, genetic algorithm, fuzzy systems). It consists of nine function models: pattern recognition, decision trees, association rule, fuzzy rule, neural network, genetic algorithm, Hyper Envelop, support vector machine, visualization. The principle and knowledge representation of some function models of data mining are described. The software tool of data mining is realized by Visual C++ under Windows 2000. Nonmonotony in data mining is dealt with by concept hierarchy and layered mining. The software tool of data mining has satisfactorily applied in the prediction of regularities of the formation of ternary intermetallic compounds in alloy systems, and diagnosis of brain glioma.

  10. Concepts and Tools for the Software Life Cycle

    Science.gov (United States)

    Tausworthe, R. C.

    1985-01-01

    The tools, techniques, and aids needed to engineer, manage, and administer a large software-intensive task are themselves parts of a large softwaare base, and are incurred only at great expense. The needs of the software life cycle in terms of such supporting tools and methodologies are highlighted. The concept of a distributed network for engineering, management, and administrative functions is outlined, and the key characteristics of localized subnets in high-communications-traffic areas of software activity are discussed. A formal, deliberate, structured, systems-engineering approach for the construction of a uniform, coordinated tool set is proposed as a means to reduce development and maintenance costs, foster adaptability, enhance reliability, and promote standardization.

  11. Software Tools for High-Performance Computiing: Survey and Recommendations

    Directory of Open Access Journals (Sweden)

    Bill Appelbe

    1996-01-01

    Full Text Available Applications programming for high-performance computing is notoriously difficult. Al-though parallel programming is intrinsically complex, the principal reason why high-performance computing is difficult is the lack of effective software tools. We believe that the lack of tools in turn is largely due to market forces rather than our inability to design and build such tools. Unfortunately, the poor availability and utilization of parallel tools hurt the entire supercomputing industry and the U.S. high performance computing initiative which is focused on applications. A disproportionate amount of resources is being spent on faster hardware and architectures, while tools are being neglected. This article introduces a taxonomy of tools, analyzes the major factors that contribute to this situation, and suggests ways that the imbalance could be redressed and the likely evolution of tools.

  12. PAnalyzer: A software tool for protein inference in shotgun proteomics

    Directory of Open Access Journals (Sweden)

    Prieto Gorka

    2012-11-01

    Full Text Available Abstract Background Protein inference from peptide identifications in shotgun proteomics must deal with ambiguities that arise due to the presence of peptides shared between different proteins, which is common in higher eukaryotes. Recently data independent acquisition (DIA approaches have emerged as an alternative to the traditional data dependent acquisition (DDA in shotgun proteomics experiments. MSE is the term used to name one of the DIA approaches used in QTOF instruments. MSE data require specialized software to process acquired spectra and to perform peptide and protein identifications. However the software available at the moment does not group the identified proteins in a transparent way by taking into account peptide evidence categories. Furthermore the inspection, comparison and report of the obtained results require tedious manual intervention. Here we report a software tool to address these limitations for MSE data. Results In this paper we present PAnalyzer, a software tool focused on the protein inference process of shotgun proteomics. Our approach considers all the identified proteins and groups them when necessary indicating their confidence using different evidence categories. PAnalyzer can read protein identification files in the XML output format of the ProteinLynx Global Server (PLGS software provided by Waters Corporation for their MSE data, and also in the mzIdentML format recently standardized by HUPO-PSI. Multiple files can also be read simultaneously and are considered as technical replicates. Results are saved to CSV, HTML and mzIdentML (in the case of a single mzIdentML input file files. An MSE analysis of a real sample is presented to compare the results of PAnalyzer and ProteinLynx Global Server. Conclusions We present a software tool to deal with the ambiguities that arise in the protein inference process. Key contributions are support for MSE data analysis by ProteinLynx Global Server and technical replicates

  13. Automotive Software Engineering. Fundamentals, processes, methods, tools; Automotive Software Engineering. Grundlagen, Prozesse, Methoden und Werkzeuge

    Energy Technology Data Exchange (ETDEWEB)

    Schaeuffele, J.; Zurawka, T.

    2003-07-01

    The book presents fundamentals and practical examples of processes, methods and tools to ensure safe operation of electronic systems and software in motor vehicles. The focus is on the electronic systems of the powertrain, suspension and ar body. Contents: The overall system of car, driver and environment; Fundamentals; Processes for development of electronic systems and software; Methods and tools for the development, production and servicing of electronic systems. The book addresses staff members of motor car producers and suppliers of electronic systems and software, as well as students of computer science, electrical and mechanical engineering specifying in car engineering, control engineering, mechatronics and software engineering. [German] Dieses Buch enthaelt Grundlagen und praktische Beispiele zu Prozessen, Methoden und Werkzeugen, die zur sicheren Beherrschbarkeit von elektronischen Systemen und Software im Fahrzeug beitragen. Dabei stehen die elektronischen Systeme des Antriebsstrangs, des Fahrwerks und der Karosserie im Vordergrund. Zum Inhalt gehoeren die folgenden Rubriken: Gesamtsystem Fahrzeug-Fahrer-Umwelt - Grundlagen - Prozesse zur Entwicklung von elektronischen Systemen und Software - Methoden und Werkzeuge fuer die Entwicklung, die Produktion un den Service elektronischer Systeme. Das Buch richtet sich an alle Mitarbeiter von Fahrzeugherstellern und Zulieferern, die sich mit elektornischen Systemen und Software im Fahrzeug beschaeftigen. Studierende der Informatik, der Elektrotechnik oder des Maschinenbaus mit den Schwerpunkten Fahrzeugtechnik, Steuerungs- und Regelungstechnik, Mechatronik oder Software-Technik. (orig.)

  14. The Systems Biology Research Tool: evolvable open-source software

    Directory of Open Access Journals (Sweden)

    Wright Jeremiah

    2008-06-01

    Full Text Available Abstract Background Research in the field of systems biology requires software for a variety of purposes. Software must be used to store, retrieve, analyze, and sometimes even to collect the data obtained from system-level (often high-throughput experiments. Software must also be used to implement mathematical models and algorithms required for simulation and theoretical predictions on the system-level. Results We introduce a free, easy-to-use, open-source, integrated software platform called the Systems Biology Research Tool (SBRT to facilitate the computational aspects of systems biology. The SBRT currently performs 35 methods for analyzing stoichiometric networks and 16 methods from fields such as graph theory, geometry, algebra, and combinatorics. New computational techniques can be added to the SBRT via process plug-ins, providing a high degree of evolvability and a unifying framework for software development in systems biology. Conclusion The Systems Biology Research Tool represents a technological advance for systems biology. This software can be used to make sophisticated computational techniques accessible to everyone (including those with no programming ability, to facilitate cooperation among researchers, and to expedite progress in the field of systems biology.

  15. Knickpoint finder: A software tool that improves neotectonic analysis

    Science.gov (United States)

    Queiroz, G. L.; Salamuni, E.; Nascimento, E. R.

    2015-03-01

    This work presents a new software tool for morphometric analysis of drainage networks based on the methods of Hack (1973) and Etchebehere et al. (2004). This tool is applicable to studies of morphotectonics and neotectonics. The software used a digital elevation model (DEM) to identify the relief breakpoints along drainage profiles (knickpoints). The program was coded in Python for use on the ArcGIS platform and is called Knickpoint Finder. A study area was selected to test and evaluate the software's ability to analyze and identify neotectonic morphostructures based on the morphology of the terrain. For an assessment of its validity, we chose an area of the James River basin, which covers most of the Piedmont area of Virginia (USA), which is an area of constant intraplate seismicity and non-orogenic active tectonics and exhibits a relatively homogeneous geodesic surface currently being altered by the seismogenic features of the region. After using the tool in the chosen area, we found that the knickpoint locations are associated with the geologic structures, epicenters of recent earthquakes, and drainages with rectilinear anomalies. The regional analysis demanded the use of a spatial representation of the data after processing using Knickpoint Finder. The results were satisfactory in terms of the correlation of dense areas of knickpoints with active lineaments and the rapidity of the identification of deformed areas. Therefore, this software tool may be considered useful in neotectonic analyses of large areas and may be applied to any area where there is DEM coverage.

  16. Software Tools: A One-Semester Secondary School Computer Course.

    Science.gov (United States)

    Bromley, John; Lakatos, John

    1985-01-01

    Provides a course outline, describes equipment and teacher requirements, discusses student evaluation and course outcomes, and details the computer programs used in a high school course. The course is designed to teach students use of the microcomputer as a tool through hands-on experience with a variety of commercial software programs. (MBR)

  17. Metabolic interrelationships software application: Interactive learning tool for intermediary metabolism

    NARCIS (Netherlands)

    A.J.M. Verhoeven (Adrie); M. Doets (Mathijs); J.M.J. Lamers (Jos); J.F. Koster (Johan)

    2005-01-01

    textabstractWe developed and implemented the software application titled Metabolic Interrelationships as a self-learning and -teaching tool for intermediary metabolism. It is used by undergraduate medical students in an integrated organ systems-based and disease-oriented core curriculum, which start

  18. Metamodelling Approach and Software Tools for Physical Modelling and Simulation

    Directory of Open Access Journals (Sweden)

    Vitaliy Mezhuyev

    2015-02-01

    Full Text Available In computer science, metamodelling approach becomes more and more popular for the purpose of software systems development. In this paper, we discuss applicability of the metamodelling approach for development of software tools for physical modelling and simulation.To define a metamodel for physical modelling the analysis of physical models will be done. The result of such the analyses will show the invariant physical structures, we propose to use as the basic abstractions of the physical metamodel. It is a system of geometrical objects, allowing to build a spatial structure of physical models and to set a distribution of physical properties. For such geometry of distributed physical properties, the different mathematical methods can be applied. To prove the proposed metamodelling approach, we consider the developed prototypes of software tools.

  19. A Brief Review of Software Tools for Pangenomics

    Institute of Scientific and Technical Information of China (English)

    Jingfa Xiao; Zhewen Zhang; Jiayan Wu; Jun Yu

    2015-01-01

    Since the proposal for pangenomic study, there have been a dozen software tools actively in use for pangenomic analysis. By the end of 2014, Panseq and the pan-genomes analysis pipeline (PGAP) ranked as the top two most popular packages according to cumulative citations of peer-reviewed scientific publications. The functions of the software packages and tools, albeit variable among them, include categorizing orthologous genes, calculating pangenomic profiles, integrating gene annotations, and constructing phylogenies. As epigenomic elements are being gradually revealed in prokaryotes, it is expected that pangenomic databases and toolkits have to be extended to handle information of detailed functional annotations for genes and non-protein-coding sequences including non-coding RNAs, insertion elements, and conserved structural elements. To develop better bioinformatic tools, user feedback and integration of novel features are both of essence.

  20. A brief review of software tools for pangenomics.

    Science.gov (United States)

    Xiao, Jingfa; Zhang, Zhewen; Wu, Jiayan; Yu, Jun

    2015-02-01

    Since the proposal for pangenomic study, there have been a dozen software tools actively in use for pangenomic analysis. By the end of 2014, Panseq and the pan-genomes analysis pipeline (PGAP) ranked as the top two most popular packages according to cumulative citations of peer-reviewed scientific publications. The functions of the software packages and tools, albeit variable among them, include categorizing orthologous genes, calculating pangenomic profiles, integrating gene annotations, and constructing phylogenies. As epigenomic elements are being gradually revealed in prokaryotes, it is expected that pangenomic databases and toolkits have to be extended to handle information of detailed functional annotations for genes and non-protein-coding sequences including non-coding RNAs, insertion elements, and conserved structural elements. To develop better bioinformatic tools, user feedback and integration of novel features are both of essence.

  1. A Brief Review of Software Tools for Pangenomics

    Directory of Open Access Journals (Sweden)

    Jingfa Xiao

    2015-02-01

    Full Text Available Since the proposal for pangenomic study, there have been a dozen software tools actively in use for pangenomic analysis. By the end of 2014, Panseq and the pan-genomes analysis pipeline (PGAP ranked as the top two most popular packages according to cumulative citations of peer-reviewed scientific publications. The functions of the software packages and tools, albeit variable among them, include categorizing orthologous genes, calculating pangenomic profiles, integrating gene annotations, and constructing phylogenies. As epigenomic elements are being gradually revealed in prokaryotes, it is expected that pangenomic databases and toolkits have to be extended to handle information of detailed functional annotations for genes and non-protein-coding sequences including non-coding RNAs, insertion elements, and conserved structural elements. To develop better bioinformatic tools, user feedback and integration of novel features are both of essence.

  2. Software tools at the Rome CMS/ECAL Regional Center

    CERN Document Server

    Organtini, G

    2001-01-01

    The construction of the CMS electromagnetic calorimeter is under way in Rome and at CERN. To this purpose, two Regional Centers were set up in both sites. In Rome, the project was entirely carried out using new software technologies such as object oriented programming, object databases, CORBA programming and Web tools. It can be regarded as a use case for the evaluation of the benefits of new software technologies in high energy physics. Our experience is positive and encouraging for the future. (10 refs).

  3. COSTMODL: An automated software development cost estimation tool

    Science.gov (United States)

    Roush, George B.

    1991-01-01

    The cost of developing computer software continues to consume an increasing portion of many organizations' total budgets, both in the public and private sector. As this trend develops, the capability to produce reliable estimates of the effort and schedule required to develop a candidate software product takes on increasing importance. The COSTMODL program was developed to provide an in-house capability to perform development cost estimates for NASA software projects. COSTMODL is an automated software development cost estimation tool which incorporates five cost estimation algorithms including the latest models for the Ada language and incrementally developed products. The principal characteristic which sets COSTMODL apart from other software cost estimation programs is its capacity to be completely customized to a particular environment. The estimation equations can be recalibrated to reflect the programmer productivity characteristics demonstrated by the user's organization, and the set of significant factors which effect software development costs can be customized to reflect any unique properties of the user's development environment. Careful use of a capability such as COSTMODL can significantly reduce the risk of cost overruns and failed projects.

  4. Software Tools to Support the Assessment of System Health

    Science.gov (United States)

    Melcher, Kevin J.

    2013-01-01

    This presentation provides an overview of three software tools that were developed by the NASA Glenn Research Center to support the assessment of system health: the Propulsion Diagnostic Method Evaluation Strategy (ProDIMES), the Systematic Sensor Selection Strategy (S4), and the Extended Testability Analysis (ETA) tool. Originally developed to support specific NASA projects in aeronautics and space, these software tools are currently available to U.S. citizens through the NASA Glenn Software Catalog. The ProDiMES software tool was developed to support a uniform comparison of propulsion gas path diagnostic methods. Methods published in the open literature are typically applied to dissimilar platforms with different levels of complexity. They often address different diagnostic problems and use inconsistent metrics for evaluating performance. As a result, it is difficult to perform a one ]to ]one comparison of the various diagnostic methods. ProDIMES solves this problem by serving as a theme problem to aid in propulsion gas path diagnostic technology development and evaluation. The overall goal is to provide a tool that will serve as an industry standard, and will truly facilitate the development and evaluation of significant Engine Health Management (EHM) capabilities. ProDiMES has been developed under a collaborative project of The Technical Cooperation Program (TTCP) based on feedback provided by individuals within the aircraft engine health management community. The S4 software tool provides a framework that supports the optimal selection of sensors for health management assessments. S4 is structured to accommodate user ]defined applications, diagnostic systems, search techniques, and system requirements/constraints. One or more sensor suites that maximize this performance while meeting other user ]defined system requirements that are presumed to exist. S4 provides a systematic approach for evaluating combinations of sensors to determine the set or sets of

  5. Software Tools for Electrical Quality Assurance in the LHC

    CERN Document Server

    Bednarek, Mateusz

    2011-01-01

    There are over 1600 superconducting magnet circuits in the LHC machine. Many of them consist of a large number of components electrically connected in series. This enhances the sensitivity of the whole circuits to electrical faults of individual components. Furthermore, circuits are equipped with a large number of instrumentation wires, which are exposed to accidental damage or swapping. In order to ensure safe operation, an Electrical Quality Assurance (ELQA) campaign is needed after each thermal cycle. Due to the complexity of the circuits, as well as their distant geographical distribution (tunnel of 27km circumference divided in 8 sectors), suitable software and hardware platforms had to be developed. The software combines an Oracle database, LabView data acquisition applications and PHP-based web follow-up tools. This paper describes the software used for the ELQA of the LHC.

  6. Development of the software generation method using model driven software engineering tool

    Energy Technology Data Exchange (ETDEWEB)

    Jang, H. S.; Jeong, J. C.; Kim, J. H.; Han, H. W.; Kim, D. Y.; Jang, Y. W. [KOPEC, Taejon (Korea, Republic of); Moon, W. S. [NEXTech Inc., Seoul (Korea, Republic of)

    2003-10-01

    The methodologies to generate the automated software design specification and source code for the nuclear I and C systems software using model driven language is developed in this work. For qualitative analysis of the algorithm, the activity diagram is modeled and generated using Unified Modeling Language (UML), and then the sequence diagram is designed for automated source code generation. For validation of the generated code, the code audits and module test is performed using Test and QA tool. The code coverage and complexities of example code are examined in this stage. The low pressure pressurizer reactor trip module of the Plant Protection System was programmed as subject for this task. The test result using the test tool shows that errors were easily detected from the generated source codes that have been generated using test tool. The accuracy of input/output processing by the execution modules was clearly identified.

  7. Constructing an advanced software tool for planetary atmospheric modeling

    Science.gov (United States)

    Keller, Richard M.; Sims, Michael; Podolak, Ester; Mckay, Christopher

    1990-01-01

    Scientific model building can be an intensive and painstaking process, often involving the development of large and complex computer programs. Despite the effort involved, scientific models cannot be easily distributed and shared with other scientists. In general, implemented scientific models are complex, idiosyncratic, and difficult for anyone but the original scientist/programmer to understand. We believe that advanced software techniques can facilitate both the model building and model sharing process. In this paper, we describe a prototype for a scientific modeling software tool that serves as an aid to the scientist in developing and using models. This tool includes an interactive intelligent graphical interface, a high level domain specific modeling language, a library of physics equations and experimental datasets, and a suite of data display facilities. Our prototype has been developed in the domain of planetary atmospheric modeling, and is being used to construct models of Titan's atmosphere.

  8. The RUMBA software: tools for neuroimaging data analysis.

    Science.gov (United States)

    Bly, Benjamin Martin; Rebbechi, Donovan; Hanson, Stephen Jose; Grasso, Giorgio

    2004-01-01

    The enormous scale and complexity of data sets in functional neuroimaging makes it crucial to have well-designed and flexible software for image processing, modeling, and statistical analysis. At present, researchers must choose between general purpose scientific computing environments (e.g., Splus and Matlab), and specialized human brain mapping packages that implement particular analysis strategies (e.g., AFNI, SPM, VoxBo, FSL or FIASCO). For the vast majority of users in Human Brain Mapping and Cognitive Neuroscience, general purpose computing environments provide an insufficient framework for a complex data-analysis regime. On the other hand, the operational particulars of more specialized neuroimaging analysis packages are difficult or impossible to modify and provide little transparency or flexibility to the user for approaches other than massively multiple comparisons based on inferential statistics derived from linear models. In order to address these problems, we have developed open-source software that allows a wide array of data analysis procedures. The RUMBA software includes programming tools that simplify the development of novel methods, and accommodates data in several standard image formats. A scripting interface, along with programming libraries, defines a number of useful analytic procedures, and provides an interface to data analysis procedures. The software also supports a graphical functional programming environment for implementing data analysis streams based on modular functional components. With these features, the RUMBA software provides researchers programmability, reusability, modular analysis tools, novel data analysis streams, and an analysis environment in which multiple approaches can be contrasted and compared. The RUMBA software retains the flexibility of general scientific computing environments while adding a framework in which both experts and novices can develop and adapt neuroimaging-specific analyses.

  9. Northwestern University Schizophrenia Data and Software Tool (NUSDAST)

    OpenAIRE

    Wang, Lei; Kogan, Alex; Cobia, Derin; Alpert, Kathryn; Kolasny, Anthony; Miller, Michael I.; Marcus, Daniel

    2013-01-01

    The schizophrenia research community has invested substantial resources on collecting, managing and sharing large neuroimaging datasets. As part of this effort, our group has collected high resolution magnetic resonance (MR) datasets from individuals with schizophrenia, their non-psychotic siblings, healthy controls and their siblings. This effort has resulted in a growing resource, the Northwestern University Schizophrenia Data and Software Tool (NUSDAST), an NIH-funded data sharing project ...

  10. Software reference for SaTool - a Tool for Structural Analysis of Automated Systems

    DEFF Research Database (Denmark)

    Lorentzen, Torsten; Blanke, Mogens

    2004-01-01

    This software reference details the functions of SaTool – a tool for structural analysis of technical systems. SaTool is intended used as part of an industrial systems design cycle. Structural analysis is a graph-based technique where principal relations between variables express the system...... of the graph. SaTool makes analysis of the structure graph to provide knowledge about fundamental properties of the system in normal and faulty conditions. Salient features of SaTool include rapid analysis of possibility to diagnose faults and ability to make autonomous recovery should faults occur....

  11. Software Information Base(SIB)and Its Integration with Data Flow Diagram(DFD)Tool

    Institute of Scientific and Technical Information of China (English)

    董士海

    1989-01-01

    Software in formation base is the main technique of the integration of software engineering environment.Data flow diagram tool is an important software tool to support software requirement analysis phase.This article introduces the functions,structures of a Software Information Base(SIB),and a Data Flow Diagram tool first.The E-R data model of SIB and its integration with Data Flow Diagram tool are emphatically described.

  12. Regulatory network operations in the Pathway Tools software

    Directory of Open Access Journals (Sweden)

    Paley Suzanne M

    2012-09-01

    Full Text Available Abstract Background Biologists are elucidating complex collections of genetic regulatory data for multiple organisms. Software is needed for such regulatory network data. Results The Pathway Tools software supports storage and manipulation of regulatory information through a variety of strategies. The Pathway Tools regulation ontology captures transcriptional and translational regulation, substrate-level regulation of enzyme activity, post-translational modifications, and regulatory pathways. Regulatory visualizations include a novel diagram that summarizes all regulatory influences on a gene; a transcription-unit diagram, and an interactive visualization of a full transcriptional regulatory network that can be painted with gene expression data to probe correlations between gene expression and regulatory mechanisms. We introduce a novel type of enrichment analysis that asks whether a gene-expression dataset is over-represented for known regulators. We present algorithms for ranking the degree of regulatory influence of genes, and for computing the net positive and negative regulatory influences on a gene. Conclusions Pathway Tools provides a comprehensive environment for manipulating molecular regulatory interactions that integrates regulatory data with an organism’s genome and metabolic network. Curated collections of regulatory data authored using Pathway Tools are available for Escherichia coli, Bacillus subtilis, and Shewanella oneidensis.

  13. Classroom Live: a software-assisted gamification tool

    Science.gov (United States)

    de Freitas, Adrian A.; de Freitas, Michelle M.

    2013-06-01

    Teachers have come to rely on a variety of approaches in order to elicit and sustain student interest in the classroom. One particular approach, known as gamification, seeks to improve student engagement by transforming the traditional classroom experience into a competitive multiplayer game. Initial attempts at classroom gamification relied on the teacher manually tracking student progress. At the US Air Force Academy, we wanted to experiment with a software gamification tool. Our client/server suite, dubbed Classroom Live, streamlines the gamification process for the teacher by simplifying common tasks. Simultaneously, the tool provides students with an esthetically pleasing user interface that offers in game rewards in exchange for their participation. Classroom Live is still in development, but our initial experience using the tool has been extremely positive and confirms our belief that students respond positively to gamification, even at the undergraduate level.

  14. A NEO population generation and observation simulation software tool

    Science.gov (United States)

    Müller, Sven; Gelhaus, Johannes; Hahn, Gerhard; Franco, Raffaella

    One of the main targets of ESA's Space Situational Awareness (SSA) program is to build a wide knowledge base about objects that can potentially harm Earth (Near-Earth Objects, NEOs). An important part of this effort is to create the Small Bodies Data Centre (SBDC) which is going to aggregate measurement data from a fully-integrated NEO observation sensor network. Until this network is developed, artificial NEO measurement data is needed in order to validate SBDC algorithms. Moreover, to establish a functioning NEO observation sensor network, it has to be determined where to place sensors, what technical requirements have to be met in order to be able to detect NEOs and which observation strategies work the best. Because of this, a sensor simulation software was needed. This paper presents a software tool which allows users to create and analyse NEO populations and to simulate and analyse population observations. It is a console program written in Fortran and comes with a Graphical User Interface (GUI) written in Java and C. The tool can be distinguished into the components ``Population Generator'' and ``Observation Simulator''. The Population Generator component is responsible for generating and analysing a NEO population. Users can choose between creating fictitious (random) and synthetic populations. The latter are based on one of two models describing the orbital and size distribution of observed NEOs: The existing socalled ``Bottke Model'' (Bottke et al. 2000, 2002) and the new ``Granvik Model'' (Granvik et al. 2014, in preparation) which has been developed in parallel to the tool. Generated populations can be analysed by defining 2D, 3D and scatter plots using various NEO attributes. As a result, the tool creates the appropiate files for the plotting tool ``gnuplot''. The tool's Observation Simulator component yields the Observation Simulation and Observation Analysis functions. Users can define sensor systems using ground- or space-based locations as well as

  15. Tools and Behavior Abstraction: A Future for Software Engineering

    Directory of Open Access Journals (Sweden)

    Wilson Solís

    2012-06-01

    Full Text Available Software engineers rely on and use tools to analyze automatically and detailed the code and design specifications. Although many are still used to find new defects in old code, is expected in the future have more application in software engineering and are available to developers at the time of editing their products. If were possible build them fast enough and easy to use, software engineers would apply it to improve design and product development. To solve any problem, traditional engineering use programming languages, however, the level of abstraction of the most popular is not much larger than C programs several decades ago. Moreover, this level is the same in all the code and do not leaves room for abstraction of behavior, in which the design is divided into phases and which gradually introduces more details. This article presents a study of the need for a larger set of analysis tools to create languages and development environments, which provide good support to archive this abstraction.

  16. Software tool for horizontal-axis wind turbine simulation

    Energy Technology Data Exchange (ETDEWEB)

    Vitale, A.J. [Instituto Argentino de Oceanografia, Camino La Carrindanga Km. 7, 5 CC 804, B8000FWB Bahia Blanca (Argentina); Universidad Tecnologica Nacional Facultad Regional Bahia Blanca, GESE, 11 de Abril 461, B8000LMI Bahia Blanca (Argentina); Dpto. de Ing. Electrica y de Computadoras, Universidad Nacional del Sur, Av. Alem 1253, 8000 Bahia Blanca (Argentina); Rossi, A.P. [Universidad Tecnologica Nacional Facultad Regional Bahia Blanca, GESE, 11 de Abril 461, B8000LMI Bahia Blanca (Argentina); Dpto. de Ing. Electrica y de Computadoras, Universidad Nacional del Sur, Av. Alem 1253, 8000 Bahia Blanca (Argentina)

    2008-07-15

    The main problem of a wind turbine generator design project is the design of the right blades capable of satisfying the specific energy requirement of an electric system with optimum performance. Once the blade has been designed for optimum operation at a particular rotor angular speed, it is necessary to determine the overall performance of the rotor under the range of wind speed that it will encounter. A software tool that simulates low-power, horizontal-axis wind turbines was developed for this purpose. With this program, the user can calculate the rotor power output for any combination of wind and rotor speeds, with definite blade shape and airfoil characteristics. The software also provides information about distribution of forces along the blade span, for different operational conditions. (author)

  17. Northwestern University Schizophrenia Data and Software Tool (NUSDAST).

    Science.gov (United States)

    Wang, Lei; Kogan, Alex; Cobia, Derin; Alpert, Kathryn; Kolasny, Anthony; Miller, Michael I; Marcus, Daniel

    2013-01-01

    The schizophrenia research community has invested substantial resources on collecting, managing and sharing large neuroimaging datasets. As part of this effort, our group has collected high resolution magnetic resonance (MR) datasets from individuals with schizophrenia, their non-psychotic siblings, healthy controls and their siblings. This effort has resulted in a growing resource, the Northwestern University Schizophrenia Data and Software Tool (NUSDAST), an NIH-funded data sharing project to stimulate new research. This resource resides on XNAT Central, and it contains neuroimaging (MR scans, landmarks and surface maps for deep subcortical structures, and FreeSurfer cortical parcellation and measurement data), cognitive (cognitive domain scores for crystallized intelligence, working memory, episodic memory, and executive function), clinical (demographic, sibling relationship, SAPS and SANS psychopathology), and genetic (20 polymorphisms) data, collected from more than 450 subjects, most with 2-year longitudinal follow-up. A neuroimaging mapping, analysis and visualization software tool, CAWorks, is also part of this resource. Moreover, in making our existing neuroimaging data along with the associated meta-data and computational tools publically accessible, we have established a web-based information retrieval portal that allows the user to efficiently search the collection. This research-ready dataset meaningfully combines neuroimaging data with other relevant information, and it can be used to help facilitate advancing neuroimaging research. It is our hope that this effort will help to overcome some of the commonly recognized technical barriers in advancing neuroimaging research such as lack of local organization and standard descriptions.

  18. Northwestern University Schizophrenia Data and Software Tool (NUSDAST

    Directory of Open Access Journals (Sweden)

    Lei eWang

    2013-11-01

    Full Text Available The schizophrenia research community has invested substantial resources on collecting, managing and sharing large neuroimaging datasets. As part of this effort, our group has collected high resolution magnetic resonance (MR datasets from individuals with schizophrenia, their non-psychotic siblings, healthy controls and their siblings. This effort has resulted in a growing resource, the Northwestern University Schizophrenia Data and Software Tool (NUSDAST, an NIH-funded data sharing project to stimulate new research. This resource resides on XNAT Central, and it contains neuroimaging (MR scans, landmarks and surface maps for deep subcortical structures, and FreeSurfer cortical parcellation and measurement data, cognitive (cognitive domain scores for crystallized intelligence, working memory, episodic memory, and executive function, clinical (demographic, sibling relationship, SAPS and SANS psychopathology, and genetic (20 polymorphisms data, collected from more than 450 subjects, most with 2-year longitudinal follow-up. A neuroimaging mapping, analysis and visualization software tool, CAWorks, is also part of this resource. Moreover, in making our existing neuroimaging data along with the associated meta-data and computational tools publically accessible, we have established a web-based information retrieval portal that allows the user to efficiently search the collection. This research-ready dataset meaningfully combines neuroimaging data with other relevant information, and it can be used to help facilitate advancing neuroimaging research. It is our hope that this effort will help to overcome some of the commonly recognized technical barriers in advancing neuroimaging research such as lack of local organization and standard descriptions.

  19. Object-Oriented Software Tools for the Construction of Preconditioners

    Directory of Open Access Journals (Sweden)

    Eva Mossberg

    1997-01-01

    Full Text Available In recent years, there has been considerable progress concerning preconditioned iterative methods for large and sparse systems of equations arising from the discretization of differential equations. Such methods are particularly attractive in the context of high-performance (parallel computers. However, the implementation of a preconditioner is a nontrivial task. The focus of the present contribution is on a set of object-oriented software tools that support the construction of a family of preconditioners based on fast transforms. By combining objects of different classes, it is possible to conveniently construct any preconditioner within this family.

  20. QUALITY SERVICES EVALUATION MODEL BASED ON DEDICATED SOFTWARE TOOL

    Directory of Open Access Journals (Sweden)

    ANDREEA CRISTINA IONICĂ

    2012-10-01

    Full Text Available In this paper we introduced a new model, called Service Quality (SQ, which combines QFD and SERVQUAL methods. This model takes from the SERVQUAL method the five dimensions of requirements and three of characteristics and from the QFD method the application methodology. The originality of the SQ model consists in computing a global index that reflects the customers’ requirements accomplishment level by the quality characteristics. In order to prove the viability of the SQ model, there was developed a software tool that was applied for the evaluation of a health care services provider.

  1. Software Tools for In-Situ Documentation of Built Heritage

    Science.gov (United States)

    Smars, P.

    2013-07-01

    The paper presents open source software tools developed by the author to facilitate in-situ documentation of architectural and archæological heritage. The design choices are exposed and related to a general issue in conservation and documentation: taking decisions about a valuable object under threat . The questions of level of objectivity is central to the three steps of this process. It is our belief that in-situ documentation has to be favoured in this demanding context, full of potential discoveries. The very powerful surveying techniques in rapid development nowadays enhance our vision but often tend to bring back a critical part of the documentation process to the office. The software presented facilitate a direct treatment of the data on the site. Emphasis is given to flexibility, interoperability and simplicity. Key features of the software are listed and illustrated with examples (3D model of Gothic vaults, analysis of the shape of a column, deformation of a wall, direct interaction with AutoCAD).

  2. The Software Improvement Process - Tools And Rules To Encourage Quality

    CERN Document Server

    Sigerud, K

    2011-01-01

    The Applications section of the CERN accelerator Controls group has decided to apply a systematic approach to quality assurance (QA), the “Software Improvement Process”, SIP. This process focuses on three areas: the development process itself, suitable QA tools, and how to practically encourage developers to do QA. For each stage of the development process we have agreed on the recommended activities and deliverables, and identified tools to automate and support the task. For example we do more code reviews. As peer reviews are resource-intensive, we only do them for complex parts of a product. As a complement, we are using static code checking tools, like FindBugs and Checkstyle. We also encourage unit testing and have agreed on a minimum level of test coverage recommended for all products, measured using Clover. Each of these tools is well integrated with our IDE (Eclipse) and give instant feedback to the developer about the quality of their code. The major challenges of SIP have been to 1) agree on com...

  3. Software tools to aid Pascal and Ada program design

    Energy Technology Data Exchange (ETDEWEB)

    Jankowitz, H.T.

    1987-01-01

    This thesis describes a software tool which analyses the style and structure of Pascal and Ada programs by ensuring that some minimum design requirements are fulfilled. The tool is used in much the same way as a compiler is used to teach students the syntax of a language, only in this case issues related to the design and structure of the program are of paramount importance. The tool operates by analyzing the design and structure of a syntactically correct program, automatically generating a report detailing changes that need to be made in order to ensure that the program is structurally sound. The author discusses how the model gradually evolved from a plagiarism detection system which extracted several measurable characteristics in a program to a model that analyzed the style of Pascal programs. In order to incorporate more-sophistical concepts like data abstraction, information hiding and data protection, this model was then extended to analyze the composition of Ada programs. The Ada model takes full advantage of facilities offered in the language and by using this tool the standard and quality of written programs is raised whilst the fundamental principles of program design are grasped through a process of self-tuition.

  4. Evaluating, selecting and relevance software tools in technology monitoring

    Directory of Open Access Journals (Sweden)

    Óscar Fernando Castellanos Domínguez

    2010-07-01

    Full Text Available The current setting for industrial and entrepreneurial development has posed the need for incorporating differentiating elements into the production apparatus leading to anticipating technological change. Technology monitoring (TM emerges as a methodology focused on analysing these changes for identifying challenges and opportunities (being mainly supported by information technology (IT through the search for, capture and analysis of data and information. This article proposes criteria for choosing and efficiently using software tools having different characteristics, requirements, capacity and cost which could be used in monitoring. An approach is made to different TM models, emphasising the identification and analysis of different information sources for coving and supporting information and access monitoring. Some evaluation, selection and analysis criteria are given for using these types of tools according to each production system’s individual profile and needs. Some of the existing software packages are described which are available on the market for carrying out monitoring prolects, relating them to their complexity, process characteristics and cost.

  5. A Software Tool for Quantitative Seismicity Analysis - ZMAP

    Science.gov (United States)

    Wiemer, S.; Gerstenberger, M.

    2001-12-01

    Earthquake catalogs are probably the most basic product of seismology, and remain arguably the most useful for tectonic studies. Modern seismograph networks can locate up to 100,000 earthquakes annually, providing a continuous and sometime overwhelming stream of data. ZMAP is a set of tools driven by a graphical user interface (GUI), designed to help seismologists analyze catalog data. ZMAP is primarily a research tool suited to the evaluation of catalog quality and to addressing specific hypotheses; however, it can also be useful in routine network operations. Examples of ZMAP features include catalog quality assessment (artifacts, completeness, explosion contamination), interactive data exploration, mapping transients in seismicity (rate changes, b-values, p-values), fractal dimension analysis and stress tensor inversions. Roughly 100 scientists worldwide have used the software at least occasionally. About 30 peer-reviewed publications have made use of ZMAP. ZMAP code is open source, written in the commercial software language Matlab by the Mathworks, a widely used software in the natural sciences. ZMAP was first published in 1994, and has continued to grow over the past 7 years. Recently, we released ZMAP v.6. The poster will introduce the features of ZMAP. We will specifically focus on ZMAP features related to time-dependent probabilistic hazard assessment. We are currently implementing a ZMAP based system that computes probabilistic hazard maps, which combine the stationary background hazard as well as aftershock and foreshock hazard into a comprehensive time dependent probabilistic hazard map. These maps will be displayed in near real time on the Internet. This poster is also intended as a forum for ZMAP users to provide feedback and discuss the future of ZMAP.

  6. Learning Photogrammetry with Interactive Software Tool PhoX

    Science.gov (United States)

    Luhmann, T.

    2016-06-01

    Photogrammetry is a complex topic in high-level university teaching, especially in the fields of geodesy, geoinformatics and metrology where high quality results are demanded. In addition, more and more black-box solutions for 3D image processing and point cloud generation are available that generate nice results easily, e.g. by structure-from-motion approaches. Within this context, the classical approach of teaching photogrammetry (e.g. focusing on aerial stereophotogrammetry) has to be reformed in order to educate students and professionals with new topics and provide them with more information behind the scene. Since around 20 years photogrammetry courses at the Jade University of Applied Sciences in Oldenburg, Germany, include the use of digital photogrammetry software that provide individual exercises, deep analysis of calculation results and a wide range of visualization tools for almost all standard tasks in photogrammetry. During the last years the software package PhoX has been developed that is part of a new didactic concept in photogrammetry and related subjects. It also serves as analysis tool in recent research projects. PhoX consists of a project-oriented data structure for images, image data, measured points and features and 3D objects. It allows for almost all basic photogrammetric measurement tools, image processing, calculation methods, graphical analysis functions, simulations and much more. Students use the program in order to conduct predefined exercises where they have the opportunity to analyse results in a high level of detail. This includes the analysis of statistical quality parameters but also the meaning of transformation parameters, rotation matrices, calibration and orientation data. As one specific advantage, PhoX allows for the interactive modification of single parameters and the direct view of the resulting effect in image or object space.

  7. Learning Photogrammetry with Interactive Software Tool PhoX

    Directory of Open Access Journals (Sweden)

    T. Luhmann

    2016-06-01

    Full Text Available Photogrammetry is a complex topic in high-level university teaching, especially in the fields of geodesy, geoinformatics and metrology where high quality results are demanded. In addition, more and more black-box solutions for 3D image processing and point cloud generation are available that generate nice results easily, e.g. by structure-from-motion approaches. Within this context, the classical approach of teaching photogrammetry (e.g. focusing on aerial stereophotogrammetry has to be reformed in order to educate students and professionals with new topics and provide them with more information behind the scene. Since around 20 years photogrammetry courses at the Jade University of Applied Sciences in Oldenburg, Germany, include the use of digital photogrammetry software that provide individual exercises, deep analysis of calculation results and a wide range of visualization tools for almost all standard tasks in photogrammetry. During the last years the software package PhoX has been developed that is part of a new didactic concept in photogrammetry and related subjects. It also serves as analysis tool in recent research projects. PhoX consists of a project-oriented data structure for images, image data, measured points and features and 3D objects. It allows for almost all basic photogrammetric measurement tools, image processing, calculation methods, graphical analysis functions, simulations and much more. Students use the program in order to conduct predefined exercises where they have the opportunity to analyse results in a high level of detail. This includes the analysis of statistical quality parameters but also the meaning of transformation parameters, rotation matrices, calibration and orientation data. As one specific advantage, PhoX allows for the interactive modification of single parameters and the direct view of the resulting effect in image or object space.

  8. Desire characteristics of a generic 'no frills' software engineering tools package

    Energy Technology Data Exchange (ETDEWEB)

    Rhodes, J.J.

    1986-07-29

    Increasing numbers of vendors are developing software engineering tools to meet the demands of increasingly complex software systems, higher reliability goals for software products, higher programming labor costs, and management's desire to more closely associate software lifecycle costs with the estimated development schedule. Some vendors have chosen a dedicated workstation approach to achieve high user interactivity through windowing and mousing. Other vendors are using multi-user mainframes with low cost terminals to economize on the costs of the hardware and the tools software. For all of the potential customers of software tools, the question remains: What are the minimum functional requirements that a software engineering tools package must have in order to be considered useful throughout the entire software lifecycle. This paper describes the desired characteristics of a non-existent but realistic 'no frills' software engineering tools package. 3 refs., 5 figs.

  9. The ultimate CASE (Computer-Aided Software Engineering) tool

    Energy Technology Data Exchange (ETDEWEB)

    Sharp, J.K.

    1990-01-01

    The theory and practice of information engineering is being actively developed at Sandia National Laboratories. The main output of Sandia is information. Information is created, analyzed and distributed. It is the life blood of our design laboratory. The proper management of information will have a large, positive impact on staff productivity. In order to achieve the potential benefits of shared information a commonly understood approach is needed, and the approach must be implemented in a CASE (Computer-Aided Software Engineering) tool that spans the entire life cycle of information. The commonly understood approach used at Sandia is natural language. More specifically, it is a structured subset of English. Users and system developers communicate requirements and commitments that they both understand. The approach is based upon NIAM (Nijssen's Information Analysis Methodology). In the last three years four NIAM training classes have been given at Sandia. The classes were all at the introductory level, with the latest class last October having an additional seminar highlighting successful projects. The continued growth in applications using NIAM requires an advanced class. The class will develop an information model for the Ultimate CASE Tool.'' This paper presents the requirements that have been established for the Ultimate CASE Tool'' and presents initial models. 4 refs., 1 tab.

  10. A Survey Identifying Trends on Use of Software Development Tools in Different Indian SMEs

    Directory of Open Access Journals (Sweden)

    Nomi Baruah Ashima

    2012-10-01

    Full Text Available Software Process Improvement defines the identification of the current state-of- practice of processeswithin an organization and then improving it. Software Process Improvement is ever lasting, never endingand ever changing process. Some of the issues which force an organization to undergo software processimprovement are customer dissatisfaction, inadequate software quality, inability to deliver on time andwithin budget, and excessive rework. The SMEs are using software process models but they are not able todeliver on time and within budget, and excessive rework. The SMEs are using software processimprovement models but they are not able to follow all the processes due to lack of resource and cost toimprove their productivity and quality of their product. A survey of 18 SMEs catering software market hasbeen carried out for finding software development scenarios.The intent of the study was to find theprevailing tools and techniques , the SMEs are using to automate software development process and toincorporate software project management. The survey identifies four different types of SoftwareDevelopment Tools which are proving to be effective in current scenario of software development. Theseare identified as Requirement Management Tools, Process Modelling Tools, Software ConfigurationManagement Tools and Cost Estimating Tools. This paper summarizes the trends followed in usage ofSoftware Development Tools in SMEs and it has been shown graphically also.

  11. User Guide for the STAYSL PNNL Suite of Software Tools

    Energy Technology Data Exchange (ETDEWEB)

    Greenwood, Lawrence R.; Johnson, Christian D.

    2013-02-27

    The STAYSL PNNL software suite provides a set of tools for working with neutron activation rates measured in a nuclear fission reactor, an accelerator-based neutron source, or any neutron field to determine the neutron flux spectrum through a generalized least-squares approach. This process is referred to as neutron spectral adjustment since the preferred approach is to use measured data to adjust neutron spectra provided by neutron physics calculations. The input data consist of the reaction rates based on measured activities, an initial estimate of the neutron flux spectrum, neutron activation cross sections and their associated uncertainties (covariances), and relevant correction factors. The output consists of the adjusted neutron flux spectrum and associated covariance matrix, which is useful for neutron dosimetry and radiation damage calculations.

  12. Evaluation of The Virtual Cells Software: a Teaching Tool

    Directory of Open Access Journals (Sweden)

    C.C.P. da Silva

    2005-07-01

    handling,  having an accessible language,  supporting the  software  as an education  tool that is capable  to facilitate  the learning  of the fundamental concepts  about the theme.  Other  workshops are programmed to happen with participants from different educational institutions of Sao Carlos  city,  with the goal to broaden our sample.

  13. SNPdetector: a software tool for sensitive and accurate SNP detection.

    Directory of Open Access Journals (Sweden)

    Jinghui Zhang

    2005-10-01

    Full Text Available Identification of single nucleotide polymorphisms (SNPs and mutations is important for the discovery of genetic predisposition to complex diseases. PCR resequencing is the method of choice for de novo SNP discovery. However, manual curation of putative SNPs has been a major bottleneck in the application of this method to high-throughput screening. Therefore it is critical to develop a more sensitive and accurate computational method for automated SNP detection. We developed a software tool, SNPdetector, for automated identification of SNPs and mutations in fluorescence-based resequencing reads. SNPdetector was designed to model the process of human visual inspection and has a very low false positive and false negative rate. We demonstrate the superior performance of SNPdetector in SNP and mutation analysis by comparing its results with those derived by human inspection, PolyPhred (a popular SNP detection tool, and independent genotype assays in three large-scale investigations. The first study identified and validated inter- and intra-subspecies variations in 4,650 traces of 25 inbred mouse strains that belong to either the Mus musculus species or the M. spretus species. Unexpected heterozygosity in CAST/Ei strain was observed in two out of 1,167 mouse SNPs. The second study identified 11,241 candidate SNPs in five ENCODE regions of the human genome covering 2.5 Mb of genomic sequence. Approximately 50% of the candidate SNPs were selected for experimental genotyping; the validation rate exceeded 95%. The third study detected ENU-induced mutations (at 0.04% allele frequency in 64,896 traces of 1,236 zebra fish. Our analysis of three large and diverse test datasets demonstrated that SNPdetector is an effective tool for genome-scale research and for large-sample clinical studies. SNPdetector runs on Unix/Linux platform and is available publicly (http://lpg.nci.nih.gov.

  14. SNPdetector: A Software Tool for Sensitive and Accurate SNP Detection.

    Directory of Open Access Journals (Sweden)

    2005-10-01

    Full Text Available Identification of single nucleotide polymorphisms (SNPs and mutations is important for the discovery of genetic predisposition to complex diseases. PCR resequencing is the method of choice for de novo SNP discovery. However, manual curation of putative SNPs has been a major bottleneck in the application of this method to high-throughput screening. Therefore it is critical to develop a more sensitive and accurate computational method for automated SNP detection. We developed a software tool, SNPdetector, for automated identification of SNPs and mutations in fluorescence-based resequencing reads. SNPdetector was designed to model the process of human visual inspection and has a very low false positive and false negative rate. We demonstrate the superior performance of SNPdetector in SNP and mutation analysis by comparing its results with those derived by human inspection, PolyPhred (a popular SNP detection tool, and independent genotype assays in three large-scale investigations. The first study identified and validated inter- and intra-subspecies variations in 4,650 traces of 25 inbred mouse strains that belong to either the Mus musculus species or the M. spretus species. Unexpected heterozgyosity in CAST/Ei strain was observed in two out of 1,167 mouse SNPs. The second study identified 11,241 candidate SNPs in five ENCODE regions of the human genome covering 2.5 Mb of genomic sequence. Approximately 50% of the candidate SNPs were selected for experimental genotyping; the validation rate exceeded 95%. The third study detected ENU-induced mutations (at 0.04% allele frequency in 64,896 traces of 1,236 zebra fish. Our analysis of three large and diverse test datasets demonstrated that SNPdetector is an effective tool for genome-scale research and for large-sample clinical studies. SNPdetector runs on Unix/Linux platform and is available publicly (http://lpg.nci.nih.gov.

  15. A software tool for rapid flood inundation mapping

    Science.gov (United States)

    Verdin, James; Verdin, Kristine; Mathis, Melissa L.; Magadzire, Tamuka; Kabuchanga, Eric; Woodbury, Mark; Gadain, Hussein

    2016-06-02

    The GIS Flood Tool (GFT) was developed by the U.S. Geological Survey with support from the U.S. Agency for International Development’s Office of U.S. Foreign Disaster Assistance to provide a means for production of reconnaissance-level flood inundation mapping for data-sparse and resource-limited areas of the world. The GFT has also attracted interest as a tool for rapid assessment flood inundation mapping for the Flood Inundation Mapping Program of the U.S. Geological Survey. The GFT can fill an important gap for communities that lack flood inundation mapping by providing a first-estimate of inundation zones, pending availability of resources to complete an engineering study. The tool can also help identify priority areas for application of scarce flood inundation mapping resources. The technical basis of the GFT is an application of the Manning equation for steady flow in an open channel, operating on specially processed digital elevation data. The GFT is implemented as a software extension in ArcGIS. Output maps from the GFT were validated at 11 sites with inundation maps produced previously by the Flood Inundation Mapping Program using standard one-dimensional hydraulic modeling techniques. In 80 percent of the cases, the GFT inundation patterns matched 75 percent or more of the one-dimensional hydraulic model inundation patterns. Lower rates of pattern agreement were seen at sites with low relief and subtle surface water divides. Although the GFT is simple to use, it should be applied with the oversight or review of a qualified hydraulic engineer who understands the simplifying assumptions of the approach.

  16. Technical Data Exchange Software Tools Adapted to Distributed Microsatellite Design

    Science.gov (United States)

    Pache, Charly

    2002-01-01

    One critical issue concerning distributed design of satellites, is the collaborative work it requires. In particular, the exchange of data between each group responsible for each subsystem can be complex and very time-consuming. The goal of this paper is to present a design collaborative tool, the SSETI Design Model (SDM), specifically developed for enabling satellite distributed design. SDM is actually used in the ongoing Student Space Exploration &Technology (SSETI) initiative (www.sseti.net). SSETI is lead by European Space Agency (ESA) outreach office (http://www.estec.esa.nl/outreach), involving student groups from all over Europe for design, construction and launch of a microsatellite. The first part of this paper presents the current version of the SDM tool, a collection of Microsoft Excel linked worksheets, one for each subsystem. An overview of the project framework/structure is given, explaining the different actors, the flows between them, as well as the different types of data and the links - formulas - between data sets. Unified Modeling Language (UML) diagrams give an overview of the different parts . Then the SDM's functionalities, developed in VBA scripts (Visual Basic for Application), are introduced, as well as the interactive features, user interfaces and administration tools. The second part discusses the capabilities and limitations of SDM current version. Taking into account these capabilities and limitations, the third part outlines the next version of SDM, a web-oriented, database-driven evolution of the current version. This new approach will enable real-time data exchange and processing between the different actors of the mission. Comprehensive UML diagrams will guide the audience through the entire modeling process of such a system. Tradeoffs simulation capabilities, security, reliability, hardware and software issues will also be thoroughly discussed.

  17. Software Development Outsourcing Decision Support Tool with Neural Network Learning

    Science.gov (United States)

    2004-03-01

    software domain, enterprise scripting software domain, and outsourcing ( maintenance and training) processes found to be included in the new model but not in...accounting and order entry) software domains, and outsourcing ( maintenance , configuration management and software engineer support) processes were...found in the original model but not in the new model included: enterprise (scripting and order entry) software domains and outsourcing maintenance process

  18. 76 FR 5832 - International Business Machines (IBM), Software Group Business Unit, Optim Data Studio Tools QA...

    Science.gov (United States)

    2011-02-02

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF LABOR Employment and Training Administration International Business Machines (IBM), Software Group Business Unit... at International Business Machines (IBM), Software Group Business Unit, Optim Data Studio Tools...

  19. Performance Evaluation of a Software Engineering Tool for Automated Design of Cooling Systems in Injection Moulding

    DEFF Research Database (Denmark)

    Jauregui-Becker, Juan M.; Tosello, Guido; van Houten, Fred J.A.M.;

    2013-01-01

    This paper presents a software tool for automating the design of cooling systems for injection moulding and a validation of its performance. Cooling system designs were automatically generated by the proposed software tool and by applying a best practice tool engineering design approach. The two ...

  20. Can agile software tools bring the benefits of a task board to globally distributed teams?

    NARCIS (Netherlands)

    Katsma, Christiaan; Amrit, Chintan; Hillegersberg, van Jos; Sikkel, Klaas; Oshri, Ilan; Kotlarsky, Julia; Willcocks, Leslie P.

    2013-01-01

    Software-based tooling has become an essential part of globally disitrbuted software development. In this study we focus on the usage of such tools and task boards in particular. We investigate the deployment of these tools through a field research in 4 different companies that feature agile and glo

  1. Decision graphs: a tool for developing real-time software

    Energy Technology Data Exchange (ETDEWEB)

    Kozubal, A.J.

    1981-01-01

    The use of decision graphs in the preparation of, in particular, real-time software is briefly described. The usefulness of decision graphs in software design, testing, and maintenance is pointed out. 2 figures. (RWR)

  2. Software Reuse in Agile Development Organizations - A Conceptual Management Tool

    NARCIS (Netherlands)

    Spoelstra, Wouter; Iacob, Maria; Sinderen, van Marten

    2011-01-01

    The reuse of knowledge is considered a major factor for increasing productivity and quality. In the software industry knowledge is embodied in software assets such as code components, functional designs and test cases. This kind of knowledge reuse is also referred to as software reuse. Although the

  3. Introduction to format: the software tools test formatting program

    Energy Technology Data Exchange (ETDEWEB)

    Agazzi, C.

    1984-12-01

    Format is the name of the Software Tools formatter. It allows you to format text according to instructions that you place within the text. The text and instructions for each document you wish to create are kept in files. Each instruction, called a request line, makes changes in the way your document is laid out. For example, you can change the margins within your document to visually set off lists of items or topics. You can also bold face or underline words or sentences to highlight them. Throughout this manual are examples of how to use the Format request lines along with illustrations of the effects request lines have on an example letter. The request lines begin with a period in the first column on the screen. Each request line performs a specific function and is placed on the line immediately in front of the test to be formatted. Output lines are automatically filled; that is, their right margins are justified, without regard to the format of the input test lines.

  4. Towards a Reference Architecture to Provision Tools as a Service for Global Software Development

    DEFF Research Database (Denmark)

    Chauhan, Aufeef; Babar, Muhammad Ali

    2014-01-01

    Organizations involve in Global Software Development (GSD) face challenges in terms of having access to appropriate set of tools for performing distributed engineering and development activities, integration between heterogeneous desktop and web-based tools, management of artifacts developed and ...

  5. Software tools overview : process integration, modelling and optimisation for energy saving and pollution reduction

    OpenAIRE

    Lam, Hon Loong; Klemeš, Jiri; Kravanja, Zdravko; Varbanov, Petar

    2012-01-01

    This paper provides an overview of software tools based on long experience andapplications in the area of process integration, modelling and optimisation. The first part reviews the current design practice and the development of supporting software tools. Those are categorised as: (1) process integration and retrofit analysis tools, (2) general mathematical modelling suites with optimisation libraries, (3) flowsheeting simulation and (4) graph-based process optimisation tools. The second part...

  6. Systems and software variability management concepts, tools and experiences

    CERN Document Server

    Capilla, Rafael; Kang, Kyo-Chul

    2013-01-01

    The success of product line engineering techniques in the last 15 years has popularized the use of software variability as a key modeling approach for describing the commonality and variability of systems at all stages of the software lifecycle. Software product lines enable a family of products to share a common core platform, while allowing for product specific functionality being built on top of the platform. Many companies have exploited the concept of software product lines to increase the resources that focus on highly differentiating functionality and thus improve their competitiveness

  7. Software for predictive microbiology and risk assessment: a description and comparison of tools presented at the ICPMF8 Software Fair.

    Science.gov (United States)

    Tenenhaus-Aziza, Fanny; Ellouze, Mariem

    2015-02-01

    The 8th International Conference on Predictive Modelling in Food was held in Paris, France in September 2013. One of the major topics of this conference was the transfer of knowledge and tools between academics and stakeholders of the food sector. During the conference, a "Software Fair" was held to provide information and demonstrations of predictive microbiology and risk assessment software. This article presents an overall description of the 16 software tools demonstrated at the session and provides a comparison based on several criteria such as the modeling approach, the different modules available (e.g. databases, predictors, fitting tools, risk assessment tools), the studied environmental factors (temperature, pH, aw, etc.), the type of media (broth or food) and the number and type of the provided micro-organisms (pathogens and spoilers). The present study is a guide to help users select the software tools which are most suitable to their specific needs, before they test and explore the tool(s) in more depth.

  8. Nik Software Captured The Complete Guide to Using Nik Software's Photographic Tools

    CERN Document Server

    Corbell, Tony L

    2011-01-01

    Learn all the features and functionality of the complete Nik family of products Styled in such a way as to resemble the way photographers think, Nik Software Captured aims to help you learn to apply all the features and functionality of the Nik software products. With Nik Software Captured, authors and Nik Software, Inc. insiders Tony Corbell and Josh Haftel help you use after-capture software products easier and more creatively. Their sole aim is to ensure that you can apply the techniques discussed in the book while gaining a thorough understanding of the capabilities of programs such as Dfi

  9. Software development tool for PicoBlaze multi-processor implementation

    OpenAIRE

    Claudiu Lung; Buchman Attila

    2012-01-01

    This paper presents a useful software tool for projects with multi PicoBlaze microprocessors implemented in FPGA circuits. Application presented in this paper which use for software development PicoBlaze SDK tool is an Automatic Packet Report System (APRS), with three PicoBlaze microprocessors implemented in FPGA circuit.

  10. Development of Software for Analyzing Breakage Cutting ToolsBased on Image Processing

    Institute of Scientific and Technical Information of China (English)

    赵彦玲; 刘献礼; 王鹏; 王波; 王红运

    2004-01-01

    As the present day digital microsystems do not provide specialized microscopes that can detect cutting-tool, analysis software has been developed using VC++. A module for verge test and image segmentation is designed specifically for cutting-tools. Known calibration relations and given postulates are used in scale measurements. Practical operations show that the software can perform accurate detection.

  11. Kid Tools: Self-Management, Problem- Solving, Organizational, and Planning Software for Children and Teachers

    Science.gov (United States)

    Miller, Kevin J.; Fitzgerald, Gail E.; Koury, Kevin A.; Mitchem, Herine J.; Hollingsead, Candice

    2007-01-01

    This article provides an overview of KidTools, an electronic performance software system designed for elementary and middle school children to use independently on classroom or home computers. The software system contains 30 computerized research-based strategy tools that can be implemented in a classroom or home environment. Through the…

  12. Possibilities for using software tools in the process of secuirty design

    Directory of Open Access Journals (Sweden)

    Ladislav Mariš

    2013-07-01

    Full Text Available The authors deal with the use of software support the process of security design. The article proposes the theoretical basis of the implementation of software tools to design activities. Based on the selected design standards of electrical safety systems application design solutions, especially in drawing documentation. The article should serve the needs of the project team members in order to use selected software tools and a subsequent increase in the degree of automation of design activities.

  13. Software Engineering Practices and Tool Support: an exploratory study in New Zealand

    Directory of Open Access Journals (Sweden)

    Chris Phillips

    2003-11-01

    Full Text Available This study was designed as a preliminary investigation of the practices of software engineers within New Zealand, including their use of development tools. The project involved a review of relevant literature on software engineering and CASE tools, the development and testing of an interview protocol, and structured interviews with five software engineers. This paper describes the project, presents the findings, examines the results in the context of the literature and outlines on-going funded work involving a larger survey.

  14. Tuning COCOMO-II for Software Process Improvement: A Tool Based Approach

    Directory of Open Access Journals (Sweden)

    SYEDA UMEMA HANI

    2016-10-01

    Full Text Available In order to compete in the international software development market the software organizations have to adopt internationally accepted software practices i.e. standard like ISO (International Standard Organization or CMMI (Capability Maturity Model Integration in spite of having scarce resources and tools. The aim of this study is to develop a tool which could be used to present an actual picture of Software Process Improvement benefits in front of the software development companies. However, there are few tools available to assist in making predictions, they are too expensive and could not cover dataset that reflect the cultural behavior of organizations for software development in developing countries. In extension to our previously done research reported elsewhere for Pakistani software development organizations which has quantified benefits of SDPI (Software Development Process Improvement, this research has used sixty-two datasets from three different software development organizations against the set of metrics used in COCOMO-II (Constructive Cost Model 2000. It derived a verifiable equation for calculating ISF (Ideal Scale Factor and tuned the COCOMO-II model to bring prediction capability for SDPI (benefit measurement classes such as ESCP (Effort, Schedule, Cost, and Productivity. This research has contributed towards software industry by giving a reliable and low-cost mechanism for generating prediction models with high prediction accuracy. Hopefully, this study will help software organizations to use this tool not only to predict ESCP but also to predict an exact impact of SDPI.

  15. Using Software Development Tools and Practices in Acquisition

    Science.gov (United States)

    2013-12-01

    provement like CMMI ® and software development processes like Rational Unified Process (RUP). Somewhat orthogonal to these structured methods are software...permission@sei.cmu.edu. * These restrictions do not apply to U.S. government entities. Architecture Tradeoff Analysis Method ®, Carnegie Mellon® and... CMMI ® are registered in the U.S. Patent and Trademark Office by Carnegie Mellon University. DM-0000791. CMU/SEI-2013-TN-017 | i Table of

  16. A Web-based Tool for Automatizing the Software Process Improvement Initiatives in Small Software Enterprises

    NARCIS (Netherlands)

    Garcia, I.; Pacheco, C.

    2010-01-01

    Top-down process improvement approaches provide a high-level model of what the process of a software development organization should be. Such models are based on the consensus of a designated working group on how software should be developed or maintained. They are very useful in that they provide g

  17. MSP-Tool: a VBA-based software tool for the analysis of multispecimen paleointensity data

    Directory of Open Access Journals (Sweden)

    Marilyn Wilhelmina Leonora Monster

    2015-12-01

    Full Text Available The multispecimen protocol (MSP is a method to estimate the Earth’s magnetic field’s past strength from volcanic rocks or archeological materials. By reducing the amount of heating steps and aligning the specimens parallel to the applied field, thermochemical alteration and multi-domain effects are minimized. We present a new software tool, written for Microsoft Excel 2010 in Visual Basic for Applications (VBA, that evaluates paleointensity data acquired using this protocol. In addition to the three ratios (standard, fraction-corrected and domain-state-corrected calculated following Dekkers and Böhnel (2006 and Fabian and Leonhardt (2010 and a number of other parameters proposed by Fabian and Leonhardt (2010, it also provides several reliability criteria. These include an alteration criterion, whether or not the linear regression intersects the y axis within the theoretically prescribed range, and two directional checks. Overprints and misalignment are detected by isolating the remaining natural remanent magnetization (NRM and the partial thermoremanent magnetization (pTRM gained and comparing their declinations and inclinations. The NRM remaining and pTRM gained are then used to calculate alignment-corrected multispecimen plots. Data are analyzed using bootstrap statistics. The program was tested on lava samples that were given a full TRM and that acquired their pTRMs at angles of 0, 15, 30 and 90° with respect to their NRMs. MSP-Tool adequately detected and largely corrected these artificial alignment errors.

  18. MSP-Tool: a VBA-based software tool for the analysis of multispecimen paleointensity data

    Science.gov (United States)

    Monster, Marilyn; de Groot, Lennart; Dekkers, Mark

    2015-12-01

    The multispecimen protocol (MSP) is a method to estimate the Earth's magnetic field's past strength from volcanic rocks or archeological materials. By reducing the amount of heating steps and aligning the specimens parallel to the applied field, thermochemical alteration and multi-domain effects are minimized. We present a new software tool, written for Microsoft Excel 2010 in Visual Basic for Applications (VBA), that evaluates paleointensity data acquired using this protocol. In addition to the three ratios (standard, fraction-corrected and domain-state-corrected) calculated following Dekkers and Böhnel (2006) and Fabian and Leonhardt (2010) and a number of other parameters proposed by Fabian and Leonhardt (2010), it also provides several reliability criteria. These include an alteration criterion, whether or not the linear regression intersects the y axis within the theoretically prescribed range, and two directional checks. Overprints and misalignment are detected by isolating the remaining natural remanent magnetization (NRM) and the partial thermoremanent magnetization (pTRM) gained and comparing their declinations and inclinations. The NRM remaining and pTRM gained are then used to calculate alignment-corrected multispecimen plots. Data are analyzed using bootstrap statistics. The program was tested on lava samples that were given a full TRM and that acquired their pTRMs at angles of 0, 15, 30 and 90° with respect to their NRMs. MSP-Tool adequately detected and largely corrected these artificial alignment errors.

  19. C++ Software Quality in the ATLAS Experiment: Tools and Experience

    CERN Document Server

    Kluth, Stefan; The ATLAS collaboration; Obreshkov, Emil; Roe, Shaun; Seuster, Rolf; Snyder, Scott; Stewart, Graeme

    2016-01-01

    The ATLAS experiment at CERN uses about six million lines of code and currently has about 420 developers whose background is largely from physics. In this paper we explain how the C++ code quality is managed using a range of tools from compile-time through to run time testing and reflect on the great progress made in the last year largely through the use of static analysis tools such as Coverity®, an industry-standard tool which enables quality comparison with general open source C++ code. Other tools including cppcheck, Include-What-You-Use and run-time 'sanitizers' are also discussed.

  20. C++ Software Quality in the ATLAS experiment: Tools and Experience

    CERN Document Server

    Martin-Haugh, Stewart; The ATLAS collaboration

    2017-01-01

    In this paper we explain how the C++ code quality is managed in ATLAS using a range of tools from compile-time through to run time testing and reflect on the substantial progress made in the last two years largely through the use of static analysis tools such as Coverity®, an industry-standard tool which enables quality comparison with general open source C++ code. Other available code analysis tools are also discussed, as is the role of unit testing with an example of how the GoogleTest framework can be applied to our codebase.

  1. Software engineering capability for Ada (GRASP/Ada Tool)

    Science.gov (United States)

    Cross, James H., II

    1995-01-01

    The GRASP/Ada project (Graphical Representations of Algorithms, Structures, and Processes for Ada) has successfully created and prototyped a new algorithmic level graphical representation for Ada software, the Control Structure Diagram (CSD). The primary impetus for creation of the CSD was to improve the comprehension efficiency of Ada software and, as a result, improve reliability and reduce costs. The emphasis has been on the automatic generation of the CSD from Ada PDL or source code to support reverse engineering and maintenance. The CSD has the potential to replace traditional prettyprinted Ada Source code. A new Motif compliant graphical user interface has been developed for the GRASP/Ada prototype.

  2. PTaaS: Platform for Providing Software Developing Applications and Tools as a Service

    DEFF Research Database (Denmark)

    Chauhan, Muhammad Aufeef; Babar, Muhammad Ali

    2014-01-01

    of large number of complex activities that does not only include technological aspects but also social aspects. A large number of applications and tools have been devised for providing solutions to the challenges of the GSD that emerge as a result of distributed development teams. However...... technological support for it that is not limited to one specific tools and a particular phase of software development life cycle. In this thesis, we have explored the possibility of offering software development applications and tools as services that can be acquired on demand according to the software...... development process in globally distributed environment. We have performed the structured review of the literature on GSD tools to identify attributes of the software development tools that have been introduced for addressing GSD challenges and we have discussed significance of technology alignment...

  3. Vulnerability management tools for COTS software - A comparison

    NARCIS (Netherlands)

    Welberg, S.M.

    2008-01-01

    In this paper, we compare vulnerability management tools in two stages. In the first stage, we perform a global comparison involving thirty tools available in the market. A framework composed of several criteria based on scope and analysis is used for this comparison. From this global view of the to

  4. Simulation and visualization tool design for robot software

    NARCIS (Netherlands)

    Lu, Zhou; Ran, Tjalling; Broenink, Jan F.; Chalmers, K.; Pedersen, J.B.

    2016-01-01

    Modern embedded systems are designed for multiple and increasingly demanding tasks. Complex concurrent software is required by multi-task automated service robotics for implementing their challenging (control) algorithms. TERRA is a communicating Sequential Processes (CSP) algebra-based Eclipse grap

  5. A SOFTWARE TOOL FOR EXPERIMENTAL STUDY LEAP MOTION

    Directory of Open Access Journals (Sweden)

    Georgi Krastev

    2015-12-01

    Full Text Available The paper aims to present computer application that illustrates Leap Motion controller’s abilities. It is a peripheral and software for PC, which enables control by natural user interface based on gestures. The publication also describes how the controller works and its main advantages/disadvantages. Some apps using leap motion controller are discussed.

  6. A Survey of Reliability, Maintainability, Supportability, and Testability Software Tools

    Science.gov (United States)

    1991-04-01

    Industrial Reliability Program ABSTRACT: Program is utilized for analysis of industrial equipment for which military requirements are not applicable...Can communicate with other TECNASA software and has British or Portuguese menus. MACHINES: IBM PC POC: TECNASA Attn: Jose L. Barletta Electronica ...termed "performability". Models both repairable and nonrepairable systems. MACHINES: No Data POC: Industrial Technology Institute 20 NAME: METFAC

  7. Use of software tools for calculating flow accelerated corrosion of nuclear power plant equipment and pipelines

    Science.gov (United States)

    Naftal', M. M.; Baranenko, V. I.; Gulina, O. M.

    2014-06-01

    The results obtained from calculations of flow accelerated corrosion of equipment and pipelines operating at nuclear power plants constructed on the basis of PWR, VVER, and RBMK reactors carried out using the EKI-02 and EKI-03 software tools are presented. It is shown that the calculation error does not exceed its value indicated in the qualification certificates for these software tools. It is pointed out that calculations aimed at predicting the service life of pipelines and efficient surveillance of flow accelerated corrosion wear are hardly possible without using the above-mentioned software tools.

  8. High Energy Physics Forum for Computational Excellence: Working Group Reports (I. Applications Software II. Software Libraries and Tools III. Systems)

    Energy Technology Data Exchange (ETDEWEB)

    Habib, Salman [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Roser, Robert [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States)

    2015-10-28

    Computing plays an essential role in all aspects of high energy physics. As computational technology evolves rapidly in new directions, and data throughput and volume continue to follow a steep trend-line, it is important for the HEP community to develop an effective response to a series of expected challenges. In order to help shape the desired response, the HEP Forum for Computational Excellence (HEP-FCE) initiated a roadmap planning activity with two key overlapping drivers -- 1) software effectiveness, and 2) infrastructure and expertise advancement. The HEP-FCE formed three working groups, 1) Applications Software, 2) Software Libraries and Tools, and 3) Systems (including systems software), to provide an overview of the current status of HEP computing and to present findings and opportunities for the desired HEP computational roadmap. The final versions of the reports are combined in this document, and are presented along with introductory material.

  9. High Energy Physics Forum for Computational Excellence: Working Group Reports (I. Applications Software II. Software Libraries and Tools III. Systems)

    CERN Document Server

    Habib, Salman; LeCompte, Tom; Marshall, Zach; Borgland, Anders; Viren, Brett; Nugent, Peter; Asai, Makoto; Bauerdick, Lothar; Finkel, Hal; Gottlieb, Steve; Hoeche, Stefan; Sheldon, Paul; Vay, Jean-Luc; Elmer, Peter; Kirby, Michael; Patton, Simon; Potekhin, Maxim; Yanny, Brian; Calafiura, Paolo; Dart, Eli; Gutsche, Oliver; Izubuchi, Taku; Lyon, Adam; Petravick, Don

    2015-01-01

    Computing plays an essential role in all aspects of high energy physics. As computational technology evolves rapidly in new directions, and data throughput and volume continue to follow a steep trend-line, it is important for the HEP community to develop an effective response to a series of expected challenges. In order to help shape the desired response, the HEP Forum for Computational Excellence (HEP-FCE) initiated a roadmap planning activity with two key overlapping drivers -- 1) software effectiveness, and 2) infrastructure and expertise advancement. The HEP-FCE formed three working groups, 1) Applications Software, 2) Software Libraries and Tools, and 3) Systems (including systems software), to provide an overview of the current status of HEP computing and to present findings and opportunities for the desired HEP computational roadmap. The final versions of the reports are combined in this document, and are presented along with introductory material.

  10. Survivability as a Tool for Evaluating Open Source Software

    Science.gov (United States)

    2015-06-01

    Initiative OSS open source software PACFLT Pacific Fleet SLOC source lines of code SEA South East Asia SRWBR short range wide band radio SAM surface-to-air...the Massachusetts Institute of Technology (MIT) License, the GNU General Public License, and the Mozilla Public License 1.1. It is good to have a...law. For example, the GNU General Public License requires that modified code be redistributed under the same license. For a government project, such a

  11. Contingency Contractor Optimization Phase 3 Sustainment Third-Party Software List - Contingency Contractor Optimization Tool - Prototype

    Energy Technology Data Exchange (ETDEWEB)

    Durfee, Justin David; Frazier, Christopher Rawls; Bandlow, Alisa

    2016-05-01

    The Contingency Contractor Optimization Tool - Prototype (CCOT-P) requires several third-party software packages. These are documented below for each of the CCOT-P elements: client, web server, database server, solver, web application and polling application.

  12. Toolkit of Available EPA Green Infrastructure Modeling Software: Watershed Management Optimization Support Tool (WMOST)

    Science.gov (United States)

    Watershed Management Optimization Support Tool (WMOST) is a software application designed tofacilitate integrated water resources management across wet and dry climate regions. It allows waterresources managers and planners to screen a wide range of practices across their watersh...

  13. An Overview of Public Access Computer Software Management Tools for Libraries

    Science.gov (United States)

    Wayne, Richard

    2004-01-01

    An IT decision maker gives an overview of public access PC software that's useful in controlling session length and scheduling, Internet access, print output, security, and the latest headaches: spyware and adware. In this article, the author describes a representative sample of software tools in several important categories such as setup…

  14. Managing clinical research data: software tools for hypothesis exploration.

    Science.gov (United States)

    Starmer, C F; Dietz, M A

    1990-07-01

    Data representation, data file specification, and the communication of data between software systems are playing increasingly important roles in clinical data management. This paper describes the concept of a self-documenting file that contains annotations or comments that aid visual inspection of the data file. We describe access of data from annotated files and illustrate data analysis with a few examples derived from the UNIX operating environment. Use of annotated files provides the investigator with both a useful representation of the primary data and a repository of comments that describe some of the context surrounding data capture.

  15. PhosphoHunter: An Efficient Software Tool for Phosphopeptide Identification

    Directory of Open Access Journals (Sweden)

    Alessandra Tiengo

    2015-01-01

    Full Text Available Phosphorylation is a protein posttranslational modification. It is responsible of the activation/inactivation of disease-related pathways, thanks to its role of “molecular switch.” The study of phosphorylated proteins becomes a key point for the proteomic analyses focused on the identification of diagnostic/therapeutic targets. Liquid chromatography coupled to tandem mass spectrometry (LC-MS/MS is the most widely used analytical approach. Although unmodified peptides are automatically identified by consolidated algorithms, phosphopeptides still require automated tools to avoid time-consuming manual interpretation. To improve phosphopeptide identification efficiency, a novel procedure was developed and implemented in a Perl/C tool called PhosphoHunter, here proposed and evaluated. It includes a preliminary heuristic step for filtering out the MS/MS spectra produced by nonphosphorylated peptides before sequence identification. A method to assess the statistical significance of identified phosphopeptides was also formulated. PhosphoHunter performance was tested on a dataset of 1500 MS/MS spectra and it was compared with two other tools: Mascot and Inspect. Comparisons demonstrated that a strong point of PhosphoHunter is sensitivity, suggesting that it is able to identify real phosphopeptides with superior performance. Performance indexes depend on a single parameter (intensity threshold that users can tune according to the study aim. All the three tools localized >90% of phosphosites.

  16. Development of a Software Tool to Automate ADCO Flight Controller Console Planning Tasks

    Science.gov (United States)

    Anderson, Mark G.

    2011-01-01

    This independent study project covers the development of the International Space Station (ISS) Attitude Determination and Control Officer (ADCO) Planning Exchange APEX Tool. The primary goal of the tool is to streamline existing manual and time-intensive planning tools into a more automated, user-friendly application that interfaces with existing products and allows the ADCO to produce accurate products and timelines more effectively. This paper will survey the current ISS attitude planning process and its associated requirements, goals, documentation and software tools and how a software tool could simplify and automate many of the planning actions which occur at the ADCO console. The project will be covered from inception through the initial prototype delivery in November 2011 and will include development of design requirements and software as well as design verification and testing.

  17. The Web Interface Template System (WITS), a software developer`s tool

    Energy Technology Data Exchange (ETDEWEB)

    Lauer, L.J.; Lynam, M.; Muniz, T. [Sandia National Labs., Albuquerque, NM (United States). Financial Systems Dept.

    1995-11-01

    The Web Interface Template System (WITS) is a tool for software developers. WITS is a three-tiered, object-oriented system operating in a Client/Server environment. This tool can be used to create software applications that have a Web browser as the user interface and access a Sybase database. Development, modification, and implementation are greatly simplified because the developer can change and test definitions immediately, without writing or compiling any code. This document explains WITS functionality, the system structure and components of WITS, and how to obtain, install, and use the software system.

  18. lipID--a software tool for automated assignment of lipids in mass spectra.

    Science.gov (United States)

    Hübner, Göran; Crone, Catharina; Lindner, Buko

    2009-12-01

    A new software tool called lipID is reported, which supports the identification of glycerophospholipids, glycosphingolipids, fatty acids and small oligosaccharides in mass spectra. The user-extendable software is a Microsoft (MS) Excel Add-In developed using Visual Basic for Applications and is compatible with all Versions of MS Excel since MS Excel 97. It processes singly given mass-to-charge values as well as mass lists considering a number of user-defined options. The software's mode of operation, usage and options are explained and the benefits and limitations of the tool are illustrated by means of three typical analytical examples of lipid analyses.

  19. Software tool for the prosthetic foot modeling and stiffness optimization.

    Science.gov (United States)

    Strbac, Matija; Popović, Dejan B

    2012-01-01

    We present the procedure for the optimization of the stiffness of the prosthetic foot. The procedure allows the selection of the elements of the foot and the materials used for the design. The procedure is based on the optimization where the cost function is the minimization of the difference between the knee joint torques of healthy walking and the walking with the transfemural prosthesis. We present a simulation environment that allows the user to interactively vary the foot geometry and track the changes in the knee torque that arise from these adjustments. The software allows the estimation of the optimal prosthetic foot elasticity and geometry. We show that altering model attributes such as the length of the elastic foot segment or its elasticity leads to significant changes in the estimated knee torque required for a given trajectory.

  20. Improving Fund Risk Management by Using New Software Tools Technology

    Directory of Open Access Journals (Sweden)

    Stephanos Papadamou

    2004-01-01

    Full Text Available This paper introduces a new MATLAB-based toolbox for Computer Aided mutual fund risk evaluation. In the age of computerized trading, financial services companies and independent investors must quickly investigate fund investment style and market risk. The Fund Risk toolbox is a financial software that includes a set of functions based on value at risk (VaR and expected tail loss (ETL methodology for graphical presentation of risk forecasts, evaluation of different risk models and identification of fund investment style. The sample of historical data can be divided to an estimation rolling window and the back-testing period. MATLAB?s vast built-in mathematical and financial functionality along with the fact that is both an interpreted and compiled programming language make this toolbox easily extendable by adding new complicated risk models with minimum programming effort.

  1. Software Tool for the Prosthetic Foot Modeling and Stiffness Optimization

    Directory of Open Access Journals (Sweden)

    Matija Štrbac

    2012-01-01

    Full Text Available We present the procedure for the optimization of the stiffness of the prosthetic foot. The procedure allows the selection of the elements of the foot and the materials used for the design. The procedure is based on the optimization where the cost function is the minimization of the difference between the knee joint torques of healthy walking and the walking with the transfemural prosthesis. We present a simulation environment that allows the user to interactively vary the foot geometry and track the changes in the knee torque that arise from these adjustments. The software allows the estimation of the optimal prosthetic foot elasticity and geometry. We show that altering model attributes such as the length of the elastic foot segment or its elasticity leads to significant changes in the estimated knee torque required for a given trajectory.

  2. Collaborative Software Development in Support of Fast Adaptive AeroSpace Tools (FAAST)

    Science.gov (United States)

    Kleb, William L.; Nielsen, Eric J.; Gnoffo, Peter A.; Park, Michael A.; Wood, William A.

    2003-01-01

    A collaborative software development approach is described. The software product is an adaptation of proven computational capabilities combined with new capabilities to form the Agency's next generation aerothermodynamic and aerodynamic analysis and design tools. To efficiently produce a cohesive, robust, and extensible software suite, the approach uses agile software development techniques; specifically, project retrospectives, the Scrum status meeting format, and a subset of Extreme Programming's coding practices are employed. Examples are provided which demonstrate the substantial benefits derived from employing these practices. Also included is a discussion of issues encountered when porting legacy Fortran 77 code to Fortran 95 and a Fortran 95 coding standard.

  3. Computer-aided design in power engineering. Application of software tools

    Energy Technology Data Exchange (ETDEWEB)

    Stojkovic, Zlatan

    2012-07-01

    Demonstrates the use software tools in the practice of design in the field of power systems. Presents many applications in the design in the field of power systems. Useful for educative purposes and practical work. This textbooks demonstrates the application of software tools in solving a series of problems from the field of designing power system structures and systems. It contains four chapters: The first chapter leads the reader through all the phases necessary in the procedures of computer aided modeling and simulation. It guides through the complex problems presenting on the basis of eleven original examples. The second chapter presents application of software tools in power system calculations of power systems equipment design. Several design example calculations are carried out using engineering standards like MATLAB, EMTP/ATP, Excel and Access, AutoCAD and Simulink. The third chapters focuses on the graphical documentation using a collection of software tools (AutoCAD, EPLAN, SIMARIS SIVACON, SIMARIS DESIGN) which enable the complete automation of the development of graphical documentation of a power systems. In the fourth chapter, the application of software tools in the project management in power systems is discussed. Here, the emphasis is put on the standard software MS Excel and MS Project.

  4. A free software tool for the development of decision support systems

    Directory of Open Access Journals (Sweden)

    COLONESE, G

    2008-06-01

    Full Text Available This article describes PostGeoOlap, a free software open source tool for decision support that integrates OLAP (On-Line Analytical Processing and GIS (Geographical Information Systems. Besides describing the tool, we show how it can be used to achieve effective and low cost decision support that is adequate for small and medium companies and for small public offices.

  5. D-VASim: A Software Tool to Simulate and Analyze Genetic Logic Circuits

    DEFF Research Database (Denmark)

    Baig, Hasan; Madsen, Jan

    2016-01-01

    -stage researchers with limited experience in the field of biology. The Solution: Using LabVIEW to develop a user-friendly simulation tool named Dynamic Virtual Analyzer and Simulator (D-VASim), which is the first software tool in the domain of synthetic biology that provides a virtual laboratory environment...

  6. Data Visualization: An Exploratory Study into the Software Tools Used by Businesses

    Science.gov (United States)

    Diamond, Michael; Mattia, Angela

    2015-01-01

    Data visualization is a key component to business and data analytics, allowing analysts in businesses to create tools such as dashboards for business executives. Various software packages allow businesses to create these tools in order to manipulate data for making informed business decisions. The focus is to examine what skills employers are…

  7. Development of tools for safety analysis of control software in advanced reactors

    Energy Technology Data Exchange (ETDEWEB)

    Guarro, S.; Yau, M.; Motamed, M. [Advanced Systems Concepts Associates, El Segundo, CA (United States)

    1996-04-01

    Software based control systems have gained a pervasive presence in a wide variety of applications, including nuclear power plant control and protection systems which are within the oversight and licensing responsibility of the US Nuclear Regulatory Commission. While the cost effectiveness and flexibility of software based plant process control is widely recognized, it is very difficult to achieve and prove high levels of demonstrated dependability and safety assurance for the functions performed by process control software, due to the very flexibility and potential complexity of the software itself. The development of tools to model, analyze and test software design and implementations in the context of the system that the software is designed to control can greatly assist the task of providing higher levels of assurance than those obtainable by software testing alone. This report presents and discusses the development of the Dynamic Flowgraph Methodology (DFM) and its application in the dependability and assurance analysis of software-based control systems. The features of the methodology and full-scale examples of application to both generic process and nuclear power plant control systems are presented and discussed in detail. The features of a workstation software tool developed to assist users in the application of DFM are also described.

  8. Development of a software tool for an internal dosimetry using MIRD method

    Science.gov (United States)

    Chaichana, A.; Tocharoenchai, C.

    2016-03-01

    Currently, many software packages for the internal radiation dosimetry have been developed. Many of them do not provide sufficient tools to perform all of the necessary steps from nuclear medicine image analysis for dose calculation. For this reason, we developed a CALRADDOSE software that can be performed internal dosimetry using MIRD method within a single environment. MATLAB software version 2015a was used as development tool. The calculation process of this software proceeds from collecting time-activity data from image data followed by residence time calculation and absorbed dose calculation using MIRD method. To evaluate the accuracy of this software, we calculate residence times and absorbed doses of 5 Ga- 67 studies and 5 I-131 MIBG studies and then compared the results with those obtained from OLINDA/EXM software. The results showed that the residence times and absorbed doses obtained from both software packages were not statistically significant differences. The CALRADDOSE software is a user-friendly, graphic user interface-based software for internal dosimetry. It provides fast and accurate results, which may be useful for a routine work.

  9. A Software Tool for the Evaluation of the Behaviour of Bioelectrical Currents

    Directory of Open Access Journals (Sweden)

    Gianluca Fabbri

    2011-06-01

    Full Text Available A software tool has been developed in order to evaluate bioelectrical currents. The tool is able to provide a graphical representation of the behaviour of small currents emitted by characteristic points of the human body and captured through a non invasive probe previously developed. The software implementation combines a variety of graphical techniques to create a powerful system that will enable users to perform an accurate and reliable analysis of the emitted currents and to easily go on to further applications and research. This paper introduces the design and the main characteristics of the tool and shows significant measurement results.

  10. Nmag micromagnetic simulation tool - software engineering lessons learned

    CERN Document Server

    Fangohr, Hans; Franchin, Matteo

    2016-01-01

    We review design decisions and their impact for the open source code Nmag from a software engineering in computational science point of view. Key lessons to learn include that the approach of encapsulating the simulation functionality in a library of a general purpose language, here Python, eliminates the need for configuration files, provides greatest flexibility in using the simulation, allows mixing of multiple simulations, pre- and post-processing in the same (Python) file, and allows to benefit from the rich Python ecosystem of scientific packages. The choice of programming language (OCaml) for the computational core did not resonate with the users of the package (who are not computer scientists) and was suboptimal. The choice of Python for the top-level user interface was very well received by users from the science and engineering community. The from-source installation in which key requirements were compiled from a tarball was remarkably robust. In places, the code is a lot more ambitious than necessa...

  11. Towards a Reference Architecture to Provision Tools as a Service for Global Software Development

    DEFF Research Database (Denmark)

    Chauhan, Aufeef; Babar, Muhammad Ali

    2014-01-01

    Organizations involve in Global Software Development (GSD) face challenges in terms of having access to appropriate set of tools for performing distributed engineering and development activities, integration between heterogeneous desktop and web-based tools, management of artifacts developed...... and maintained over distant locations using different kind of tools, traceability among artifacts, and access to artifacts and data of sensitive nature. These challenges pose additional constraints on specific projects and reduce the possibility to carry out their engineering and development in globally...

  12. Software Tools for Robust Analysis of High-Dimensional Data

    Directory of Open Access Journals (Sweden)

    Valentin Todorov

    2014-06-01

    Full Text Available The present work discusses robust multivariate methods specifically designed for highdimensions. Their implementation in R is presented and their application is illustratedon examples. The first group are algorithms for outlier detection, already introducedelsewhere and implemented in other packages. The value added of the new package isthat all methods follow the same design pattern and thus can use the same graphicaland diagnostic tools. The next topic covered is sparse principal components including anobject oriented interface to the standard method proposed by Zou, Hastie, and Tibshirani(2006 and the robust one proposed by Croux, Filzmoser, and Fritz (2013. Robust partialleast squares (see Hubert and Vanden Branden 2003 as well as partial least squares fordiscriminant analysis conclude the scope of the new package.

  13. DAE Tools: equation-based object-oriented modelling, simulation and optimisation software

    Directory of Open Access Journals (Sweden)

    Dragan D. Nikolić

    2016-04-01

    Full Text Available In this work, DAE Tools modelling, simulation and optimisation software, its programming paradigms and main features are presented. The current approaches to mathematical modelling such as the use of modelling languages and general-purpose programming languages are analysed. The common set of capabilities required by the typical simulation software are discussed, and the shortcomings of the current approaches recognised. A new hybrid approach is introduced, and the modelling languages and the hybrid approach are compared in terms of the grammar, compiler, parser and interpreter requirements, maintainability and portability. The most important characteristics of the new approach are discussed, such as: (1 support for the runtime model generation; (2 support for the runtime simulation set-up; (3 support for complex runtime operating procedures; (4 interoperability with the third party software packages (i.e. NumPy/SciPy; (5 suitability for embedding and use as a web application or software as a service; and (6 code-generation, model exchange and co-simulation capabilities. The benefits of an equation-based approach to modelling, implemented in a fourth generation object-oriented general purpose programming language such as Python are discussed. The architecture and the software implementation details as well as the type of problems that can be solved using DAE Tools software are described. Finally, some applications of the software at different levels of abstraction are presented, and its embedding capabilities and suitability for use as a software as a service is demonstrated.

  14. Teaching structure: student use of software tools for understanding macromolecular structure in an undergraduate biochemistry course.

    Science.gov (United States)

    Jaswal, Sheila S; O'Hara, Patricia B; Williamson, Patrick L; Springer, Amy L

    2013-01-01

    Because understanding the structure of biological macromolecules is critical to understanding their function, students of biochemistry should become familiar not only with viewing, but also with generating and manipulating structural representations. We report a strategy from a one-semester undergraduate biochemistry course to integrate use of structural representation tools into both laboratory and homework activities. First, early in the course we introduce the use of readily available open-source software for visualizing protein structure, coincident with modules on amino acid and peptide bond properties. Second, we use these same software tools in lectures and incorporate images and other structure representations in homework tasks. Third, we require a capstone project in which teams of students examine a protein-nucleic acid complex and then use the software tools to illustrate for their classmates the salient features of the structure, relating how the structure helps explain biological function. To ensure engagement with a range of software and database features, we generated a detailed template file that can be used to explore any structure, and that guides students through specific applications of many of the software tools. In presentations, students demonstrate that they are successfully interpreting structural information, and using representations to illustrate particular points relevant to function. Thus, over the semester students integrate information about structural features of biological macromolecules into the larger discussion of the chemical basis of function. Together these assignments provide an accessible introduction to structural representation tools, allowing students to add these methods to their biochemical toolboxes early in their scientific development.

  15. SaTool - a Software Tool for Structural Analysis of Complex Automation Systems

    DEFF Research Database (Denmark)

    Blanke, Mogens; Lorentzen, Torsten

    2006-01-01

    The paper introduces SaTool, a tool for structural analysis, the use of the Matlab (R)-based implementation is presented and special features are introduced, which were motivated by industrial users. Salient features of tool are presented, including the ability to specify the behavior of a comple...

  16. A Software Tool for Optimal Sizing of PV Systems in Malaysia

    OpenAIRE

    Tamer Khatib; Azah Mohamed; K. Sopian

    2012-01-01

    This paper presents a MATLAB based user friendly software tool called as PV.MY for optimal sizing of photovoltaic (PV) systems. The software has the capabilities of predicting the metrological variables such as solar energy, ambient temperature and wind speed using artificial neural network (ANN), optimizes the PV module/ array tilt angle, optimizes the inverter size and calculate optimal capacities of PV array, battery, wind turbine and diesel generator in hybrid PV systems. The ANN based mo...

  17. A System Identification Software Tool for General MISO ARX-Type of Model Structures

    OpenAIRE

    Lindskog, Peter

    1996-01-01

    The typical system identification procedure requires powerful and versatile software means. In this paper we describe and exemplify the use of a prototype identi#cation software tool, applicable for the rather broad class of multi input single output model structures with regressors that are formed by delayed in- and outputs. Interesting special instances of this model structure category include, e.g., linear ARX and many semi-physical structures, feed-forward neural networks, radial basis fu...

  18. MyETL: A Java Software Tool to Extract, Transform, and Load Your Business

    OpenAIRE

    Michele Nuovo

    2015-01-01

    The project follows the development of a Java Software Tool that extracts data from Flat File (Fixed Length Record Type), CSV (Comma Separated Values), and XLS (Microsoft Excel 97-2003 Worksheet file), apply transformation to those sources, and finally load the data into the end target RDBMS. The software refers to a process known as ETL (Extract Transform and Load). Those kinds of systems are called ETL systems.

  19. MyETL: A Java Software Tool to Extract, Transform, and Load Your Business

    Directory of Open Access Journals (Sweden)

    Michele Nuovo

    2015-12-01

    Full Text Available The project follows the development of a Java Software Tool that extracts data from Flat File (Fixed Length Record Type, CSV (Comma Separated Values, and XLS (Microsoft Excel 97-2003 Worksheet file, apply transformation to those sources, and finally load the data into the end target RDBMS. The software refers to a process known as ETL (Extract Transform and Load. Those kinds of systems are called ETL systems.

  20. A Reference Architecture for Providing Tools as a Service to Support Global Software Development

    DEFF Research Database (Denmark)

    Chauhan, Aufeef

    2014-01-01

    -based solutions. The restricted ability of the organizations to have desired alignment of tools with software engineering and development processes results in administrative and managerial overhead that incur increased development cost and poor product quality. Moreover, stakeholders involved in the projects have......Global Software Development (GSD) teams encounter challenges that are associated with distribution of software development activities across multiple geographic regions. The limited support for performing collaborative development and engineering activities and lack of sufficient support...... for maintaining and resolving dependencies and traceability across heterogeneous tools are major challenges for GSD teams. The lack of insufficient support for cross platform tools integration also makes it hard to address the stated challenges using existing paradigms that are based upon desktop and web...

  1. A software tool for design of process monitoring and analysis systems

    DEFF Research Database (Denmark)

    Singh, Ravendra; Gernaey, Krist; Gani, Rafiqul

    2009-01-01

    and analysis system. A software to achieve this has been developed. Two developed supporting tools for the design, a knowledge base (consisting of the process knowledge as well as the knowledge on measurement methods & tools) and a model library (consisting of the process operational models) have been extended...... rigorously and integrated with the user interface, which made the software more generic and applicable to a wide range of problems. The software for the design of a process monitoring and analysis system is presented and illustrated with a tablet manufacturing process example.......A well designed process monitoring and analysis system is necessary to consistently achieve any predefined end product quality. Systematic computer aided methods and tools provide the means to design the necessary process monitoring and analysis systems and/or to validate any existing monitoring...

  2. [Software CMAP TOOLS ™ to build concept maps: an evaluation by nursing students].

    Science.gov (United States)

    Ferreira, Paula Barreto; Cohrs, Cibelli Rizzo; De Domenico, Edvane Birelo Lopes

    2012-08-01

    Concept mapping (CM) is a teaching strategy that can be used to solve clinical cases, but the maps are difficult to write. The objective of this study was to describe the challenges and contributions of the Cmap Tools® software in building concept maps to solve clinical cases. To do this, a descriptive and qualitative method was used with junior nursing students from the Federal University of São Paulo. The teaching strategy was applied and the data were collected using the focal group technique. The results showed that the software facilitates and guarantees the organization, visualization, and correlation of the data, but there are difficulties related to the handling of its tools initially. In conclusion, the formatting and auto formatting resources of Cmap Tools® facilitated the construction of concept maps; however, orientation strategies should be implemented for the initial stage of the software utilization.

  3. Data Mining for Secure Software Engineering – Source Code Management Tool Case Study

    Directory of Open Access Journals (Sweden)

    A.V.Krishna Prasad,

    2010-07-01

    Full Text Available As Data Mining for Secure Software Engineering improves software productivity and quality, software engineers are increasingly applying data mining algorithms to various software engineering tasks. However mining software engineering data poses several challenges, requiring various algorithms to effectively mine sequences, graphs and text from such data. Software engineering data includes code bases, execution traces, historical code changes,mailing lists and bug data bases. They contains a wealth of information about a projects-status, progress and evolution. Using well established data mining techniques, practitioners and researchers can explore the potential of this valuable data in order to better manage their projects and do produce higher-quality software systems that are delivered on time and with in budget. Data mining can be used in gathering and extracting latent security requirements, extracting algorithms and business rules from code, mining legacy applications for requirements and business rules for new projects etc. Mining algorithms for software engineering falls into four main categories: Frequent pattern mining – finding commonly occurring patterns; Pattern matching – finding data instances for given patterns; Clustering – grouping data into clusters and Classification – predicting labels of data based on already labeled data. In this paper, we will discuss the overview of strategies for data mining for secure software engineering, with the implementation of a case study of text mining for source code management tool.

  4. Towards early software reliability prediction for computer forensic tools (case study).

    Science.gov (United States)

    Abu Talib, Manar

    2016-01-01

    Versatility, flexibility and robustness are essential requirements for software forensic tools. Researchers and practitioners need to put more effort into assessing this type of tool. A Markov model is a robust means for analyzing and anticipating the functioning of an advanced component based system. It is used, for instance, to analyze the reliability of the state machines of real time reactive systems. This research extends the architecture-based software reliability prediction model for computer forensic tools, which is based on Markov chains and COSMIC-FFP. Basically, every part of the computer forensic tool is linked to a discrete time Markov chain. If this can be done, then a probabilistic analysis by Markov chains can be performed to analyze the reliability of the components and of the whole tool. The purposes of the proposed reliability assessment method are to evaluate the tool's reliability in the early phases of its development, to improve the reliability assessment process for large computer forensic tools over time, and to compare alternative tool designs. The reliability analysis can assist designers in choosing the most reliable topology for the components, which can maximize the reliability of the tool and meet the expected reliability level specified by the end-user. The approach of assessing component-based tool reliability in the COSMIC-FFP context is illustrated with the Forensic Toolkit Imager case study.

  5. Managing the Testing Process Practical Tools and Techniques for Managing Hardware and Software Testing

    CERN Document Server

    Black, Rex

    2011-01-01

    New edition of one of the most influential books on managing software and hardware testing In this new edition of his top-selling book, Rex Black walks you through the steps necessary to manage rigorous testing programs of hardware and software. The preeminent expert in his field, Mr. Black draws upon years of experience as president of both the International and American Software Testing Qualifications boards to offer this extensive resource of all the standards, methods, and tools you'll need. The book covers core testing concepts and thoroughly examines the best test management practices

  6. A Reference Architecture for Providing Tools as a Service to Support Global Software Development

    DEFF Research Database (Denmark)

    Chauhan, Aufeef

    2014-01-01

    Global Software Development (GSD) teams encounter challenges that are associated with distribution of software development activities across multiple geographic regions. The limited support for performing collaborative development and engineering activities and lack of sufficient support for main......Global Software Development (GSD) teams encounter challenges that are associated with distribution of software development activities across multiple geographic regions. The limited support for performing collaborative development and engineering activities and lack of sufficient support...... specific constraints regarding availability and deployments of the tools. The artifacts and data produced or consumed by the tools need to be governed according to the constraints and corresponding quality of service (QoS) parameters. In this paper, we present the research agenda to leverage cloud......-computing paradigm for addressing above-mentioned issues by providing a framework to select appropriate tools as well as associated services and reference architecture of the cloud-enabled middleware platform that allows on demand provisioning of software engineering Tools as a Service (TaaS) with focus...

  7. Activity Theory applied to Global Software Engineering: Theoretical Foundations and Implications for Tool Builders

    DEFF Research Database (Denmark)

    Tell, Paolo; Ali Babar, Muhammad

    2012-01-01

    Although a plethora of tools are available for Global Software Engineering (GSE) teams, it is being realized increasingly that the most prevalent desktop metaphor underpinning the majority of tools have several inherent limitations. We have proposed that Activity-Based Computing (ABC) can...... be a promising alternative to build tools for GSE. However, significant effort is required to introduce a new paradigm; there is a need of sound theoretical foundation based on activity theory to address challenges faced by tools in GSE. This paper reports our effort aimed at building theoretical foundations...... in building supporting infrastructure for GSE, and describe a proof of concept prototype....

  8. "Blogs" and "wikis" are valuable software tools for communication within research groups.

    Science.gov (United States)

    Sauer, Igor M; Bialek, Dominik; Efimova, Ekaterina; Schwartlander, Ruth; Pless, Gesine; Neuhaus, Peter

    2005-01-01

    Appropriate software tools may improve communication and ease access to knowledge for research groups. A weblog is a website which contains periodic, chronologically ordered posts on a common webpage, whereas a wiki is hypertext-based collaborative software that enables documents to be authored collectively using a web browser. Although not primarily intended for use as an intranet-based collaborative knowledge warehouse, both blogs and wikis have the potential to offer all the features of complex and expensive IT solutions. These tools enable the team members to share knowledge simply and quickly-the collective knowledge base of the group can be efficiently managed and navigated.

  9. Proceedings of the Workshop on software tools for distributed intelligent control systems

    Energy Technology Data Exchange (ETDEWEB)

    Herget, C.J. (ed.)

    1990-09-01

    The Workshop on Software Tools for Distributed Intelligent Control Systems was organized by Lawrence Livermore National Laboratory for the United States Army Headquarters Training and Doctrine Command and the Defense Advanced Research Projects Agency. The goals of the workshop were to the identify the current state of the art in tools which support control systems engineering design and implementation, identify research issues associated with writing software tools which would provide a design environment to assist engineers in multidisciplinary control design and implementation, formulate a potential investment strategy to resolve the research issues and develop public domain code which can form the core of more powerful engineering design tools, and recommend test cases to focus the software development process and test associated performance metrics. Recognizing that the development of software tools for distributed intelligent control systems will require a multidisciplinary effort, experts in systems engineering, control systems engineering, and compute science were invited to participate in the workshop. In particular, experts who could address the following topics were selected: operating systems, engineering data representation and manipulation, emerging standards for manufacturing data, mathematical foundations, coupling of symbolic and numerical computation, user interface, system identification, system representation at different levels of abstraction, system specification, system design, verification and validation, automatic code generation, and integration of modular, reusable code.

  10. Review of free software tools for image analysis of fluorescence cell micrographs.

    Science.gov (United States)

    Wiesmann, V; Franz, D; Held, C; Münzenmayer, C; Palmisano, R; Wittenberg, T

    2015-01-01

    An increasing number of free software tools have been made available for the evaluation of fluorescence cell micrographs. The main users are biologists and related life scientists with no or little knowledge of image processing. In this review, we give an overview of available tools and guidelines about which tools the users should use to segment fluorescence micrographs. We selected 15 free tools and divided them into stand-alone, Matlab-based, ImageJ-based, free demo versions of commercial tools and data sharing tools. The review consists of two parts: First, we developed a criteria catalogue and rated the tools regarding structural requirements, functionality (flexibility, segmentation and image processing filters) and usability (documentation, data management, usability and visualization). Second, we performed an image processing case study with four representative fluorescence micrograph segmentation tasks with figure-ground and cell separation. The tools display a wide range of functionality and usability. In the image processing case study, we were able to perform figure-ground separation in all micrographs using mainly thresholding. Cell separation was not possible with most of the tools, because cell separation methods are provided only by a subset of the tools and are difficult to parametrize and to use. Most important is that the usability matches the functionality of a tool. To be usable, specialized tools with less functionality need to fulfill less usability criteria, whereas multipurpose tools need a well-structured menu and intuitive graphical user interface.

  11. A case study on the comparison of different software tools for automated quantification of peptides.

    Science.gov (United States)

    Colaert, Niklaas; Vandekerckhove, Joël; Martens, Lennart; Gevaert, Kris

    2011-01-01

    MS-driven proteomics has evolved over the past two decades to a high tech and high impact research field. Two distinct factors clearly influenced its expansion: the rapid growth of an arsenal of instrument and proteomic techniques that led to an explosion of high quality data and the development of software tools to analyze and interpret these data which boosted the number of scientific discoveries. In analogy with the benchmarking of new instruments and proteomic techniques, such software tools must be thoroughly tested and analyzed. Recently, new tools were developed for automatic peptide quantification in quantitative proteomic experiments. Here we present a case study where the most recent and frequently used tools are analyzed and compared.

  12. Evaluation of computer-aided software engineering tools for data base development

    Energy Technology Data Exchange (ETDEWEB)

    Woyna, M.A.; Carlson, C.R.

    1989-02-01

    More than 80 computer-aided software engineering (CASE) tools were evaluated to determine their usefulness in data base development projects. The goal was to review the current state of the CASE industry and recommend one or more tools for inclusion in the uniform development environment (UDE), a programming environment being designed by Argonne National Laboratory for the US Department of Defense Organization of the Joint Chiefs of Staff, J-8 Directorate. This environment gives a computer programmer a consistent user interface and access to a full suite of tools and utilities for software development. In an effort to identify tools that would be useful in the planning, analysis, design, implementation, and maintenance of Argonne's data base development projects for the J-8 Directorate, we evaluated 83 commercially available CASE products. This report outlines the method used and presents the results of the evaluation.

  13. An Open-Source Tool Set Enabling Analog-Digital-Software Co-Design

    Directory of Open Access Journals (Sweden)

    Michelle Collins

    2016-02-01

    Full Text Available This paper presents an analog-digital hardware-software co-design environment for simulating and programming reconfigurable systems. The tool simulates, designs, as well as enables experimental measurements after compiling to configurable systems in the same integrated design tool framework. High level software in Scilab/Xcos (open-source programs similar to MATLAB/Simulink that converts the high-level block description by the user to blif format (sci2blif, which acts as an input to the modified VPR tool, including the code v p r 2 s w c s , encoding the specific platform through specific architecture files, resulting in a targetable switch list on the resulting configurable analog–digital system. The resulting tool uses an analog and mixed-signal library of components, enabling users and future researchers access to the basic analog operations/computations that are possible.

  14. Contingency Contractor Optimization Phase 3 Sustainment Software Design Document - Contingency Contractor Optimization Tool - Prototype

    Energy Technology Data Exchange (ETDEWEB)

    Durfee, Justin David; Frazier, Christopher Rawls; Bandlow, Alisa; Jones, Katherine A

    2016-05-01

    This document describes the final software design of the Contingency Contractor Optimization Tool - Prototype. Its purpose is to provide the overall architecture of the software and the logic behind this architecture. Documentation for the individual classes is provided in the application Javadoc. The Contingency Contractor Optimization project is intended to address Department of Defense mandates by delivering a centralized strategic planning tool that allows senior decision makers to quickly and accurately assess the impacts, risks, and mitigation strategies associated with utilizing contract support. The Contingency Contractor Optimization Tool - Prototype was developed in Phase 3 of the OSD ATL Contingency Contractor Optimization project to support strategic planning for contingency contractors. The planning tool uses a model to optimize the Total Force mix by minimizing the combined total costs for selected mission scenarios. The model optimizes the match of personnel types (military, DoD civilian, and contractors) and capabilities to meet mission requirements as effectively as possible, based on risk, cost, and other requirements.

  15. Review of Ground Systems Development and Operations (GSDO) Tools for Verifying Command and Control Software

    Science.gov (United States)

    Aguilar, Michael L.; Bonanne, Kevin H.; Favretto, Jeffrey A.; Jackson, Maddalena M.; Jones, Stephanie L.; Mackey, Ryan M.; Sarrel, Marc A.; Simpson, Kimberly A.

    2014-01-01

    The Exploration Systems Development (ESD) Standing Review Board (SRB) requested the NASA Engineering and Safety Center (NESC) conduct an independent review of the plan developed by Ground Systems Development and Operations (GSDO) for identifying models and emulators to create a tool(s) to verify their command and control software. The NESC was requested to identify any issues or weaknesses in the GSDO plan. This document contains the outcome of the NESC review.

  16. A multicenter study benchmarks software tools for label-free proteome quantification.

    Science.gov (United States)

    Navarro, Pedro; Kuharev, Jörg; Gillet, Ludovic C; Bernhardt, Oliver M; MacLean, Brendan; Röst, Hannes L; Tate, Stephen A; Tsou, Chih-Chiang; Reiter, Lukas; Distler, Ute; Rosenberger, George; Perez-Riverol, Yasset; Nesvizhskii, Alexey I; Aebersold, Ruedi; Tenzer, Stefan

    2016-11-01

    Consistent and accurate quantification of proteins by mass spectrometry (MS)-based proteomics depends on the performance of instruments, acquisition methods and data analysis software. In collaboration with the software developers, we evaluated OpenSWATH, SWATH 2.0, Skyline, Spectronaut and DIA-Umpire, five of the most widely used software methods for processing data from sequential window acquisition of all theoretical fragment-ion spectra (SWATH)-MS, which uses data-independent acquisition (DIA) for label-free protein quantification. We analyzed high-complexity test data sets from hybrid proteome samples of defined quantitative composition acquired on two different MS instruments using different SWATH isolation-window setups. For consistent evaluation, we developed LFQbench, an R package, to calculate metrics of precision and accuracy in label-free quantitative MS and report the identification performance, robustness and specificity of each software tool. Our reference data sets enabled developers to improve their software tools. After optimization, all tools provided highly convergent identification and reliable quantification performance, underscoring their robustness for label-free quantitative proteomics.

  17. D-VASim: A Software Tool to Simulate and Analyze Genetic Logic Circuits

    DEFF Research Database (Denmark)

    Baig, Hasan; Madsen, Jan

    2016-01-01

    The Challenge: Creating a software tool for the simulation and analysis of genetic logic circuits to help researchers performing wet lab experiments virtually, because the manual process of wet lab experimentation of genetic logic circuits is time consuming and a challenging task for early...... to perform run-time interactive simulation and analysis of genetic logic circuits....

  18. Experience with case tools in the design of process-oriented software

    CERN Document Server

    Novakov, O

    1993-01-01

    In Accelerator systems such as the CERN PS complex, process equipment has a life time which may exceed the typical life cycle of its related software. Taking in account the variety of such equipment, it is important to keep the analysis and design of the software in a system- independent form. This paper discusses the experience gathered in using commercial CASE tools for analysis, design and reverse engineering of different process- oriented software modules, with a principal emphasis on maintaining the initial analysis in a standardized form. Such tools are in existence for several years, but this paper shows that they are not entirely adapted to our needs.In particular, the paper stresses the problems of integrating such a tool in an existing data-base-dependent development chain, the lack of real-time simulation tools and of object-oriented concepts in existing commercial packages. Finally, the paper attempts to show a broader view of software engineering needs in our particular context.

  19. Using a Self-Administered Visual Basic Software Tool To Teach Psychological Concepts.

    Science.gov (United States)

    Strang, Harold R.; Sullivan, Amie K.; Schoeny, Zahrl G.

    2002-01-01

    Introduces LearningLinks, a Visual Basic software tool that allows teachers to create individualized learning modules that use constructivist and behavioral learning principles. Describes field testing of undergraduates at the University of Virginia that tested a module designed to improve understanding of the psychological concepts of…

  20. Wiki as a Corporate Learning Tool: Case Study for Software Development Company

    Science.gov (United States)

    Milovanovic, Milos; Minovic, Miroslav; Stavljanin, Velimir; Savkovic, Marko; Starcevic, Dusan

    2012-01-01

    In our study, we attempted to further investigate how Web 2.0 technologies influence workplace learning. Our particular interest was on using Wiki as a tool for corporate exchange of knowledge with the focus on informal learning. In this study, we collaborated with a multinational software development company that uses Wiki as a corporate tool…

  1. SDMdata: A Web-Based Software Tool for Collecting Species Occurrence Records.

    Directory of Open Access Journals (Sweden)

    Xiaoquan Kong

    Full Text Available It is important to easily and efficiently obtain high quality species distribution data for predicting the potential distribution of species using species distribution models (SDMs. There is a need for a powerful software tool to automatically or semi-automatically assist in identifying and correcting errors. Here, we use Python to develop a web-based software tool (SDMdata to easily collect occurrence data from the Global Biodiversity Information Facility (GBIF and check species names and the accuracy of coordinates (latitude and longitude. It is an open source software (GNU Affero General Public License/AGPL licensed allowing anyone to access and manipulate the source code. SDMdata is available online free of charge from .

  2. Astrophysics datamining in the classroom: Exploring real data with new software tools and robotic telescopes

    CERN Document Server

    Doran, Rosa; Boudier, Thomas; Pacôme,; Delva,; Ferlet, Roger; Almeida, Maria L T; Barbosa, Domingos; Gomez, Edward; Pennypacker, Carl; Roche, Paul; Roberts, Sarah

    2012-01-01

    Within the efforts to bring frontline interactive astrophysics and astronomy to the classroom, the Hands on Universe (HOU) developed a set of exercises and platform using real data obtained by some of the most advanced ground and space observatories. The backbone of this endeavour is a new free software Web tool - Such a Lovely Software for Astronomy based on Image J (Salsa J). It is student-friendly and developed specifically for the HOU project and targets middle and high schools. It allows students to display, analyze, and explore professionally obtained astronomical images, while learning concepts on gravitational dynamics, kinematics, nuclear fusion, electromagnetism. The continuous evolving set of exercises and tutorials is being completed with real (professionally obtained) data to download and detailed tutorials. The flexibility of the Salsa J platform tool enables students and teachers to extend the exercises with their own observations. The software developed for the HOU program has been designed to...

  3. Programming heterogeneous MPSoCs tool flows to close the software productivity gap

    CERN Document Server

    Castrillón Mazo, Jerónimo

    2014-01-01

    This book provides embedded software developers with techniques for programmingheterogeneous Multi-Processor Systems-on-Chip (MPSoCs), capable of executing multiple applications simultaneously. It describes a set of algorithms and methodologies to narrow the software productivity gap, as well as an in-depth description of the underlying problems and challenges of today’s programming practices. The authors present four different tool flows: A parallelism extraction flow for applications writtenusing the C programming language, a mapping and scheduling flow for parallel applications, a special mapping flow for baseband applications in the context of Software Defined Radio (SDR) and a final flow for analyzing multiple applications at design time. The tool flows are evaluated on Virtual Platforms (VPs), which mimic different characteristics of state-of-the-art heterogeneous MPSoCs.   • Provides a novel set of algorithms and methodologies for programming heterogeneous Multi-Processor Systems-on-Chip (MPSoCs)...

  4. Software Construction and Composition Tools for Petascale Computing SCW0837 Progress Report

    Energy Technology Data Exchange (ETDEWEB)

    Epperly, T W; Hochstein, L

    2011-09-12

    The majority of scientific software is distributed as source code. As the number of library dependencies and supported platforms increases, so does the complexity of describing the rules for configuring and building software. In this project, we have performed an empirical study of the magnitude of the build problem by examining the development history of two DOE-funded scientific software projects. We have developed MixDown, a meta-build tool, to simplify the task of building applications that depend on multiple third-party libraries. The results of this research indicate that the effort that scientific programmers spend takes a significant fraction of the total development effort and that the use of MixDown can significantly simplify the task of building software with multiple dependencies.

  5. Assess/Mitigate Risk through the Use of Computer-Aided Software Engineering (CASE) Tools

    Science.gov (United States)

    Aguilar, Michael L.

    2013-01-01

    The NASA Engineering and Safety Center (NESC) was requested to perform an independent assessment of the mitigation of the Constellation Program (CxP) Risk 4421 through the use of computer-aided software engineering (CASE) tools. With the cancellation of the CxP, the assessment goals were modified to capture lessons learned and best practices in the use of CASE tools. The assessment goal was to prepare the next program for the use of these CASE tools. The outcome of the assessment is contained in this document.

  6. Life Cycle Assessment Studies of Chemical and Biochemical Processes through the new LCSoft Software-tool

    DEFF Research Database (Denmark)

    Supawanich, Perapong; Malakul, Pomthong; Gani, Rafiqul

    2015-01-01

    requirements have to be evaluated together with environmental and economic aspects. The LCSoft software-tool has been developed to perform LCA as a stand-alone tool as well as integrated with other process design tools such as process simulation, economic analysis (ECON), and sustainable process design...... (SustainPro). An extended version of LCSoft is presented in this paper. The development work consists of four main tasks. The first task consists of the Life Cycle Inventory (LCI) calculation function. The second task deals with the extension of the Life Cycle Inventory database and improvement of the Life...

  7. New software tools for enhanced precision in robot-assisted laser phonomicrosurgery.

    Science.gov (United States)

    Dagnino, Giulio; Mattos, Leonardo S; Caldwell, Darwin G

    2012-01-01

    This paper describes a new software package created to enhance precision during robot-assisted laser phonomicrosurgery procedures. The new software is composed of three tools for camera calibration, automatic tumor segmentation, and laser tracking. These were designed and developed to improve the outcome of this demanding microsurgical technique, and were tested herein to produce quantitative performance data. The experimental setup was based on the motorized laser micromanipulator created by Istituto Italiano di Tecnologia and the experimental protocols followed are fully described in this paper. The results show the new tools are robust and effective: The camera calibration tool reduced residual errors (RMSE) to 0.009 ± 0.002 mm under 40× microscope magnification; the automatic tumor segmentation tool resulted in deep lesion segmentations comparable to manual segmentations (RMSE= 0.160 ± 0.028 mm under 40× magnification); and the laser tracker tool proved to be reliable even during cutting procedures (RMSE= 0.073 ± 0.023 mm under 40× magnification). These results demonstrate the new software package can provide excellent improvements to the previous microsurgical system, leading to important enhancements in surgical outcome.

  8. IPAT: a freely accessible software tool for analyzing multiple patent documents with inbuilt landscape visualizer.

    Science.gov (United States)

    Ajay, Dara; Gangwal, Rahul P; Sangamwar, Abhay T

    2015-01-01

    Intelligent Patent Analysis Tool (IPAT) is an online data retrieval tool, operated based on text mining algorithm to extract specific patent information in a predetermined pattern into an Excel sheet. The software is designed and developed to retrieve and analyze technology information from multiple patent documents and generate various patent landscape graphs and charts. The software is C# coded in visual studio 2010, which extracts the publicly available patent information from the web pages like Google Patent and simultaneously study the various technology trends based on user-defined parameters. In other words, IPAT combined with the manual categorization will act as an excellent technology assessment tool in competitive intelligence and due diligence for predicting the future R&D forecast.

  9. A Business Analytics Software Tool for Monitoring and Predicting Radiology Throughput Performance.

    Science.gov (United States)

    Jones, Stephen; Cournane, Seán; Sheehy, Niall; Hederman, Lucy

    2016-12-01

    Business analytics (BA) is increasingly being utilised by radiology departments to analyse and present data. It encompasses statistical analysis, forecasting and predictive modelling and is used as an umbrella term for decision support and business intelligence systems. The primary aim of this study was to determine whether utilising BA technologies could contribute towards improved decision support and resource management within radiology departments. A set of information technology requirements were identified with key stakeholders, and a prototype BA software tool was designed, developed and implemented. A qualitative evaluation of the tool was carried out through a series of semi-structured interviews with key stakeholders. Feedback was collated, and emergent themes were identified. The results indicated that BA software applications can provide visibility of radiology performance data across all time horizons. The study demonstrated that the tool could potentially assist with improving operational efficiencies and management of radiology resources.

  10. Pathway Tools version 19.0 update: software for pathway/genome informatics and systems biology.

    Science.gov (United States)

    Karp, Peter D; Latendresse, Mario; Paley, Suzanne M; Krummenacker, Markus; Ong, Quang D; Billington, Richard; Kothari, Anamika; Weaver, Daniel; Lee, Thomas; Subhraveti, Pallavi; Spaulding, Aaron; Fulcher, Carol; Keseler, Ingrid M; Caspi, Ron

    2016-09-01

    Pathway Tools is a bioinformatics software environment with a broad set of capabilities. The software provides genome-informatics tools such as a genome browser, sequence alignments, a genome-variant analyzer and comparative-genomics operations. It offers metabolic-informatics tools, such as metabolic reconstruction, quantitative metabolic modeling, prediction of reaction atom mappings and metabolic route search. Pathway Tools also provides regulatory-informatics tools, such as the ability to represent and visualize a wide range of regulatory interactions. This article outlines the advances in Pathway Tools in the past 5 years. Major additions include components for metabolic modeling, metabolic route search, computation of atom mappings and estimation of compound Gibbs free energies of formation; addition of editors for signaling pathways, for genome sequences and for cellular architecture; storage of gene essentiality data and phenotype data; display of multiple alignments, and of signaling and electron-transport pathways; and development of Python and web-services application programming interfaces. Scientists around the world have created more than 9800 Pathway/Genome Databases by using Pathway Tools, many of which are curated databases for important model organisms.

  11. SU-E-T-27: A Tool for Routine Quality Assurance of Radiotherapy Dose Calculation Software

    Energy Technology Data Exchange (ETDEWEB)

    Popple, R; Cardan, R; Duan, J; Wu, X; Shen, S; Brezovich, I [The University of Alabama at Birmingham, Birmingham, AL (United States)

    2014-06-01

    Purpose: Dose calculation software is thoroughly evaluated when it is commissioned; however, evaluation of periodic software updates is typically limited in scope due to staffing constraints and the need to quickly return the treatment planning system to clinical service. We developed a tool for quickly and comprehensively testing and documenting dose calculation software against measured data. Methods: A tool was developed using MatLab (The MathWorks, Natick, MA) for evaluation of dose calculation algorithms against measured data. Inputs to the tool are measured data, reference DICOM RT PLAN files describing the measurements, and dose calculations in DICOM format. The tool consists of a collection of extensible modules that can perform analysis of point dose, depth dose curves, and profiles using dose difference, distance-to-agreement, and the gamma-index. Each module generates a report subsection that is incorporated into a master template, which is converted to final form in portable document format (PDF). Results: After each change to the treatment planning system, a report can be generated in approximately 90 minutes. The tool has been in use for more than 5 years, spanning 5 versions of the eMC and 4 versions of the AAA. We have detected changes to the algorithms that affected clinical practice once during this period. Conclusion: Our tool provides an efficient method for quality assurance of dose calculation software, providing a complete set of tests for an update. Future work includes the addition of plan level tests, allowing incorporation of, for example, the TG-119 test suite for IMRT, and integration with the treatment planning system via an application programming interface. Integration with the planning system will permit fully-automated testing and reporting at scheduled intervals.

  12. Cerec Smile Design--a software tool for the enhancement of restorations in the esthetic zone.

    Science.gov (United States)

    Kurbad, Andreas; Kurbad, Susanne

    2013-01-01

    Restorations in the esthetic zone can now be enhanced using software tools. In addition to the design of the restoration itself, a part or all of the patient's face can be displayed on the monitor to increase the predictability of treatment results. Using the Smile Design components of the Cerec and inLab software, a digital photograph of the patient can be projected onto a three-dimensional dummy head. In addition to its use for the enhancement of the CAD process, this technology can also be utilized for marketing purposes.

  13. Distortion Analysis Toolkit—A Software Tool for Easy Analysis of Nonlinear Audio Systems

    Science.gov (United States)

    Pakarinen, Jyri

    2010-12-01

    Several audio effects devices deliberately add nonlinear distortion to the processed signal in order to create a desired sound. When creating virtual analog models of nonlinearly distorting devices, it would be very useful to carefully analyze the type of distortion, so that the model could be made as realistic as possible. While traditional system analysis tools such as the frequency response give detailed information on the operation of linear and time-invariant systems, they are less useful for analyzing nonlinear devices. Furthermore, although there do exist separate algorithms for nonlinear distortion analysis, there is currently no unified, easy-to-use tool for rapid analysis of distorting audio systems. This paper offers a remedy by introducing a new software tool for easy analysis of distorting effects. A comparison between a well-known guitar tube amplifier and two commercial software simulations is presented as a case study. This freely available software is written in Matlab language, but the analysis tool can also run as a standalone program, so the user does not need to have Matlab installed in order to perform the analysis.

  14. A Tool for Testing of Inheritance Related Bugs in Object Oriented Software

    Directory of Open Access Journals (Sweden)

    B. G. Geetha

    2008-01-01

    Full Text Available Object oriented software development different from traditional development products. In object oriented software polymorphism, inheritance, dynamic binding are the important features. An inheritance property is the main feature. The compilers usually detect the syntax oriented errors only. Some of the property errors may be located in the product. Data flow testing is an appropriate testing method for testing program futures. This test analysis structure of the software and gives the flow of property. This study is designed to detect the hidden errors with reference to the inheritance property. Inputs of the tool are set of classes and packages. Outputs of the tools are hierarchies of the classes, methods, attributes and a set of inheritance related bugs like naked access, spaghetti inheritance bugs are automatically detected by the tool. The tool is developed as three major modules. They are code analysis, knowledge base preparation and bugs analysis. The code analysis module is designed to parse extract details from the code. The knowledge base preparation module is designed to prepare the knowledge base about the program details. The bug's analysis module is designed to extract bugs related information from the database. It is a static testing. This study focused on Java programs.

  15. The Formalism and Language Tools for Semantics Specification of Software Libraries

    Directory of Open Access Journals (Sweden)

    V. M. Itsykson

    2016-01-01

    Full Text Available The paper is dedicated to the specification of the structure and the behaviour of soft-ware libraries. It describes the existing problems of libraries specifications. A brief overview of the research field concerned with formalizing the specification of libraries and library functions is presented. The requirements imposed on the formalism designed are established; the formalism based on these requirements allows specifying all the properties of the libraries needed for automation of several classes of problems: defects detection in the software, migration of applications into a new environment, gen-eration of software documentation. The requirements on the language tools based on the developed formalism are proposed. The conclusion defines potential directions for further research.

  16. PyElph - a software tool for gel images analysis and phylogenetics

    Directory of Open Access Journals (Sweden)

    Pavel Ana Brânduşa

    2012-01-01

    Full Text Available Abstract Background This paper presents PyElph, a software tool which automatically extracts data from gel images, computes the molecular weights of the analyzed molecules or fragments, compares DNA patterns which result from experiments with molecular genetic markers and, also, generates phylogenetic trees computed by five clustering methods, using the information extracted from the analyzed gel image. The software can be successfully used for population genetics, phylogenetics, taxonomic studies and other applications which require gel image analysis. Researchers and students working in molecular biology and genetics would benefit greatly from the proposed software because it is free, open source, easy to use, has a friendly Graphical User Interface and does not depend on specific image acquisition devices like other commercial programs with similar functionalities do. Results PyElph software tool is entirely implemented in Python which is a very popular programming language among the bioinformatics community. It provides a very friendly Graphical User Interface which was designed in six steps that gradually lead to the results. The user is guided through the following steps: image loading and preparation, lane detection, band detection, molecular weights computation based on a molecular weight marker, band matching and finally, the computation and visualization of phylogenetic trees. A strong point of the software is the visualization component for the processed data. The Graphical User Interface provides operations for image manipulation and highlights lanes, bands and band matching in the analyzed gel image. All the data and images generated in each step can be saved. The software has been tested on several DNA patterns obtained from experiments with different genetic markers. Examples of genetic markers which can be analyzed using PyElph are RFLP (Restriction Fragment Length Polymorphism, AFLP (Amplified Fragment Length Polymorphism, RAPD

  17. PC Software graphics tool for conceptual design of space/planetary electrical power systems

    Science.gov (United States)

    Truong, Long V.

    1995-01-01

    This paper describes the Decision Support System (DSS), a personal computer software graphics tool for designing conceptual space and/or planetary electrical power systems. By using the DSS, users can obtain desirable system design and operating parameters, such as system weight, electrical distribution efficiency, and bus power. With this tool, a large-scale specific power system was designed in a matter of days. It is an excellent tool to help designers make tradeoffs between system components, hardware architectures, and operation parameters in the early stages of the design cycle. The DSS is a user-friendly, menu-driven tool with online help and a custom graphical user interface. An example design and results are illustrated for a typical space power system with multiple types of power sources, frequencies, energy storage systems, and loads.

  18. SOFTWARE TOOL FOR LASER CUTTING PROCESS CONTROL – SOLVING REAL INDUSTRIAL CASE STUDIES

    Directory of Open Access Journals (Sweden)

    Miloš Madić

    2016-08-01

    Full Text Available Laser cutting is one of the leading non-conventional machining technologies with a wide spectrum of application in modern industry. It order to exploit a number of advantages that this technology offers for contour cutting of materials, it is necessary to carefully select laser cutting conditions for each given workpiece material, thickness and desired cut qualities. In other words, there is a need for process control of laser cutting. After a comprehensive analysis of the main laser cutting parameters and process performance characteristics, the application of the developed software tool “BRUTOMIZER” for off-line control of CO2 laser cutting process of three different workpiece materials (mild steel, stainless steel and aluminum is illustrated. Advantages and abilities of the developed software tool are also illustrated.

  19. RAVEN AS A TOOL FOR DYNAMIC PROBABILISTIC RISK ASSESSMENT: SOFTWARE OVERVIEW

    Energy Technology Data Exchange (ETDEWEB)

    Alfonsi Andrea; Mandelli Diego; Rabiti Cristian; Joshua Cogliati; Robert Kinoshita

    2013-05-01

    RAVEN is a software tool under development at the Idaho National Laboratory (INL) that acts as the control logic driver and post-processing tool for the newly developed Thermo-Hydraylic code RELAP- 7. The scope of this paper is to show the software structure of RAVEN and its utilization in connection with RELAP-7. A short overview of the mathematical framework behind the code is presented along with its main capabilities such as on-line controlling/monitoring and Monte-Carlo sampling. A demo of a Station Black Out PRA analysis of a simplified Pressurized Water Reactor (PWR) model is shown in order to demonstrate the Monte-Carlo and clustering capabilities.

  20. Development of computer-aided software engineering tool for sequential control of JT-60U

    Energy Technology Data Exchange (ETDEWEB)

    Shimono, M.; Akasaka, H.; Kurihara, K.; Kimura, T. [Japan Atomic Energy Research Inst., Naka, Ibaraki (Japan). Naka Fusion Research Establishment

    1995-12-31

    Discharge sequential control (DSC) is an essential control function for the intermittent and pulse discharge operation of a tokamak device, so that many subsystems may work with each other in correct order and/or synchronously. In the development of the DSC program, block diagrams of logical operation for sequential control are illustrated in its design at first. Then, the logical operators and I/O`s which are involved in the block diagrams are compiled and converted to a certain particular form. Since the block diagrams of the sequential control amounts to about 50 sheets in the case of the JT-60 upgrade tokamak (JT-60U) high power discharge and the above steps of the development have been performed manually so far, a great effort has been required for the program development. In order to remove inefficiency in such development processes, a computer-aided software engineering (CASE) tool has been developed on a UNIX workstation. This paper reports how the authors design it for the development of the sequential control programs. The tool is composed of the following three tools: (1) Automatic drawing tool, (2) Editing tool, and (3) Trace tool. This CASE tool, an object-oriented programming tool having graphical formalism, can powerfully accelerate the cycle for the development of the sequential control function commonly associated with pulse discharge in a tokamak fusion device.

  1. Variation of densitometry on computed tomography in COPD--influence of different software tools.

    Directory of Open Access Journals (Sweden)

    Mark O Wielpütz

    Full Text Available Quantitative multidetector computed tomography (MDCT as a potential biomarker is increasingly used for severity assessment of emphysema in chronic obstructive pulmonary disease (COPD. Aim of this study was to evaluate the user-independent measurement variability between five different fully-automatic densitometry software tools.MDCT and full-body plethysmography incl. forced expiratory volume in 1s and total lung capacity were available for 49 patients with advanced COPD (age = 64±9 years, forced expiratory volume in 1 s = 31±6% predicted. Measurement variation regarding lung volume, emphysema volume, emphysema index, and mean lung density was evaluated for two scientific and three commercially available lung densitometry software tools designed to analyze MDCT from different scanner types.One scientific tool and one commercial tool failed to process most or all datasets, respectively, and were excluded. One scientific and another commercial tool analyzed 49, the remaining commercial tool 30 datasets. Lung volume, emphysema volume, emphysema index and mean lung density were significantly different amongst these three tools (p<0.001. Limits of agreement for lung volume were [-0.195, -0.052 l], [-0.305, -0.131 l], and [-0.123, -0.052 l] with correlation coefficients of r = 1.00 each. Limits of agreement for emphysema index were [-6.2, 2.9%], [-27.0, 16.9%], and [-25.5, 18.8%], with r = 0.79 to 0.98. Correlation of lung volume with total lung capacity was good to excellent (r = 0.77 to 0.91, p<0.001, but segmented lung volume (6.7±1.3-6.8±1.3 l were significantly lower than total lung capacity (7.7±1.7 l, p<0.001.Technical incompatibilities hindered evaluation of two of five tools. The remaining three showed significant measurement variation for emphysema, hampering quantitative MDCT as a biomarker in COPD. Follow-up studies should currently use identical software, and standardization efforts should encompass software as

  2. Featureous: A Tool for Feature-Centric Analysis of Java Software

    DEFF Research Database (Denmark)

    Olszak, Andrzej; Jørgensen, Bo Nørregaard

    2010-01-01

    Feature-centric comprehension of source code is necessary for incorporating user-requested modifications during software evolution and maintenance. However, such comprehension is difficult to achieve in case of large object-oriented programs due to the size, complexity, and implicit character...... of mappings between features and source code. To support programmers in overcoming these difficulties, we present a feature-centric analysis tool, Featureous. Our tool extends the NetBeans IDE with mechanisms for efficient location of feature implementations in legacy source code, and an extensive analysis...

  3. Software Tools for Emittance Measurement and Matching for 12 GeV CEBAF

    Energy Technology Data Exchange (ETDEWEB)

    Turner, Dennis L. [Thomas Jefferson National Accelerator Facility (TJNAF), Newport News, VA (United States)

    2016-05-01

    This paper discusses model-driven setup of the Continuous Electron Beam Accelerator Facility (CEBAF) for the 12GeV era, focusing on qsUtility. qsUtility is a set of software tools created to perform emittance measurements, analyze those measurements, and compute optics corrections based upon the measurements.qsUtility was developed as a toolset to facilitate reducing machine configuration time and reproducibility by way of an accurate accelerator model, and to provide Operations staff with tools to measure and correct machine optics with little or no assistance from optics experts.

  4. A SOFTWARE TOOL TO COMPARE MEASURED AND SIMULATED BUILDING ENERGY PERFORMANCE DATA

    Energy Technology Data Exchange (ETDEWEB)

    Maile, Tobias; Bazjanac, Vladimir; O' Donnell, James; Garr, Matthew

    2011-11-01

    Building energy performance is often inadequate when compared to design goals. To link design goals to actual operation one can compare measured with simulated energy performance data. Our previously developed comparison approach is the Energy Performance Comparison Methodology (EPCM), which enables the identification of performance problems based on a comparison of measured and simulated performance data. In context of this method, we developed a software tool that provides graphing and data processing capabilities of the two performance data sets. The software tool called SEE IT (Stanford Energy Efficiency Information Tool) eliminates the need for manual generation of data plots and data reformatting. SEE IT makes the generation of time series, scatter and carpet plots independent of the source of data (measured or simulated) and provides a valuable tool for comparing measurements with simulation results. SEE IT also allows assigning data points on a predefined building object hierarchy and supports different versions of simulated performance data. This paper briefly introduces the EPCM, describes the SEE IT tool and illustrates its use in the context of a building case study.

  5. A Novel Software Tool to Generate Customer Needs for Effective Design of Online Shopping Websites

    Directory of Open Access Journals (Sweden)

    Ashish K. Sharma

    2016-03-01

    Full Text Available —Effective design of online shopping websites is the need of the hour as design plays a crucial role in the success of online shopping businesses. Recently, the use of Quality Function Deployment (QFD has been reported for the design of online shopping websites. QFD is a customer driven process that encompasses voluminous data gathered from customers through several techniques like personal interview, focus groups, surveys etc. This massive, unsorted and unstructured data is required to be transformed into a limited number of structured information to represent the actual Customer Needs (CNs which are then utilized in subsequent stages of QFD process. This can be achieved through brainstorming using techniques like Affinity Process. However, integrating the Affinity Process within QFD is tedious and time consuming and cannot be dealt with manually. This generates a pressing need for a software tool to serve the purpose. Moreover, the researches carried out so far have focused on QFD application, post the generation of CNs. Also, the available QFD softwares lack the option to generate CNs from collected data. Thus, the paper aims to develop a novel software tool that integrates Affinity Process with QFD to generate customers‘ needs for effective design of online shopping websites. The software system is developed using Visual Basic Dot Net (VB.Net that integrates a MS-Access database.

  6. Users' manual for the Hydroecological Integrity Assessment Process software (including the New Jersey Assessment Tools)

    Science.gov (United States)

    Henriksen, James A.; Heasley, John; Kennen, Jonathan G.; Nieswand, Steven

    2006-01-01

    This manual is a user’s guide to four computer software tools that have been developed for the Hydroecological Integrity Assessment Process. The Hydroecological Integrity Assessment Process recognizes that streamflow is strongly related to many critical physiochemical components of rivers, such as dissolved oxygen, channel geomorphology, and water temperature, and can be considered a “master variable” that limits the disturbance, abundance, and diversity of many aquatic plant and animal species.

  7. statnet: Software Tools for the Representation, Visualization, Analysis and Simulation of Network Data

    Directory of Open Access Journals (Sweden)

    Mark S. Handcock

    2007-12-01

    Full Text Available statnet is a suite of software packages for statistical network analysis. The packages implement recent advances in network modeling based on exponential-family random graph models (ERGM. The components of the package provide a comprehensive framework for ERGM-based network modeling, including tools for model estimation, model evaluation, model-based network simulation, and network visualization. This broad functionality is powered by a central Markov chain Monte Carlo (MCMC algorithm. The coding is optimized for speed and robustness.

  8. A Multiscale Software Tool for Field/Circuit Co-Simulation

    Science.gov (United States)

    2011-12-15

    Lumped Port 2 on the right end of the microstrip line. The simulated S-parameters, S11 and S21, of the active microwave amplifier circuit are shown in...REPORT A Multiscale Software Tool for Field/ Circuit Simulation Final Report 14. ABSTRACT 16. SECURITY CLASSIFICATION OF: This report is developed under...topic #A08-T004, contract W911NF-09-C-0159. As the final report, we have developed a new multiscale field/ circuit solver by combining three efficient

  9. phenoVein - A software tool for leaf vein segmentation and analysis

    OpenAIRE

    Bühler, Jonas; Rishmawi, Louai; Pflugfelder, Daniel; Huber, Gregor; Scharr, Hanno; Hülskamp, Martin; Koornneef, Maarten; SCHURR, ULRICH; Jahnke, Siegfried

    2015-01-01

    phenoVein is a software tool dedicated to automated segmenting and analyzing images of leaf veins. It includes comfortable manual correction features. Advanced image filtering automatically emphasizes veins from background and compensates for local brightness inhomogeneities. Phenotypical leaf vein traits being calculated are total vein density, vein lengths and widths and skeleton graph statistics. For determination of vein widths, a model based vein edge estimation approach has been impleme...

  10. A software tool for determination of breast cancer treatment methods using data mining approach.

    Science.gov (United States)

    Cakır, Abdülkadir; Demirel, Burçin

    2011-12-01

    In this work, breast cancer treatment methods are determined using data mining. For this purpose, software is developed to help to oncology doctor for the suggestion of application of the treatment methods about breast cancer patients. 462 breast cancer patient data, obtained from Ankara Oncology Hospital, are used to determine treatment methods for new patients. This dataset is processed with Weka data mining tool. Classification algorithms are applied one by one for this dataset and results are compared to find proper treatment method. Developed software program called as "Treatment Assistant" uses different algorithms (IB1, Multilayer Perception and Decision Table) to find out which one is giving better result for each attribute to predict and by using Java Net beans interface. Treatment methods are determined for the post surgical operation of breast cancer patients using this developed software tool. At modeling step of data mining process, different Weka algorithms are used for output attributes. For hormonotherapy output IB1, for tamoxifen and radiotherapy outputs Multilayer Perceptron and for the chemotherapy output decision table algorithm shows best accuracy performance compare to each other. In conclusion, this work shows that data mining approach can be a useful tool for medical applications particularly at the treatment decision step. Data mining helps to the doctor to decide in a short time.

  11. Ignominy:a Tool for Software Dependency and Metric Analysis with Examples from Large HEP Packages

    Institute of Scientific and Technical Information of China (English)

    LassiA.Tuura; LucasTaylor

    2001-01-01

    Ignominy is a tool developed in the CMS IGUANA project to analyse the structure of software systems.Its primary component is a dependency scanner that distills information into human-usable forms.It also includes several tools to visualise the collected data in the form of graphical views and numerical metrics.Ignominy was designed to adapt to almost any reasonable structure,and it has been used to analyse several large projects. The original purpose of Ignominy was to help us better ensure the quality of our own software,and in particular warn us about possible structureal problems early on .As a part of this activity it is now used as a standard part of our release procedure,we also use it to evaluate and study the quality of external packages we plan to make use of .We describe what Ignominy can find out,and how if can be used to ivsualise and assess a software structure.We also discuss the inherent problems of the analysis as well as the different approaches to modularity the tool makes quite evident.The focus is the illustration of these issues through the analysis results for several sizable HEP softwre projects.

  12. User Driven Development of Software Tools for Open Data Discovery and Exploration

    Science.gov (United States)

    Schlobinski, Sascha; Keppel, Frank; Dihe, Pascal; Boot, Gerben; Falkenroth, Esa

    2016-04-01

    The use of open data in research faces challenges not restricted to inherent properties such as data quality, resolution of open data sets. Often Open data is catalogued insufficiently or fragmented. Software tools that support the effective discovery including the assessment of the data's appropriateness for research have shortcomings such as the lack of essential functionalities like support for data provenance. We believe that one of the reasons is the neglect of real end users requirements in the development process of aforementioned software tools. In the context of the FP7 Switch-On project we have pro-actively engaged the relevant user user community to collaboratively develop a means to publish, find and bind open data relevant for hydrologic research. Implementing key concepts of data discovery and exploration we have used state of the art web technologies to provide an interactive software tool that is easy to use yet powerful enough to satisfy the data discovery and access requirements of the hydrological research community.

  13. GOAL: A software tool for assessing biological significance of genes groups

    Directory of Open Access Journals (Sweden)

    Famili Fazel

    2010-05-01

    Full Text Available Abstract Background Modern high throughput experimental techniques such as DNA microarrays often result in large lists of genes. Computational biology tools such as clustering are then used to group together genes based on their similarity in expression profiles. Genes in each group are probably functionally related. The functional relevance among the genes in each group is usually characterized by utilizing available biological knowledge in public databases such as Gene Ontology (GO, KEGG pathways, association between a transcription factor (TF and its target genes, and/or gene networks. Results We developed GOAL: Gene Ontology AnaLyzer, a software tool specifically designed for the functional evaluation of gene groups. GOAL implements and supports efficient and statistically rigorous functional interpretations of gene groups through its integration with available GO, TF-gene association data, and association with KEGG pathways. In order to facilitate more specific functional characterization of a gene group, we implement three GO-tree search strategies rather than one as in most existing GO analysis tools. Furthermore, GOAL offers flexibility in deployment. It can be used as a standalone tool, a plug-in to other computational biology tools, or a web server application. Conclusion We developed a functional evaluation software tool, GOAL, to perform functional characterization of a gene group. GOAL offers three GO-tree search strategies and combines its strength in function integration, portability and visualization, and its flexibility in deployment. Furthermore, GOAL can be used to evaluate and compare gene groups as the output from computational biology tools such as clustering algorithms.

  14. The anatomy of E-Learning tools: Does software usability influence learning outcomes?

    Science.gov (United States)

    Van Nuland, Sonya E; Rogers, Kem A

    2016-07-08

    Reductions in laboratory hours have increased the popularity of commercial anatomy e-learning tools. It is critical to understand how the functionality of such tools can influence the mental effort required during the learning process, also known as cognitive load. Using dual-task methodology, two anatomical e-learning tools were examined to determine the effect of their design on cognitive load during two joint learning exercises. A.D.A.M. Interactive Anatomy is a simplistic, two-dimensional tool that presents like a textbook, whereas Netter's 3D Interactive Anatomy has a more complex three-dimensional usability that allows structures to be rotated. It was hypothesized that longer reaction times on an observation task would be associated with the more complex anatomical software (Netter's 3D Interactive Anatomy), indicating a higher cognitive load imposed by the anatomy software, which would result in lower post-test scores. Undergraduate anatomy students from Western University, Canada (n = 70) were assessed using a baseline knowledge test, Stroop observation task response times (a measure of cognitive load), mental rotation test scores, and an anatomy post-test. Results showed that reaction times and post-test outcomes were similar for both tools, whereas mental rotation test scores were positively correlated with post-test values when students used Netter's 3D Interactive Anatomy (P = 0.007), but not when they used A.D.A.M. Interactive Anatomy. This suggests that a simple e-learning tool, such as A.D.A.M. Interactive Anatomy, is as effective as more complicated tools, such as Netter's 3D Interactive Anatomy, and does not academically disadvantage those with poor spatial ability. Anat Sci Educ 9: 378-390. © 2015 American Association of Anatomists.

  15. Development of a User Interface for a Regression Analysis Software Tool

    Science.gov (United States)

    Ulbrich, Norbert Manfred; Volden, Thomas R.

    2010-01-01

    An easy-to -use user interface was implemented in a highly automated regression analysis tool. The user interface was developed from the start to run on computers that use the Windows, Macintosh, Linux, or UNIX operating system. Many user interface features were specifically designed such that a novice or inexperienced user can apply the regression analysis tool with confidence. Therefore, the user interface s design minimizes interactive input from the user. In addition, reasonable default combinations are assigned to those analysis settings that influence the outcome of the regression analysis. These default combinations will lead to a successful regression analysis result for most experimental data sets. The user interface comes in two versions. The text user interface version is used for the ongoing development of the regression analysis tool. The official release of the regression analysis tool, on the other hand, has a graphical user interface that is more efficient to use. This graphical user interface displays all input file names, output file names, and analysis settings for a specific software application mode on a single screen which makes it easier to generate reliable analysis results and to perform input parameter studies. An object-oriented approach was used for the development of the graphical user interface. This choice keeps future software maintenance costs to a reasonable limit. Examples of both the text user interface and graphical user interface are discussed in order to illustrate the user interface s overall design approach.

  16. BYMUR software: a free and open source tool for quantifying and visualizing multi-risk analyses

    Science.gov (United States)

    Tonini, Roberto; Selva, Jacopo

    2013-04-01

    The BYMUR software aims to provide an easy-to-use open source tool for both computing multi-risk and managing/visualizing/comparing all the inputs (e.g. hazard, fragilities and exposure) as well as the corresponding results (e.g. risk curves, risk indexes). For all inputs, a complete management of inter-model epistemic uncertainty is considered. The BYMUR software will be one of the final products provided by the homonymous ByMuR project (http://bymur.bo.ingv.it/) funded by Italian Ministry of Education, Universities and Research (MIUR), focused to (i) provide a quantitative and objective general method for a comprehensive long-term multi-risk analysis in a given area, accounting for inter-model epistemic uncertainty through Bayesian methodologies, and (ii) apply the methodology to seismic, volcanic and tsunami risks in Naples (Italy). More specifically, the BYMUR software will be able to separately account for the probabilistic hazard assessment of different kind of hazardous phenomena, the relative (time-dependent/independent) vulnerabilities and exposure data, and their possible (predefined) interactions: the software will analyze these inputs and will use them to estimate both single- and multi- risk associated to a specific target area. In addition, it will be possible to connect the software to further tools (e.g., a full hazard analysis), allowing a dynamic I/O of results. The use of Python programming language guarantees that the final software will be open source and platform independent. Moreover, thanks to the integration of some most popular and rich-featured Python scientific modules (Numpy, Matplotlib, Scipy) with the wxPython graphical user toolkit, the final tool will be equipped with a comprehensive Graphical User Interface (GUI) able to control and visualize (in the form of tables, maps and/or plots) any stage of the multi-risk analysis. The additional features of importing/exporting data in MySQL databases and/or standard XML formats (for

  17. Robust optimal design of experiments for model discrimination using an interactive software tool.

    Directory of Open Access Journals (Sweden)

    Johannes Stegmaier

    Full Text Available Mathematical modeling of biochemical processes significantly contributes to a better understanding of biological functionality and underlying dynamic mechanisms. To support time consuming and costly lab experiments, kinetic reaction equations can be formulated as a set of ordinary differential equations, which in turn allows to simulate and compare hypothetical models in silico. To identify new experimental designs that are able to discriminate between investigated models, the approach used in this work solves a semi-infinite constrained nonlinear optimization problem using derivative based numerical algorithms. The method takes into account parameter variabilities such that new experimental designs are robust against parameter changes while maintaining the optimal potential to discriminate between hypothetical models. In this contribution we present a newly developed software tool that offers a convenient graphical user interface for model discrimination. We demonstrate the beneficial operation of the discrimination approach and the usefulness of the software tool by analyzing a realistic benchmark experiment from literature. New robust optimal designs that allow to discriminate between the investigated model hypotheses of the benchmark experiment are successfully calculated and yield promising results. The involved robustification approach provides maximally discriminating experiments for the worst parameter configurations, which can be used to estimate the meaningfulness of upcoming experiments. A major benefit of the graphical user interface is the ability to interactively investigate the model behavior and the clear arrangement of numerous variables. In addition to a brief theoretical overview of the discrimination method and the functionality of the software tool, the importance of robustness of experimental designs against parameter variability is demonstrated on a biochemical benchmark problem. The software is licensed under the GNU

  18. Software tools of the Computis European project to process mass spectrometry images.

    Science.gov (United States)

    Robbe, Marie-France; Both, Jean-Pierre; Prideaux, Brendan; Klinkert, Ivo; Picaud, Vincent; Schramm, Thorsten; Hester, Atfons; Guevara, Victor; Stoeckli, Markus; Roempp, Andreas; Heeren, Ron M A; Spengler, Bernhard; Gala, Olivier; Haan, Serge

    2014-01-01

    Among the needs usually expressed by teams using mass spectrometry imaging, one that often arises is that for user-friendly software able to manage huge data volumes quickly and to provide efficient assistance for the interpretation of data. To answer this need, the Computis European project developed several complementary software tools to process mass spectrometry imaging data. Data Cube Explorer provides a simple spatial and spectral exploration for matrix-assisted laser desorption/ionisation-time of flight (MALDI-ToF) and time of flight-secondary-ion mass spectrometry (ToF-SIMS) data. SpectViewer offers visualisation functions, assistance to the interpretation of data, classification functionalities, peak list extraction to interrogate biological database and image overlay, and it can process data issued from MALDI-ToF, ToF-SIMS and desorption electrospray ionisation (DESI) equipment. EasyReg2D is able to register two images, in American Standard Code for Information Interchange (ASCII) format, issued from different technologies. The collaboration between the teams was hampered by the multiplicity of equipment and data formats, so the project also developed a common data format (imzML) to facilitate the exchange of experimental data and their interpretation by the different software tools. The BioMap platform for visualisation and exploration of MALDI-ToF and DESI images was adapted to parse imzML files, enabling its access to all project partners and, more globally, to a larger community of users. Considering the huge advantages brought by the imzML standard format, a specific editor (vBrowser) for imzML files and converters from proprietary formats to imzML were developed to enable the use of the imzML format by a broad scientific community. This initiative paves the way toward the development of a large panel of software tools able to process mass spectrometry imaging datasets in the future.

  19. Integration of life cycle assessment software with tools for economic and sustainability analyses and process simulation for sustainable process design

    DEFF Research Database (Denmark)

    Kalakul, Sawitree; Malakul, Pomthong; Siemanond, Kitipat

    2014-01-01

    The sustainable future of the world challenges engineers to develop chemical process designs that are not only technically and economically feasible but also environmental friendly. Life cycle assessment (LCA) is a tool for identifying and quantifying environmental impacts of the chemical product....... Although there are several commercial LCA tools, there is still a need for a simple LCA software that can be integrated with process design tools. In this paper, a new LCA software, LCSoft, is developed for evaluation of chemical, petrochemical, and biochemical processes with options for integration...... with other process design tools such as sustainable design (SustainPro), economic analysis (ECON) and process simulation. The software framework contains four main tools: Tool-I is for life cycle inventory (LCI) knowledge management that enables easy maintenance and future expansion of the LCI database; Tool...

  20. Utilization of Software Tools for Uncertainty Calculation in Measurement Science Education

    Science.gov (United States)

    Zangl, Hubert; Zine-Zine, Mariam; Hoermaier, Klaus

    2015-02-01

    Despite its importance, uncertainty is often neglected by practitioners in the design of system even in safety critical applications. Thus, problems arising from uncertainty may only be identified late in the design process and thus lead to additional costs. Although there exists numerous tools to support uncertainty calculation, reasons for limited usage in early design phases may be low awareness of the existence of the tools and insufficient training in the practical application. We present a teaching philosophy that addresses uncertainty from the very beginning of teaching measurement science, in particular with respect to the utilization of software tools. The developed teaching material is based on the GUM method and makes use of uncertainty toolboxes in the simulation environment. Based on examples in measurement science education we discuss advantages and disadvantages of the proposed teaching philosophy and include feedback from students.

  1. SOFI Simulation Tool: A Software Package for Simulating and Testing Super-Resolution Optical Fluctuation Imaging.

    Science.gov (United States)

    Girsault, Arik; Lukes, Tomas; Sharipov, Azat; Geissbuehler, Stefan; Leutenegger, Marcel; Vandenberg, Wim; Dedecker, Peter; Hofkens, Johan; Lasser, Theo

    2016-01-01

    Super-resolution optical fluctuation imaging (SOFI) allows one to perform sub-diffraction fluorescence microscopy of living cells. By analyzing the acquired image sequence with an advanced correlation method, i.e. a high-order cross-cumulant analysis, super-resolution in all three spatial dimensions can be achieved. Here we introduce a software tool for a simple qualitative comparison of SOFI images under simulated conditions considering parameters of the microscope setup and essential properties of the biological sample. This tool incorporates SOFI and STORM algorithms, displays and describes the SOFI image processing steps in a tutorial-like fashion. Fast testing of various parameters simplifies the parameter optimization prior to experimental work. The performance of the simulation tool is demonstrated by comparing simulated results with experimentally acquired data.

  2. TESPI (Tool for Environmental Sound Product Innovation): a simplified software tool to support environmentally conscious design in SMEs

    Science.gov (United States)

    Misceo, Monica; Buonamici, Roberto; Buttol, Patrizia; Naldesi, Luciano; Grimaldi, Filomena; Rinaldi, Caterina

    2004-12-01

    TESPI (Tool for Environmental Sound Product Innovation) is the prototype of a software tool developed within the framework of the "eLCA" project. The project, (www.elca.enea.it)financed by the European Commission, is realising "On line green tools and services for Small and Medium sized Enterprises (SMEs)". The implementation by SMEs of environmental product innovation (as fostered by the European Integrated Product Policy, IPP) needs specific adaptation to their economic model, their knowledge of production and management processes and their relationships with innovation and the environment. In particular, quality and costs are the main driving forces of innovation in European SMEs, and well known barriers exist to the adoption of an environmental approach in the product design. Starting from these considerations, the TESPI tool has been developed to support the first steps of product design taking into account both the quality and the environment. Two main issues have been considered: (i) classic Quality Function Deployment (QFD) can hardly be proposed to SMEs; (ii) the environmental aspects of the product life cycle need to be integrated with the quality approach. TESPI is a user friendly web-based tool, has a training approach and applies to modular products. Users are guided through the investigation of the quality aspects of their product (customer"s needs and requirements fulfilment) and the identification of the key environmental aspects in the product"s life cycle. A simplified check list allows analyzing the environmental performance of the product. Help is available for a better understanding of the analysis criteria. As a result, the significant aspects for the redesign of the product are identified.

  3. Creating a strategic plan for configuration management using computer aided software engineering (CASE) tools

    Energy Technology Data Exchange (ETDEWEB)

    Smith, P.R.; Sarfaty, R.

    1993-05-01

    This paper provides guidance in the definition, documentation, measurement, enhancement of processes, and validation of a strategic plan for configuration management (CM). The approach and methodology used in establishing a strategic plan is the same for any enterprise, including the Department of Energy (DOE), commercial nuclear plants, the Department of Defense (DOD), or large industrial complexes. The principles and techniques presented are used world wide by some of the largest corporations. The authors used industry knowledge and the areas of their current employment to illustrate and provide examples. Developing a strategic configuration and information management plan for DOE Idaho Field Office (DOE-ID) facilities is discussed in this paper. A good knowledge of CM principles is the key to successful strategic planning. This paper will describe and define CM elements, and discuss how CM integrates the facility`s physical configuration, design basis, and documentation. The strategic plan does not need the support of a computer aided software engineering (CASE) tool. However, the use of the CASE tool provides a methodology for consistency in approach, graphics, and database capability combined to form an encyclopedia and a method of presentation that is easily understood and aids the process of reengineering. CASE tools have much more capability than those stated above. Some examples are supporting a joint application development group (JAD) to prepare a software functional specification document and, if necessary, provide the capability to automatically generate software application code. This paper briefly discusses characteristics and capabilities of two CASE tools that use different methodologies to generate similar deliverables.

  4. Emerging role of bioinformatics tools and software in evolution of clinical research

    Directory of Open Access Journals (Sweden)

    Supreet Kaur Gill

    2016-01-01

    Full Text Available Clinical research is making toiling efforts for promotion and wellbeing of the health status of the people. There is a rapid increase in number and severity of diseases like cancer, hepatitis, HIV etc, resulting in high morbidity and mortality. Clinical research involves drug discovery and development whereas clinical trials are performed to establish safety and efficacy of drugs. Drug discovery is a long process starting with the target identification, validation and lead optimization. This is followed by the preclinical trials, intensive clinical trials and eventually post marketing vigilance for drug safety. Softwares and the bioinformatics tools play a great role not only in the drug discovery but also in drug development. It involves the use of informatics in the development of new knowledge pertaining to health and disease, data management during clinical trials and to use clinical data for secondary research. In addition, new technology likes molecular docking, molecular dynamics simulation, proteomics and quantitative structure activity relationship in clinical research results in faster and easier drug discovery process. During the preclinical trials, the software is used for randomization to remove bias and to plan study design. In clinical trials software like electronic data capture, Remote data capture and electronic case report form (eCRF is used to store the data. eClinical, Oracle clinical are software used for clinical data management and for statistical analysis of the data. After the drug is marketed the safety of a drug could be monitored by drug safety software like Oracle Argus or ARISg. Therefore, softwares are used from the very early stages of drug designing, to drug development, clinical trials and during pharmacovigilance. This review describes different aspects related to application of computers and bioinformatics in drug designing, discovery and development, formulation designing and clinical research.

  5. DVS-SOFTWARE: An Effective Tool for Applying Highly Parallelized Hardware To Computational Geophysics

    Science.gov (United States)

    Herrera, I.; Herrera, G. S.

    2015-12-01

    Most geophysical systems are macroscopic physical systems. The behavior prediction of such systems is carried out by means of computational models whose basic models are partial differential equations (PDEs) [1]. Due to the enormous size of the discretized version of such PDEs it is necessary to apply highly parallelized super-computers. For them, at present, the most efficient software is based on non-overlapping domain decomposition methods (DDM). However, a limiting feature of the present state-of-the-art techniques is due to the kind of discretizations used in them. Recently, I. Herrera and co-workers using 'non-overlapping discretizations' have produced the DVS-Software which overcomes this limitation [2]. The DVS-software can be applied to a great variety of geophysical problems and achieves very high parallel efficiencies (90%, or so [3]). It is therefore very suitable for effectively applying the most advanced parallel supercomputers available at present. In a parallel talk, in this AGU Fall Meeting, Graciela Herrera Z. will present how this software is being applied to advance MOD-FLOW. Key Words: Parallel Software for Geophysics, High Performance Computing, HPC, Parallel Computing, Domain Decomposition Methods (DDM)REFERENCES [1]. Herrera Ismael and George F. Pinder, Mathematical Modelling in Science and Engineering: An axiomatic approach", John Wiley, 243p., 2012. [2]. Herrera, I., de la Cruz L.M. and Rosas-Medina A. "Non Overlapping Discretization Methods for Partial, Differential Equations". NUMER METH PART D E, 30: 1427-1454, 2014, DOI 10.1002/num 21852. (Open source) [3]. Herrera, I., & Contreras Iván "An Innovative Tool for Effectively Applying Highly Parallelized Software To Problems of Elasticity". Geofísica Internacional, 2015 (In press)

  6. A software tool of digital tomosynthesis application for patient positioning in radiotherapy.

    Science.gov (United States)

    Yan, Hui; Dai, Jian-Rong

    2016-03-08

    Digital Tomosynthesis (DTS) is an image modality in reconstructing tomographic images from two-dimensional kV projections covering a narrow scan angles. Comparing with conventional cone-beam CT (CBCT), it requires less time and radiation dose in data acquisition. It is feasible to apply this technique in patient positioning in radiotherapy. To facilitate its clinical application, a software tool was developed and the reconstruction processes were accelerated by graphic process-ing unit (GPU). Two reconstruction and two registration processes are required for DTS application which is different from conventional CBCT application which requires one image reconstruction process and one image registration process. The reconstruction stage consists of productions of two types of DTS. One type of DTS is reconstructed from cone-beam (CB) projections covering a narrow scan angle and is named onboard DTS (ODTS), which represents the real patient position in treatment room. Another type of DTS is reconstructed from digitally reconstructed radiography (DRR) and is named reference DTS (RDTS), which represents the ideal patient position in treatment room. Prior to the reconstruction of RDTS, The DRRs are reconstructed from planning CT using the same acquisition setting of CB projections. The registration stage consists of two matching processes between ODTS and RDTS. The target shift in lateral and longitudinal axes are obtained from the matching between ODTS and RDTS in coronal view, while the target shift in longitudinal and vertical axes are obtained from the matching between ODTS and RDTS in sagittal view. In this software, both DRR and DTS reconstruction algorithms were implemented on GPU environments for acceleration purpose. The comprehensive evaluation of this software tool was performed including geometric accuracy, image quality, registration accuracy, and reconstruction efficiency. The average correlation coefficient between DRR/DTS generated by GPU-based algorithm

  7. Evaluating a digital ship design tool prototype: Designers' perceptions of novel ergonomics software.

    Science.gov (United States)

    Mallam, Steven C; Lundh, Monica; MacKinnon, Scott N

    2017-03-01

    Computer-aided solutions are essential for naval architects to manage and optimize technical complexities when developing a ship's design. Although there are an array of software solutions aimed to optimize the human element in design, practical ergonomics methodologies and technological solutions have struggled to gain widespread application in ship design processes. This paper explores how a new ergonomics technology is perceived by naval architecture students using a mixed-methods framework. Thirteen Naval Architecture and Ocean Engineering Masters students participated in the study. Overall, results found participants perceived the software and its embedded ergonomics tools to benefit their design work, increasing their empathy and ability to understand the work environment and work demands end-users face. However, participant's questioned if ergonomics could be practically and efficiently implemented under real-world project constraints. This revealed underlying social biases and a fundamental lack of understanding in engineering postgraduate students regarding applied ergonomics in naval architecture.

  8. A software tool for teaching and training how to build and use a TOWS matrix

    Directory of Open Access Journals (Sweden)

    Amparo Mariño Ibáñez

    2010-05-01

    Full Text Available Strategic planning is currently being used by most companies; it analyses current and expected future situations, determines com-pany orientation and develops means or strategies for achieving their stated missions. This article is aimed at reviewing general considerations in strategic planning and presenting a computational tool designed for building a TOWS matrix for matching a company’s opportunities and threats with its weaknesses and, more especially, its strengths. The software development life cycle (SDLC involved analysis, design, implementation and use. The literature about strategic planning and SWOT analysis was re-viewed for making the analysis. The software only automates an aspect of the whole strategic planning process and can be used for improving students and staff training in SWOT analysis. This type of work seeks to motivate interdisciplinary research.

  9. Web-based software tool for constraint-based design specification of synthetic biological systems.

    Science.gov (United States)

    Oberortner, Ernst; Densmore, Douglas

    2015-06-19

    miniEugene provides computational support for solving combinatorial design problems, enabling users to specify and enumerate designs for novel biological systems based on sets of biological constraints. This technical note presents a brief tutorial for biologists and software engineers in the field of synthetic biology on how to use miniEugene. After reading this technical note, users should know which biological constraints are available in miniEugene, understand the syntax and semantics of these constraints, and be able to follow a step-by-step guide to specify the design of a classical synthetic biological system-the genetic toggle switch.1 We also provide links and references to more information on the miniEugene web application and the integration of the miniEugene software library into sophisticated Computer-Aided Design (CAD) tools for synthetic biology ( www.eugenecad.org ).

  10. Techniques and tools for measuring energy efficiency of scientific software applications

    CERN Document Server

    Abdurachmanov, David; Eulisse, Giulio; Knight, Robert; Niemi, Tapio; Nurminen, Jukka K.; Nyback, Filip; Pestana, Goncalo; Ou, Zhonghong; Khan, Kashif

    2014-01-01

    The scale of scientific High Performance Computing (HPC) and High Throughput Computing (HTC) has increased significantly in recent years, and is becoming sensitive to total energy use and cost. Energy-efficiency has thus become an important concern in scientific fields such as High Energy Physics (HEP). There has been a growing interest in utilizing alternate architectures, such as low power ARM processors, to replace traditional Intel x86 architectures. Nevertheless, even though such solutions have been successfully used in mobile applications with low I/O and memory demands, it is unclear if they are suitable and more energy-efficient in the scientific computing environment. Furthermore, there is a lack of tools and experience to derive and compare power consumption between the architectures for various workloads, and eventually to support software optimizations for energy efficiency. To that end, we have performed several physical and software-based measurements of workloads from HEP applications running o...

  11. Software tools for manipulating fe mesh, virtual surgery and post-processing

    Directory of Open Access Journals (Sweden)

    Milašinović Danko Z.

    2009-01-01

    Full Text Available This paper describes a set of software tools which we developed for the calculation of fluid flow through cardiovascular organs. Our tools work with medical data from a CT scanner, but could be used with any other 3D input data. For meshing we used a Tetgen tetrahedral mesh generator, as well as a mesh re-generator that we have developed for conversion of tetrahedral elements into bricks. After adequate meshing we used our PAKF solver for calculation of fluid flow. For human-friendly presentation of results we developed a set of post-processing software tools. With modification of 2D mesh (boundary of cardiovascular organ it is possible to do virtual surgery, so in a case of an aorta with aneurism, which we had received from University Clinical center in Heidelberg from a multi-slice 64-CT scanner, we removed the aneurism and ran calculations on both geometrical models afterwards. The main idea of this methodology is creating a system that could be used in clinics.

  12. TScratch: a novel and simple software tool for automated analysis of monolayer wound healing assays.

    Science.gov (United States)

    Gebäck, Tobias; Schulz, Martin Michael Peter; Koumoutsakos, Petros; Detmar, Michael

    2009-04-01

    Cell migration plays a major role in development, physiology, and disease, and is frequently evaluated in vitro by the monolayer wound healing assay. The assay analysis, however, is a time-consuming task that is often performed manually. In order to accelerate this analysis, we have developed TScratch, a new, freely available image analysis technique and associated software tool that uses the fast discrete curvelet transform to automate the measurement of the area occupied by cells in the images. This tool helps to significantly reduce the time needed for analysis and enables objective and reproducible quantification of assays. The software also offers a graphical user interface which allows easy inspection of analysis results and, if desired, manual modification of analysis parameters. The automated analysis was validated by comparing its results with manual-analysis results for a range of different cell lines. The comparisons demonstrate a close agreement for the vast majority of images that were examined and indicate that the present computational tool can reproduce statistically significant results in experiments with well-known cell migration inhibitors and enhancers.

  13. Analyst Tools and Quality Control Software for the ARM Data System

    Energy Technology Data Exchange (ETDEWEB)

    Moore, Sean; Hughes, Gary

    2008-07-31

    Mission Research develops analyst tools and automated quality control software in order to assist the Atmospheric Radiation Measurement (ARM) Data Quality Office with their data inspection tasks. We have developed web-based data analysis and visualization tools such as the interactive plotting program NCVweb, various diagnostic plot browsers, and a datastream processing status application. These tools allow even novice ARM researchers to be productive with ARM data with only minimal effort. We also contribute to the ARM Data Quality Office by analyzing ARM data streams, developing new quality control metrics, new diagnostic plots, and integrating this information into DQ HandS - the Data Quality Health and Status web-based explorer. We have developed several ways to detect outliers in ARM data streams and have written software to run in an automated fashion to flag these outliers. We have also embarked on a system to comprehensively generate long time-series plots, frequency distributions, and other relevant statistics for scientific and engineering data in most high-level, publicly available ARM data streams. Furthermore, frequency distributions categorized by month or by season are made available to help define valid data ranges specific to those time domains. These statistics can be used to set limits that when checked, will improve upon the reporting of suspicious data and the early detection of instrument malfunction. The statistics and proposed limits are stored in a database for easy reporting, refining, and for use by other processes. Web-based applications to view the results are also available.

  14. Fuzzy cognitive map software tool for treatment management of uncomplicated urinary tract infection.

    Science.gov (United States)

    Papageorgiou, Elpiniki I

    2012-03-01

    Uncomplicated urinary tract infection (uUTI) is a bacterial infection that affects individuals with normal urinary tracts from both structural and functional perspective. The appropriate antibiotics and treatment suggestions to individuals suffer of uUTI is an important and complex task that demands a special attention. How to decrease the unsafely use of antibiotics and their consumption is an important issue in medical treatment. Aiming to model medical decision making for uUTI treatment, an innovative and flexible approach called fuzzy cognitive maps (FCMs) is proposed to handle with uncertainty and missing information. The FCM is a promising technique for modeling knowledge and/or medical guidelines/treatment suggestions and reasoning with it. A software tool, namely FCM-uUTI DSS, is investigated in this work to produce a decision support module for uUTI treatment management. The software tool was tested (evaluated) in a number of 38 patient cases, showing its functionality and demonstrating that the use of the FCMs as dynamic models is reliable and good. The results have shown that the suggested FCM-uUTI tool gives a front-end decision on antibiotics' suggestion for uUTI treatment and are considered as helpful references for physicians and patients. Due to its easy graphical representation and simulation process the proposed FCM formalization could be used to make the medical knowledge widely available through computer consultation systems.

  15. A software tool for advanced MRgFUS prostate therapy planning and follow up

    Science.gov (United States)

    van Straaten, Dörte; Hoogenboom, Martijn; van Amerongen, Martinus J.; Weiler, Florian; Issawi, Jumana Al; Günther, Matthias; Fütterer, Jurgen; Jenne, Jürgen W.

    2017-03-01

    US guided HIFU/FUS ablation for the therapy of prostate cancer is a clinical established method, while MR guided HIFU/FUS applications for prostate recently started clinical evaluation. Even if MRI examination is an excellent diagnostic tool for prostate cancer, it is a time consuming procedure and not practicable within an MRgFUS therapy session. The aim of our ongoing work is to develop software to support therapy planning and post-therapy follow-up for MRgFUS on localized prostate cancer, based on multi-parametric MR protocols. The clinical workflow of diagnosis, therapy and follow-up of MR guided FUS on prostate cancer was deeply analyzed. Based on this, the image processing workflow was designed and all necessary components, e.g. GUI, viewer, registration tools etc. were defined and implemented. The software bases on MeVisLab with several implemented C++ modules for the image processing tasks. The developed software, called LTC (Local Therapy Control) will register and visualize automatically all images (T1w, T2w, DWI etc.) and ADC or perfusion maps gained from the diagnostic MRI session. This maximum of diagnostic information helps to segment all necessary ROIs, e.g. the tumor, for therapy planning. Final therapy planning will be performed based on these segmentation data in the following MRgFUS therapy session. In addition, the developed software should help to evaluate the therapy success, by synchronization and display of pre-therapeutic, therapy and follow-up image data including the therapy plan and thermal dose information. In this ongoing project, the first stand-alone prototype was completed and will be clinically evaluated.

  16. New tools for digital medical image processing implemented in DIP software

    Energy Technology Data Exchange (ETDEWEB)

    Araujo, Erica A.C.; Santana, Ivan E. [Instituto Federal de Educacao, Ciencia e Tecnologia de Pernambuco, Recife, PE (Brazil); Lima, Fernando R.A., E-mail: falima@cnen.gov.b [Centro Regional de Ciencias Nucleares, (CRCN/NE-CNEN-PE), Recife, PE (Brazil); Viera, Jose W. [Escola Politecnica de Pernambuco, Recife, PE (Brazil)

    2011-07-01

    The anthropomorphic models used in computational dosimetry, also called phantoms, are mostly built from stacks of images CT (Computed Tomography) or MRI (Magnetic Resonance Imaging) obtained from scans of patients or volunteers. The construction of voxel phantoms requires computational processing for transforming image formats, dimensional image compression (2D) to form three-dimensional arrays (3D), quantization, resampling, enhancement, restoration and image segmentation, among others. Hardly the computational dosimetry researcher finds all these skills into a single software and often it results in a decreased development of their research or inadequate use of alternative tools. The need to integrate the various tasks of the original digital image processing to obtain an image that can be used in a computational model of exposure led to the development of software DIP (Digital Image Processing). This software reads, writes and edits binary files containing the 3D matrix corresponding to a stack of cross-sectional images of a given geometry that can be a human body or other volume of interest. It can also read any type of computer image and do conversions. When the task involves only one output image, it is saved in the JPEG standard Windows. When it involves a stack of images, the binary output file is called SGI (Interactive Graphic Simulations, a symbol already used in other publications of the Research Group in Numerical Dosimetry). The following paper presents the third version of the DIP software and emphasizes the new tools it implemented. Currently it has the menus Basics, Views, Spatial Domain, Frequency Domain, Segmentations and Study. Each menu contains items and subitems with features that generally require an image as input and produce an image or an attribute in the output. (author)

  17. DSC: software tool for simulation-based design of control strategies applied to wastewater treatment plants.

    Science.gov (United States)

    Ruano, M V; Ribes, J; Seco, A; Ferrer, J

    2011-01-01

    This paper presents a computer tool called DSC (Simulation based Controllers Design) that enables an easy design of control systems and strategies applied to wastewater treatment plants. Although the control systems are developed and evaluated by simulation, this tool aims to facilitate the direct implementation of the designed control system to the PC of the full-scale WWTP (wastewater treatment plants). The designed control system can be programmed in a dedicated control application and can be connected to either the simulation software or the SCADA of the plant. To this end, the developed DSC incorporates an OPC server (OLE for process control) which facilitates an open-standard communication protocol for different industrial process applications. The potential capabilities of the DSC tool are illustrated through the example of a full-scale application. An aeration control system applied to a nutrient removing WWTP was designed, tuned and evaluated with the DSC tool before its implementation in the full scale plant. The control parameters obtained by simulation were suitable for the full scale plant with only few modifications to improve the control performance. With the DSC tool, the control systems performance can be easily evaluated by simulation. Once developed and tuned by simulation, the control systems can be directly applied to the full-scale WWTP.

  18. EVALUATION METRICS FOR WIRELESS SENSOR NETWORK SECURITY: ALGORITHMS REVIEW AND SOFTWARE TOOL

    Directory of Open Access Journals (Sweden)

    Qasem Abu Al-Haija

    2013-01-01

    Full Text Available Wireless Sensor Networks (WSN is currently receiving a significant attention due to their potential impact into several real life applications such as military and home automation technology. The work in this study is a complementary part of what’s discussed. In this study, we propose a software tool to simulate and evaluate the six evaluation metrics presented for non-deterministic wireless sensor network in which are: Scalability, Key Connectivity, Memory complexity, Communication complexity, Power Consumption and Confidentiality. The evaluation metrics were simulated as well as evaluated to help the network designer choosing the best probabilistic security key management algorithm for certain randomly distributed sensory network.

  19. Software Tool for Automated Failure Modes and Effects Analysis (FMEA) of Hydraulic Systems

    DEFF Research Database (Denmark)

    Stecki, J. S.; Conrad, Finn; Oh, B.

    2002-01-01

    Offshore, marine,aircraft and other complex engineering systems operate in harsh environmental and operational conditions and must meet stringent requirements of reliability, safety and maintability. To reduce the hight costs of development of new systems in these fields improved the design...... management techniques and a vast array of computer aided techniques are applied during design and testing stages. The paper present and discusses the research and development of a software tool for automated failure mode and effects analysis - FMEA - of hydraulic systems. The paper explains the underlying...

  20. Computation of Internal Fluid Flows in Channels Using the CFD Software Tool FlowVision

    CERN Document Server

    Kochevsky, A N

    2004-01-01

    The article describes the CFD software tool FlowVision (OOO "Tesis", Moscow). The model equations used for this research are the set of Reynolds and continuity equations and equations of the standard k - e turbulence model. The aim of the paper was testing of FlowVision by comparing the computational results for a number of simple internal channel fluid flows with known experimental data. The test cases are non-swirling and swirling flows in pipes and diffusers, flows in stationary and rotating bends. Satisfactory correspondence of results was obtained both for flow patterns and respective quantitative values.

  1. A Tale of Two Cultures: Cross Cultural Comparison in Learning the Prezi Presentation Software Tool in the US and Norway

    Science.gov (United States)

    Brock, Sabra; Brodahl, Cornelia

    2013-01-01

    Presentation software is an important tool for both student and professorial communicators. PowerPoint has been the standard since it was introduced in 1990. However, new "improved" software platforms are emerging. Prezi is one of these, claiming to remedy the linear thinking that underlies PowerPoint by creating one canvas and…

  2. Virtual Power Electronics: Novel Software Tools for Design, Modeling and Education

    Science.gov (United States)

    Hamar, Janos; Nagy, István; Funato, Hirohito; Ogasawara, Satoshi; Dranga, Octavian; Nishida, Yasuyuki

    The current paper is dedicated to present browser-based multimedia-rich software tools and e-learning curriculum to support the design and modeling process of power electronics circuits and to explain sometimes rather sophisticated phenomena. Two projects will be discussed. The so-called Inetele project is financed by the Leonardo da Vinci program of the European Union (EU). It is a collaborative project between numerous EU universities and institutes to develop state-of-the art curriculum in Electrical Engineering. Another cooperative project with participation of Japanese, European and Australian institutes focuses especially on developing e-learning curriculum, interactive design and modeling tools, furthermore on development of a virtual laboratory. Snapshots from these two projects will be presented.

  3. Verification of visual odometry algorithms with an OpenGL-based software tool

    Science.gov (United States)

    Skulimowski, Piotr; Strumillo, Pawel

    2015-05-01

    We present a software tool called a stereovision egomotion sequence generator that was developed for testing visual odometry (VO) algorithms. Various approaches to single and multicamera VO algorithms are reviewed first, and then a reference VO algorithm that has served to demonstrate the program's features is described. The program offers simple tools for defining virtual static three-dimensional scenes and arbitrary six degrees of freedom motion paths within such scenes and output sequences of stereovision images, disparity ground-truth maps, and segmented scene images. A simple script language is proposed that simplifies tests of VO algorithms for user-defined scenarios. The program's capabilities are demonstrated by testing a reference VO technique that employs stereoscopy and feature tracking.

  4. NEuronMOrphological analysis tool: open-source software for quantitative morphometrics

    Directory of Open Access Journals (Sweden)

    Lucia eBilleci

    2013-02-01

    Full Text Available Morphometric analysis of neurons and brain tissue is relevant to the study of neuron circuitry development during the first phases of brain growth or for probing the link between microstructural morphology and degenerative diseases. As neural imaging techniques become ever more sophisticated, so does the amount and complexity of data generated. The NEuronMOrphological analysis tool NEMO was purposely developed to handle and process large numbers of optical microscopy image files of neurons in culture or slices in order to automatically run batch routines, store data and apply multivariate classification and feature extraction using3-way principal component analysis. Here we describe the software's main features, underlining the differences between NEMO and other commercial and non-commercial image processing tools, and show an example of how NEMO can be used to classify neurons from wild-type mice and from animal models of autism.

  5. Easyverifier 1.0: a software tool for revising scientific articles’ bibliographical citations

    Directory of Open Access Journals (Sweden)

    Freddy Alberto Correa Riveros

    2010-05-01

    Full Text Available The first academic revolution which occurred in developed countries during the late 19th century made research a university func- tion in addition to the traditional task of teaching. A second academic revolution has tried to transform the university into a tea- ching, research and socio-economic development enterprise. The scientific article has become an excellent practical means for the movement of new knowledge between the university and the socioeconomic environment. This work had two purposes. One was to present some general considerations regarding research and the scientific article. The second was to provide information about a computational tool which supports revising scientific articles’ citations; this step is usually done manually and requires some experience. The software allows two text files to be read, one containing the scientific article’s content and another one the bibliography. A report is then generated allowing the authors mentioned in the text but not indexed in the bibliography to be identified and to determine which authors have been mentioned in the bibliography but who have not been mentioned in the text of the article. The software allows researchers and journal coordinators to detect reference errors among citations in the text and the bibliographical references. The steps to develop the software were: analysis, design, implementation and use. For the analysis it was important the revision of the literature about elaboration of citations in scientific documents.

  6. MyView2, a new visualization software tool for analysis of LHD data

    Energy Technology Data Exchange (ETDEWEB)

    Moon, Chanho, E-mail: moon@nifs.ac.jp; Yoshinuma, Mikirou; Emoto, Masahiko; Ida, Katsumi

    2016-03-15

    The Large Helical Device (LHD) at the National Institute for Fusion Science (NIFS) is the world’s largest superconducting helical fusion device, providing a scientific research center to elucidate important physics research such as plasma transport, turbulence dynamics, and other topics. Furthermore, many types of advanced diagnostic devices are used to measure the confinement plasma characteristics, and these valuable physical data are registered over the 131,000 discharges in the LHD database. However, it is difficult to investigate the experimental data even though much physical data has been registered. In order to improve the efficiency for investigating plasma physics in LHD, we have developed a new data visualization software, MyView2, which consists of Python-based modules that can be easily set up and updated. MyView2 provides immediate access to experimental results, cross-shot analysis, and a collaboration point for scientific research. In particular, the MyView2 software is a portable structure for making viewable LHD experimental data in on- and off-site web servers, which is a capability not previously available in any general use tool. We will also discuss the benefits of using the MyView2 software for in-depth analysis of LHD experimental data.

  7. ConfocalCheck--a software tool for the automated monitoring of confocal microscope performance.

    Directory of Open Access Journals (Sweden)

    Keng Imm Hng

    Full Text Available Laser scanning confocal microscopy has become an invaluable tool in biomedical research but regular quality testing is vital to maintain the system's performance for diagnostic and research purposes. Although many methods have been devised over the years to characterise specific aspects of a confocal microscope like measuring the optical point spread function or the field illumination, only very few analysis tools are available. Our aim was to develop a comprehensive quality assurance framework ranging from image acquisition to automated analysis and documentation. We created standardised test data to assess the performance of the lasers, the objective lenses and other key components required for optimum confocal operation. The ConfocalCheck software presented here analyses the data fully automatically. It creates numerous visual outputs indicating potential issues requiring further investigation. By storing results in a web browser compatible file format the software greatly simplifies record keeping allowing the operator to quickly compare old and new data and to spot developing trends. We demonstrate that the systematic monitoring of confocal performance is essential in a core facility environment and how the quantitative measurements obtained can be used for the detailed characterisation of system components as well as for comparisons across multiple instruments.

  8. Software tools for quantification of X-ray microtomography at the UGCT

    Energy Technology Data Exchange (ETDEWEB)

    Vlassenbroeck, J. [Department of Subatomic and Radiation Physics, Ghent University, Proeftuinstraat 86, B-9000 Gent (Belgium)], E-mail: jelle.vlassenbroeck@ugent.be; Dierick, M.; Masschaele, B. [Department of Subatomic and Radiation Physics, Ghent University, Proeftuinstraat 86, B-9000 Gent (Belgium); Cnudde, V. [Department of Geology and Soil Science, Ghent University, Krijgslaan 281/S8, B-9000 Gent (Belgium); Van Hoorebeke, L. [Department of Subatomic and Radiation Physics, Ghent University, Proeftuinstraat 86, B-9000 Gent (Belgium); Jacobs, P. [Department of Geology and Soil Science, Ghent University, Krijgslaan 281/S8, B-9000 Gent (Belgium)

    2007-09-21

    The technique of X-ray microtomography using X-ray tube radiation offers an interesting tool for the non-destructive investigation of a wide range of materials. A major challenge lies in the analysis and quantification of the resulting data, allowing for a full characterization of the sample under investigation. In this paper, we discuss the software tools for reconstruction and analysis of tomographic data that are being developed at the UGCT. The tomographic reconstruction is performed using Octopus, a high-performance and user-friendly software package. The reconstruction process transforms the raw acquisition data into a stack of 2D cross-sections through the sample, resulting in a 3D data set. A number of artifact and noise reduction algorithms are integrated to reduce ring artifacts, beam hardening artifacts, COR misalignment, detector or stage tilt, pixel non-linearities, etc. These corrections are very important to facilitate the analysis of the 3D data. The analysis of the 3D data focuses primarily on the characterization of pore structures, but will be extended to other applications. A first package for the analysis of pore structures in three dimensions was developed under Matlab. A new package, called Morpho+, is being developed in a C++ environment, with optimizations and extensions of the previously used algorithms. The current status of this project will be discussed. Examples of pore analysis can be found in pharmaceuticals, material science, geology and numerous other fields.

  9. Proofreading Using an Assistive Software Homophone Tool: Compensatory and Remedial Effects on the Literacy Skills of Students with Reading Difficulties

    Science.gov (United States)

    Lange, Alissa A.; Mulhern, Gerry; Wylie, Judith

    2009-01-01

    The present study investigated the effects of using an assistive software homophone tool on the assisted proofreading performance and unassisted basic skills of secondary-level students with reading difficulties. Students aged 13 to 15 years proofread passages for homophonic errors under three conditions: with the homophone tool, with homophones…

  10. Application of the PredictAD Software Tool to Predict Progression in Patients with Mild Cognitive Impairment

    DEFF Research Database (Denmark)

    Simonsen, Anja H; Mattila, Jussi; Hejl, Anne-Mette

    2012-01-01

    of incremental data presentation using the software tool. A 5th phase was done with all available patient data presented on paper charts. Classifications by the clinical raters were compared to the clinical diagnoses made by the Alzheimer's Disease Neuroimaging Initiative investigators. Results: A statistical......Background: The PredictAD tool integrates heterogeneous data such as imaging, cerebrospinal fluid biomarkers and results from neuropsychological tests for compact visualization in an interactive user interface. This study investigated whether the software tool could assist physicians in the early...

  11. Ranking of Business Process Simulation Software Tools with DEX/QQ Hierarchical Decision Model.

    Science.gov (United States)

    Damij, Nadja; Boškoski, Pavle; Bohanec, Marko; Mileva Boshkoska, Biljana

    2016-01-01

    The omnipresent need for optimisation requires constant improvements of companies' business processes (BPs). Minimising the risk of inappropriate BP being implemented is usually performed by simulating the newly developed BP under various initial conditions and "what-if" scenarios. An effectual business process simulations software (BPSS) is a prerequisite for accurate analysis of an BP. Characterisation of an BPSS tool is a challenging task due to the complex selection criteria that includes quality of visual aspects, simulation capabilities, statistical facilities, quality reporting etc. Under such circumstances, making an optimal decision is challenging. Therefore, various decision support models are employed aiding the BPSS tool selection. The currently established decision support models are either proprietary or comprise only a limited subset of criteria, which affects their accuracy. Addressing this issue, this paper proposes a new hierarchical decision support model for ranking of BPSS based on their technical characteristics by employing DEX and qualitative to quantitative (QQ) methodology. Consequently, the decision expert feeds the required information in a systematic and user friendly manner. There are three significant contributions of the proposed approach. Firstly, the proposed hierarchical model is easily extendible for adding new criteria in the hierarchical structure. Secondly, a fully operational decision support system (DSS) tool that implements the proposed hierarchical model is presented. Finally, the effectiveness of the proposed hierarchical model is assessed by comparing the resulting rankings of BPSS with respect to currently available results.

  12. Quality-driven multi-objective optimization of software architecture design : method, tool, and application

    NARCIS (Netherlands)

    Etemadi Idgahi (Etemaadi), Ramin

    2014-01-01

    Software architecting is a non-trivial and demanding task for software engineers to perform. The architecture is a key enabler for software systems. Besides being crucial for user functionality, the software architecture has deep impact on software qualities such as performance, safety, and cost.

  13. BioBrick assembly standards and techniques and associated software tools.

    Science.gov (United States)

    Røkke, Gunvor; Korvald, Eirin; Pahr, Jarle; Oyås, Ove; Lale, Rahmi

    2014-01-01

    The BioBrick idea was developed to introduce the engineering principles of abstraction and standardization into synthetic biology. BioBricks are DNA sequences that serve a defined biological function and can be readily assembled with any other BioBrick parts to create new BioBricks with novel properties. In order to achieve this, several assembly standards can be used. Which assembly standards a BioBrick is compatible with, depends on the prefix and suffix sequences surrounding the part. In this chapter, five of the most common assembly standards will be described, as well as some of the most used assembly techniques, cloning procedures, and a presentation of the available software tools that can be used for deciding on the best method for assembling of different BioBricks, and searching for BioBrick parts in the Registry of Standard Biological Parts database.

  14. A TAXONOMY FOR TOOLS, PROCESSES AND LANGUAGES IN AUTOMOTIVE SOFTWARE ENGINEERING

    Directory of Open Access Journals (Sweden)

    Florian Bock

    2016-01-01

    Full Text Available Within the growing domain of software engineering in the automotive sector, the number of used tools, processes, methods and languages has increased distinctly in the past years. To be able to choose proper methods for particular development use cases, factors like the intended use, key-features and possible limitations have to be evaluated. This requires a taxonomy that aids the decision making. An analysis of the main existing taxonomies revealed two major deficiencies: the lack of the automotive focus and the limitation to particular engineering method types. To face this, a graphical taxonomy is proposed based on two well-established engineering approaches and enriched with additional classification information. It provides a self-evident and -explanatory overview and comparison technique for engineering methods in the automotive domain. The taxonomy is applied to common automotive engineering methods. The resulting diagram classifies each method and enables the reader to select appropriate solutions for given project requirements.

  15. A software tool to evaluate crystal types and morphological developments of accessory zircon

    Science.gov (United States)

    Sturm, Robert

    2014-08-01

    Computer programs for an appropriate visualization of crystal types and morphological developments of accessory zircon are not available hitherto. Usually, typological computations are conducted by using simple calculation tools or spread-sheet programs. In practice, however, high numbers of data sets including information of numerous zircon populations have to be processed and stored. The paper describes the software ZIRCTYP, which is a macro-driven program within the Microsoft Access database management system. It allows the computation of zircon morphologies occurring in specific rock samples and their presentation in typology diagrams. In addition, morphological developments within a given zircon population are presented (1) statistically and (2) graphically as crystal sequences showing initial, intermediate, and final growth stages.

  16. Development and validation of evolutionary algorithm software as an optimization tool for biological and environmental applications.

    Science.gov (United States)

    Sys, K; Boon, N; Verstraete, W

    2004-06-01

    A flexible, extendable tool for the optimization of (micro)biological processes and protocols using evolutionary algorithms was developed. It has been tested using three different theoretical optimization problems: 2 two-dimensional problems, one with three maxima and one with five maxima and a river autopurification optimization problem with boundary conditions. For each problem, different evolutionary parameter settings were used for the optimization. For each combination of evolutionary parameters, 15 generations were run 20 times. It has been shown that in all cases, the evolutionary algorithm gave rise to valuable results. Generally, the algorithms were able to detect the more stable sub-maximum even if there existed less stable maxima. The latter is, from a practical point of view, generally more desired. The most important factors influencing the convergence process were the parameter value randomization rate and distribution. The developed software, described in this work, is available for free.

  17. Introduction of software tools for epidemiological surveillance in infection control in Colombia

    Science.gov (United States)

    Motoa, Gabriel; Vallejo, Marta; Blanco, Víctor M; Correa, Adriana; de la Cadena, Elsa; Villegas, María Virginia

    2015-01-01

    Introduction: Healthcare-Associated Infections (HAI) are a challenge for patient safety in the hospitals. Infection control committees (ICC) should follow CDC definitions when monitoring HAI. The handmade method of epidemiological surveillance (ES) may affect the sensitivity and specificity of the monitoring system, while electronic surveillance can improve the performance, quality and traceability of recorded information. Objective: To assess the implementation of a strategy for electronic surveillance of HAI, Bacterial Resistance and Antimicrobial Consumption by the ICC of 23 high-complexity clinics and hospitals in Colombia, during the period 2012-2013. Methods: An observational study evaluating the introduction of electronic tools in the ICC was performed; we evaluated the structure and operation of the ICC, the degree of incorporation of the software HAI Solutions and the adherence to record the required information. Results: Thirty-eight percent of hospitals (8/23) had active surveillance strategies with standard criteria of the CDC, and 87% of institutions adhered to the module of identification of cases using the HAI Solutions software. In contrast, compliance with the diligence of the risk factors for device-associated HAIs was 33%. Conclusions: The introduction of ES could achieve greater adherence to a model of active surveillance, standardized and prospective, helping to improve the validity and quality of the recorded information. PMID:26309340

  18. A Software Tool for Optimal Sizing of PV Systems in Malaysia

    Directory of Open Access Journals (Sweden)

    Tamer Khatib

    2012-01-01

    Full Text Available This paper presents a MATLAB based user friendly software tool called as PV.MY for optimal sizing of photovoltaic (PV systems. The software has the capabilities of predicting the metrological variables such as solar energy, ambient temperature and wind speed using artificial neural network (ANN, optimizes the PV module/ array tilt angle, optimizes the inverter size and calculate optimal capacities of PV array, battery, wind turbine and diesel generator in hybrid PV systems. The ANN based model for metrological prediction uses four meteorological variables, namely, sun shine ratio, day number and location coordinates. As for PV system sizing, iterative methods are used for determining the optimal sizing of three types of PV systems, which are standalone PV system, hybrid PV/wind system and hybrid PV/diesel generator system. The loss of load probability (LLP technique is used for optimization in which the energy sources capacities are the variables to be optimized considering very low LLP. As for determining the optimal PV panels tilt angle and inverter size, the Liu and Jordan model for solar energy incident on a tilt surface is used in optimizing the monthly tilt angle, while a model for inverter efficiency curve is used in the optimization of inverter size.

  19. RadNotes: a novel software development tool for radiology education.

    Science.gov (United States)

    Baxter, A B; Klein, J S; Oesterle, E V

    1997-01-01

    RadNotes is a novel software development tool that enables physicians to develop teaching materials incorporating text and images in an intelligent, highly usable format. Projects undertaken in the RadNotes environment require neither programming expertise nor the assistance of a software engineer. The first of these projects, Thoracic Imaging, integrates image teaching files, concise disease and topic summaries, references, and flash card quizzes into a single program designed to provide an overview of chest radiology. RadNotes is intended to support the academic goals of teaching radiologists by enabling authors to create, edit, and electronically distribute image-oriented presentations. RadNotes also supports the educational goals of physicians who wish to quickly review selected imaging topics, as well as to develop a visual vocabulary of corresponding radiologic anatomy and pathologic conditions. Although Thoracic Imaging was developed with the aim of introducing chest radiology to residents, RadNotes can be used to develop tutorials and image-based tests for all levels; create corresponding World Wide Web sites; and organize notes, images, and references for individual use.

  20. Cloud-Based SimJavaWeb Software Tool to Learn Simulation

    Directory of Open Access Journals (Sweden)

    A. Yu. Bykov

    2017-01-01

    Full Text Available Currently, in simulation there is a trend towards using the distributed software tools, particularly ones, which are using cloud technologies and the Internet. The article considers a simulation educational tool, implemented as a web application using the Java language with special Java class library developed for simulation. It is focused on a discrete event approach to modeling, similarly to the GPSS language, and intended for queuing systems simulation.The structure of the models obtained using this class library is similar to that of the GPSS language models. Also, the simulation language interpreter similar to GPSS is created using this class library, with some differences in the individual statements.Simulation experiments are performed on the server-side, and on client-side you must use a browser with standard functions to enter the source code into HTML-created form. Mobile devices can be used as clients. The source code of a model can be represented both in the Java language using a class library and in the language similar to GPSS.The simulation system implements functions especially for educational process. For example, there is possibility for a student to upload learning materials on the server, send developed software and reports of test control to the teacher via the Internet, and receive a detailed assessment of their results from the teacher. Also detailed results of passed tests in learning modules are entered, and some other functions are implemented in the system.As examples, the article considers models of the m/M/n/0 type queuing system in Java with a class library, and in the language similar to GPSS, shows simulation results, and presents the analytical model and calculations for this system. Analytical calculations proved that the modeling system is useful, as it overlaps simulation results with the acceptable error. Some approaches to the interaction with students through the Internet, used in modeling environment, can

  1. Techniques and software tools for estimating ultrasonic signal-to-noise ratios

    Science.gov (United States)

    Chiou, Chien-Ping; Margetan, Frank J.; McKillip, Matthew; Engle, Brady J.; Roberts, Ronald A.

    2016-02-01

    At Iowa State University's Center for Nondestructive Evaluation (ISU CNDE), the use of models to simulate ultrasonic inspections has played a key role in R&D efforts for over 30 years. To this end a series of wave propagation models, flaw response models, and microstructural backscatter models have been developed to address inspection problems of interest. One use of the combined models is the estimation of signal-to-noise ratios (S/N) in circumstances where backscatter from the microstructure (grain noise) acts to mask sonic echoes from internal defects. Such S/N models have been used in the past to address questions of inspection optimization and reliability. Under the sponsorship of the National Science Foundation's Industry/University Cooperative Research Center at ISU, an effort was recently initiated to improve existing research-grade software by adding graphical user interface (GUI) to become user friendly tools for the rapid estimation of S/N for ultrasonic inspections of metals. The software combines: (1) a Python-based GUI for specifying an inspection scenario and displaying results; and (2) a Fortran-based engine for computing defect signal and backscattered grain noise characteristics. The latter makes use of several models including: the Multi-Gaussian Beam Model for computing sonic fields radiated by commercial transducers; the Thompson-Gray Model for the response from an internal defect; the Independent Scatterer Model for backscattered grain noise; and the Stanke-Kino Unified Model for attenuation. The initial emphasis was on reformulating the research-grade code into a suitable modular form, adding the graphical user interface and performing computations rapidly and robustly. Thus the initial inspection problem being addressed is relatively simple. A normal-incidence pulse/echo immersion inspection is simulated for a curved metal component having a non-uniform microstructure, specifically an equiaxed, untextured microstructure in which the average

  2. Software development for dynamic position emission tomography: Dynamic image analysis (DIA) tool

    Energy Technology Data Exchange (ETDEWEB)

    Pyeon, Do Yeong; Jung, Young Jin [Dongseo University, Busan (Korea, Republic of); Kim, Jung Su [Dept. of Radilogical Science, Dongnam Health University, Suwon (Korea, Republic of)

    2016-09-15

    Positron Emission Tomography(PET) is nuclear medical tests which is a combination of several compounds with a radioactive isotope that can be injected into body to quantitatively measure the metabolic rate (in the body). Especially, Phenomena that increase (sing) glucose metabolism in cancer tissue using the 18F-FDG (Fluorodeoxyglucose) is utilized widely in cancer diagnosis. And then, Numerous studies have been reported that incidence seems high availability even in the modern diagnosis of dementia and Parkinson's (disease) in brain disease. When using a dynamic PET image including the time information in the static information that is provided for the diagnosis many can increase the accuracy of diagnosis. For this reason, clinical researchers getting great attention but, it is the lack of tools to conduct research. And, it interfered complex mathematical algorithm and programming skills for activation of research. In this study, in order to easy to use and enable research dPET, we developed the software based graphic user interface(GUI). In the future, by many clinical researcher using DIA-Tool is expected to be of great help to dPET research.

  3. MilQuant: a free, generic software tool for isobaric tagging-based quantitation.

    Science.gov (United States)

    Zou, Xiao; Zhao, Minzhi; Shen, Hongyan; Zhao, Xuyang; Tong, Yuanpeng; Wang, Qingsong; Wei, Shicheng; Ji, Jianguo

    2012-09-18

    Isobaric tagging techniques such as iTRAQ and TMT are widely used in quantitative proteomics and especially useful for samples that demand in vitro labeling. Due to diversity in choices of MS acquisition approaches, identification algorithms, and relative abundance deduction strategies, researchers are faced with a plethora of possibilities when it comes to data analysis. However, the lack of generic and flexible software tool often makes it cumbersome for researchers to perform the analysis entirely as desired. In this paper, we present MilQuant, mzXML-based isobaric labeling quantitator, a pipeline of freely available programs that supports native acquisition files produced by all mass spectrometer types and collection approaches currently used in isobaric tagging based MS data collection. Moreover, aside from effective normalization and abundance ratio deduction algorithms, MilQuant exports various intermediate results along each step of the pipeline, making it easy for researchers to customize the analysis. The functionality of MilQuant was demonstrated by four distinct datasets from different laboratories. The compatibility and extendibility of MilQuant makes it a generic and flexible tool that can serve as a full solution to data analysis of isobaric tagging-based quantitation.

  4. PentaPlot: A software tool for the illustration of genome mosaicism

    Directory of Open Access Journals (Sweden)

    Zhaxybayeva Olga

    2005-06-01

    Full Text Available Abstract Background Dekapentagonal maps depict the phylogenetic relationships of five genomes in a visually appealing diagram and can be viewed as an alternative to a single evolutionary consensus tree. In particular, the generated maps focus attention on those gene families that significantly deviate from the consensus or plurality phylogeny. PentaPlot is a software tool that computes such dekapentagonal maps given an appropriate probability support matrix. Results The visualization with dekapentagonal maps critically depends on the optimal layout of unrooted tree topologies representing different evolutionary relationships among five organisms along the vertices of the dekapentagon. This is a difficult optimization problem given the large number of possible layouts. At its core our tool utilizes a genetic algorithm with demes and a local search strategy to search for the optimal layout. The hybrid genetic algorithm performs satisfactorily even in those cases where the chosen genomes are so divergent that little phylogenetic information has survived in the individual gene families. Conclusion PentaPlot is being made publicly available as an open source project at http://pentaplot.sourceforge.net.

  5. Streamlining the Design-to-Build Transition with Build-Optimization Software Tools.

    Science.gov (United States)

    Oberortner, Ernst; Cheng, Jan-Fang; Hillson, Nathan J; Deutsch, Samuel

    2017-03-17

    Scaling-up capabilities for the design, build, and test of synthetic biology constructs holds great promise for the development of new applications in fuels, chemical production, or cellular-behavior engineering. Construct design is an essential component in this process; however, not every designed DNA sequence can be readily manufactured, even using state-of-the-art DNA synthesis methods. Current biological computer-aided design and manufacture tools (bioCAD/CAM) do not adequately consider the limitations of DNA synthesis technologies when generating their outputs. Designed sequences that violate DNA synthesis constraints may require substantial sequence redesign or lead to price-premiums and temporal delays, which adversely impact the efficiency of the DNA manufacturing process. We have developed a suite of build-optimization software tools (BOOST) to streamline the design-build transition in synthetic biology engineering workflows. BOOST incorporates knowledge of DNA synthesis success determinants into the design process to output ready-to-build sequences, preempting the need for sequence redesign. The BOOST web application is available at https://boost.jgi.doe.gov and its Application Program Interfaces (API) enable integration into automated, customized DNA design processes. The herein presented results highlight the effectiveness of BOOST in reducing DNA synthesis costs and timelines.

  6. GraphCrunch 2: Software tool for network modeling, alignment and clustering

    Directory of Open Access Journals (Sweden)

    Hayes Wayne

    2011-01-01

    Full Text Available Abstract Background Recent advancements in experimental biotechnology have produced large amounts of protein-protein interaction (PPI data. The topology of PPI networks is believed to have a strong link to their function. Hence, the abundance of PPI data for many organisms stimulates the development of computational techniques for the modeling, comparison, alignment, and clustering of networks. In addition, finding representative models for PPI networks will improve our understanding of the cell just as a model of gravity has helped us understand planetary motion. To decide if a model is representative, we need quantitative comparisons of model networks to real ones. However, exact network comparison is computationally intractable and therefore several heuristics have been used instead. Some of these heuristics are easily computable "network properties," such as the degree distribution, or the clustering coefficient. An important special case of network comparison is the network alignment problem. Analogous to sequence alignment, this problem asks to find the "best" mapping between regions in two networks. It is expected that network alignment might have as strong an impact on our understanding of biology as sequence alignment has had. Topology-based clustering of nodes in PPI networks is another example of an important network analysis problem that can uncover relationships between interaction patterns and phenotype. Results We introduce the GraphCrunch 2 software tool, which addresses these problems. It is a significant extension of GraphCrunch which implements the most popular random network models and compares them with the data networks with respect to many network properties. Also, GraphCrunch 2 implements the GRAph ALigner algorithm ("GRAAL" for purely topological network alignment. GRAAL can align any pair of networks and exposes large, dense, contiguous regions of topological and functional similarities far larger than any other

  7. TIDE TOOL: Open-Source Sea-Level Monitoring Software for Tsunami Warning Systems

    Science.gov (United States)

    Weinstein, S. A.; Kong, L. S.; Becker, N. C.; Wang, D.

    2012-12-01

    A tsunami warning center (TWC) typically decides to issue a tsunami warning bulletin when initial estimates of earthquake source parameters suggest it may be capable of generating a tsunami. A TWC, however, relies on sea-level data to provide prima facie evidence for the existence or non-existence of destructive tsunami waves and to constrain tsunami wave height forecast models. In the aftermath of the 2004 Sumatra disaster, the International Tsunami Information Center asked the Pacific Tsunami Warning Center (PTWC) to develop a platform-independent, easy-to-use software package to give nascent TWCs the ability to process WMO Global Telecommunications System (GTS) sea-level messages and to analyze the resulting sea-level curves (marigrams). In response PTWC developed TIDE TOOL that has since steadily grown in sophistication to become PTWC's operational sea-level processing system. TIDE TOOL has two main parts: a decoder that reads GTS sea-level message logs, and a graphical user interface (GUI) written in the open-source platform-independent graphical toolkit scripting language Tcl/Tk. This GUI consists of dynamic map-based clients that allow the user to select and analyze a single station or groups of stations by displaying their marigams in strip-chart or screen-tiled forms. TIDE TOOL also includes detail maps of each station to show each station's geographical context and reverse tsunami travel time contours to each station. TIDE TOOL can also be coupled to the GEOWARE™ TTT program to plot tsunami travel times and to indicate the expected tsunami arrival time on the marigrams. Because sea-level messages are structured in a rich variety of formats TIDE TOOL includes a metadata file, COMP_META, that contains all of the information needed by TIDE TOOL to decode sea-level data as well as basic information such as the geographical coordinates of each station. TIDE TOOL can therefore continuously decode theses sea-level messages in real-time and display the time

  8. Should we have blind faith in bioinformatics software? Illustrations from the SNAP web-based tool.

    Directory of Open Access Journals (Sweden)

    Sébastien Robiou-du-Pont

    Full Text Available Bioinformatics tools have gained popularity in biology but little is known about their validity. We aimed to assess the early contribution of 415 single nucleotide polymorphisms (SNPs associated with eight cardio-metabolic traits at the genome-wide significance level in adults in the Family Atherosclerosis Monitoring In earLY Life (FAMILY birth cohort. We used the popular web-based tool SNAP to assess the availability of the 415 SNPs in the Illumina Cardio-Metabochip genotyped in the FAMILY study participants. We then compared the SNAP output with the Cardio-Metabochip file provided by Illumina using chromosome and chromosomal positions of SNPs from NCBI Human Genome Browser (Genome Reference Consortium Human Build 37. With the HapMap 3 release 2 reference, 201 out of 415 SNPs were reported as missing in the Cardio-Metabochip by the SNAP output. However, the Cardio-Metabochip file revealed that 152 of these 201 SNPs were in fact present in the Cardio-Metabochip array (false negative rate of 36.6%. With the more recent 1000 Genomes Project release, we found a false-negative rate of 17.6% by comparing the outputs of SNAP and the Illumina product file. We did not find any 'false positive' SNPs (SNPs specified as available in the Cardio-Metabochip by SNAP, but not by the Cardio-Metabochip Illumina file. The Cohen's Kappa coefficient, which calculates the percentage of agreement between both methods, indicated that the validity of SNAP was fair to moderate depending on the reference used (the HapMap 3 or 1000 Genomes. In conclusion, we demonstrate that the SNAP outputs for the Cardio-Metabochip are invalid. This study illustrates the importance of systematically assessing the validity of bioinformatics tools in an independent manner. We propose a series of guidelines to improve practices in the fast-moving field of bioinformatics software implementation.

  9. Performance analysis and optimization of an advanced pharmaceutical wastewater treatment plant through a visual basic software tool (PWWT.VB).

    Science.gov (United States)

    Pal, Parimal; Thakura, Ritwik; Chakrabortty, Sankha

    2016-05-01

    A user-friendly, menu-driven simulation software tool has been developed for the first time to optimize and analyze the system performance of an advanced continuous membrane-integrated pharmaceutical wastewater treatment plant. The software allows pre-analysis and manipulation of input data which helps in optimization and shows the software performance visually on a graphical platform. Moreover, the software helps the user to "visualize" the effects of the operating parameters through its model-predicted output profiles. The software is based on a dynamic mathematical model, developed for a systematically integrated forward osmosis-nanofiltration process for removal of toxic organic compounds from pharmaceutical wastewater. The model-predicted values have been observed to corroborate well with the extensive experimental investigations which were found to be consistent under varying operating conditions like operating pressure, operating flow rate, and draw solute concentration. Low values of the relative error (RE = 0.09) and high values of Willmott-d-index (d will = 0.981) reflected a high degree of accuracy and reliability of the software. This software is likely to be a very efficient tool for system design or simulation of an advanced membrane-integrated treatment plant for hazardous wastewater.

  10. Diva software, a tool for European regional seas and Ocean climatologies production

    Science.gov (United States)

    Ouberdous, M.; Troupin, C.; Barth, A.; Alvera-Azcàrate, A.; Beckers, J.-M.

    2012-04-01

    Diva (Data-Interpolating Variational Analysis) is a software based on a method designed to perform data-gridding (or analysis) tasks, with the assets of taking into account the intrinsic nature of oceanographic data, i.e., the uncertainty on the in situ measurements and the anisotropy due to advection and irregular coastlines and topography. The Variational Inverse Method (VIM, Brasseur et al., 1996) implemented in Diva consists in minimizing a variational principle which accounts for the differences between the observations and the reconstructed field, the influence of the gradients and variability of the reconstructed field. The resolution of the numerical problem is based on finite-element method, which allows a great numerical efficiency and the consideration of complicated contours. Along with the analysis, Diva provides also error fields (Brankart and Brasseur, 1998; Rixen et al., 2000) based on the data coverage and noise. Diva is used for the production of climatologies in the pan-European network SeaDataNet. SeaDataNet is connecting the existing marine data centres of more than 30 countries and set up a data management infrastructure consisting of a standardized distributed system. The consortium has elaborated integrated products, using common procedures and methods. Among these, it uses the Diva software as reference tool for climatologies computation for various European regional seas, the Atlantic and the global ocean. During the first phase of the SeaDataNet project, a number of additional tools were developed to make easier the climatologies production for the users. Among these tools: the advection constraint during the field reconstruction through the specification of a velocity field on a regular grid, forcing the analysis to align with the velocity vectors; the Generalized Cross Validation for the determination of analysis parameters (signal-to-noise ratio); the creation of contours at selected depths; the detection of possible outliers; the

  11. An open-source software tool for the generation of relaxation time maps in magnetic resonance imaging

    Directory of Open Access Journals (Sweden)

    Kühne Titus

    2010-07-01

    Full Text Available Abstract Background In magnetic resonance (MR imaging, T1, T2 and T2* relaxation times represent characteristic tissue properties that can be quantified with the help of specific imaging strategies. While there are basic software tools for specific pulse sequences, until now there is no universal software program available to automate pixel-wise mapping of relaxation times from various types of images or MR systems. Such a software program would allow researchers to test and compare new imaging strategies and thus would significantly facilitate research in the area of quantitative tissue characterization. Results After defining requirements for a universal MR mapping tool, a software program named MRmap was created using a high-level graphics language. Additional features include a manual registration tool for source images with motion artifacts and a tabular DICOM viewer to examine pulse sequence parameters. MRmap was successfully tested on three different computer platforms with image data from three different MR system manufacturers and five different sorts of pulse sequences: multi-image inversion recovery T1; Look-Locker/TOMROP T1; modified Look-Locker (MOLLI T1; single-echo T2/T2*; and multi-echo T2/T2*. Computing times varied between 2 and 113 seconds. Estimates of relaxation times compared favorably to those obtained from non-automated curve fitting. Completed maps were exported in DICOM format and could be read in standard software packages used for analysis of clinical and research MR data. Conclusions MRmap is a flexible cross-platform research tool that enables accurate mapping of relaxation times from various pulse sequences. The software allows researchers to optimize quantitative MR strategies in a manufacturer-independent fashion. The program and its source code were made available as open-source software on the internet.

  12. Development and evaluation of an open source software tool for deidentification of pathology reports

    Directory of Open Access Journals (Sweden)

    Mahaadevan Rajeshwarri

    2006-03-01

    Full Text Available Abstract Background Electronic medical records, including pathology reports, are often used for research purposes. Currently, there are few programs freely available to remove identifiers while leaving the remainder of the pathology report text intact. Our goal was to produce an open source, Health Insurance Portability and Accountability Act (HIPAA compliant, deidentification tool tailored for pathology reports. We designed a three-step process for removing potential identifiers. The first step is to look for identifiers known to be associated with the patient, such as name, medical record number, pathology accession number, etc. Next, a series of pattern matches look for predictable patterns likely to represent identifying data; such as dates, accession numbers and addresses as well as patient, institution and physician names. Finally, individual words are compared with a database of proper names and geographic locations. Pathology reports from three institutions were used to design and test the algorithms. The software was improved iteratively on training sets until it exhibited good performance. 1800 new pathology reports were then processed. Each report was reviewed manually before and after deidentification to catalog all identifiers and note those that were not removed. Results 1254 (69.7 % of 1800 pathology reports contained identifiers in the body of the report. 3439 (98.3% of 3499 unique identifiers in the test set were removed. Only 19 HIPAA-specified identifiers (mainly consult accession numbers and misspelled names were missed. Of 41 non-HIPAA identifiers missed, the majority were partial institutional addresses and ages. Outside consultation case reports typically contain numerous identifiers and were the most challenging to deidentify comprehensively. There was variation in performance among reports from the three institutions, highlighting the need for site-specific customization, which is easily accomplished with our tool

  13. Data-Driven Decision Making as a Tool to Improve Software Development Productivity

    Science.gov (United States)

    Brown, Mary Erin

    2013-01-01

    The worldwide software project failure rate, based on a survey of information technology software manager's view of user satisfaction, product quality, and staff productivity, is estimated to be between 24% and 36% and software project success has not kept pace with the advances in hardware. The problem addressed by this study was the limited…

  14. TRANSIT--A Software Tool for Himar1 TnSeq Analysis.

    Directory of Open Access Journals (Sweden)

    Michael A DeJesus

    2015-10-01

    Full Text Available TnSeq has become a popular technique for determining the essentiality of genomic regions in bacterial organisms. Several methods have been developed to analyze the wealth of data that has been obtained through TnSeq experiments. We developed a tool for analyzing Himar1 TnSeq data called TRANSIT. TRANSIT provides a graphical interface to three different statistical methods for analyzing TnSeq data. These methods cover a variety of approaches capable of identifying essential genes in individual datasets as well as comparative analysis between conditions. We demonstrate the utility of this software by analyzing TnSeq datasets of M. tuberculosis grown on glycerol and cholesterol. We show that TRANSIT can be used to discover genes which have been previously implicated for growth on these carbon sources. TRANSIT is written in Python, and thus can be run on Windows, OSX and Linux platforms. The source code is distributed under the GNU GPL v3 license and can be obtained from the following GitHub repository: https://github.com/mad-lab/transit.

  15. UNBizPlanner: a software tool for preparing a business plan

    Directory of Open Access Journals (Sweden)

    Oscar Ávila Cifuentes

    2010-04-01

    Full Text Available Universities are currently expected to play a new role in society (in addition to research and teaching by engaging in a third mission concerning socio-economic development. Universities also play an important role in encouraging en-trepreneurs through training them in business planning. A business plan is a document summarising how an entre-preneur will create an organisation to exploit a business opportunity. Preparing a business plan draws on a wide range of knowledge from many business disciplines (e.g. finance, human resource management, intellectual pro-perty management, supply chain management, operations management and marketing. This article presents a computational tool for drawing up a business plan from a Colombian viewpoint by identifying the most relevant stages which are born in mind by national entities having most experience in creating and consolidating companies. Special emphasis was placed on analysing, designing and implementing a systems development life cycle for de-veloping the software. Reviewing the literature concerning business plans formed an important part of the analysis stage (bearing a Colombian viewpoint in mind.

  16. Prognostic 2.0: software tool for heart rate variability analysis and QT interval dispersion

    Science.gov (United States)

    Mendoza, Alfonso; Rueda, Oscar L.; Bautista, Lola X.; Martinez, Víctor E.; Lopez, Eddie R.; Gomez, Mario F.; Alvarez, Alexander

    2007-09-01

    Cardiovascular diseases, in particular Acute Myocardial Infarction (AMI) are the first cause of death in industrialized countries. Measurements of indicators of the behavior of the autonomic nervous system, such as the Heart Rate Variability (HRV) and the QT Interval Dispersion (QTD) in the acute phase of the AMI (first 48 hours after the event) give a good estimation of the subsequent cardiac events that could present a person who had suffered an AMI. This paper describes the implementation of the second version of Prognostic-AMI, a software tool that automate the calculation of such indicators. It uses the Discrete Wavelet Transform (DWT) to de-noise the signals an to detect the QRS complex and the T-wave from a conventional electrocardiogram of 12 leads. Indicators are measured in both time and frequency domain. A pilot trial performed on a sample population of 76 patients shows that people who had had cardiac complications in the acute phase of the AMI have low values in the indicators of HRV and QTD.

  17. Software Tool for Analysis of Breathing-Related Errors in Transthoracic Electrical Bioimpedance Spectroscopy Measurements

    Science.gov (United States)

    Abtahi, F.; Gyllensten, I. C.; Lindecrantz, K.; Seoane, F.

    2012-12-01

    During the last decades, Electrical Bioimpedance Spectroscopy (EBIS) has been applied in a range of different applications and mainly using the frequency sweep-technique. Traditionally the tissue under study is considered to be timeinvariant and dynamic changes of tissue activity are ignored and instead treated as a noise source. This assumption has not been adequately tested and could have a negative impact and limit the accuracy for impedance monitoring systems. In order to successfully use frequency-sweeping EBIS for monitoring time-variant systems, it is paramount to study the effect of frequency-sweep delay on Cole Model-based analysis. In this work, we present a software tool that can be used to simulate the influence of respiration activity in frequency-sweep EBIS measurements of the human thorax and analyse the effects of the different error sources. Preliminary results indicate that the deviation on the EBIS measurement might be significant at any frequency, and especially in the impedance plane. Therefore the impact on Cole-model analysis might be different depending on method applied for Cole parameter estimation.

  18. Development of a Kinect Software Tool to Classify Movements during Active Video Gaming.

    Science.gov (United States)

    Rosenberg, Michael; Thornton, Ashleigh L; Lay, Brendan S; Ward, Brodie; Nathan, David; Hunt, Daniel; Braham, Rebecca

    2016-01-01

    While it has been established that using full body motion to play active video games results in increased levels of energy expenditure, there is little information on the classification of human movement during active video game play in relationship to fundamental movement skills. The aim of this study was to validate software utilising Kinect sensor motion capture technology to recognise fundamental movement skills (FMS), during active video game play. Two human assessors rated jumping and side-stepping and these assessments were compared to the Kinect Action Recognition Tool (KART), to establish a level of agreement and determine the number of movements completed during five minutes of active video game play, for 43 children (m = 12 years 7 months ± 1 year 6 months). During five minutes of active video game play, inter-rater reliability, when examining the two human raters, was found to be higher for the jump (r = 0.94, p Kinect sensor can be used to count the number of jumps and sidestep during five minutes of active video game play with a similar level of accuracy as human raters. However, in contrast to humans, the KART system required a fraction of the time to analyse and tabulate the results.

  19. CancellationTools: All-in-one software for administration and analysis of cancellation tasks.

    Science.gov (United States)

    Dalmaijer, Edwin S; Van der Stigchel, Stefan; Nijboer, Tanja C W; Cornelissen, Tim H W; Husain, Masud

    2015-12-01

    In a cancellation task, a participant is required to search for and cross out ("cancel") targets, which are usually embedded among distractor stimuli. The number of cancelled targets and their location can be used to diagnose the neglect syndrome after stroke. In addition, the organization of search provides a potentially useful way to measure executive control over multitarget search. Although many useful cancellation measures have been introduced, most fail to make their way into research studies and clinical practice due to the practical difficulty of acquiring such parameters from traditional pen-and-paper measures. Here we present new, open-source software that is freely available to all. It allows researchers and clinicians to flexibly administer computerized cancellation tasks using stimuli of their choice, and to directly analyze the data in a convenient manner. The automated analysis suite provides output that includes almost all of the currently existing measures, as well as several new ones introduced here. All tasks can be performed using either a computer mouse or a touchscreen as an input device, and an online version of the task runtime is available for tablet devices. A summary of the results is produced in a single A4-sized PDF document, including high quality data visualizations. For research purposes, batch analysis of large datasets is possible. In sum, CancellationTools allows users to employ a flexible, computerized cancellation task, which provides extensive benefits and ease of use.

  20. GTest: a software tool for graphical assessment of empirical distributions' Gaussianity.

    Science.gov (United States)

    Barca, E; Bruno, E; Bruno, D E; Passarella, G

    2016-03-01

    their request for an effective tool for addressing such difficulties motivated us in adopting the inference-by-eye paradigm and implementing an easy-to-use, quick and reliable statistical tool. GTest visualizes its outcomes as a modified version of the Q-Q plot. The application has been developed in Visual Basic for Applications (VBA) within MS Excel 2010, which demonstrated to have all the characteristics of robustness and reliability needed. GTest provides true graphical normality tests which are as reliable as any statistical quantitative approach but much easier to understand. The Q-Q plots have been integrated with the outlining of an acceptance region around the representation of the theoretical distribution, defined in accordance with the alpha level of significance and the data sample size. The test decision rule is the following: if the empirical scatterplot falls completely within the acceptance region, then it can be concluded that the empirical distribution fits the theoretical one at the given alpha level. A comprehensive case study has been carried out with simulated and real-world data in order to check the robustness and reliability of the software.

  1. A fluoroscopy-based planning and guidance software tool for minimally invasive hip refixation by cement injection

    NARCIS (Netherlands)

    Malan, D.F.; Van der Walt, S.J.; Raidou, R.G.; Van den Berg, B.; Stoel, B.C.; Botha, C.P.; Nelissen, R.G.H.H.; Valstar, E.R.

    2015-01-01

    Purpose In orthopaedics, minimally invasive injection of bone cement is an established technique. We present HipRFX, a software tool for planning and guiding a cement injection procedure for stabilizing a loosening hip prosthesis. HipRFX works by analysing a pre-operative CT and intraoperative C-arm

  2. Corganiser: a web-based software tool for planning time-sensitive sampling of whole rounds during scientific drilling

    DEFF Research Database (Denmark)

    Marshall, Ian

    2014-01-01

    Corganiser is a software tool developed to simplify the process of preparing whole-round sampling plans for time-sensitive microbiology and geochemistry sampling during scientific drilling. It was developed during the Integrated Ocean Drilling Program (IODP) Expedition 347, but is designed to work...

  3. Plagiarism Detection: A Comparison of Teaching Assistants and a Software Tool in Identifying Cheating in a Psychology Course

    Science.gov (United States)

    Seifried, Eva; Lenhard, Wolfgang; Spinath, Birgit

    2015-01-01

    Essays that are assigned as homework in large classes are prone to cheating via unauthorized collaboration. In this study, we compared the ability of a software tool based on Latent Semantic Analysis (LSA) and student teaching assistants to detect plagiarism in a large group of students. To do so, we took two approaches: the first approach was…

  4. The Design and Development of a Computerized Tool Support for Conducting Senior Projects in Software Engineering Education

    Science.gov (United States)

    Chen, Chung-Yang; Teng, Kao-Chiuan

    2011-01-01

    This paper presents a computerized tool support, the Meetings-Flow Project Collaboration System (MFS), for designing, directing and sustaining the collaborative teamwork required in senior projects in software engineering (SE) education. Among many schools' SE curricula, senior projects serve as a capstone course that provides comprehensive…

  5. Productivity, part 2: cloud storage, remote meeting tools, screencasting, speech recognition software, password managers, and online data backup.

    Science.gov (United States)

    Lackey, Amanda E; Pandey, Tarun; Moshiri, Mariam; Lalwani, Neeraj; Lall, Chandana; Bhargava, Puneet

    2014-06-01

    It is an opportune time for radiologists to focus on personal productivity. The ever increasing reliance on computers and the Internet has significantly changed the way we work. Myriad software applications are available to help us improve our personal efficiency. In this article, the authors discuss some tools that help improve collaboration and personal productivity, maximize e-learning, and protect valuable digital data.

  6. State transition storyboards: A tool for designing the Goldstone solar system radar data acquisition system user interface software

    Science.gov (United States)

    Howard, S. D.

    1987-01-01

    Effective user interface design in software systems is a complex task that takes place without adequate modeling tools. By combining state transition diagrams and the storyboard technique of filmmakers, State Transition Storyboards were developed to provide a detailed modeling technique for the Goldstone Solar System Radar Data Acquisition System human-machine interface. Illustrations are included with a description of the modeling technique.

  7. EPA's science blog: "It All Starts with Science"; Article title: "EPA's Solvent Substitution Software Tool, PARIS III"

    Science.gov (United States)

    EPA's solvent substitution software tool, PARIS III is provided by the EPA for free, and can be effective and efficiently used to help environmentally-conscious individuals find better and greener solvent mixtures for many different common industrial processes. People can downlo...

  8. Mars, accessing the third dimension: a software tool to exploit Mars ground penetrating radars data.

    Science.gov (United States)

    Cantini, Federico; Ivanov, Anton B.

    2016-04-01

    The Mars Advanced Radar for Subsurface and Ionosphere Sounding (MARSIS), on board the ESA's Mars Express and the SHAllow RADar (SHARAD), on board the NASA's Mars Reconnaissance Orbiter are two ground penetrating radars (GPRs) aimed to probe the crust of Mars to explore the subsurface structure of the planet. By now they are collecting data since about 10 years covering a large fraction of the Mars surface. On the Earth GPRs collect data by sending electromagnetic (EM) pulses toward the surface and listening to the return echoes occurring at the dielectric discontinuities on the planet's surface and subsurface. The wavelengths used allow MARSIS EM pulses to penetrate the crust for several kilometers. The data products (Radargrams) are matrices where the x-axis spans different sampling points on the planet surface and the y-axis is the power of the echoes over time in the listening window. No standard way to manage this kind of data is established in the planetary science community and data analysis and interpretation require very often some knowledge of radar signal processing. Our software tool is aimed to ease the access to this data in particular to scientists without a specific background in signal processing. MARSIS and SHARAD geometrical data such as probing point latitude and longitude and spacecraft altitude, are stored, together with relevant acquisition metadata, in a geo-enabled relational database implemented using PostgreSQL and PostGIS. Data are extracted from official ESA and NASA released data using self-developed python classes and scripts and inserted in the database using OGR utilities. This software is also aimed to be the core of a collection of classes and script to implement more complex GPR data analysis. Geometrical data and metadata are exposed as WFS layers using a QGIS server, which can be further integrated with other data, such as imaging, spectroscopy and topography. Radar geometry data will be available as a part of the iMars Web

  9. A Software Tool for Atmospheric Correction and Surface Temperature Estimation of Landsat Infrared Thermal Data

    Directory of Open Access Journals (Sweden)

    Benjamin Tardy

    2016-08-01

    Full Text Available Land surface temperature (LST is an important variable involved in the Earth’s surface energy and water budgets and a key component in many aspects of environmental research. The Landsat program, jointly carried out by NASA and the USGS, has been recording thermal infrared data for the past 40 years. Nevertheless, LST data products for Landsat remain unavailable. The atmospheric correction (AC method commonly used for mono-window Landsat thermal data requires detailed information concerning the vertical structure (temperature, pressure and the composition (water vapor, ozone of the atmosphere. For a given coordinate, this information is generally obtained through either radio-sounding or atmospheric model simulations and is passed to the radiative transfer model (RTM to estimate the local atmospheric correction parameters. Although this approach yields accurate LST data, results are relevant only near this given coordinate. To meet the scientific community’s demand for high-resolution LST maps, we developed a new software tool dedicated to processing Landsat thermal data. The proposed tool improves on the commonly-used AC algorithm by incorporating spatial variations occurring in the Earth’s atmosphere composition. The ERA-Interim dataset (ECMWFmeteorological organization was used to retrieve vertical atmospheric conditions, which are available at a global scale with a resolution of 0.125 degrees and a temporal resolution of 6 h. A temporal and spatial linear interpolation of meteorological variables was performed to match the acquisition dates and coordinates of the Landsat images. The atmospheric correction parameters were then estimated on the basis of this reconstructed atmospheric grid using the commercial RTMsoftware MODTRAN. The needed surface emissivity was derived from the common vegetation index NDVI, obtained from the red and near-infrared (NIR bands of the same Landsat image. This permitted an estimation of LST for the entire

  10. Open Software Tools Applied to Jordan's National Multi-Agent Water Management Model

    Science.gov (United States)

    Knox, Stephen; Meier, Philipp; Harou, Julien; Yoon, Jim; Selby, Philip; Lachaut, Thibaut; Klassert, Christian; Avisse, Nicolas; Khadem, Majed; Tilmant, Amaury; Gorelick, Steven

    2016-04-01

    Jordan is the fourth most water scarce country in the world, where demand exceeds supply in a politically and demographically unstable context. The Jordan Water Project (JWP) aims to perform policy evaluation by modelling the hydrology, economics, and governance of Jordan's water resource system. The multidisciplinary nature of the project requires a modelling software system capable of integrating submodels from multiple disciplines into a single decision making process and communicating results to stakeholders. This requires a tool for building an integrated model and a system where diverse data sets can be managed and visualised. The integrated Jordan model is built using Pynsim, an open-source multi-agent simulation framework implemented in Python. Pynsim operates on network structures of nodes and links and supports institutional hierarchies, where an institution represents a grouping of nodes, links or other institutions. At each time step, code within each node, link and institution can executed independently, allowing for their fully autonomous behaviour. Additionally, engines (sub-models) perform actions over the entire network or on a subset of the network, such as taking a decision on a set of nodes. Pynsim is modular in design, allowing distinct modules to be modified easily without affecting others. Data management and visualisation is performed using Hydra (www.hydraplatform.org), an open software platform allowing users to manage network structure and data. The Hydra data manager connects to Pynsim, providing necessary input parameters for the integrated model. By providing a high-level portal to the model, Hydra removes a barrier between the users of the model (researchers, stakeholders, planners etc) and the model itself, allowing them to manage data, run the model and visualise results all through a single user interface. Pynsim's ability to represent institutional hierarchies, inter-network communication and the separation of node, link and

  11. SAGES: a suite of freely-available software tools for electronic disease surveillance in resource-limited settings.

    Directory of Open Access Journals (Sweden)

    Sheri L Lewis

    Full Text Available Public health surveillance is undergoing a revolution driven by advances in the field of information technology. Many countries have experienced vast improvements in the collection, ingestion, analysis, visualization, and dissemination of public health data. Resource-limited countries have lagged behind due to challenges in information technology infrastructure, public health resources, and the costs of proprietary software. The Suite for Automated Global Electronic bioSurveillance (SAGES is a collection of modular, flexible, freely-available software tools for electronic disease surveillance in resource-limited settings. One or more SAGES tools may be used in concert with existing surveillance applications or the SAGES tools may be used en masse for an end-to-end biosurveillance capability. This flexibility allows for the development of an inexpensive, customized, and sustainable disease surveillance system. The ability to rapidly assess anomalous disease activity may lead to more efficient use of limited resources and better compliance with World Health Organization International Health Regulations.

  12. Sustaining an Online, Shared Community Resource for Models, Robust Open source Software Tools and Data for Volcanology - the Vhub Experience

    Science.gov (United States)

    Patra, A. K.; Valentine, G. A.; Bursik, M. I.; Connor, C.; Connor, L.; Jones, M.; Simakov, N.; Aghakhani, H.; Jones-Ivey, R.; Kosar, T.; Zhang, B.

    2015-12-01

    Over the last 5 years we have created a community collaboratory Vhub.org [Palma et al, J. App. Volc. 3:2 doi:10.1186/2191-5040-3-2] as a place to find volcanology-related resources, and a venue for users to disseminate tools, teaching resources, data, and an online platform to support collaborative efforts. As the community (current active users > 6000 from an estimated community of comparable size) embeds the tools in the collaboratory into educational and research workflows it became imperative to: a) redesign tools into robust, open source reusable software for online and offline usage/enhancement; b) share large datasets with remote collaborators and other users seamlessly with security; c) support complex workflows for uncertainty analysis, validation and verification and data assimilation with large data. The focus on tool development/redevelopment has been twofold - firstly to use best practices in software engineering and new hardware like multi-core and graphic processing units. Secondly we wish to enhance capabilities to support inverse modeling, uncertainty quantification using large ensembles and design of experiments, calibration, validation. Among software engineering practices we practice are open source facilitating community contributions, modularity and reusability. Our initial targets are four popular tools on Vhub - TITAN2D, TEPHRA2, PUFF and LAVA. Use of tools like these requires many observation driven data sets e.g. digital elevation models of topography, satellite imagery, field observations on deposits etc. These data are often maintained in private repositories that are privately shared by "sneaker-net". As a partial solution to this we tested mechanisms using irods software for online sharing of private data with public metadata and access limits. Finally, we adapted use of workflow engines (e.g. Pegasus) to support the complex data and computing workflows needed for usage like uncertainty quantification for hazard analysis using physical

  13. Linking meteoroid streams to their parent bodies by means of orbital association software tools

    Science.gov (United States)

    Madiedo, Jose Maria; Trigo-Rodriguez, Josep Maria

    2013-01-01

    A Microsoft-Windows-compatible software, called ORAS (ORbital Association Software), has been developed for verifying possible associations between orbits using different existing criteria. Applying this software revealed a likely association for the orbit of a Northern Chi-Orionid (ORN) fireball and Asteroid (PHA) 2008XM1. A numerical integration 4000 years backwards in time for the orbital parameters shows that this asteroid is a better match for this Northern Chi-Orionid than Asteroid (NEO) 2002XM35.

  14. Software sensors as a tool for optimization of animal-cell cultures.

    OpenAIRE

    Dorresteijn, P.C.

    1997-01-01

    In this thesis software sensors are introduced that predict the biomass activity and the concentrations of glucose, glutamine, lactic acid, and ammonium on line, The software sensors for biomass activity, glucose and lactic acid can be applied for any type of animal cell that is grown in a bioreactor system. The glutamine and ammonium software sensors are determined experimentally by correlating them to the acid-production rate. Therefore, they can only be used for Vero cells.In the developme...

  15. A flexible, interactive software tool for fitting the parameters of neuronal models

    Directory of Open Access Journals (Sweden)

    Péter eFriedrich

    2014-07-01

    Full Text Available The construction of biologically relevant neuronal models as well as model-based analysis of experimental data often requires the simultaneous fitting of multiple model parameters, so that the behavior of the model in a certain paradigm matches (as closely as possible the corresponding output of a real neuron according to some predefined criterion. Although the task of model optimization is often computationally hard, and the quality of the results depends heavily on technical issues such as the appropriate choice (and implementation of cost functions and optimization algorithms, no existing program provides access to the best available methods while also guiding the user through the process effectively. Our software, called Optimizer, implements a modular and extensible framework for the optimization of neuronal models, and also features a graphical interface which makes it easy for even non-expert users to handle many commonly occurring scenarios. Meanwhile, educated users can extend the capabilities of the program and customize it according to their needs with relatively little effort. Optimizer has been developed in Python, takes advantage of open-source Python modules for nonlinear optimization, and interfaces directly with the NEURON simulator to run the models. Other simulators are supported through an external interface. We have tested the program on several different types of problem of varying complexity, using different model classes. As targets, we used simulated traces from the same or a more complex model class, as well as experimental data. We successfully used Optimizer to determine passive parameters and conductance densities in compartmental models, and to fit simple (adaptive exponential integrate-and-fire neuronal models to complex biological data. Our detailed comparisons show that Optimizer can handle a wider range of problems, and delivers equally good or better performance than any other existing neuronal model fitting

  16. Development of a Kinect Software Tool to Classify Movements during Active Video Gaming.

    Directory of Open Access Journals (Sweden)

    Michael Rosenberg

    Full Text Available While it has been established that using full body motion to play active video games results in increased levels of energy expenditure, there is little information on the classification of human movement during active video game play in relationship to fundamental movement skills. The aim of this study was to validate software utilising Kinect sensor motion capture technology to recognise fundamental movement skills (FMS, during active video game play. Two human assessors rated jumping and side-stepping and these assessments were compared to the Kinect Action Recognition Tool (KART, to establish a level of agreement and determine the number of movements completed during five minutes of active video game play, for 43 children (m = 12 years 7 months ± 1 year 6 months. During five minutes of active video game play, inter-rater reliability, when examining the two human raters, was found to be higher for the jump (r = 0.94, p < .01 than the sidestep (r = 0.87, p < .01, although both were excellent. Excellent reliability was also found between human raters and the KART system for the jump (r = 0.84, p, .01 and moderate reliability for sidestep (r = 0.6983, p < .01 during game play, demonstrating that both humans and KART had higher agreement for jumps than sidesteps in the game play condition. The results of the study provide confidence that the Kinect sensor can be used to count the number of jumps and sidestep during five minutes of active video game play with a similar level of accuracy as human raters. However, in contrast to humans, the KART system required a fraction of the time to analyse and tabulate the results.

  17. Tools to Support the Reuse of Software Assets for the NASA Earth Science Decadal Survey Missions

    Science.gov (United States)

    Mattmann, Chris A.; Downs, Robert R.; Marshall, James J.; Most, Neal F.; Samadi, Shahin

    2011-01-01

    The NASA Earth Science Data Systems (ESDS) Software Reuse Working Group (SRWG) is chartered with the investigation, production, and dissemination of information related to the reuse of NASA Earth science software assets. One major current objective is to engage the NASA decadal missions in areas relevant to software reuse. In this paper we report on the current status of these activities. First, we provide some background on the SRWG in general and then discuss the group s flagship recommendation, the NASA Reuse Readiness Levels (RRLs). We continue by describing areas in which mission software may be reused in the context of NASA decadal missions. We conclude the paper with pointers to future directions.

  18. Development of the software tool for generation and visualization of the finite element head model with bone conduction sounds

    Science.gov (United States)

    Nikolić, Dalibor; Milošević, Žarko; Saveljić, Igor; Filipović, Nenad

    2015-12-01

    Vibration of the skull causes a hearing sensation. We call it Bone Conduction (BC) sound. There are several investigations about transmission properties of bone conducted sound. The aim of this study was to develop a software tool for easy generation of the finite element (FE) model of the human head with different materials based on human head anatomy and to calculate sound conduction through the head. Developed software tool generates a model in a few steps. The first step is to do segmentation of CT medical images (DICOM) and to generate a surface mesh files (STL). Each STL file presents a different layer of human head with different material properties (brain, CSF, different layers of the skull bone, skin, etc.). The next steps are to make tetrahedral mesh from obtained STL files, to define FE model boundary conditions and to solve FE equations. This tool uses PAK solver, which is the open source software implemented in SIFEM FP7 project, for calculations of the head vibration. Purpose of this tool is to show impact of the bone conduction sound of the head on the hearing system and to estimate matching of obtained results with experimental measurements.

  19. A new online software tool for pressure ulcer monitoring as an educational instrument for unified nursing assessment in clinical settings

    Directory of Open Access Journals (Sweden)

    Andrea Pokorná

    2016-07-01

    Full Text Available Data collection and evaluation of that data is crucial for effective quality management and naturally also for prevention and treatment of pressure ulcers. Data collected in a uniform manner by nurses in clinical practice could be used for further analyses. Data about pressure ulcers are collected to differing degrees of quality based on the local policy of the given health care facility and in relation to the nurse’s actual level of knowledge concerning pressure ulcer identification and use of objective scales (i.e. categorization of pressure ulcers. Therefore, we have developed software suitable for data collection which includes some educational tools to promote unified reporting of data by nurses. A description of this software and some educational and learning components of the tool is presented herein. The planned process of clinical application of the newly developed software is also briefly mentioned. The discussion is focused on the usability of the online reporting tool and possible further development of the tool.

  20. Software sensors as a tool for optimization of animal-cell cultures.

    NARCIS (Netherlands)

    Dorresteijn, P.C.

    1997-01-01

    In this thesis software sensors are introduced that predict the biomass activity and the concentrations of glucose, glutamine, lactic acid, and ammonium on line, The software sensors for biomass activity, glucose and lactic acid can be applied for any type of animal cell that is grown in a bioreacto

  1. A Process Framework for Designing Software Reference Architectures for Providing Tools as a Service

    DEFF Research Database (Denmark)

    Chauhan, Muhammad Aufeef; Babar, Muhammad Ali; Probst, Christian W.

    2016-01-01

    Software Reference Architecture (SRA), which is a generic architecture solution for a specific type of software systems, provides foundation for the design of concrete architectures in terms of architecture design guidelines and architecture elements. The complexity and size of certain types of s...

  2. Using McIDAS-V data analysis and visualization software as an educational tool for understanding the atmosphere

    Science.gov (United States)

    Achtor, T. H.; Rink, T.

    2010-12-01

    The University of Wisconsin’s Space Science and Engineering Center (SSEC) has been at the forefront in developing data analysis and visualization tools for environmental satellites and other geophysical data. The fifth generation of the Man-computer Interactive Data Access System (McIDAS-V) is Java-based, open-source, freely available software that operates on Linux, Macintosh and Windows systems. The software tools provide powerful new data manipulation and visualization capabilities that work with geophysical data in research, operational and educational environments. McIDAS-V provides unique capabilities to support innovative techniques for evaluating research results, teaching and training. McIDAS-V is based on three powerful software elements. VisAD is a Java library for building interactive, collaborative, 4 dimensional visualization and analysis tools. The Integrated Data Viewer (IDV) is a reference application based on the VisAD system and developed by the Unidata program that demonstrates the flexibility that is needed in this evolving environment, using a modern, object-oriented software design approach. The third tool, HYDRA, allows users to build, display and interrogate multi and hyperspectral environmental satellite data in powerful ways. The McIDAS-V software is being used for training and education in several settings. The McIDAS User Group provides training workshops at its annual meeting. Numerous online tutorials with training data sets have been developed to aid users in learning simple and more complex operations in McIDAS-V, all are available online. In a University of Wisconsin-Madison undergraduate course in Radar and Satellite Meteorology, McIDAS-V is used to create and deliver laboratory exercises using case study and real time data. At the high school level, McIDAS-V is used in several exercises in our annual Summer Workshop in Earth and Atmospheric Sciences to provide young scientists the opportunity to examine data with friendly and

  3. Algorithms and software tools for ordering clone libraries: application to the mapping of the genome of Schizosaccharomyces pombe.

    Science.gov (United States)

    Mott, R; Grigoriev, A; Maier, E; Hoheisel, J; Lehrach, H

    1993-04-25

    A complete set of software tools to aid the physical mapping of a genome has been developed and successfully applied to the genomic mapping of the fission yeast Schizosaccharomyces pombe. Two approaches were used for ordering single-copy hybridisation probes: one was based on the simulated annealing algorithm to order all probes, and another on inferring the minimum-spanning subset of the probes using a heuristic filtering procedure. Both algorithms produced almost identical maps, with minor differences in the order of repetitive probes and those having identical hybridisation patterns. A separate algorithm fitted the clones to the established probe order. Approaches for handling experimental noise and repetitive elements are discussed. In addition to these programs and the database management software, tools for visualizing and editing the data are described. The issues of combining the information from different libraries are addressed. Also, ways of handling multiple-copy probes and non-hybridisation data are discussed.

  4. Numerical arc segmentation algorithm for a radio conference - A software tool for communication satellite systems planning

    Science.gov (United States)

    Whyte, W. A.; Heyward, A. O.; Ponchak, D. S.; Spence, R. L.; Zuzek, J. E.

    1988-01-01

    A detailed description of a Numerical Arc Segmentation Algorithm for a Radio Conference (NASARC) software package for communication satellite systems planning is presented. This software provides a method of generating predetermined arc segments for use in the development of an allotment planning procedure to be carried out at the 1988 World Administrative Radio Conference (WARC - 88) on the use of the GEO and the planning of space services utilizing GEO. The features of the NASARC software package are described, and detailed information is given about the function of each of the four NASARC program modules. The results of a sample world scenario are presented and discussed.

  5. An user-friendly software tool for the solution of the time-dependent Schroedinger and Gross-Pitaevskii equations

    Energy Technology Data Exchange (ETDEWEB)

    Serafini, Thomas; Bertoni, Andrea, E-mail: andrea.bertoni@unimore.i [S3 National Research Center, INFM-CNR, 41125 Modena (Italy)

    2009-11-15

    In this work we present TDStool, a general-purpose easy-to-use software tool for the solution of the time-dependent Schroedinger equation in 2D and 3D domains with arbitrary time-dependent potentials. The numerical algorithms adopted in the code, namely Fourier split-step and box-integration methods, are sketched and the main characteristics of the tool are illustrated. As an example, the dynamics of a single electron in systems of two and three coupled quantum dots is obtained. The code is released as an open-source project and has a build-in graphical interface for the visualization of the results.

  6. The Seismic Tool-Kit (STK): an open source software for seismology and signal processing.

    Science.gov (United States)

    Reymond, Dominique

    2016-04-01

    We present an open source software project (GNU public license), named STK: Seismic ToolKit, that is dedicated mainly for seismology and signal processing. The STK project that started in 2007, is hosted by SourceForge.net, and count more than 19 500 downloads at the date of writing. The STK project is composed of two main branches: First, a graphical interface dedicated to signal processing (in the SAC format (SAC_ASCII and SAC_BIN): where the signal can be plotted, zoomed, filtered, integrated, derivated, ... etc. (a large variety of IFR and FIR filter is proposed). The estimation of spectral density of the signal are performed via the Fourier transform, with visualization of the Power Spectral Density (PSD) in linear or log scale, and also the evolutive time-frequency representation (or sonagram). The 3-components signals can be also processed for estimating their polarization properties, either for a given window, or either for evolutive windows along the time. This polarization analysis is useful for extracting the polarized noises, differentiating P waves, Rayleigh waves, Love waves, ... etc. Secondly, a panel of Utilities-Program are proposed for working in a terminal mode, with basic programs for computing azimuth and distance in spherical geometry, inter/auto-correlation, spectral density, time-frequency for an entire directory of signals, focal planes, and main components axis, radiation pattern of P waves, Polarization analysis of different waves (including noize), under/over-sampling the signals, cubic-spline smoothing, and linear/non linear regression analysis of data set. A MINimum library of Linear AlGebra (MIN-LINAG) is also provided for computing the main matrix process like: QR/QL decomposition, Cholesky solve of linear system, finding eigen value/eigen vectors, QR-solve/Eigen-solve of linear equations systems ... etc. STK is developed in C/C++, mainly under Linux OS, and it has been also partially implemented under MS-Windows. Usefull links: http

  7. GUM2DFT—a software tool for uncertainty evaluation of transient signals in the frequency domain

    Science.gov (United States)

    Eichstädt, S.; Wilkens, V.

    2016-05-01

    The Fourier transform and its counterpart for discrete time signals, the discrete Fourier transform (DFT), are common tools in measurement science and application. Although almost every scientific software package offers ready-to-use implementations of the DFT, the propagation of uncertainties in line with the guide to the expression of uncertainty in measurement (GUM) is typically neglected. This is of particular importance in dynamic metrology, when input estimation is carried out by deconvolution in the frequency domain. To this end, we present the new open-source software tool GUM2DFT, which utilizes closed formulas for the efficient propagation of uncertainties for the application of the DFT, inverse DFT and input estimation in the frequency domain. It handles different frequency domain representations, accounts for autocorrelation and takes advantage of the symmetry inherent in the DFT result for real-valued time domain signals. All tools are presented in terms of examples which form part of the software package. GUM2DFT will foster GUM-compliant evaluation of uncertainty in a DFT-based analysis and enable metrologists to include uncertainty evaluations in their routine work.

  8. METHODS AND TOOLS FOR DEVELOPING COMPUTER LEARNING SOFTWARE ON ELECTRICAL ENGINEERING BASED ON WIKI TECHNOLOGY

    Directory of Open Access Journals (Sweden)

    Roman V. Alekhin

    2014-01-01

    Full Text Available This work is devoted to issues related to the development of learning software and, in particular, digital libraries and knowledge bases in different fields and disciplines. Main attention is paid to the development of computer learning software for electrical engineering on the base of rapidly growing and popular wiki technology. This work was supported by RFBR (projects 12-08-00358, 14-01-00427, 12-07-00508. 

  9. Particle Loss Calculator – a new software tool for the assessment of the performance of aerosol inlet systems

    Directory of Open Access Journals (Sweden)

    S.-L. von der Weiden

    2009-09-01

    Full Text Available Most aerosol measurements require an inlet system to transport aerosols from a select sampling location to a suitable measurement device through some length of tubing. Such inlet systems must be optimized to minimize aerosol sampling artifacts and maximize sampling efficiency. In this study we introduce a new multifunctional software tool (Particle Loss Calculator, PLC that can be used to quickly determine aerosol sampling efficiency and particle transport losses due to passage through arbitrary tubing systems. The software employs relevant empirical and theoretical relationships found in established literature and accounts for the most important sampling and transport effects that might be encountered during deployment of typical, ground-based ambient aerosol measurements through a constant-diameter sampling probe. The software treats non-isoaxial and non-isokinetic aerosol sampling, aerosol diffusion and sedimentation as well as turbulent inertial deposition and inertial deposition in bends and contractions of tubing. This software was validated through comparison with experimentally determined particle losses for several tubing systems bent to create various diffusion, sedimentation and inertial deposition properties. As long as the tube geometries are not "too extreme", agreement is satisfactory. We discuss the conclusions of these experiments, the limitations of the software and present three examples of the use of the Particle Loss Calculator in the field.

  10. Particle Loss Calculator – a new software tool for the assessment of the performance of aerosol inlet systems

    Directory of Open Access Journals (Sweden)

    S.-L. von der Weiden

    2009-04-01

    Full Text Available Most aerosol measurements require an inlet system to transport aerosols from a select sampling location to a suitable measurement device through some length of tubing. Such inlet systems must be optimized to minimize aerosol sampling artifacts and maximize sampling efficiency. In this study we introduce a new multifunctional software tool (Particle Loss Calculator, PLC that can be used to quickly determine aerosol sampling efficiency and particle transport losses due to passage through arbitrary tubing systems. The software employs relevant empirical and theoretical relationships found in established literature and accounts for the most important sampling and transport effects that might be encountered during deployment of typical, ground-based ambient aerosol measurements. The software treats non-isoaxial and non-isokinetic aerosol sampling, aerosol diffusion and sedimentation as well as turbulent inertial deposition and inertial deposition in bends and contractions of tubing. This software was validated through comparison with experimentally determined particle losses for several tubing systems bent to create various diffusion, sedimentation and inertial deposition properties. As long as the tube geometries are not "too extreme", agreement is satisfactory. We discuss the conclusions of these experiments, the limitations of the software and present three examples of the use of the Particle Loss Calculator in the field.

  11. Lilith: A software framework for the rapid development of scalable tools for distributed computing

    Energy Technology Data Exchange (ETDEWEB)

    Gentile, A.C.; Evensky, D.A.; Armstrong, R.C.

    1998-03-01

    Lilith is a general purpose framework, written in Java, that provides a highly scalable distribution of user code across a heterogeneous computing platform. By creation of suitable user code, the Lilith framework can be used for tool development. The scalable performance provided by Lilith is crucial to the development of effective tools for large distributed systems. Furthermore, since Lilith handles the details of code distribution and communication, the user code need focus primarily on the tool functionality, thus, greatly decreasing the time required for tool development. In this paper, the authors concentrate on the use of the Lilith framework to develop scalable tools. The authors review the functionality of Lilith and introduce a typical tool capitalizing on the features of the framework. They present new Objects directly involved with tool creation. They explain details of development and illustrate with an example. They present timing results demonstrating scalability.

  12. TU-C-17A-03: An Integrated Contour Evaluation Software Tool Using Supervised Pattern Recognition for Radiotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Chen, H; Tan, J; Kavanaugh, J; Dolly, S; Gay, H; Thorstad, W; Anastasio, M; Altman, M; Mutic, S; Li, H [Washington University School of Medicine, Saint Louis, MO (United States)

    2014-06-15

    Purpose: Radiotherapy (RT) contours delineated either manually or semiautomatically require verification before clinical usage. Manual evaluation is very time consuming. A new integrated software tool using supervised pattern contour recognition was thus developed to facilitate this process. Methods: The contouring tool was developed using an object-oriented programming language C# and application programming interfaces, e.g. visualization toolkit (VTK). The C# language served as the tool design basis. The Accord.Net scientific computing libraries were utilized for the required statistical data processing and pattern recognition, while the VTK was used to build and render 3-D mesh models from critical RT structures in real-time and 360° visualization. Principal component analysis (PCA) was used for system self-updating geometry variations of normal structures based on physician-approved RT contours as a training dataset. The inhouse design of supervised PCA-based contour recognition method was used for automatically evaluating contour normality/abnormality. The function for reporting the contour evaluation results was implemented by using C# and Windows Form Designer. Results: The software input was RT simulation images and RT structures from commercial clinical treatment planning systems. Several abilities were demonstrated: automatic assessment of RT contours, file loading/saving of various modality medical images and RT contours, and generation/visualization of 3-D images and anatomical models. Moreover, it supported the 360° rendering of the RT structures in a multi-slice view, which allows physicians to visually check and edit abnormally contoured structures. Conclusion: This new software integrates the supervised learning framework with image processing and graphical visualization modules for RT contour verification. This tool has great potential for facilitating treatment planning with the assistance of an automatic contour evaluation module in avoiding

  13. Lilith: A software framework for the rapid development of scalable tools for distributed computing

    Energy Technology Data Exchange (ETDEWEB)

    Gentile, A.C.; Evensky, D.A.; Armstrong, R.C.

    1997-12-31

    Lilith is a general purpose tool that provides a highly scalable, easy distribution of user code across a heterogeneous computing platform. By handling the details of code distribution and communication, such a framework allows for the rapid development of tools for the use and management of large distributed systems. This speed-up in development not only enables the easy creation of tools as needed but also facilitates the ultimate development of more refined, hard-coded tools as well. Lilith is written in Java, providing platform independence and further facilitating rapid tool development through Object reuse and ease of development. The authors present the user-involved objects in the Lilith Distributed Object System and the Lilith User API. They present an example of tool development, illustrating the user calls, and present results demonstrating Lilith`s scalability.

  14. Development of a case tool to support decision based software development

    Science.gov (United States)

    Wild, Christian J.

    1993-01-01

    A summary of the accomplishments of the research over the past year are presented. Achievements include: made demonstrations with DHC, a prototype supporting decision based software development (DBSD) methodology, for Paramax personnel at ODU; met with Paramax personnel to discuss DBSD issues, the process of integrating DBSD and Refinery and the porting process model; completed and submitted a paper describing DBSD paradigm to IFIP '92; completed and presented a paper describing the approach for software reuse at the Software Reuse Workshop in April 1993; continued to extend DHC with a project agenda, facility necessary for a better project management; completed a primary draft of the re-engineering process model for porting; created a logging form to trace all the activities involved in the process of solving the reengineering problem, and developed a primary chart with the problems involved by the reengineering process.

  15. Validation of a Low-Thrust Mission Design Tool Using Operational Navigation Software

    Science.gov (United States)

    Englander, Jacob A.; Knittel, Jeremy M.; Williams, Ken; Stanbridge, Dale; Ellison, Donald H.

    2017-01-01

    Design of flight trajectories for missions employing solar electric propulsion requires a suitably high-fidelity design tool. In this work, the Evolutionary Mission Trajectory Generator (EMTG) is presented as a medium-high fidelity design tool that is suitable for mission proposals. EMTG is validated against the high-heritage deep-space navigation tool MIRAGE, demonstrating both the accuracy of EMTG's model and an operational mission design and navigation procedure using both tools. The validation is performed using a benchmark mission to the Jupiter Trojans.

  16. ARCHER, a New Monte Carlo Software Tool for Emerging Heterogeneous Computing Environments

    Science.gov (United States)

    Xu, X. George; Liu, Tianyu; Su, Lin; Du, Xining; Riblett, Matthew; Ji, Wei; Gu, Deyang; Carothers, Christopher D.; Shephard, Mark S.; Brown, Forrest B.; Kalra, Mannudeep K.; Liu, Bob

    2014-06-01

    The Monte Carlo radiation transport community faces a number of challenges associated with peta- and exa-scale computing systems that rely increasingly on heterogeneous architectures involving hardware accelerators such as GPUs. Existing Monte Carlo codes and methods must be strategically upgraded to meet emerging hardware and software needs. In this paper, we describe the development of a software, called ARCHER (Accelerated Radiation-transport Computations in Heterogeneous EnviRonments), which is designed as a versatile testbed for future Monte Carlo codes. Preliminary results from five projects in nuclear engineering and medical physics are presented.

  17. New EPA 'PLUS' software: A useful tool for local emergency planners

    Energy Technology Data Exchange (ETDEWEB)

    Anastas, P.T.; Tobin, P.S.

    1993-09-01

    EPA's Office of Pollution Prevention and Toxics has produced a new bibliographic database designed for local emergency planners. The PLUS'' (Planner's Library in User-friendly Software) system is a resource database containing abstracts of hundreds of references useful in planning for hazardous substances emergencies. The software's structure allows planners to construct customized search strategies for generating lists of emergency planning-related references on subjects ranging from chemical profiles to personal protective gear.

  18. Software Tools for Software Maintenance

    Science.gov (United States)

    1988-10-01

    I- IIBM Main DOS, OS! None Spoc I Quikjob I AI ----------- ------------ I ----------------------------- I-- IBM Main OS Cobol I...Cobol Debug A I ---------------------- -------------------------- i IBM Main OS Cobol I Quick Online Debugging System A I ----------------------I...Debug Assembler JSCDebu&.Cobot Debug Cobol QUODS (Qusk Online Dtbugring System) Cobol Superbug Antmblcr Trace Ary Tracer Fortn Assembler X)tbug

  19. APASVO: A free software tool for automatic P-phase picking and event detection in seismic traces

    Science.gov (United States)

    Romero, José Emilio; Titos, Manuel; Bueno, Ángel; Álvarez, Isaac; García, Luz; Torre, Ángel de la; Benítez, M.a. Carmen

    2016-05-01

    The accurate estimation of the arrival time of seismic waves or picking is a problem of major interest in seismic research given its relevance in many seismological applications, such as earthquake source location and active seismic tomography. In the last decades, several automatic picking methods have been proposed with the ultimate goal of implementing picking algorithms whose results are comparable to those obtained by manual picking. In order to facilitate the use of these automated methods in the analysis of seismic traces, this paper presents a new free, open source, software graphical tool, named APASVO, which allows picking tasks in an easy and user-friendly way. The tool also provides event detection functionality, where a relatively imprecise estimation of the onset time is sufficient. The application implements the STA-LTA detection algorithm and the AMPA picking algorithm. An autoregressive AIC-based picking method can also be applied. Besides, this graphical tool is complemented with two additional command line tools, an event picking tool and a synthetic earthquake generator. APASVO is a multiplatform tool that works on Windows, Linux and OS X. The application can process data in a large variety of file formats. It is implemented in Python and relies on well-known scientific computing packages such as ObsPy, NumPy, SciPy and Matplotlib.

  20. In-depth evaluation of software tools for data-independent acquisition based label-free quantification.

    Science.gov (United States)

    Kuharev, Jörg; Navarro, Pedro; Distler, Ute; Jahn, Olaf; Tenzer, Stefan

    2015-09-01

    Label-free quantification (LFQ) based on data-independent acquisition workflows currently experiences increasing popularity. Several software tools have been recently published or are commercially available. The present study focuses on the evaluation of three different software packages (Progenesis, synapter, and ISOQuant) supporting ion mobility enhanced data-independent acquisition data. In order to benchmark the LFQ performance of the different tools, we generated two hybrid proteome samples of defined quantitative composition containing tryptically digested proteomes of three different species (mouse, yeast, Escherichia coli). This model dataset simulates complex biological samples containing large numbers of both unregulated (background) proteins as well as up- and downregulated proteins with exactly known ratios between samples. We determined the number and dynamic range of quantifiable proteins and analyzed the influence of applied algorithms (retention time alignment, clustering, normalization, etc.) on quantification results. Analysis of technical reproducibility revealed median coefficients of variation of reported protein abundances below 5% for MS(E) data for Progenesis and ISOQuant. Regarding accuracy of LFQ, evaluation with synapter and ISOQuant yielded superior results compared to Progenesis. In addition, we discuss reporting formats and user friendliness of the software packages. The data generated in this study have been deposited to the ProteomeXchange Consortium with identifier PXD001240 (http://proteomecentral.proteomexchange.org/dataset/PXD001240).

  1. Software tools for data modelling and processing of human body temperature circadian dynamics.

    Science.gov (United States)

    Petrova, Elena S; Afanasova, Anastasia I

    2015-01-01

    This paper is presenting a software development for simulating and processing thermometry data. The motivation of this research is the miniaturization of actuators attached to human body which allow frequent temperature measurements and improve the medical diagnosis procedures related to circadian dynamics.

  2. Cyber-physical systems software development: way of working and tool suite

    NARCIS (Netherlands)

    Bezemer, Maarten Matthijs

    2013-01-01

    Designing embedded control software for modern cyber-physical systems becomes more and more difficult, because of the increasing amount and complexity of their requirements. The regular requirements are extended with modern requirements, for example, to get a general purpose cyber-physical system ca

  3. A Systematic Mapping Study of Tools for Distributed Software Development Teams

    DEFF Research Database (Denmark)

    Tell, Paolo; Ali Babar, Muhammad

    Context: A wide variety of technologies have been developed to support Global Software Development (GSD). However, the information about the dozens of available solutions is quite diverse and scattered making it quite difficult to have an overview able to identify common trends and unveil researc...

  4. APPLICATION Of MODEL PSP MANUAL AND SUPPORTED BY TOOL MARRIES IN A STUDY OF CASE OF BRAZILIAN PLANT OF SOFTWARE

    Directory of Open Access Journals (Sweden)

    Denis Ávila Montini

    2006-06-01

    Full Text Available In a context of continuous improvement of quality in software development’s projects, the PSP experimental process was applied to discipline some of the processes suggested by CMMI level 2 with two different strategies. The first one consists of observing the behavior of a software factory on collecting the necessary data to assist the PSP model manually, and in the second one the collecting happened with the help of a CASE tool. The results show the impacts in the performance and in the quality patterns that the two strategies provided with their advantages and their vulnerabilities. In both cases the fulfillment of the stated periods was obtained from the moment where the specification and the course of the activities were controlled by the two PSP’ strategies suggested. Key words: CMMI, PSP, CASE, Factory, Improvement of processes.

  5. The Application of Intentional Subjective Properties and Mediated Communication Tools to Software Agents in Online Disputes Resolution Environments

    Directory of Open Access Journals (Sweden)

    Renzo Gobbin

    2004-11-01

    Full Text Available This paper examines the use of subjective properties in modeling an architecture for cooperative agents using Agent Communication Language (ACL that is used as a mediating tool for cooperative communication activities between and within software agents. The role that subjective and objective properties have in explaining and modeling agent internalization and externalization of ACL messages is investigated and related to Vygotsky’s developmental learning theories such as Mediated Activity Theory. A novel agent architecture ALMA (Agent Language Mediated Activity based on the integration of agents’ subjective and objective properties within an agent communication activity framework will be presented. The relevance of software agents subjective properties in modeling applications such as e-Law Online Dispute Resolution for e-business contractual arrangements using natural language subject/object relation in their communication patterns will be discussed.

  6. A System Level Tool for Translating Software to Reconfigurable Hardware Project

    Data.gov (United States)

    National Aeronautics and Space Administration — In this research we will develop a system level tool to translate binary code of a general-purpose processor into Register Transfer Level VHDL code to be mapped onto...

  7. Establishing a Web-Based DICOM Teaching File Authoring Tool Using Open-Source Public Software

    OpenAIRE

    Lee, Wen-Jeng; Yang, Chung-Yi; Liu, Kao-Lang; Liu, Hon-Man; Ching, Yu-Tai; Chen, Shyh-Jye

    2005-01-01

    Online teaching files are an important source of educational and referential materials in the radiology community. The commonly used Digital Imaging and Communications in Medicine (DICOM) file format of the radiology community is not natively supported by common Web browsers. The ability of the Web server to convert and parse DICOM is important when the DICOM-converting tools are not available. In this paper, we describe our approach to develop a Web-based teaching file authoring tool. Our se...

  8. The Image-Guided Surgery ToolKit IGSTK: an open source C++ software toolkit

    Science.gov (United States)

    Cheng, Peng; Ibanez, Luis; Gobbi, David; Gary, Kevin; Aylward, Stephen; Jomier, Julien; Enquobahrie, Andinet; Zhang, Hui; Kim, Hee-su; Blake, M. Brian; Cleary, Kevin

    2007-03-01

    The Image-Guided Surgery Toolkit (IGSTK) is an open source C++ software library that provides the basic components needed to develop image-guided surgery applications. The focus of the toolkit is on robustness using a state machine architecture. This paper presents an overview of the project based on a recent book which can be downloaded from igstk.org. The paper includes an introduction to open source projects, a discussion of our software development process and the best practices that were developed, and an overview of requirements. The paper also presents the architecture framework and main components. This presentation is followed by a discussion of the state machine model that was incorporated and the associated rationale. The paper concludes with an example application.

  9. STEM_CELL: a software tool for electron microscopy: part 2--analysis of crystalline materials.

    Science.gov (United States)

    Grillo, Vincenzo; Rossi, Francesca

    2013-02-01

    A new graphical software (STEM_CELL) for analysis of HRTEM and STEM-HAADF images is here introduced in detail. The advantage of the software, beyond its graphic interface, is to put together different analysis algorithms and simulation (described in an associated article) to produce novel analysis methodologies. Different implementations and improvements to state of the art approach are reported in the image analysis, filtering, normalization, background subtraction. In particular two important methodological results are here highlighted: (i) the definition of a procedure for atomic scale quantitative analysis of HAADF images, (ii) the extension of geometric phase analysis to large regions up to potentially 1μm through the use of under sampled images with aliasing effects.

  10. Mid-water Software Tools and the Application to Processing and Analysis of the Latest Generation Multibeam Sonars

    Science.gov (United States)

    Gee, L.; Doucet, M.

    2010-12-01

    The latest generation of multibeam sonars now has the ability to map the water-column, along with the seafloor. Currently, the users of these sonars have a limited view of the mid-water data in real-time, and if they do store the data, they are restricted to replaying it only, with no ability for further analysis. The water-column data has the potential to address a number of research areas including detection of small targets (wrecks, etc.) above the seabed, mapping of fish and marine mammals and a wide range of physical oceanographic processes. However, researchers have been required to develop their own in-house software tools before they can even begin their study of the water column data. This paper describes the development of more general software tools for the full processing of raw sonar data (bathymetry, backscatter and water-column) to yield output products suitable for visualization in a 4D time-synchronized environment. The huge water-column data volumes generated by the new sonars, combined with the variety of data formats from the different sonar manufacturers, provides a significant challenge in the design and development of tools that can be applied to the wide variety of applications. The development of the mid-water tools on this project addressed this problem by using a unified way of storing the water column data in a generic water column format (GWC). The sonar data are converted into the GWC by re-integrating the water column packets with time-based navigation and attitude, such that downstream in the workflow, the tools will have access to all relevant data of any particular ping. Dependent on the application and the resolution requirements, the conversion process also allows simple sub-sampling. Additionally, each file is indexed to enable fast non-linear lookup and extraction of any packet type or packet type collection in the sonar file. These tools also fully exploit multi-core and hyper-threading technologies to maximize the throughput

  11. Techniques and Tools for Trustworthy Composition of Pre-Designed Embedded Software Components

    Science.gov (United States)

    2012-07-01

    that was developed as a part of the Arduino open-source electronics prototyping platform. The Ardupilot system consists of the hardware which is placed...hand of the user is used to communicate the roll, pitch and the throttle information to the UAV. The firmware for the system is written in the Arduino ...Measurement Unit (IMU) sensors it can be used to develop an Unmanned Aerial Vehicle. Software for the Ardupilot can be programmed using the Arduino

  12. Providing a Connection between a Bayesian Inverse Modeling Tool and a Coupled Hydrogeological Processes Modeling Software

    Science.gov (United States)

    Frystacky, H.; Osorio-Murillo, C. A.; Over, M. W.; Kalbacher, T.; Gunnell, D.; Kolditz, O.; Ames, D.; Rubin, Y.

    2013-12-01

    The Method of Anchored Distributions (MAD) is a Bayesian technique for characterizing the uncertainty in geostatistical model parameters. Open-source software has been developed in a modular framework such that this technique can be applied to any forward model software via a driver. This presentation is about the driver that has been developed for OpenGeoSys (OGS), open-source software that can simulate many hydrogeological processes, including couple processes. MAD allows the use of multiple data types for conditioning the spatially random fields and assessing model parameter likelihood. For example, if simulating flow and mass transport, the inversion target variable could be hydraulic conductivity and the inversion data types could be head, concentration, or both. The driver detects from the OGS files which processes and variables are being used in a given project and allows MAD to prompt the user to choose those that are to be modeled or to be treated deterministically. In this way, any combination of processes allowed by OGS can have MAD applied. As for the software, there are two versions, each with its own OGS driver. A Windows desktop version is available as a graphical user interface and is ideal for the learning and teaching environment. High-throughput computing can even be achieved with this version via HTCondor if large projects want to be pursued in a computer lab. In addition to this desktop application, a Linux version is available equipped with MPI such that it can be run in parallel on a computer cluster. All releases can be downloaded from the MAD Codeplex site given below.

  13. Instrument-independent software tools for the analysis of MS-MS and LC-MS lipidomics data.

    Science.gov (United States)

    Haimi, Perttu; Chaithanya, Krishna; Kainu, Ville; Hermansson, Martin; Somerharju, Pentti

    2009-01-01

    Mass spectrometry (MS), particularly electrospray-MS, is the key tool in modern lipidomics. However, as even a modest scale experiment produces a great amount of data, data processing often becomes limiting. Notably, the software provided with MS instruments are not well suited for quantitative analysis of lipidomes because of the great variety of species present and complexities in response calibration. Here we describe the use of two recently introduced software tools: lipid mass spectrum analysis (LIMSA) and spectrum extraction from chromatographic data (SECD), which significantly increase the speed and reliability of mass spectrometric analysis of complex lipidomes. LIMSA is a Microsoft Excel add-on that (1) finds and integrates the peaks in an imported spectrum, (2) identifies the peaks, (3) corrects the peak areas for overlap by isotopic peaks of other species and (4) quantifies the identified species using included internal standards. LIMSA is instrument-independent because it processes text-format MS spectra. Typically, the analysis of one spectrum takes only a few seconds.The SECD software allows one to display MS chromatograms as two-dimensional maps, which is useful for visual inspection of the data. More importantly, however, SECD allows one to extract mass spectra from user-defined regions of the map for further analysis with, e.g., LIMSA. The use of select regions rather than simple time-range averaging significantly improves the signal-to-noise ratio as signals outside the region of interest are more efficiently excluded. LIMSA and SECD have proven to be robust and convenient tools and are available free of charge from the authors.

  14. Genoviz Software Development Kit: Java tool kit for building genomics visualization applications

    Directory of Open Access Journals (Sweden)

    Chervitz Stephen A

    2009-08-01

    Full Text Available Abstract Background Visualization software can expose previously undiscovered patterns in genomic data and advance biological science. Results The Genoviz Software Development Kit (SDK is an open source, Java-based framework designed for rapid assembly of visualization software applications for genomics. The Genoviz SDK framework provides a mechanism for incorporating adaptive, dynamic zooming into applications, a desirable feature of genome viewers. Visualization capabilities of the Genoviz SDK include automated layout of features along genetic or genomic axes; support for user interactions with graphical elements (Glyphs in a map; a variety of Glyph sub-classes that promote experimentation with new ways of representing data in graphical formats; and support for adaptive, semantic zooming, whereby objects change their appearance depending on zoom level and zooming rate adapts to the current scale. Freely available demonstration and production quality applications, including the Integrated Genome Browser, illustrate Genoviz SDK capabilities. Conclusion Separation between graphics components and genomic data models makes it easy for developers to add visualization capability to pre-existing applications or build new applications using third-party data models. Source code, documentation, sample applications, and tutorials are available at http://genoviz.sourceforge.net/.

  15. The evaluation of Computed Tomography hard- and software tools for micropaleontologic studies on foraminifera

    Science.gov (United States)

    van Loo, D.; Speijer, R.; Masschaele, B.; Dierick, M.; Cnudde, V.; Boone, M.; de Witte, Y.; Dewanckele, J.; van Hoorebeke, L.; Jacobs, P.

    2009-04-01

    Foraminifera (Forams) are single-celled amoeba-like organisms in the sea, which build a tiny calcareous multi-chambered shell for protection. Their enormous abundance, great variation of shape through time and their presence in all marine deposits made these tiny microfossils the oil companies' best friend by facilitating the detection of new oil wells. Besides the success of forams in the oil and gas industry, they are also a most powerful tool for reconstructing climate change in the past. The shell of a foraminifer is a tiny gold mine of information both geometrical as chemical. However, until recently the best information on this architecture was only obtained through imaging the outside of a shell with Scanning Electron Microscopy (SEM), giving no clues towards internal structures other than single snapshots through breaking a specimen apart. With X-ray computed tomography (CT) it is possible to overcome this problem and uncover a huge amount of geometrical information without destructing the samples. Using the last generation of micro-CT's, called nano-CT, because of the sub-micron resolution, it is now possible to perform adequate imaging even on these tiny samples without needing huge facilities. In this research, a comparison is made between different X-ray sources and X-ray detectors and the resulting image resolution. Both sharpness, noise and contrast are very important parameters that will have important effects on the accuracy of the results and on the speed of data-processing. Combining this tomography technique with specific image processing software, called segmentation, it is possible to obtain a 3D virtual representation of the entire forams shell. This 3D virtual object can then be used for many purposes, from which automatic measurement of the chambers size is one of the most important ones. The segmentation process is a combination of several algorithms that are often used in CT evaluation, in this work an evaluation of those algorithms is

  16. Plots, Calculations and Graphics Tools (PCG2). Software Transfer Request Presentation

    Science.gov (United States)

    Richardson, Marilou R.

    2010-01-01

    This slide presentation reviews the development of the Plots, Calculations and Graphics Tools (PCG2) system. PCG2 is an easy to use tool that provides a single user interface to view data in a pictorial, tabular or graphical format. It allows the user to view the same display and data in the Control Room, engineering office area, or remote sites. PCG2 supports extensive and regular engineering needs that are both planned and unplanned and it supports the ability to compare, contrast and perform ad hoc data mining over the entire domain of a program's test data.

  17. A Tool for Optimizing the Build Performance of Large Software Code Bases

    NARCIS (Netherlands)

    Telea, Alexandru; Voinea, Lucian; Kontogiannis, K; Tjortjis, C; Winter, A

    2008-01-01

    We present Build Analyzer, a tool that helps developers optimize the build performance of huge systems written in C Due to complex C header dependencies, even small code changes can cause extremely long rebuilds, which are problematic when code is shared and modified by teams of hundreds of individu

  18. The Viability of a Software Tool to Assist Students in the Review of Literature

    Science.gov (United States)

    Anderson, Timothy R.

    2013-01-01

    Most doctoral students are novice researchers and may not possess the skills to effectively conduct a comprehensive review of the literature and frame a problem designed to conduct original research. Students need proper training and tools necessary to critically evaluate, synthesize and organize literature. The purpose of this concurrent mixed…

  19. Software Tools for the Analysis of the Photocathode Response of Photomultiplier Vacuum Tubes

    CERN Document Server

    Fabbri, R

    2013-01-01

    The central institute of electronics (ZEA-2) in the Forschungszentrum Juelich (FZJ) has developed a system to scan the response of the photocathode of photomultiplier tubes (PMT). The PMT sits tight on a supporting structure, while a blue light emitting diode is moved along its surface by two stepper motors, spanning both the x and y coordinates. All the system is located in a light-tight box made by wood. A graphical software was developed in-situ to perform the scan operations under different configurations (e.g., the step size of the scan and the number of measurements per point). During each point measurement the current output generated in the vacuum photomultiplier is processed in sequence by a pre-amplifier (mainly to convert the current signal into a voltage signal), an amplifier, and by an ADC module (typically a CAEN N957). The information of the measurement is saved in files at the end of the scan. Recently, software based on the CERN ROOT and on the Qt libraries was developed to help the user anal...

  20. Numerical arc segmentation algorithm for a radio conference: A software tool for communication satellite systems planning

    Science.gov (United States)

    Whyte, W. A.; Heyward, A. O.; Ponchak, D. S.; Spence, R. L.; Zuzek, J. E.

    1988-01-01

    The Numerical Arc Segmentation Algorithm for a Radio Conference (NASARC) provides a method of generating predetermined arc segments for use in the development of an allotment planning procedure to be carried out at the 1988 World Administrative Radio Conference (WARC) on the Use of the Geostationary Satellite Orbit and the Planning of Space Services Utilizing It. Through careful selection of the predetermined arc (PDA) for each administration, flexibility can be increased in terms of choice of system technical characteristics and specific orbit location while reducing the need for coordination among administrations. The NASARC software determines pairwise compatibility between all possible service areas at discrete arc locations. NASARC then exhaustively enumerates groups of administrations whose satellites can be closely located in orbit, and finds the arc segment over which each such compatible group exists. From the set of all possible compatible groupings, groups and their associated arc segments are selected using a heuristic procedure such that a PDA is identified for each administration. Various aspects of the NASARC concept and how the software accomplishes specific features of allotment planning are discussed.

  1. Internet-Based Software Tools for Analysis and Processing of LIDAR Point Cloud Data via the OpenTopography Portal

    Science.gov (United States)

    Nandigam, V.; Crosby, C. J.; Baru, C.; Arrowsmith, R.

    2009-12-01

    LIDAR is an excellent example of the new generation of powerful remote sensing data now available to Earth science researchers. Capable of producing digital elevation models (DEMs) more than an order of magnitude higher resolution than those currently available, LIDAR data allows earth scientists to study the processes that contribute to landscape evolution at resolutions not previously possible, yet essential for their appropriate representation. Along with these high-resolution datasets comes an increase in the volume and complexity of data that the user must efficiently manage and process in order for it to be scientifically useful. Although there are expensive commercial LIDAR software applications available, processing and analysis of these datasets are typically computationally inefficient on the conventional hardware and software that is currently available to most of the Earth science community. We have designed and implemented an Internet-based system, the OpenTopography Portal, that provides integrated access to high-resolution LIDAR data as well as web-based tools for processing of these datasets. By using remote data storage and high performance compute resources, the OpenTopography Portal attempts to simplify data access and standard LIDAR processing tasks for the Earth Science community. The OpenTopography Portal allows users to access massive amounts of raw point cloud LIDAR data as well as a suite of DEM generation tools to enable users to generate custom digital elevation models to best fit their science applications. The Cyberinfrastructure software tools for processing the data are freely available via the portal and conveniently integrated with the data selection in a single user-friendly interface. The ability to run these tools on powerful Cyberinfrastructure resources instead of their own labs provides a huge advantage in terms of performance and compute power. The system also encourages users to explore data processing methods and the

  2. The GlycanBuilder: a fast, intuitive and flexible software tool for building and displaying glycan structures

    Directory of Open Access Journals (Sweden)

    Dell Anne

    2007-08-01

    Full Text Available Abstract Background Carbohydrates play a critical role in human diseases and their potential utility as biomarkers for pathological conditions is a major driver for characterization of the glycome. However, the additional complexity of glycans compared to proteins and nucleic acids has slowed the advancement of glycomics in comparison to genomics and proteomics. The branched nature of carbohydrates, the great diversity of their constituents and the numerous alternative symbolic notations, make the input and display of glycans not as straightforward as for example the amino-acid sequence of a protein. Every glycoinformatic tool providing a user interface would benefit from a fast, intuitive, appealing mechanism for input and output of glycan structures in a computer readable format. Results A software tool for building and displaying glycan structures using a chosen symbolic notation is described here. The "GlycanBuilder" uses an automatic rendering algorithm to draw the saccharide symbols and to place them on the drawing board. The information about the symbolic notation is derived from a configurable graphical model as a set of rules governing the aspect and placement of residues and linkages. The algorithm is able to represent a structure using only few traversals of the tree and is inherently fast. The tool uses an XML format for import and export of encoded structures. Conclusion The rendering algorithm described here is able to produce high-quality representations of glycan structures in a chosen symbolic notation. The automated rendering process enables the "GlycanBuilder" to be used both as a user-independent component for displaying glycans and as an easy-to-use drawing tool. The "GlycanBuilder" can be integrated in web pages as a Java applet for the visual editing of glycans. The same component is available as a web service to render an encoded structure into a graphical format. Finally, the "GlycanBuilder" can be integrated into other

  3. The GLEaMviz computational tool, a publicly available software to explore realistic epidemic spreading scenarios at the global scale

    Directory of Open Access Journals (Sweden)

    Quaggiotto Marco

    2011-02-01

    Full Text Available Abstract Background Computational models play an increasingly important role in the assessment and control of public health crises, as demonstrated during the 2009 H1N1 influenza pandemic. Much research has been done in recent years in the development of sophisticated data-driven models for realistic computer-based simulations of infectious disease spreading. However, only a few computational tools are presently available for assessing scenarios, predicting epidemic evolutions, and managing health emergencies that can benefit a broad audience of users including policy makers and health institutions. Results We present "GLEaMviz", a publicly available software system that simulates the spread of emerging human-to-human infectious diseases across the world. The GLEaMviz tool comprises three components: the client application, the proxy middleware, and the simulation engine. The latter two components constitute the GLEaMviz server. The simulation engine leverages on the Global Epidemic and Mobility (GLEaM framework, a stochastic computational scheme that integrates worldwide high-resolution demographic and mobility data to simulate disease spread on the global scale. The GLEaMviz design aims at maximizing flexibility in defining the disease compartmental model and configuring the simulation scenario; it allows the user to set a variety of parameters including: compartment-specific features, transition values, and environmental effects. The output is a dynamic map and a corresponding set of charts that quantitatively describe the geo-temporal evolution of the disease. The software is designed as a client-server system. The multi-platform client, which can be installed on the user's local machine, is used to set up simulations that will be executed on the server, thus avoiding specific requirements for large computational capabilities on the user side. Conclusions The user-friendly graphical interface of the GLEaMviz tool, along with its high level

  4. The pyPHaz software, an interactive tool to analyze and visualize results from probabilistic hazard assessments

    Science.gov (United States)

    Tonini, Roberto; Selva, Jacopo; Costa, Antonio; Sandri, Laura

    2014-05-01

    Probabilistic Hazard Assessment (PHA) is becoming an essential tool for risk mitigation policies, since it allows to quantify the hazard due to hazardous phenomena and, differently from the deterministic approach, it accounts for both aleatory and epistemic uncertainties. On the other hand, one of the main disadvantages of PHA methods is that their results are not easy to understand and interpret by people who are not specialist in probabilistic tools. For scientists, this leads to the issue of providing tools that can be easily used and understood by decision makers (i.e., risk managers or local authorities). The work here presented fits into the problem of simplifying the transfer between scientific knowledge and land protection policies, by providing an interface between scientists, who produce PHA's results, and decision makers, who use PHA's results for risk analyses. In this framework we present pyPHaz, an open tool developed and designed to visualize and analyze PHA results due to one or more phenomena affecting a specific area of interest. The software implementation has been fully developed with the free and open-source Python programming language and some featured Python-based libraries and modules. The pyPHaz tool allows to visualize the Hazard Curves (HC) calculated in a selected target area together with different levels of uncertainty (mean and percentiles) on maps that can be interactively created and modified by the user, thanks to a dedicated Graphical User Interface (GUI). Moreover, the tool can be used to compare the results of different PHA models and to merge them, by creating ensemble models. The pyPHaz software has been designed with the features of storing and accessing all the data through a MySQL database and of being able to read as input the XML-based standard file formats defined in the frame of GEM (Global Earthquake Model). This format model is easy to extend also to any other kind of hazard, as it will be shown in the applications

  5. ATAQS: A computational software tool for high throughput transition optimization and validation for selected reaction monitoring mass spectrometry

    Directory of Open Access Journals (Sweden)

    Ramos Hector

    2011-03-01

    proteomics via SRM is a powerful new technique that enables the reproducible and accurate identification and quantification of sets of proteins of interest. ATAQS is the first open-source software that supports all steps of the targeted proteomics workflow. ATAQS also provides software API (Application Program Interface documentation that enables the addition of new algorithms to each of the workflow steps. The software, installation guide and sample dataset can be found in http://tools.proteomecenter.org/ATAQS/ATAQS.html

  6. Data Assimilation Tools for CO2 Reservoir Model Development – A Review of Key Data Types, Analyses, and Selected Software

    Energy Technology Data Exchange (ETDEWEB)

    Rockhold, Mark L.; Sullivan, E. C.; Murray, Christopher J.; Last, George V.; Black, Gary D.

    2009-09-30

    Pacific Northwest National Laboratory (PNNL) has embarked on an initiative to develop world-class capabilities for performing experimental and computational analyses associated with geologic sequestration of carbon dioxide. The ultimate goal of this initiative is to provide science-based solutions for helping to mitigate the adverse effects of greenhouse gas emissions. This Laboratory-Directed Research and Development (LDRD) initiative currently has two primary focus areas—advanced experimental methods and computational analysis. The experimental methods focus area involves the development of new experimental capabilities, supported in part by the U.S. Department of Energy’s (DOE) Environmental Molecular Science Laboratory (EMSL) housed at PNNL, for quantifying mineral reaction kinetics with CO2 under high temperature and pressure (supercritical) conditions. The computational analysis focus area involves numerical simulation of coupled, multi-scale processes associated with CO2 sequestration in geologic media, and the development of software to facilitate building and parameterizing conceptual and numerical models of subsurface reservoirs that represent geologic repositories for injected CO2. This report describes work in support of the computational analysis focus area. The computational analysis focus area currently consists of several collaborative research projects. These are all geared towards the development and application of conceptual and numerical models for geologic sequestration of CO2. The software being developed for this focus area is referred to as the Geologic Sequestration Software Suite or GS3. A wiki-based software framework is being developed to support GS3. This report summarizes work performed in FY09 on one of the LDRD projects in the computational analysis focus area. The title of this project is Data Assimilation Tools for CO2 Reservoir Model Development. Some key objectives of this project in FY09 were to assess the current state

  7. A Dedicated Computational Platform for Cellular Monte Carlo T-CAD Software Tools

    Science.gov (United States)

    2015-07-14

    as shown in figure 8. The actual synthesized load impedance is determined in post-processing through a Fourier transform . The magnitude and phase...for simulation purposes is arduous or incomplete. The acquired equipment has been successfully deployed and is making possible the extraction of...device parameters from simulation data performed at molecular resolution. For the first time, the parameters extracted with our tools will supply

  8. On a Formal Tool for Reasoning About Flight Software Cost Analysis

    Science.gov (United States)

    Spagnuolo, John N., Jr.; Stukes, Sherry A.

    2013-01-01

    A report focuses on the development of flight software (FSW) cost estimates for 16 Discovery-class missions at JPL. The techniques and procedures developed enabled streamlining of the FSW analysis process, and provided instantaneous confirmation that the data and processes used for these estimates were consistent across all missions. The research provides direction as to how to build a prototype rule-based system for FSW cost estimation that would provide (1) FSW cost estimates, (2) explanation of how the estimates were arrived at, (3) mapping of costs, (4) mathematical trend charts with explanations of why the trends are what they are, (5) tables with ancillary FSW data of interest to analysts, (6) a facility for expert modification/enhancement of the rules, and (7) a basis for conceptually convenient expansion into more complex, useful, and general rule-based systems.

  9. SIGSAC Software: A tool for the Management of Chronic Disease and Telecare.

    Science.gov (United States)

    Claudia, Bustamante; Claudia, Alcayaga; Ilta, Lange; Iñigo, Meza

    2012-01-01

    Chronic disease management is highly complex because multiple interventions are required to improve clinical outcomes. From the patient's perspective, his main problems are dealing with self-management without support and feeling isolated between clinical visits. A strategy for providing continuous self-management support is the use of communication technologies, such as the telephone. However, to be efficient and effective, an information system is required for telecare planning and follows up. The use of electronic clinical records facilitates the implementation of telecare, but those systems often do not allow to combine usual care (visits to the health clinics) with telecare. This paper presents the experience of developing an application called SIGSAC (Software de Información, Gestión y Seguimiento para el Autocuidado Crónico) for Chronic Disease Management and Telecare follow up.

  10. Combining On-Line Characterization Tools with Modern Software Environments for Optimal Operation of Polymerization Processes

    Directory of Open Access Journals (Sweden)

    Navid Ghadipasha

    2016-02-01

    Full Text Available This paper discusses the initial steps towards the formulation and implementation of a generic and flexible model centric framework for integrated simulation, estimation, optimization and feedback control of polymerization processes. For the first time it combines the powerful capabilities of the automatic continuous on-line monitoring of polymerization system (ACOMP, with a modern simulation, estimation and optimization software environment towards an integrated scheme for the optimal operation of polymeric processes. An initial validation of the framework was performed for modelling and optimization using literature data, illustrating the flexibility of the method to apply under different systems and conditions. Subsequently, off-line capabilities of the system were fully tested experimentally for model validations, parameter estimation and process optimization using ACOMP data. Experimental results are provided for free radical solution polymerization of methyl methacrylate.

  11. Use of slide presentation software as a tool to measure hip arthroplasty wear.

    Science.gov (United States)

    Yun, Ho Hyun; Jajodia, Nirmal K; Myung, Jae Sung; Oh, Jong Keon; Park, Sang Won; Shon, Won Yong

    2009-12-01

    The authors propose a manual measurement method for wear in total hip arthroplasty (PowerPoint method) based on the well-known Microsoft PowerPoint software (Microsoft Corporation, Redmond, Wash). In addition, the accuracy and reproducibility of the devised method were quantified and compared with two methods previously described by Livermore and Dorr, and accuracies were determined at different degrees of wear. The 57 hips recruited were allocated to: class 1 (retrieval series), class 2 (clinical series), and class 3 (a repeat film analysis series). The PowerPoint method was found to have good reproducibility and to better detect wear differences between classes. The devised method can be easily used for recording wear at follow-up visits and could be used as a supplementary method when computerized methods cannot be employed.

  12. Using Teamcenter engineering software for a successive punching tool lifecycle management

    Science.gov (United States)

    Blaga, F.; Pele, A.-V.; Stǎnǎşel, I.; Buidoş, T.; Hule, V.

    2015-11-01

    The paper presents studies and researches results of the implementation of Teamcenter (TC) integrated management of a product lifecycle, in a virtual enterprise. The results are able to be implemented also in a real enterprise. The product was considered a successive punching and cutting tool, designed to materialize a metal sheet part. The paper defines the technical documentation flow (flow of information) in the process of constructive computer aided design of the tool. After the design phase is completed a list of parts is generated containing standard or manufactured components (BOM, Bill of Materials). The BOM may be exported to MS Excel (.xls) format and can be transferred to other departments of the company in order to supply the necessary materials and resources to achieve the final product. This paper describes the procedure to modify or change certain dimensions of sheet metal part obtained by punching. After 3D and 2D design, the digital prototype of punching tool moves to following lifecycle phase of the manufacturing process. For each operation of the technological process the corresponding phases are described in detail. Teamcenter enables to describe manufacturing company structure, underlying workstations that carry out various operations of manufacturing process. The paper revealed that the implementation of Teamcenter PDM in a company, improves efficiency of managing product information, eliminating time working with search, verification and correction of documentation, while ensuring the uniqueness and completeness of the product data.

  13. YANA – a software tool for analyzing flux modes, gene-expression and enzyme activities

    Directory of Open Access Journals (Sweden)

    Engels Bernd

    2005-06-01

    Full Text Available Abstract Background A number of algorithms for steady state analysis of metabolic networks have been developed over the years. Of these, Elementary Mode Analysis (EMA has proven especially useful. Despite its low user-friendliness, METATOOL as a reliable high-performance implementation of the algorithm has been the instrument of choice up to now. As reported here, the analysis of metabolic networks has been improved by an editor and analyzer of metabolic flux modes. Analysis routines for expression levels and the most central, well connected metabolites and their metabolic connections are of particular interest. Results YANA features a platform-independent, dedicated toolbox for metabolic networks with a graphical user interface to calculate (integrating METATOOL, edit (including support for the SBML format, visualize, centralize, and compare elementary flux modes. Further, YANA calculates expected flux distributions for a given Elementary Mode (EM activity pattern and vice versa. Moreover, a dissection algorithm, a centralization algorithm, and an average diameter routine can be used to simplify and analyze complex networks. Proteomics or gene expression data give a rough indication of some individual enzyme activities, whereas the complete flux distribution in the network is often not known. As such data are noisy, YANA features a fast evolutionary algorithm (EA for the prediction of EM activities with minimum error, including alerts for inconsistent experimental data. We offer the possibility to include further known constraints (e.g. growth constraints in the EA calculation process. The redox metabolism around glutathione reductase serves as an illustration example. All software and documentation are available for download at http://yana.bioapps.biozentrum.uni-wuerzburg.de. Conclusion A graphical toolbox and an editor for METATOOL as well as a series of additional routines for metabolic network analyses constitute a new user

  14. A Critical Study of Effect of Web-Based Software Tools in Finding and Sharing Digital Resources--A Literature Review

    Science.gov (United States)

    Baig, Muntajeeb Ali

    2010-01-01

    The purpose of this paper is to review the effect of web-based software tools for finding and sharing digital resources. A positive correlation between learning and studying through online tools has been found in recent researches. In traditional classroom, searching resources are limited to the library and sharing of resources is limited to the…

  15. Safety assessment driving radioactive waste management solutions (SADRWMS Methodology) implemented in a software tool (SAFRAN)

    Energy Technology Data Exchange (ETDEWEB)

    Kinker, M., E-mail: M.Kinker@iaea.org [International Atomic Energy Agency (IAEA), Vienna (Austria); Avila, R.; Hofman, D., E-mail: rodolfo@facilia.se [FACILIA AB, Stockholm (Sweden); Jova Sed, L., E-mail: jovaluis@gmail.com [Centro Nacional de Seguridad Nuclear (CNSN), La Habana (Cuba); Ledroit, F., E-mail: frederic.ledroit@irsn.fr [IRSN PSN-EXP/SSRD/BTE, (France)

    2013-07-01

    In 2004, the International Atomic Energy Agency (IAEA) organized the International Project on Safety Assessment Driving Radioactive Waste Management Solutions (SADRWMS) to examine international approaches to safety assessment for predisposal management of radioactive waste. The initial outcome of the SADRWMS Project was achieved through the development of flowcharts which could be used to improve the mechanisms for applying safety assessment methodologies to predisposal management of radioactive waste. These flowcharts have since been incorporated into DS284 (General Safety Guide on the Safety Case and Safety Assessment for Predisposal Management of Radioactive Waste), and were also considered during the early development stages of the Safety Assessment Framework (SAFRAN) Tool. In 2009 the IAEA presented DS284 to the IAEA Waste Safety Standards Committee, during which it was proposed that the graded approach to safety case and safety assessment be illustrated through the development of Safety Reports for representative predisposal radioactive waste management facilities and activities. To oversee the development of these reports, it was agreed to establish the International Project on Complementary Safety Reports: Development and Application to Waste Management Facilities (CRAFT). The goal of the CRAFT project is to develop complementary reports by 2014, which the IAEA could then publish as IAEA Safety Reports. The present work describes how the DS284 methodology and SAFRAN Tool can be applied in the development and review of the safety case and safety assessment to a range of predisposal waste management facilities or activities within the Region. (author)

  16. Regional Economic Accounting (REAcct). A software tool for rapidly approximating economic impacts

    Energy Technology Data Exchange (ETDEWEB)

    Ehlen, Mark Andrew; Vargas, Vanessa N.; Loose, Verne William; Starks, Shirley J.; Ellebracht, Lory A.

    2011-07-01

    This paper describes the Regional Economic Accounting (REAcct) analysis tool that has been in use for the last 5 years to rapidly estimate approximate economic impacts for disruptions due to natural or manmade events. It is based on and derived from the well-known and extensively documented input-output modeling technique initially presented by Leontief and more recently further developed by numerous contributors. REAcct provides county-level economic impact estimates in terms of gross domestic product (GDP) and employment for any area in the United States. The process for using REAcct incorporates geospatial computational tools and site-specific economic data, permitting the identification of geographic impact zones that allow differential magnitude and duration estimates to be specified for regions affected by a simulated or actual event. Using these data as input to REAcct, the number of employees for 39 directly affected economic sectors (including 37 industry production sectors and 2 government sectors) are calculated and aggregated to provide direct impact estimates. Indirect estimates are then calculated using Regional Input-Output Modeling System (RIMS II) multipliers. The interdependent relationships between critical infrastructures, industries, and markets are captured by the relationships embedded in the inputoutput modeling structure.

  17. Determination of Flux linkage Characteristics and Inductance of a Submersible Switched Reluctance Motor using Software Tools

    Directory of Open Access Journals (Sweden)

    Sundaram Maruthachalam

    2011-01-01

    Full Text Available Problem statement: The Switched Reluctance Motor (SRM is an old member of the Electric Machines Family. It’s simple structure, ruggedness and inexpensive manufacturing capability make it more attractive for Industrial applications. Now, the applications of switched reluctance motors in various industrial fields are tried by many engineers. However, switched reluctance motors are not used so far in submersible underwater motor for agriculture purposee. The torque developed by an SRM is dependent on the change of flux-linkage and rotor position. The flux linkage characteristic of the motor is required to make the control circuit. Since the SRM is non-linear in nature, estimation and calculation of the flux linkage characteristics is very difficult. Approach: With the flux tube method concept a simple algorithm is being developed in a MATLAB. ANSYS Software is used to determine the flux distribution at various rotor positions. Results: The aligned and unaligned flux linkage values for theoretical calculation at a current of 7 A is 72.7 mwb and 13.79 mwb respectively. With FEA simulation the obtained value is 92.73 mwb and 19.175. Conclusion: In this and, a simplified method for the determination of flux linkage characteristics of submersible SRM using MATLAB has been presented. The obtained value has been validated with the ANSYS FEM method. the calculated unaligned and aligned inductance values of a 4- phase, 3 hp, 220 V Submersible SRM using simplified MATLAB method very much matches with the ANSYS FEM Method.

  18. Development of a new software tool, based on ANN technology, in neutron spectrometry and dosimetry research

    Energy Technology Data Exchange (ETDEWEB)

    Ortiz R, J.M.; Martinez B, M.R.; Vega C, H.R. [Universidad Autonoma de Zacatecas, Av. Ramon Lopez Velarde 801, A.P. 336, 98000 Zacatecas (Mexico)

    2007-07-01

    Artificial Intelligence is a branch of study which enhances the capability of computers by giving them human-like intelligence. The brain architecture has been extensively studied and attempts have been made to emulate it as in the Artificial Neural Network technology. A large variety of neural network architectures have been developed and they have gained wide-spread popularity over the last few decades. Their application is considered as a substitute for many classical techniques that have been used for many years, as in the case of neutron spectrometry and dosimetry research areas. In previous works, a new approach called Robust Design of Artificial Neural network was applied to build an ANN topology capable to solve the neutron spectrometry and dosimetry problems within the Mat lab programming environment. In this work, the knowledge stored at Mat lab ANN's synaptic weights was extracted in order to develop for first time a customized software application based on ANN technology, which is proposed to be used in the neutron spectrometry and simultaneous dosimetry fields. (Author)

  19. Building an infrastructure at PICKSC for the educational use of kinetic software tools

    Science.gov (United States)

    Mori, W. B.; Decyk, V. K.; Tableman, A.; Fonseca, R. A.; Tsung, F. S.; Hu, Q.; Winjum, B. J.; Amorim, L. D.; An, W.; Dalichaouch, T. N.; Davidson, A.; Joglekar, A.; Li, F.; May, J.; Touati, M.; Xu, X. L.; Yu, P.

    2016-10-01

    One aim of the Particle-In-Cell and Kinetic Simulation Center (PICKSC) at UCLA is to coordinate a community development of educational software for undergraduate and graduate courses in plasma physics and computer science. The rich array of physical behaviors exhibited by plasmas can be difficult to grasp by students. If they are given the ability to quickly and easily explore plasma physics through kinetic simulations, and to make illustrative visualizations of plasma waves, particle motion in electromagnetic fields, instabilities, or other phenomena, then they can be equipped with first-hand experiences that inform and contextualize conventional texts and lectures. We are developing an infrastructure for any interested persons to take our kinetic codes, run them without any prerequisite knowledge, and explore desired scenarios. Furthermore, we are actively interested in any ideas or input from other plasma physicists. This poster aims to illustrate what we have developed and gather a community of interested users and developers. Supported by NSF under Grant ACI-1339893.

  20. CAGO: a software tool for dynamic visual comparison and correlation measurement of genome organization.

    Science.gov (United States)

    Chang, Yi-Feng; Chang, Chuan-Hsiung

    2011-01-01

    CAGO (Comparative Analysis of Genome Organization) is developed to address two critical shortcomings of conventional genome atlas plotters: lack of dynamic exploratory functions and absence of signal analysis for genomic properties. With dynamic exploratory functions, users can directly manipulate chromosome tracks of a genome atlas and intuitively identify distinct genomic signals by visual comparison. Signal analysis of genomic properties can further detect inconspicuous patterns from noisy genomic properties and calculate correlations between genomic properties across various genomes. To implement dynamic exploratory functions, CAGO presents each genome atlas in Scalable Vector Graphics (SVG) format and allows users to interact with it using a SVG viewer through JavaScript. Signal analysis functions are implemented using R statistical software and a discrete wavelet transformation package waveslim. CAGO is not only a plotter for generating complex genome atlases, but also a platform for exploring genome atlases with dynamic exploratory functions for visual comparison and with signal analysis for comparing genomic properties across multiple organisms. The web-based application of CAGO, its source code, user guides, video demos, and live examples are publicly available and can be accessed at http://cbs.ym.edu.tw/cago.

  1. CAGO: a software tool for dynamic visual comparison and correlation measurement of genome organization.

    Directory of Open Access Journals (Sweden)

    Yi-Feng Chang

    Full Text Available CAGO (Comparative Analysis of Genome Organization is developed to address two critical shortcomings of conventional genome atlas plotters: lack of dynamic exploratory functions and absence of signal analysis for genomic properties. With dynamic exploratory functions, users can directly manipulate chromosome tracks of a genome atlas and intuitively identify distinct genomic signals by visual comparison. Signal analysis of genomic properties can further detect inconspicuous patterns from noisy genomic properties and calculate correlations between genomic properties across various genomes. To implement dynamic exploratory functions, CAGO presents each genome atlas in Scalable Vector Graphics (SVG format and allows users to interact with it using a SVG viewer through JavaScript. Signal analysis functions are implemented using R statistical software and a discrete wavelet transformation package waveslim. CAGO is not only a plotter for generating complex genome atlases, but also a platform for exploring genome atlases with dynamic exploratory functions for visual comparison and with signal analysis for comparing genomic properties across multiple organisms. The web-based application of CAGO, its source code, user guides, video demos, and live examples are publicly available and can be accessed at http://cbs.ym.edu.tw/cago.

  2. Using Numerical Models in the Development of Software Tools for Risk Management of Accidents with Oil and Inert Spills

    Science.gov (United States)

    Fernandes, R.; Leitão, P. C.; Braunschweig, F.; Lourenço, F.; Galvão, P.; Neves, R.

    2012-04-01

    substances, helping in the management of the crisis, in the distribution of response resources, or prioritizing specific areas. They can also be used for detection of pollution sources. However, the resources involved, and the scientific and technological levels needed in the manipulation of numerical models, had both limited the interoperability between operational models, monitoring tools and decision-support software tools. The increasing predictive capacity of metocean conditions and fate and behaviour of pollutants spilt at sea or costal zones, and the presence of monitoring tools like vessel traffic control systems, can both provide a safer support for decision-making in emergency or planning issues associated to pollution risk management, especially if used in an integrated way. Following this approach, and taking advantage of an integrated framework developed in ARCOPOL (www.arcopol.eu) and EASYCO (www.project-easy.info) projects, three innovative model-supported software tools were developed and applied in the Atlantic Area, and / or the Portuguese Coast. Two of these tools are used for spill model simulations - a web-based interface (EASYCO web bidirectional tool) and an advanced desktop application (MOHID Desktop Spill Simulator) - both of them allowing end user to have control over the model simulations. Parameters such as date and time of the event, location and oil spill volume are provided the users; these interactive tools also integrate best available metocean forecasts (waves, meteorological, hydrodynamics) from different institutions in the Atlantic Area. Metocean data are continuously gathered from remote THREDDS data servers (using OPENDAP) or ftp sites, and then automatically interpolated and pre-processed to be available for the simulators. These simulation tools developed can also import initial data and export results from/to remote servers, using OGC WFS services. Simulations are provided to end user in a matter of seconds, and thus, can be very

  3. Software tools and preliminary design of a control system for the 40m OAN radiotelescope

    Science.gov (United States)

    de Vicente, P.; Bolaño, R.

    2004-07-01

    The Observatorio Astronómico Nacional (OAN) is building a 40m radiotelescope in its facilities in Yebes (Spain) which will be delivered by April 2004. The servosystem will be controlled by an ACU (Antenna Control Unit), a real time computer running VxWorks which will be commanded from a remote computer (RCC) or from a local computer (LCC) which will act as console. We present the tools we have chosen to develop and use the control system for the RCC and the criteria followed for the choices we made. We also present a preliminary design of the control system on which we are currently working. The RCC will run a server which communicates with the ACU using sockets and with the clients, receivers and backends using OmniOrb, a free implementation of CORBA. Clients running Python will allow the users to control the antenna from any host connected to a LAN or a secure Internet connection.

  4. MAAC: a software tool for user authentication and access control to the electronic patient record in an open distributed environment

    Science.gov (United States)

    Motta, Gustavo H.; Furuie, Sergio S.

    2004-04-01

    Designing proper models for authorization and access control for the electronic patient record (EPR) is essential to wide scale use of the EPR in large health organizations. This work presents MAAC (Middleware for Authentication and Access Control), a tool that implements a contextual role-based access control (RBAC) authorization model. RBAC regulates user"s access to computers resources based on their organizational roles. A contextual authorization uses environmental information available at access-request time, like user/patient relationship, in order to decide whether a user has the right to access an EPR resource. The software architecture where MAAC is implemented uses Lightweight Directory Access Protocol, Java programming language and the CORBA/OMG standards CORBA Security Service and Resource Access Decision Facility. With those open and distributed standards, heterogeneous EPR components can request user authentication and access authorization services in a unified and consistent fashion across multiple platforms.

  5. Neurophysiological analytics for all! Free open-source software tools for documenting, analyzing, visualizing, and sharing using electronic notebooks.

    Science.gov (United States)

    Rosenberg, David M; Horn, Charles C

    2016-08-01

    Neurophysiology requires an extensive workflow of information analysis routines, which often includes incompatible proprietary software, introducing limitations based on financial costs, transfer of data between platforms, and the ability to share. An ecosystem of free open-source software exists to fill these gaps, including thousands of analysis and plotting packages written in Python and R, which can be implemented in a sharable and reproducible format, such as the Jupyter electronic notebook. This tool chain can largely replace current routines by importing data, producing analyses, and generating publication-quality graphics. An electronic notebook like Jupyter allows these analyses, along with documentation of procedures, to display locally or remotely in an internet browser, which can be saved as an HTML, PDF, or other file format for sharing with team members and the scientific community. The present report illustrates these methods using data from electrophysiological recordings of the musk shrew vagus-a model system to investigate gut-brain communication, for example, in cancer chemotherapy-induced emesis. We show methods for spike sorting (including statistical validation), spike train analysis, and analysis of compound action potentials in notebooks. Raw data and code are available from notebooks in data supplements or from an executable online version, which replicates all analyses without installing software-an implementation of reproducible research. This demonstrates the promise of combining disparate analyses into one platform, along with the ease of sharing this work. In an age of diverse, high-throughput computational workflows, this methodology can increase efficiency, transparency, and the collaborative potential of neurophysiological research.

  6. STUDY REGARDING THE USE OF THE TOOLS OFFERED BY MICROSOFT EXCEL SOFTWARE IN THE ACTIVITY OF THE BIHOR COUNTY COMPANIES

    Directory of Open Access Journals (Sweden)

    Țarca Naiana

    2014-07-01

    Full Text Available A business activity involves many decision situations. These include organization, aggregation, processing, modelling and analysis of large volumes of information. By using specialized software that provides tools for analysis, aggregation and data modeling, company managers can quickly obtain significant information about company activity. For different reasons some companies are opting for a summary analysis of data, while others for a complex analysis of the data. Many companies use spreadsheet applications for inventory, data processing, modeling and analysis, useful for business activities. Microsoft Excel software is used by many of those who know and use spreadsheet applications for carrying out the work. Using tools to organize, aggregate, modelling and data analysis provided by spreadsheet application, these companies can make complex economic analyses and prognoses that lead to better decisions. For example, the Pivot tables are a simple way to extract relevant information from complex data sets in a quick and simple manner. Solver can be used to solve optimization problems. Correlation is useful in interpreting the relationships between various indicators. Based on these considerations we conducted a study in which we sought to obtain information on how instruments such as Pivot tables, Solver and Correlation are used in the business activities of companies. Companies that attaches high importance of using Pivot tables are medium and large companies. Among the companies using Solver tool, very little use GRG Nonlinear and Evolutionary Algorithms. Therefore, the Solver is used more to resolving optimization problems involving linear modeling. Correlation tool, which could help decision makers to understand more easily why the increasing of one of the factors implies the change of other factors and consequently help them make better decisions, is used far too little. Still too many companies give less importance to data organizing

  7. GPU-FS-kNN: a software tool for fast and scalable kNN computation using GPUs.

    Directory of Open Access Journals (Sweden)

    Ahmed Shamsul Arefin

    Full Text Available BACKGROUND: The analysis of biological networks has become a major challenge due to the recent development of high-throughput techniques that are rapidly producing very large data sets. The exploding volumes of biological data are craving for extreme computational power and special computing facilities (i.e. super-computers. An inexpensive solution, such as General Purpose computation based on Graphics Processing Units (GPGPU, can be adapted to tackle this challenge, but the limitation of the device internal memory can pose a new problem of scalability. An efficient data and computational parallelism with partitioning is required to provide a fast and scalable solution to this problem. RESULTS: We propose an efficient parallel formulation of the k-Nearest Neighbour (kNN search problem, which is a popular method for classifying objects in several fields of research, such as pattern recognition, machine learning and bioinformatics. Being very simple and straightforward, the performance of the kNN search degrades dramatically for large data sets, since the task is computationally intensive. The proposed approach is not only fast but also scalable to large-scale instances. Based on our approach, we implemented a software tool GPU-FS-kNN (GPU-based Fast and Scalable k-Nearest Neighbour for CUDA enabled GPUs. The basic approach is simple and adaptable to other available GPU architectures. We observed speed-ups of 50-60 times compared with CPU implementation on a well-known breast microarray study and its associated data sets. CONCLUSION: Our GPU-based Fast and Scalable k-Nearest Neighbour search technique (GPU-FS-kNN provides a significant performance improvement for nearest neighbour computation in large-scale networks. Source code and the software tool is available under GNU Public License (GPL at https://sourceforge.net/p/gpufsknn/.

  8. The DSET Tool Library: A software approach to enable data exchange between climate system models

    Energy Technology Data Exchange (ETDEWEB)

    McCormick, J. [Lawrence Livermore National Lab., CA (United States)

    1994-12-01

    Climate modeling is a computationally intensive process. Until recently computers were not powerful enough to perform the complex calculations required to simulate the earth`s climate. As a result standalone programs were created that represent components of the earth`s climate (e.g., Atmospheric Circulation Model). However, recent advances in computing, including massively parallel computing, make it possible to couple the components forming a complete earth climate simulation. The ability to couple different climate model components will significantly improve our ability to predict climate accurately and reliably. Historically each major component of the coupled earth simulation is a standalone program designed independently with different coordinate systems and data representations. In order for two component models to be coupled, the data of one model must be mapped to the coordinate system of the second model. The focus of this project is to provide a general tool to facilitate the mapping of data between simulation components, with an emphasis on using object-oriented programming techniques to provide polynomial interpolation, line and area weighting, and aggregation services.

  9. freeQuant: A Mass Spectrometry Label-Free Quantification Software Tool for Complex Proteome Analysis.

    Science.gov (United States)

    Deng, Ning; Li, Zhenye; Pan, Chao; Duan, Huilong

    2015-01-01

    Study of complex proteome brings forward higher request for the quantification method using mass spectrometry technology. In this paper, we present a mass spectrometry label-free quantification tool for complex proteomes, called freeQuant, which integrated quantification with functional analysis effectively. freeQuant consists of two well-integrated modules: label-free quantification and functional analysis with biomedical knowledge. freeQuant supports label-free quantitative analysis which makes full use of tandem mass spectrometry (MS/MS) spectral count, protein sequence length, shared peptides, and ion intensity. It adopts spectral count for quantitative analysis and builds a new method for shared peptides to accurately evaluate abundance of isoforms. For proteins with low abundance, MS/MS total ion count coupled with spectral count is included to ensure accurate protein quantification. Furthermore, freeQuant supports the large-scale functional annotations for complex proteomes. Mitochondrial proteomes from the mouse heart, the mouse liver, and the human heart were used to evaluate the usability and performance of freeQuant. The evaluation showed that the quantitative algorithms implemented in freeQuant can improve accuracy of quantification with better dynamic range.

  10. The Use of Automated Software Tools in Evaluating an e-Learning Platform Quality

    Directory of Open Access Journals (Sweden)

    George Suciu

    2012-09-01

    Full Text Available

    This paper proposes an expert system which can be used to evaluate the quality of an e-learning platform. The proposed expert system is using the modified version of the SEEQUEL Core Quality Framework and it was built using CLIPS expert system generator. The SEEQUEL Core Quality Framework originated from the collaboration between the e-learning Industry Group (eLIG with a number of European expert organizations and associations, coordinated by the MENON Network, is a framework used to build the quality tree by selecting the quality characteristics from a list of common characteristics applicable to the whole e-learning experience. CLIPS is a productive development and delivery expert system tool which provides a complete environment for the construction of rule based expert systems.

    In the first part of this paper the SEEQUEL Core Quality Framework and CLIPS expert system generator are presented showing the advantage of using an expert system for this task. In the second part, a case study of evaluating an e-learning platform is presented. The final conclusion of the experiment was that an expert system can successfully replace a human expert for the proposed task.

  11. The Use of Automated Software Tools in Evaluating an e-Learning Platform Quality

    Directory of Open Access Journals (Sweden)

    Traian Lucian Militaru

    2012-09-01

    Full Text Available This paper proposes an expert system which can be used to evaluate the quality of an e-learning platform. The proposed expert system is using the modified version of the SEEQUEL Core Quality Framework and it was built using CLIPS expert system generator. The SEEQUEL Core Quality Framework originated from the collaboration between the e-learning Industry Group (eLIG with a number of European expert organizations and associations, coordinated by the MENON Network, is a framework used to build the quality tree by selecting the quality characteristics from a list of common characteristics applicable to the whole e-learning experience. CLIPS is a productive development and delivery expert system tool which provides a complete environment for the construction of rule based expert systems. In the first part of this paper the SEEQUEL Core Quality Framework and CLIPS expert system generator are presented showing the advantage of using an expert system for this task. In the second part, a case study of evaluating an e-learning platform is presented. The final conclusion of the experiment was that an expert system can successfully replace a human expert for the proposed task.

  12. UMMPerfusion: an open source software tool towards quantitative MRI perfusion analysis in clinical routine.

    Science.gov (United States)

    Zöllner, Frank G; Weisser, Gerald; Reich, Marcel; Kaiser, Sven; Schoenberg, Stefan O; Sourbron, Steven P; Schad, Lothar R

    2013-04-01

    To develop a generic Open Source MRI perfusion analysis tool for quantitative parameter mapping to be used in a clinical workflow and methods for quality management of perfusion data. We implemented a classic, pixel-by-pixel deconvolution approach to quantify T1-weighted contrast-enhanced dynamic MR imaging (DCE-MRI) perfusion data as an OsiriX plug-in. It features parallel computing capabilities and an automated reporting scheme for quality management. Furthermore, by our implementation design, it could be easily extendable to other perfusion algorithms. Obtained results are saved as DICOM objects and directly added to the patient study. The plug-in was evaluated on ten MR perfusion data sets of the prostate and a calibration data set by comparing obtained parametric maps (plasma flow, volume of distribution, and mean transit time) to a widely used reference implementation in IDL. For all data, parametric maps could be calculated and the plug-in worked correctly and stable. On average, a deviation of 0.032 ± 0.02 ml/100 ml/min for the plasma flow, 0.004 ± 0.0007 ml/100 ml for the volume of distribution, and 0.037 ± 0.03 s for the mean transit time between our implementation and a reference implementation was observed. By using computer hardware with eight CPU cores, calculation time could be reduced by a factor of 2.5. We developed successfully an Open Source OsiriX plug-in for T1-DCE-MRI perfusion analysis in a routine quality managed clinical environment. Using model-free deconvolution, it allows for perfusion analysis in various clinical applications. By our plug-in, information about measured physiological processes can be obtained and transferred into clinical practice.

  13. The Liege Acromegaly Survey (LAS): a new software tool for the study of acromegaly.

    Science.gov (United States)

    Petrossians, Patrick; Tichomirowa, Maria A; Stevenaert, Achile; Martin, Didier; Daly, Adrian F; Beckers, Albert

    2012-06-01

    Acromegaly is a chronic rare disease associated with negative pathological effects on multiple systems and organs. We designed a new informatics tool to study data from patients with acromegaly, the Liege Acromegaly Survey (LAS). This relational database permits the inclusion of anonymous historical and prospective data on patients and includes pathophysiology, clinical features, responses to therapy and long term outcomes of acromegaly. We deployed the LAS in a validation study at a single center in order to study the characteristics of patients with acromegaly diagnosed at our center from 1970-2011. A total of 290 patients with acromegaly were included (147 males and 143 females). There was a linear relationship between age at diagnosis and the date of diagnosis, indicating that older patients are being diagnosed with acromegaly more frequently. A majority presented with macroadenomas (77.5%) and the median diameter was 14 mm. Patients with macroadenomas were significantly younger than patients with microadenomas (P=0.01). GH values at diagnosis decreased with the age of the patients (P=0.01) and there was a correlation between GH values and tumor size at diagnosis (P=0.02). No correlation existed between insulin-like growth factor 1 (IGF-1) levels and tumor characteristics. The prevalence of diabetes was 21.4% in this population and 41.0% had hypertension. The presence of hypertension and diabetes were significantly associated with one another (P<0.001). There was a linear relation between initial GH and IGF-1 levels at diagnosis and those obtained during SSA analog treatment and the lowest GH and IGF-1 values following SSA therapy were obtained in older patients (GH: P<0.001; IGF-1: P<0.001). The LAS is a new relational database that is feasible to use in the clinical research setting and permits ready pooling of anonymous patient data from multiple study sites to undertake robust statistical analyses of clinical and therapeutic characteristics.

  14. Integrative Biological Chemistry Program Includes The Use Of Informatics Tools, GIS And SAS Software Applications.

    Science.gov (United States)

    D'Souza, Malcolm J; Kashmar, Richard J; Hurst, Kent; Fiedler, Frank; Gross, Catherine E; Deol, Jasbir K; Wilson, Alora

    Wesley College is a private, primarily undergraduate minority-serving institution located in the historic district of Dover, Delaware (DE). The College recently revised its baccalaureate biological chemistry program requirements to include a one-semester Physical Chemistry for the Life Sciences course and project-based experiential learning courses using instrumentation, data-collection, data-storage, statistical-modeling analysis, visualization, and computational techniques. In this revised curriculum, students begin with a traditional set of biology, chemistry, physics, and mathematics major core-requirements, a geographic information systems (GIS) course, a choice of an instrumental analysis course or a statistical analysis systems (SAS) programming course, and then, students can add major-electives that further add depth and value to their future post-graduate specialty areas. Open-sourced georeferenced census, health and health disparity data were coupled with GIS and SAS tools, in a public health surveillance system project, based on US county zip-codes, to develop use-cases for chronic adult obesity where income, poverty status, health insurance coverage, education, and age were categorical variables. Across the 48 contiguous states, obesity rates are found to be directly proportional to high poverty and inversely proportional to median income and educational achievement. For the State of Delaware, age and educational attainment were found to be limiting obesity risk-factors in its adult population. Furthermore, the 2004-2010 obesity trends showed that for two of the less densely populated Delaware counties; Sussex and Kent, the rates of adult obesity were found to be progressing at much higher proportions when compared to the national average.

  15. Visualization of 5D Assimilation Data for Meteorological Forecasting and Its Related Disaster Mitigations Utilizing Vis5D of Software Tool

    Directory of Open Access Journals (Sweden)

    Kohei Arai

    2013-09-01

    Full Text Available Method for visualization of 5D assimilation data for meteorological forecasting and its related disaster mitigations utilizing Vis5D of software tool is proposed. In order to mitigate severe weather related disaster, meteorological forecasting and prediction is needed. There are some numerical weather forecasting data, in particular, assimilation data. Time series of three dimensional geophysical parameters have to be represented visually onto computer display in a comprehensive manner. On the other hand, there are some visualization software tools. In particular, Vis5D of software tool for animation of three dimensional imagery data can be displayed. Through experiments with NCEP/GDAS assimilation data, it is found that the proposed method is appropriate for representation of 5D assimilation data in a comprehensive manner.

  16. Assessment of the effect of Nd:YAG laser pulse operating parameters on the metallurgical characteristics of different tool steels using DOE software

    Directory of Open Access Journals (Sweden)

    T. Muhič

    2011-04-01

    Full Text Available To ensure the reliability of repair welded tool surfaces, clad quality should be improved. The relationships between metallurgical characteristics of cladding and laser input welding parameters were studied using the design of experiments software. The influence of laser power, welding speed, focal point position and diameter of welding wire on the weld-bead geometry (i.e. penetration, cladding zone width and heat-affected-zone width, microstructural homogeneity, dilution and bond strength was investigated on commonly used tool steels 1,2083, 1,2312 and 1,2343, using DOE software.

  17. IMPACTO DEL USO DEL SOFTWARE CMAP-TOOLS EN LA TÉCNICA DE LOS MAPAS CONCEPTUALES

    Directory of Open Access Journals (Sweden)

    Susy Karina Dávila Panduro

    2012-12-01

    Full Text Available La investigación tuvo como objetivo: aplicar el software Cmap-Tools en el uso de mapas conceptuales para cátedras de Ciencias Sociales en la Facultad de Ciencias de la Educación y Humanidades de la UNAP en la ciudad de Iquitos, en el año 2011. El estudio pertenece al tipo experimental y el diseño fue el pre-experimental de tipo Diseño de Comparación Estática o Comparación de Grupos sólo. La población estuvo conformada por los estudiantes de la especialidad de Ciencias Sociales de la Facultad de Ciencias de la Educación y Humanidades de la UNAP que hacen un total de 147, la determinación de la muestra fue en forma no probabilística intencionada y estuvo conformada por 44 estudiantes.La técnica que se  empleó en la recolección de los datos fue: la encuesta, el instrumento fue el cuestionario.Para el procesamiento de los datos se utilizó el programa computarizado SPSS versión 17 en español con lo que se obtuvo la matriz de datos con lo que se organizó los datos en tablas y gráficos.Para el análisis e interpretación de los datos se empleó la estadística descriptiva: frecuencia, promedios simples y porcentaje y la estadística inferencial no paramétrica  de Chi Cuadrada (X2. Para la constatación de la hipótesis principal se utilizó la prueba estadística inferencial no paramétrica X2 de Chi Cuadrada con µ = 0,01; gl = 2 obteniendo X2c = 25,83; X2t = 9,21; es decir X2c> X2t se aceptó la hipótesis de la investigación: “A través del aplicación del software Cmap-Tools se mejorará el uso de mapas conceptuales para cátedras de Ciencias Sociales en la Facultad de Ciencias de la Educación y Humanidades de la UNAP en la ciudad de Iquitos en el año 2011”.

  18. IMPACTO DEL USO DEL SOFTWARE CMAP-TOOLS EN LA TÉCNICA DE LOS MAPAS CONCEPTUALES

    Directory of Open Access Journals (Sweden)

    Susy Karina Dávila Panduro

    2012-12-01

    Full Text Available La investigación tuvo como objetivo: aplicar el software Cmap-Tools en el uso de mapas conceptuales para cátedras de Ciencias Sociales en la Facultad de Ciencias de la Educación y Humanidades de la UNAP en la ciudad de Iquitos, en el año 2011. El estudio pertenece al tipo experimental y el diseño fue el pre-experimental de tipo Diseño de Comparación Estática o Comparación de Grupos sólo. La población estuvo conformada por los estudiantes de la especialidad de Ciencias Sociales de la Facultad de Ciencias de la Educación y Humanidades de la UNAP que hacen un total de 147, la determinación de la muestra fue en forma no probabilística intencionada y estuvo conformada por 44 estudiantes.La técnica que se  empleó en la recolección de los datos fue: la encuesta, el instrumento fue el cuestionario.Para el procesamiento de los datos se utilizó el programa computarizado SPSS versión 17 en español con lo que se obtuvo la matriz de datos con lo que se organizó los datos en tablas y gráficos.Para el análisis e interpretación de los datos se empleó la estadística descriptiva: frecuencia, promedios simples y porcentaje y la estadística inferencial no paramétrica  de Chi Cuadrada (X2. Para la constatación de la hipótesis principal se utilizó la prueba estadística inferencial no paramétrica X2 de Chi Cuadrada con µ = 0,01; gl = 2 obteniendo X2c = 25,83; X2t = 9,21; es decir X2c> X2t se aceptó la hipótesis de la investigación: “A través de la aplicación del software Cmap-Tools se mejorará el uso de mapas conceptuales para cátedras de Ciencias Sociales en la Facultad de Ciencias de la Educación y Humanidades de la UNAP en la ciudad de Iquitos en el año 2011”.

  19. Software Reviews.

    Science.gov (United States)

    Dwyer, Donna; And Others

    1989-01-01

    Reviewed are seven software packages for Apple and IBM computers. Included are: "Toxicology"; "Science Corner: Space Probe"; "Alcohol and Pregnancy"; "Science Tool Kit Plus"; Computer Investigations: Plant Growth"; "Climatrolls"; and "Animal Watch: Whales." (CW)

  20. Detecting variants with Metabolic Design, a new software tool to design probes for explorative functional DNA microarray development

    Directory of Open Access Journals (Sweden)

    Gravelat Fabrice

    2010-09-01

    Full Text Available Abstract Background Microorganisms display vast diversity, and each one has its own set of genes, cell components and metabolic reactions. To assess their huge unexploited metabolic potential in different ecosystems, we need high throughput tools, such as functional microarrays, that allow the simultaneous analysis of thousands of genes. However, most classical functional microarrays use specific probes that monitor only known sequences, and so fail to cover the full microbial gene diversity present in complex environments. We have thus developed an algorithm, implemented in the user-friendly program Metabolic Design, to design efficient explorative probes. Results First we have validated our approach by studying eight enzymes involved in the degradation of polycyclic aromatic hydrocarbons from the model strain Sphingomonas paucimobilis sp. EPA505 using a designed microarray of 8,048 probes. As expected, microarray assays identified the targeted set of genes induced during biodegradation kinetics experiments with various pollutants. We have then confirmed the identity of these new genes by sequencing, and corroborated the quantitative discrimination of our microarray by quantitative real-time PCR. Finally, we have assessed metabolic capacities of microbial communities in soil contaminated with aromatic hydrocarbons. Results show that our probe design (sensitivity and explorative quality can be used to study a complex environment efficiently. Conclusions We successfully use our microarray to detect gene expression encoding enzymes involved in polycyclic aromatic hydrocarbon degradation for the model strain. In addition, DNA microarray experiments performed on soil polluted by organic pollutants without prior sequence assumptions demonstrate high specificity and sensitivity for gene detection. Metabolic Design is thus a powerful, efficient tool that can be used to design explorative probes and monitor metabolic pathways in complex environments

  1. ImaSim, a software tool for basic education of medical x-ray imaging in radiotherapy and radiology

    Science.gov (United States)

    Landry, Guillaume; deBlois, François; Verhaegen, Frank

    2013-11-01

    Introduction: X-ray imaging is an important part of medicine and plays a crucial role in radiotherapy. Education in this field is mostly limited to textbook teaching due to equipment restrictions. A novel simulation tool, ImaSim, for teaching the fundamentals of the x-ray imaging process based on ray-tracing is presented in this work. ImaSim is used interactively via a graphical user interface (GUI). Materials and methods: The software package covers the main x-ray based medical modalities: planar kilo voltage (kV), planar (portal) mega voltage (MV), fan beam computed tomography (CT) and cone beam CT (CBCT) imaging. The user can modify the photon source, object to be imaged and imaging setup with three-dimensional editors. Objects are currently obtained by combining blocks with variable shapes. The imaging of three-dimensional voxelized geometries is currently not implemented, but can be added in a later release. The program follows a ray-tracing approach, ignoring photon scatter in its current implementation. Simulations of a phantom CT scan were generated in ImaSim and were compared to measured data in terms of CT number accuracy. Spatial variations in the photon fluence and mean energy from an x-ray tube caused by the heel effect were estimated from ImaSim and Monte Carlo simulations and compared. Results: In this paper we describe ImaSim and provide two examples of its capabilities. CT numbers were found to agree within 36 Hounsfield Units (HU) for bone, which corresponds to a 2% attenuation coefficient difference. ImaSim reproduced the heel effect reasonably well when compared to Monte Carlo simulations. Discussion: An x-ray imaging simulation tool is made available for teaching and research purposes. ImaSim provides a means to facilitate the teaching of medical x-ray imaging.

  2. ImaSim, a software tool for basic education of medical x-ray imaging in radiotherapy and radiology

    Directory of Open Access Journals (Sweden)

    guillaume elandry

    2013-11-01

    Full Text Available Introduction: X-ray imaging is an important part of medicine and plays a crucial role in radiotherapy. Education in this field is mostly limited to textbook teaching due to equipment restrictions. A novel simulation tool, ImaSim, for teaching the fundamentals of the x-ray imaging process based on ray-tracing is presented in this work. ImaSim is used interactively via a graphical user interface (GUI.Materials and methods: The software package covers the main x-ray based medical modalities: planar kilo voltage (kV, planar (portal mega voltage (MV, fan beam computed tomography (CT and cone beam CT (CBCT imaging. The user can modify the photon source, object to be imaged and imaging setup with three-dimensional editors. Objects are currently obtained by combining blocks with variable shapes. The imaging of three-dimensional voxelized geometries is currently not implemented, but can be added in a later release. The program follows a ray-tracing approach, ignoring photon scatter in its current implementation. Simulations of a phantom CT scan were generated in ImaSim and were compared to measured data in terms of CT number accuracy. Spatial variations in the photon fluence and mean energy from an x-ray tube caused by the heel effect were estimated from ImaSim and Monte Carlo simulations and compared.Results: In this paper we describe ImaSim and provide two examples of its capabilities. CT numbers were found to agree within 36 Hounsfield Units (HU for bone, which corresponds to a 2% attenuation coefficient difference. ImaSim reproduced the heel effect reasonably well when compared to Monte Carlo simulations. Discussion: An x-ray imaging simulation tool is made available for teaching and research purposes. ImaSim provides a means to facilitate the teaching of medical x-ray imaging.

  3. Developing a Generic Risk Assessment Simulation Modelling Software Tool for Assessing the Risk of Foot and Mouth Virus Introduction.

    Science.gov (United States)

    Tameru, B; Gebremadhin, B; Habtemariam, T; Nganwa, D; Ayanwale, O; Wilson, S; Robnett, V; Wilson, W

    2008-06-01

    Foot and Mouth disease (FMD) is a highly contagious viral disease that affects all cloven-hoofed animals. Because of its devastating effects on the agricultural industry, many countries take measures to stop the introduction of FMD virus into their countries. Decision makers at multiple levels of the United States Department of Agriculture (USDA) use Risk Assessments (RAs) (both quantitative and qualitative) to make better and more informed scientifically based decisions to prevent the accidental or intentional introduction of the disease. There is a need for a generic RA that can be applied to any country (whether FMD free or non-FMD free) and for any product (FMD infected animals and animal products). We developed a user-friendly generic RA tool (software) that can be used to conduct and examine different scenarios of quantitative/qualitative risk assessments for the different countries with their varying FMD statuses in relation to reintroduction of FMD virus into the USA. The program was written in Microsoft Visual Basic 6.0 (Microsoft Corporation, Redmond, Washington, USA). The @Risk 6.1 Developer Kit (RDK) and @Risk 6.1 Best Fit Kit library (Palisade Corporation, Newfield, NY.USA) was used to build Monte Carlo simulation models. Microsoft Access 2000 (Microsoft Corporation, Redmond, Washington, USA) was used and SQL to query the data. Different input probability distributions can be selected for the nodes in the scenario tree and different output for each end-state of the simulation is given in different graphical formats and statistical values are used in describing the likelihood of FMD virus introduction. Sensitivity Analysis in determining which input factor has more effect on the total risk outputs is also given. The developed generic RA tools can be eventually extended and modified to conduct RAs for other animal diseases and animal products.

  4. WASI-2D: A software tool for regionally optimized analysis of imaging spectrometer data from deep and shallow waters

    Science.gov (United States)

    Gege, Peter

    2014-01-01

    An image processing software has been developed which allows quantitative analysis of multi- and hyperspectral data from oceanic, coastal and inland waters. It has been implemented into the Water Colour Simulator WASI, which is a tool for the simulation and analysis of optical properties and light field parameters of deep and shallow waters. The new module WASI-2D can import atmospherically corrected images from airborne sensors and satellite instruments in various data formats and units like remote sensing reflectance or radiance. It can be easily adapted by the user to different sensors and to optical properties of the studied area. Data analysis is done by inverse modelling using established analytical models. The bio-optical model of the water column accounts for gelbstoff (coloured dissolved organic matter, CDOM), detritus, and mixtures of up to 6 phytoplankton classes and 2 spectrally different types of suspended matter. The reflectance of the sea floor is treated as sum of up to 6 substrate types. An analytic model of downwelling irradiance allows wavelength dependent modelling of sun glint and sky glint at the water surface. The provided database covers the spectral range from 350 to 1000 nm in 1 nm intervals. It can be exchanged easily to represent the optical properties of water constituents, bottom types and the atmosphere of the studied area.

  5. Using a Software Tool in Forecasting: a Case Study of Sales Forecasting Taking into Account Data Uncertainty

    Science.gov (United States)

    Fabianová, Jana; Kačmáry, Peter; Molnár, Vieroslav; Michalik, Peter

    2016-10-01

    Forecasting is one of the logistics activities and a sales forecast is the starting point for the elaboration of business plans. Forecast accuracy affects the business outcomes and ultimately may significantly affect the economic stability of the company. The accuracy of the prediction depends on the suitability of the use of forecasting methods, experience, quality of input data, time period and other factors. The input data are usually not deterministic but they are often of random nature. They are affected by uncertainties of the market environment, and many other factors. Taking into account the input data uncertainty, the forecast error can by reduced. This article deals with the use of the software tool for incorporating data uncertainty into forecasting. Proposals are presented of a forecasting approach and simulation of the impact of uncertain input parameters to the target forecasted value by this case study model. The statistical analysis and risk analysis of the forecast results is carried out including sensitivity analysis and variables impact analysis.

  6. Semi-automatic measurement of left ventricular function on dual source computed tomography using five different software tools in comparison with magnetic resonance imaging

    NARCIS (Netherlands)

    de Jonge, G. J.; van der Vleuten, P. A.; Overbosch, J.; Lubbers, D. D.; Jansen-van der Weide, M. C.; Zijlstra, F.; van Ooijen, P. M. A.; Oudkerk, M.

    2011-01-01

    Purpose: To compare left ventricular (LV) function assessment using five different software tools on the same dual source computed tomography (DSCT) datasets with the results of MRI. Materials and methods: Twenty-six patients, undergoing cardiac contrast-enhanced DSCT were included (20 men, mean age

  7. The FRISBEE tool, a software for optimising the trade-off between food quality, energy use, and global warming impact of cold chains

    NARCIS (Netherlands)

    Gwanpua, S.G.; Verboven, P.; Leducq, D.; Brown, T.; Verlinden, B.E.; Bekele, E.; Aregawi, W. Evans, J.; Foster, A.; Duret, S.; Hoang, H.M.; Sluis, S. van der; Wissink, E.; Hendriksen, L.J.A.M.; Taoukis, P.; Gogou, E.; Stahl, V.; El Jabri, M.; Le Page, J.F.; Claussen, I.; Indergård, E.; Nicolai, B.M.; Alvarez, G.; Geeraerd, A.H.

    2015-01-01

    Food quality (including safety) along the cold chain, energy use and global warming impact of refrigeration systems are three key aspects in assessing cold chain sustainability. In this paper, we present the framework of a dedicated software, the FRISBEE tool, for optimising quality of refrigerated

  8. Claire, a tool used for the simulation of events in software tests; Claire, un outil de simulation evenementielle pour le test des logiciels

    Energy Technology Data Exchange (ETDEWEB)

    Henry, J.Y.; Boulc`h, J. [CEA Centre d`Etudes de Fontenay-aux-Roses, 92 (France). Dept. d`Evaluation de Surete; Raguideau, J.; Schoen, D. [CEA Centre d`Etudes de Saclay, 91 - Gif-sur-Yvette (France). Dept. d`Electronique et d`Instrumentation Nucleaire

    1994-06-01

    CLAIRE provides a purely software system which makes it possible to validate the on line applications dealt out to the specifications domain or the code. This tool offers easy graphic design of the application and of its environment. It caries out quite efficiently the simulation of any logged in model and runs the control of the evolution either dynamically or with prerecorded time. (TEC).

  9. Software attribute visualization for high integrity software

    Energy Technology Data Exchange (ETDEWEB)

    Pollock, G.M.

    1998-03-01

    This report documents a prototype tool developed to investigate the use of visualization and virtual reality technologies for improving software surety confidence. The tool is utilized within the execution phase of the software life cycle. It provides a capability to monitor an executing program against prespecified requirements constraints provided in a program written in the requirements specification language SAGE. The resulting Software Attribute Visual Analysis Tool (SAVAnT) also provides a technique to assess the completeness of a software specification.

  10. in silico identification of genetic variants in glucocerebrosidase (GBA gene involved in Gaucher’s disease using multiple software tools.

    Directory of Open Access Journals (Sweden)

    Madhumathi eManickam

    2014-05-01

    Full Text Available Gaucher’s disease is an autosomal recessive disorder caused by the deficiency of glucocerebrosidase, a lysosomal enzyme that catalysis the hydrolysis of the glycolipid glucocerebroside to ceramide and glucose. Polymorphisms in GBA gene have been associated with the development of Gaucher disease. We hypothesize that prediction of SNPs using multiple state of the art software tools will help in increasing the confidence in identification of SNPs involved in Gaucher's disease. Enzyme replacement therapy is the only option for GD. Our goal is to use several state of art SNP algorithms to predict/address harmful SNPs using comparative studies. In this study seven different algorithms (SIFT, MutPred, nsSNP Analyzer, PANTHER, PMUT, PROVEAN and SNPs&GO were used to predict the harmful polymorphisms. Among the 7 programs, SIFT found 47 nsSNPs as deleterious, MutPred found 46 nsSNPs as harmful. nsSNP Analyzer program found 43 out of 47 nsSNPs are disease causing SNPs whereas PANTHER found 32 out of 47 as highly deleterious, 22 out of 47 are classified as pathological mutations by PMUT, 44 out of 47 were predicted to be deleterious by PROVEAN server, all 47 shows the disease related mutations by SNPs&GO. Twenty two nsSNPs were commonly predicted by all the seven different algorithms. The common 22 targeted mutations are F251L, C342G, W312C, P415R, R463C, D127V, A309V, G46E, G202E, P391L, Y363C, Y205C, W378C, I402T, S366R, F397S, Y418C, P401L, G195E, W184R, R48W and T43R.

  11. Development of a web GIS application for emissions inventory spatial allocation based on open source software tools

    Science.gov (United States)

    Gkatzoflias, Dimitrios; Mellios, Giorgos; Samaras, Zissis

    2013-03-01

    Combining emission inventory methods and geographic information systems (GIS) remains a key issue for environmental modelling and management purposes. This paper examines the development of a web GIS application as part of an emission inventory system that produces maps and files with spatial allocated emissions in a grid format. The study is not confined in the maps produced but also presents the features and capabilities of a web application that can be used by every user even without any prior knowledge of the GIS field. The development of the application was based on open source software tools such as MapServer for the GIS functions, PostgreSQL and PostGIS for the data management and HTML, PHP and JavaScript as programming languages. In addition, background processes are used in an innovative manner to handle the time consuming and computational costly procedures of the application. Furthermore, a web map service was created to provide maps to other clients such as the Google Maps API v3 that is used as part of the user interface. The output of the application includes maps in vector and raster format, maps with temporal resolution on daily and hourly basis, grid files that can be used by air quality management systems and grid files consistent with the European Monitoring and Evaluation Programme Grid. Although the system was developed and validated for the Republic of Cyprus covering a remarkable wide range of pollutant and emissions sources, it can be easily customized for use in other countries or smaller areas, as long as geospatial and activity data are available.

  12. Hypertextuality, Complexity, Creativity: Using Linguistic Software Tools to Uncover New Information about the Food and Drink of Historic Mayans

    Directory of Open Access Journals (Sweden)

    Rose Lema

    2012-05-01

    Full Text Available In this essay on natural language I present a computer-supported study of words, sentences and hypertexts concerning bromatology (the study of food and drink in a XVI century Maya-Spanish Calepin—the most complete and extended dictionary ever written on the culture of the constructors of the wonderful and prestigious Mayan cities of Uxmal, Kalakmul, Chichén-Itzá (ARZÁPALO, 1995. For constructing a complex corpus, I apply concepts of the three-body and the fractal dimension theories (POINCARÉ, 1908; MANDELBROT, 1975. First, I register an initial body of text by simply searching via the find key for abbreviations of bromatology and botany already recorded by the citation word in the Calepin. Then, I arbitrarily shorten the Spanish form corresponding to tasty and gather it through the whole dictionary. This way I obtain three bodies of interpretative meaning, lexias (BARTHES, 2002. Second, I establish the second and the third dimensional hypertextual relations between the gleaned words or sentences of text as well as their co-occurrences by using the comprehensive linguistics software, Tropes, a lexical and content analysis mixed tool, which brings up the qualitative and quantitative data pertinent to the research. Third, to bring back the colonial Maya voices of the Calepin, I surf the Internet and add to both written bodies of text a third text composed of beautiful colored images presenting food, drinks and tasty dishes that are still enjoyed by the Maya today and have been appreciated for almost five centuries. Notwithstanding the above, neither one of the three bodies (corpora nested fractally one inside the other is exhaustive. Nonetheless, the study of their interrelations could lead to the deepening of our knowledge on the complex juxtaposition between Siglo de Oro and Maya languages and cultures in the Yucatán Peninsula. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs1202215

  13. DESASS: A software tool for designing, simulating and optimising WWTPs; DESASS: una herramienta informatica para el diseno, simulacion y optimizacion de EDARs

    Energy Technology Data Exchange (ETDEWEB)

    Ferrer Polo, J.; Seco Torecillas, A.; Serralta Sevilla, J.; Ribes Bertomeu, J.; Manga Certain, J.; Asensi Dasi, E.; Morenilla Martinez, J. J.; Llavador Colomer, F.

    2005-07-01

    This paper present a very useful tool for designing, simulating and optimising wastewater treatment plants (WWTPs). This software, called DESASS (Desing and Simulation of Activated Sludge Systems), has been developed by Calagua research group under the financial support from the Entidad Publica de Saneamiento de Aguas Residuales de la Comunidad Valenciana and the companies Sear and Aquagest. This software allows designing, simulating and optimising the whole performance of WWTP ad the mathematical model implemented consider most of physical chemical and biological processes taking place in WWTPs. (Author) 15 refs.

  14. Software tool for resolution of inverse problems using artificial intelligence techniques: an application in neutron spectrometry; Herramienta en software para resolucion de problemas inversos mediante tecnicas de inteligencia artificial: una aplicacion en espectrometria neutronica

    Energy Technology Data Exchange (ETDEWEB)

    Castaneda M, V. H.; Martinez B, M. R.; Solis S, L. O.; Castaneda M, R.; Leon P, A. A.; Hernandez P, C. F.; Espinoza G, J. G.; Ortiz R, J. M.; Vega C, H. R. [Universidad Autonoma de Zacatecas, 98000 Zacatecas, Zac. (Mexico); Mendez, R. [CIEMAT, Departamento de Metrologia de Radiaciones Ionizantes, Laboratorio de Patrones Neutronicos, Av. Complutense 22, 28040 Madrid (Spain); Gallego, E. [Universidad Politecnica de Madrid, Departamento de Ingenieria Nuclear, C. Jose Gutierrez Abascal 2, 28006 Madrid (Spain); Sousa L, M. A. [Comision Nacional de Energia Nuclear, Centro de Investigacion de Tecnologia Nuclear, Av. Pte. Antonio Carlos 6627, Pampulha, 31270-901 Belo Horizonte, Minas Gerais (Brazil)

    2016-10-15

    The Taguchi methodology has proved to be highly efficient to solve inverse problems, in which the values of some parameters of the model must be obtained from the observed data. There are intrinsic mathematical characteristics that make a problem known as inverse. Inverse problems appear in many branches of science, engineering and mathematics. To solve this type of problem, researches have used different techniques. Recently, the use of techniques based on Artificial Intelligence technology is being explored by researches. This paper presents the use of a software tool based on artificial neural networks of generalized regression in the solution of inverse problems with application in high energy physics, specifically in the solution of the problem of neutron spectrometry. To solve this problem we use a software tool developed in the Mat Lab programming environment, which employs a friendly user interface, intuitive and easy to use for the user. This computational tool solves the inverse problem involved in the reconstruction of the neutron spectrum based on measurements made with a Bonner spheres spectrometric system. Introducing this information, the neural network is able to reconstruct the neutron spectrum with high performance and generalization capability. The tool allows that the end user does not require great training or technical knowledge in development and/or use of software, so it facilitates the use of the program for the resolution of inverse problems that are in several areas of knowledge. The techniques of Artificial Intelligence present singular veracity to solve inverse problems, given the characteristics of artificial neural networks and their network topology, therefore, the tool developed has been very useful, since the results generated by the Artificial Neural Network require few time in comparison to other techniques and are correct results comparing them with the actual data of the experiment. (Author)

  15. An open CAM system for dentistry on the basis of China-made 5-axis simultaneous contouring CNC machine tool and industrial CAM software.

    Science.gov (United States)

    Lu, Li; Liu, Shusheng; Shi, Shenggen; Yang, Jianzhong

    2011-10-01

    China-made 5-axis simultaneous contouring CNC machine tool and domestically developed industrial computer-aided manufacture (CAM) technology were used for full crown fabrication and measurement of crown accuracy, with an attempt to establish an open CAM system for dental processing and to promote the introduction of domestic dental computer-aided design (CAD)/CAM system. Commercially available scanning equipment was used to make a basic digital tooth model after preparation of crown, and CAD software that comes with the scanning device was employed to design the crown by using domestic industrial CAM software to process the crown data in order to generate a solid model for machining purpose, and then China-made 5-axis simultaneous contouring CNC machine tool was used to complete machining of the whole crown and the internal accuracy of the crown internal was measured by using 3D-MicroCT. The results showed that China-made 5-axis simultaneous contouring CNC machine tool in combination with domestic industrial CAM technology can be used for crown making and the crown was well positioned in die. The internal accuracy was successfully measured by using 3D-MicroCT. It is concluded that an open CAM system for dentistry on the basis of China-made 5-axis simultaneous contouring CNC machine tool and domestic industrial CAM software has been established, and development of the system will promote the introduction of domestically-produced dental CAD/CAM system.

  16. 协同软件工程工具评价模型研究%Evaluation Model of Collaborative Software Engineering Tools

    Institute of Scientific and Technical Information of China (English)

    南磊

    2011-01-01

    协同软件工程(Collaborative Software Engineering,简称CSE)是一个快速发展的领域,各种协同软件工程工具层出不穷,既有商业化的集成开发环境,也有一些实验原型.但就其支持协同的本质而言,还需要做深入的研究.在分析了国内外对CSE工具的研究与开发的基础上,文章提出了协同软件工程工具评价体系Co-Workstyle模型.该模型给出了评价CSE工具的四个核心指标:感知、同步、制品、协调,并用于对现有的协同软件工程工具进行分析.%Collaborative Software Engineering(CSE) is a rapidly developing field with varied CSE tools springing out one after another. It not only has a commercial IDE,but also has some prototypes. But as for its nature of supporting the collaboration, we need to do deeper research. By analyzing domestic and international research about collaborative software engineering tools, this paper puts forward some key factors, which should be considered in developing CSE tools, and sets up an evaluating system-Co-Workstyle model, which lines out four core indexes to evaluate CSE tools: awareness, synchronization, artifact, coordination, and use this model evaluate the existing collaborative software engineering tools.

  17. A Visual Basic simulation software tool for performance analysis of a membrane-based advanced water treatment plant.

    Science.gov (United States)

    Pal, P; Kumar, R; Srivastava, N; Chowdhury, J

    2014-02-01

    A Visual Basic simulation software (WATTPPA) has been developed to analyse the performance of an advanced wastewater treatment plant. This user-friendly and menu-driven software is based on the dynamic mathematical model for an industrial wastewater treatment scheme that integrates chemical, biological and membrane-based unit operations. The software-predicted results corroborate very well with the experimental findings as indicated in the overall correlation coefficient of the order of 0.99. The software permits pre-analysis and manipulation of input data, helps in optimization and exhibits performance of an integrated plant visually on a graphical platform. It allows quick performance analysis of the whole system as well as the individual units. The software first of its kind in its domain and in the well-known Microsoft Excel environment is likely to be very useful in successful design, optimization and operation of an advanced hybrid treatment plant for hazardous wastewater.

  18. Development of Software for Analyzing Breakage Cutting Tools Based on Image Processing%基于图像技术分析刀具破损的软件开发

    Institute of Scientific and Technical Information of China (English)

    赵彦玲; 刘献礼; 王鹏; 王波; 王红运

    2004-01-01

    As the present day digital microsystems do not provide specialized microscopes that can detect cutting-tool, analysis software has been developed using VC++. A module for verge test and image segmentation is designed specifically for cutting-tools. Known calibration relations and given postulates are used in scale measurements. Practical operations show that the software can perform accurate detection.

  19. A Decision Tool that Combines Discrete Event Software Process Models with System Dynamics Pieces for Software Development Cost Estimation and Analysis

    Science.gov (United States)

    Mizell, Carolyn Barrett; Malone, Linda

    2007-01-01

    The development process for a large software development project is very complex and dependent on many variables that are dynamic and interrelated. Factors such as size, productivity and defect injection rates will have substantial impact on the project in terms of cost and schedule. These factors can be affected by the intricacies of the process itself as well as human behavior because the process is very labor intensive. The complex nature of the development process can be investigated with software development process models that utilize discrete event simulation to analyze the effects of process changes. The organizational environment and its effects on the workforce can be analyzed with system dynamics that utilizes continuous simulation. Each has unique strengths and the benefits of both types can be exploited by combining a system dynamics model and a discrete event process model. This paper will demonstrate how the two types of models can be combined to investigate the impacts of human resource interactions on productivity and ultimately on cost and schedule.

  20. Design and Development of a Nanoscale Multi Probe System Using Open Source SPM Controller and GXSM Software: A Tool of Nanotechnology

    CERN Document Server

    Babu, S K Suresh; Moni, D Jackuline; Devaprakasam, D

    2014-01-01

    We report our design, development, installation and troubleshooting of an open source Gnome X Scanning Microscopy (GXSM) software package for controlling and processing of modern Scanning Probe Microscopy (SPM) system as a development tool of Nanotechnology. GXSM is a full featured analysis tool for the characterization of nanomaterials with different controlling tools like Atomic Force Microscopy (AFM), Scanning Tunneling Spectroscopy (STS), scanning tunneling microscopy (STM), Nanoindentation and etc.,. This developed package tool consists of Digital Signal Processing (DSP) and image processing system of SPM. A digital signal processor (DSP) subsystem runs the feedback loop, generates the scanning signals and acquires the data during SPM measurements. With installed SR-Hwl plug-in this developed package was tested in no hardware mode.

  1. TCV software test and validation tools and technique. [Terminal Configured Vehicle program for commercial transport aircraft operation

    Science.gov (United States)

    Straeter, T. A.; Williams, J. R.

    1976-01-01

    The paper describes techniques for testing and validating software for the TCV (Terminal Configured Vehicle) program which is intended to solve problems associated with operating a commercial transport aircraft in the terminal area. The TCV research test bed is a Boeing 737 specially configured with digital computer systems to carry out automatic navigation, guidance, flight controls, and electronic displays research. The techniques developed for time and cost reduction include automatic documentation aids, an automatic software configuration, and an all software generation and validation system.

  2. Pipe dream? Envisioning a grassroots Python ecosystem of open, common software tools and data access in support of river and coastal biogeochemical research (Invited)

    Science.gov (United States)

    Mayorga, E.

    2013-12-01

    Practical, problem oriented software developed by scientists and graduate students in domains lacking a strong software development tradition is often balkanized into the scripting environments provided by dominant, typically proprietary tools. In environmental fields, these tools include ArcGIS, Matlab, SAS, Excel and others, and are often constrained to specific operating systems. While this situation is the outcome of rational choices, it limits the dissemination of useful tools and their integration into loosely coupled frameworks that can meet wider needs and be developed organically by groups addressing their own needs. Open-source dynamic languages offer the advantages of an accessible programming syntax, a wealth of pre-existing libraries, multi-platform access, linkage to community libraries developed in lower level languages such as C or FORTRAN, and access to web service infrastructure. Python in particular has seen a large and increasing uptake in scientific communities, as evidenced by the continued growth of the annual SciPy conference. Ecosystems with distinctive physical structures and organization, and mechanistic processes that are well characterized, are both factors that have often led to the grass-roots development of useful code meeting the needs of a range of communities. In aquatic applications, examples include river and watershed analysis tools (River Tools, Taudem, etc), and geochemical modules such as CO2SYS, PHREEQ and LOADEST. I will review the state of affairs and explore the potential offered by a Python tool ecosystem in supporting aquatic biogeochemistry and water quality research. This potential is multi-faceted and broadly involves accessibility to lone grad students, access to a wide community of programmers and problem solvers via online resources such as StackExchange, and opportunities to leverage broader cyberinfrastructure efforts and tools, including those from widely different domains. Collaborative development of such

  3. Application of a Computer-Aided Software Engineering Tool in Teaching Practice of Software Engineering Course%一种CASE工具在《软件工程》教学实践中的应用*

    Institute of Scientific and Technical Information of China (English)

    李智

    2013-01-01

    CASE tool can support software development activities across all phases of its life-cycle. Inves-tigates how to effectively apply a self-developed CASE tool for requirements analysis in the teaching practice of an advanced Software Engineering course for master students. Through a series of case-based applications of this teaching mode, improves the students' skills and capa-bilities in analyzing software requirements.%  CASE工具可以支持软件生命周期各阶段的开发活动。探讨如何将自主开发的软件需求工程CASE工具有效地应用在硕士研究生《软件工程》课程教学实践中,并通过一系列案例教学模式的推广和应用,有效提高学生对软件需求分析的能力。

  4. Educational software tool for protection system engineers: distance relay; Herramienta educativa para la formacion de ingenieros en protecciones electricas: relevador de distancia

    Energy Technology Data Exchange (ETDEWEB)

    Trujillo-Guajardo, L.A.; Conde-Enriquez, A. [Universidad Autonoma de Nuevo Leon, Nuevo Leon (Mexico)]. E-mail: luistrujillo84@gmail.com; con_de@yahoo.com

    2012-04-15

    In this article, a graphical software tool is presented; this tool is based on the education of protection system engineers. The theoretical fundaments used for the design of operation characteristics of distance relays and their algorithms are presented. The software allows the evaluation and analysis of real time events or simulated ones of every stage of design of the distance relay. Some example cases are presented to illustrate the activities that could be done with the graphical software tool developed. [Spanish] En este articulo se presenta una herramienta computacional grafica para apoyar la formacion de ingenieros en protecciones electricas. Los fundamentos teoricos para el diseno de caracteristicas de operacion de relevadores de distancia, asi como las rutinas de programacion de un relevador de distancia son presentados. La herramienta desarrollada permite la evaluacion de las etapas de diseno de relevadores y el analisis de la operacion ante eventos reales o simulados. Se presentan algunos casos de ejemplo para ilustrar las actividades didacticas que son posibles de realizar con la herramienta presentada.

  5. System Software and Tools for High Performance Computing Environments: A report on the findings of the Pasadena Workshop, April 14--16, 1992

    Energy Technology Data Exchange (ETDEWEB)

    Sterling, T. [Universities Space Research Association, Washington, DC (United States); Messina, P. [Jet Propulsion Lab., Pasadena, CA (United States); Chen, M. [Yale Univ., New Haven, CT (United States)] [and others

    1993-04-01

    The Pasadena Workshop on System Software and Tools for High Performance Computing Environments was held at the Jet Propulsion Laboratory from April 14 through April 16, 1992. The workshop was sponsored by a number of Federal agencies committed to the advancement of high performance computing (HPC) both as a means to advance their respective missions and as a national resource to enhance American productivity and competitiveness. Over a hundred experts in related fields from industry, academia, and government were invited to participate in this effort to assess the current status of software technology in support of HPC systems. The overall objectives of the workshop were to understand the requirements and current limitations of HPC software technology and to contribute to a basis for establishing new directions in research and development for software technology in HPC environments. This report includes reports written by the participants of the workshop`s seven working groups. Materials presented at the workshop are reproduced in appendices. Additional chapters summarize the findings and analyze their implications for future directions in HPC software technology development.

  6. SOTEA, a software tool for ascertaining the efficiency potential of electrical drives - Final report; SOTEA, Softwaretool zur Ermittlung des Effizienzpotenzials bei elektrischen Antrieben - Schlussbericht

    Energy Technology Data Exchange (ETDEWEB)

    Brunner, C. U. [S.A.F.E. Schweizerische Agentur fuer Energieeffizienz, Zuerich (Switzerland); Heldstab, T. [hematik, Heldstab Systemoptimierung und Informatik, Zuerich (Switzerland)

    2009-08-15

    As a scientific base for the Swiss electric motor efficiency implementation program Topmotors a software tool for industry managers has been developed and tested. The software allows an energy efficiency engineer in a first contact with industrial managers and with few simple data on the plant operation to estimate the energy efficiency potential of electric motor systems including pay-back and investment. The data can be fed to a laptop computer on site and the results can be shown immediately. The software has been programmed and tested with five prime users. The generally positive reactions were evaluated and the tool subsequently improved. 11 industrial objects with a total of 77.6 GWh electricity consumption and 7.9 million CHF electricity cost were studied. The SOTEA estimate is an annual efficiency improvement of the electric motor systems of 6.9 GWh (11 % of electricity for motors) with an average pay-back time of 1.7 years. The SOTEA software tool is publicly available since September 2008 under www.topmotors.ch, from 1 April 2009 in a Beta-2b version. It has been downloaded until 28 June 2009 218 times by 132 persons. It will be improved with results from new pilot studies. To avoid problems with different update versions a direct internet solution will be studied. The program will also be made available internationally for English speaking users for the IEA 4E EMSA project: International Energy Agency, Implementing Agreement for Efficient Electrical End-Use Equipment, Electric Motor Systems Annex www.motorsystems.org. (authors)

  7. Development of a software tool for the management of quality control in a helical tomotherapy unit; Desarrollo de una herramienta de software para la gestion integral del control de calidad en una unidad de tomoterapia helicoidal

    Energy Technology Data Exchange (ETDEWEB)

    Garcia Repiso, S.; Hernandez Rodriguez, J.; Martin Rincon, C.; Ramos Pacho, J. A.; Verde Velasco, J. M.; Delgado Aparacio, J. M.; Perez Alvarez, M. e.; Gomez Gonzalez, N.; Cons Perez, V.; Saez Beltran, M.

    2013-07-01

    The large amount of data and information that is managed in units of external radiotherapy quality control tests makes necessary the use of tools that facilitate, on the one hand, the management of measures and results in real time, and on other tasks of management, file, query and reporting of stored data. This paper presents an application of software of own development which is used for the integral management of the helical TomoTherapy unit in the aspects related to the roles and responsibilities of the hospital Radiophysics. (Author)

  8. User Acceptance of a Software Tool for Decision Making in IT Outsourcing: A Qualitative Study in Large Companies from Sweden

    Science.gov (United States)

    Andresen, Christoffer; Hodosi, Georg; Saprykina, Irina; Rusu, Lazar

    Decisions for IT outsourcing are very complex and needs to be supported by considerations based on many (multiple) criteria. In order to facilitate the use of a specific tool by a decision-maker in IT outsourcing, we need to find out whether such a tool for this purpose will be accepted or rejected or what improvements must be added to this tool to be accepted by some IT decision makers in large companies from Sweden.

  9. A Component Approach to Collaborative Scientific Software Development: Tools and Techniques Utilized by the Quantum Chemistry Science Application Partnership

    Directory of Open Access Journals (Sweden)

    Joseph P. Kenny

    2008-01-01

    Full Text Available Cutting-edge scientific computing software is complex, increasingly involving the coupling of multiple packages to combine advanced algorithms or simulations at multiple physical scales. Component-based software engineering (CBSE has been advanced as a technique for managing this complexity, and complex component applications have been created in the quantum chemistry domain, as well as several other simulation areas, using the component model advocated by the Common Component Architecture (CCA Forum. While programming models do indeed enable sound software engineering practices, the selection of programming model is just one building block in a comprehensive approach to large-scale collaborative development which must also address interface and data standardization, and language and package interoperability. We provide an overview of the development approach utilized within the Quantum Chemistry Science Application Partnership, identifying design challenges, describing the techniques which we have adopted to address these challenges and highlighting the advantages which the CCA approach offers for collaborative development.

  10. Evaluation of software tools for automated identification of neuroanatomical structures in quantitative β-amyloid PET imaging to diagnose Alzheimer's disease

    Energy Technology Data Exchange (ETDEWEB)

    Tuszynski, Tobias; Luthardt, Julia; Butzke, Daniel; Tiepolt, Solveig; Seese, Anita; Barthel, Henryk [Leipzig University Medical Centre, Department of Nuclear Medicine, Leipzig (Germany); Rullmann, Michael; Hesse, Swen; Sabri, Osama [Leipzig University Medical Centre, Department of Nuclear Medicine, Leipzig (Germany); Leipzig University Medical Centre, Integrated Treatment and Research Centre (IFB) Adiposity Diseases, Leipzig (Germany); Gertz, Hermann-Josef [Leipzig University Medical Centre, Department of Psychiatry, Leipzig (Germany); Lobsien, Donald [Leipzig University Medical Centre, Department of Neuroradiology, Leipzig (Germany)

    2016-06-15

    For regional quantification of nuclear brain imaging data, defining volumes of interest (VOIs) by hand is still the gold standard. As this procedure is time-consuming and operator-dependent, a variety of software tools for automated identification of neuroanatomical structures were developed. As the quality and performance of those tools are poorly investigated so far in analyzing amyloid PET data, we compared in this project four algorithms for automated VOI definition (HERMES Brass, two PMOD approaches, and FreeSurfer) against the conventional method. We systematically analyzed florbetaben brain PET and MRI data of ten patients with probable Alzheimer's dementia (AD) and ten age-matched healthy controls (HCs) collected in a previous clinical study. VOIs were manually defined on the data as well as through the four automated workflows. Standardized uptake value ratios (SUVRs) with the cerebellar cortex as a reference region were obtained for each VOI. SUVR comparisons between ADs and HCs were carried out using Mann-Whitney-U tests, and effect sizes (Cohen's d) were calculated. SUVRs of automatically generated VOIs were correlated with SUVRs of conventionally derived VOIs (Pearson's tests). The composite neocortex SUVRs obtained by manually defined VOIs were significantly higher for ADs vs. HCs (p=0.010, d=1.53). This was also the case for the four tested automated approaches which achieved effect sizes of d=1.38 to d=1.62. SUVRs of automatically generated VOIs correlated significantly with those of the hand-drawn VOIs in a number of brain regions, with regional differences in the degree of these correlations. Best overall correlation was observed in the lateral temporal VOI for all tested software tools (r=0.82 to r=0.95, p<0.001). Automated VOI definition by the software tools tested has a great potential to substitute for the current standard procedure to manually define VOIs in β-amyloid PET data analysis. (orig.)

  11. Clinical evaluation of a dose monitoring software tool based on Monte Carlo Simulation in assessment of eye lens doses for cranial CT scans

    Energy Technology Data Exchange (ETDEWEB)

    Guberina, Nika; Suntharalingam, Saravanabavaan; Nassenstein, Kai; Forsting, Michael; Theysohn, Jens; Wetter, Axel; Ringelstein, Adrian [University Hospital Essen, Institute of Diagnostic and Interventional Radiology and Neuroradiology, Essen (Germany)

    2016-10-15

    The aim of this study was to verify the results of a dose monitoring software tool based on Monte Carlo Simulation (MCS) in assessment of eye lens doses for cranial CT scans. In cooperation with the Federal Office for Radiation Protection (Neuherberg, Germany), phantom measurements were performed with thermoluminescence dosimeters (TLD LiF:Mg,Ti) using cranial CT protocols: (I) CT angiography; (II) unenhanced, cranial CT scans with gantry angulation at a single and (III) without gantry angulation at a dual source CT scanner. Eye lens doses calculated by the dose monitoring tool based on MCS and assessed with TLDs were compared. Eye lens doses are summarized as follows: (I) CT angiography (a) MCS 7 mSv, (b) TLD 5 mSv; (II) unenhanced, cranial CT scan with gantry angulation, (c) MCS 45 mSv, (d) TLD 5 mSv; (III) unenhanced, cranial CT scan without gantry angulation (e) MCS 38 mSv, (f) TLD 35 mSv. Intermodality comparison shows an inaccurate calculation of eye lens doses in unenhanced cranial CT protocols at the single source CT scanner due to the disregard of gantry angulation. On the contrary, the dose monitoring tool showed an accurate calculation of eye lens doses at the dual source CT scanner without gantry angulation and for CT angiography examinations. The dose monitoring software tool based on MCS gave accurate estimates of eye lens doses in cranial CT protocols. However, knowledge of protocol and software specific influences is crucial for correct assessment of eye lens doses in routine clinical use. (orig.)

  12. TOWARD DEVELOPMENT OF A COMMON SOFTWARE APPLICATION PROGRAMMING INTERFACE (API) FOR UNCERTAINTY, SENSITIVITY, AND PARAMETER ESTIMATION METHODS AND TOOLS

    Science.gov (United States)

    The final session of the workshop considered the subject of software technology and how it might be better constructed to support those who develop, evaluate, and apply multimedia environmental models. Two invited presentations were featured along with an extended open discussio...

  13. Software distribution using xnetlib

    Energy Technology Data Exchange (ETDEWEB)

    Dongarra, J.J. [Univ. of Tennessee, Knoxville, TN (US). Dept. of Computer Science]|[Oak Ridge National Lab., TN (US); Rowan, T.H. [Oak Ridge National Lab., TN (US); Wade, R.C. [Univ. of Tennessee, Knoxville, TN (US). Dept. of Computer Science

    1993-06-01

    Xnetlib is a new tool for software distribution. Whereas its predecessor netlib uses e-mail as the user interface to its large collection of public-domain mathematical software, xnetlib uses an X Window interface and socket-based communication. Xnetlib makes it easy to search through a large distributed collection of software and to retrieve requested software in seconds.

  14. Decision peptide-driven: a free software tool for accurate protein quantification using gel electrophoresis and matrix assisted laser desorption ionization time of flight mass spectrometry.

    Science.gov (United States)

    Santos, Hugo M; Reboiro-Jato, Miguel; Glez-Peña, Daniel; Nunes-Miranda, J D; Fdez-Riverola, Florentino; Carvallo, R; Capelo, J L

    2010-09-15

    The decision peptide-driven tool implements a software application for assisting the user in a protocol for accurate protein quantification based on the following steps: (1) protein separation through gel electrophoresis; (2) in-gel protein digestion; (3) direct and inverse (18)O-labeling and (4) matrix assisted laser desorption ionization time of flight mass spectrometry, MALDI analysis. The DPD software compares the MALDI results of the direct and inverse (18)O-labeling experiments and quickly identifies those peptides with paralleled loses in different sets of a typical proteomic workflow. Those peptides are used for subsequent accurate protein quantification. The interpretation of the MALDI data from direct and inverse labeling experiments is time-consuming requiring a significant amount of time to do all comparisons manually. The DPD software shortens and simplifies the searching of the peptides that must be used for quantification from a week to just some minutes. To do so, it takes as input several MALDI spectra and aids the researcher in an automatic mode (i) to compare data from direct and inverse (18)O-labeling experiments, calculating the corresponding ratios to determine those peptides with paralleled losses throughout different sets of experiments; and (ii) allow to use those peptides as internal standards for subsequent accurate protein quantification using (18)O-labeling. In this work the DPD software is presented and explained with the quantification of protein carbonic anhydrase.

  15. The effects of text-based and graphics-based software tools on planning and organizing of stories.

    Science.gov (United States)

    Bahr, C M; Nelson, N W; Van Meter, A M

    1996-07-01

    This article describes a research study comparing the effects of two computer-based writing tools on the story-writing skills of fourth-through eighth-grade students with language-related learning disabilities. The first tool, the prompted writing feature of FrEdWriter (Rogers, 1985), allowed students to answer story grammar questions, then type stories using those responses as the plan; the second tool, Once Upon a Time (Urban, Rushing, & Star, 1990), allowed students to create graphic scenes, then type stories about those scenes. Nine students attended a series of after-school writing labs twice weekly for 11 weeks, using each tool for half of the writing sessions. Group results did not clearly favor either tool; however, individual differences suggested that use of planning features should be linked to student needs. Students who had less internal organizational ability benefited from the computer-presented story grammar prompts and wrote less mature stories when using the graphics-based tool. Students with relatively strong organizational skills wrote more mature stories with the graphics-based tool.

  16. Research on Tool Software Application based on Electroacoustic ;Technology Curriculum%基于电声技术课程的工具软件应用研究

    Institute of Scientific and Technical Information of China (English)

    徐洊学

    2015-01-01

    In the teaching process of electroacoustic technology course, some professional content can not be fully demonstrated through multimedia courseware because of its limitations. We can cover the shortage of the multimedia courseware teaching with the help of professional tools software and screen writing software.%在电声技术课程教学过程中,由于多媒体课件本身具有一定的局限性,有些专业性的内容不能够通过多媒体课件完全展示出来。借助专业性的工具软件以及屏幕书写类工具软件来辅助课堂教学,弥补多媒体课件在教学中的不足。

  17. The perfect neuroimaging-genetics-computation storm: collision of petabytes of data, millions of hardware devices and thousands of software tools.

    Science.gov (United States)

    Dinov, Ivo D; Petrosyan, Petros; Liu, Zhizhong; Eggert, Paul; Zamanyan, Alen; Torri, Federica; Macciardi, Fabio; Hobel, Sam; Moon, Seok Woo; Sung, Young Hee; Jiang, Zhiguo; Labus, Jennifer; Kurth, Florian; Ashe-McNalley, Cody; Mayer, Emeran; Vespa, Paul M; Van Horn, John D; Toga, Arthur W

    2014-06-01

    The volume, diversity and velocity of biomedical data are exponentially increasing providing petabytes of new neuroimaging and genetics data every year. At the same time, tens-of-thousands of computational algorithms are developed and reported in the literature along with thousands of software tools and services. Users demand intuitive, quick and platform-agnostic access to data, software tools, and infrastructure from millions of hardware devices. This explosion of information, scientific techniques, computational models, and technological advances leads to enormous challenges in data analysis, evidence-based biomedical inference and reproducibility of findings. The Pipeline workflow environment provides a crowd-based distributed solution for consistent management of these heterogeneous resources. The Pipeline allows multiple (local) clients and (remote) servers to connect, exchange protocols, control the execution, monitor the states of different tools or hardware, and share complete protocols as portable XML workflows. In this paper, we demonstrate several advanced computational neuroimaging and genetics case-studies, and end-to-end pipeline solutions. These are implemented as graphical workflow protocols in the context of analyzing imaging (sMRI, fMRI, DTI), phenotypic (demographic, clinical), and genetic (SNP) data.

  18. Man versus Machine: Software Training for Surgeons-An Objective Evaluation of Human and Computer-Based Training Tools for Cataract Surgical Performance.

    Science.gov (United States)

    Din, Nizar; Smith, Phillip; Emeriewen, Krisztina; Sharma, Anant; Jones, Simon; Wawrzynski, James; Tang, Hongying; Sullivan, Paul; Caputo, Silvestro; Saleh, George M

    2016-01-01

    This study aimed to address two queries: firstly, the relationship between two cataract surgical feedback tools for training, one human and one software based, and, secondly, evaluating microscope control during phacoemulsification using the software. Videos of surgeons with varying experience were enrolled and independently scored with the validated PhacoTrack motion capture software and the Objective Structured Assessment of Cataract Surgical Skill (OSACCS) human scoring tool. Microscope centration and path length travelled were also evaluated with the PhacoTrack software. Twenty-two videos correlated PhacoTrack motion capture with OSACCS. The PhacoTrack path length, number of movements, and total procedure time were found to have high levels of Spearman's rank correlation of -0.6792619 (p = 0.001), -0.6652021 (p = 0.002), and -0.771529 (p = 0001), respectively, with OSACCS. Sixty-two videos evaluated microscope camera control. Novice surgeons had their camera off the pupil centre at a far greater mean distance (SD) of 6.9 (3.3) mm, compared with experts of 3.6 (1.6) mm (p ≪ 0.05). The expert surgeons maintained good microscope camera control and limited total pupil path length travelled 2512 (1031) mm compared with novices of 4049 (2709) mm (p ≪ 0.05). Good agreement between human and machine quantified measurements of surgical skill exists. Our results demonstrate that surrogate markers for camera control are predictors of surgical skills.

  19. The International Atomic Energy Agency software package for the analysis of scintigraphic renal dynamic studies: a tool for the clinician, teacher, and researcher.

    Science.gov (United States)

    Zaknun, John J; Rajabi, Hossein; Piepsz, Amy; Roca, Isabel; Dondi, Maurizio

    2011-01-01

    Under the auspices of the International Atomic Energy Agency, a new-generation, platform-independent, and x86-compatible software package was developed for the analysis of scintigraphic renal dynamic imaging studies. It provides nuclear medicine professionals cost-free access to the most recent developments in the field. The software package is a step forward towards harmonization and standardization. Embedded functionalities render it a suitable tool for education, research, and for receiving distant expert's opinions. Another objective of this effort is to allow introducing clinically useful parameters of drainage, including normalized residual activity and outflow efficiency. Furthermore, it provides an effective teaching tool for young professionals who are being introduced to dynamic kidney studies by selected teaching case studies. The software facilitates a better understanding through practically approaching different variables and settings and their effect on the numerical results. An effort was made to introduce instruments of quality assurance at the various levels of the program's execution, including visual inspection and automatic detection and correction of patient's motion, automatic placement of regions of interest around the kidneys, cortical regions, and placement of reproducible background region on both primary dynamic and on postmicturition studies. The user can calculate the differential renal function through 2 independent methods, the integral or the Rutland-Patlak approaches. Standardized digital reports, storage and retrieval of regions of interest, and built-in database operations allow the generation and tracing of full image reports and of numerical outputs. The software package is undergoing quality assurance procedures to verify the accuracy and the interuser reproducibility with the final aim of launching the program for use by professionals and teaching institutions worldwide.

  20. MieLab: A Software Tool to Perform Calculations on the Scattering of Electromagnetic Waves by Multilayered Spheres

    Directory of Open Access Journals (Sweden)

    Ovidio Peña-Rodríguez

    2011-01-01

    Full Text Available In this paper, we present MieLab, a free computational package for simulating the scattering of electromagnetic radiation by multilayered spheres or an ensemble of particles with normal size distribution. It has been designed as a virtual laboratory, including a friendly graphical user interface (GUI, an optimization algorithm (to fit the simulations to experimental results and scripting capabilities. The paper is structured in five different sections: the introduction is a perspective on the importance of the software for the study of scattering of light scattering. In the second section, various approaches used for modeling the scattering of electromagnetic radiation by small particles are discussed. The third and fourth sections are devoted to provide an overview of MieLab and to describe the main features of its architectural model and functional behavior, respectively. Finally, several examples are provided to illustrate the main characteristics of the software.

  1. Paladin Software Support Lab

    Data.gov (United States)

    Federal Laboratory Consortium — The Paladin Software Support Environment (SSE) occupies 2,241 square-feet. It contains the hardware and software tools required to support the Paladin Automatic Fire...

  2. CoCoTools: open-source software for building connectomes using the CoCoMac anatomical database.

    Science.gov (United States)

    Blumenfeld, Robert S; Bliss, Daniel P; Perez, Fernando; D'Esposito, Mark

    2014-04-01

    Neuroanatomical tracer studies in the nonhuman primate macaque monkey are a valuable resource for cognitive neuroscience research. These data ground theories of cognitive function in anatomy, and with the emergence of graph theoretical analyses in neuroscience, there is high demand for these data to be consolidated into large-scale connection matrices ("macroconnectomes"). Because manual review of the anatomical literature is time consuming and error prone, computational solutions are needed to accomplish this task. Here we describe the "CoCoTools" open-source Python library, which automates collection and integration of macaque connectivity data for visualization and graph theory analysis. CoCoTools both interfaces with the CoCoMac database, which houses a vast amount of annotated tracer results from 100 years (1905-2005) of neuroanatomical research, and implements coordinate-free registration algorithms, which allow studies that use different parcellations of the brain to be translated into a single graph. We show that using CoCoTools to translate all of the data stored in CoCoMac produces graphs with properties consistent with what is known about global brain organization. Moreover, in addition to describing CoCoTools' processing pipeline, we provide worked examples, tutorials, links to on-line documentation, and detailed appendices to aid scientists interested in using CoCoTools to gather and analyze CoCoMac data.

  3. A COMPETITIVE MODEL BASED IN FREE SOFTWARE TOOLS FOR THE TECHNOLOGICAL MANAGEMENT OF ORGANIZATIONS - THE PROMOTION OF THE CORPORATIVE KNOWLEDGE AND THE TECHNOLOGICAL INNOVATION IN A TECHNOLOGICAL UNDERGRADUATE COURSE

    OpenAIRE

    Rubens Araújo de Oliveira; Mário Lucio Roloff

    2007-01-01

    This article presents the thematic of the technological management, where the research is focused on the choice of the best technological free software tools for the promotion of the knowledge management. This article evidences the hypothesis that it is possible to adopt the knowledge management with the union and customization of the free software tools. In such a way, any organization can act in the technological management and apply the politics of knowledge management, either to a micro-c...

  4. Software systems for astronomy

    CERN Document Server

    Conrad, Albert R

    2014-01-01

    This book covers the use and development of software for astronomy. It describes the control systems used to point the telescope and operate its cameras and spectrographs, as well as the web-based tools used to plan those observations. In addition, the book also covers the analysis and archiving of astronomical data once it has been acquired. Readers will learn about existing software tools and packages, develop their own software tools, and analyze real data sets.

  5. Intelligent Software Agents as tools for managing ethical issues in organisations caused by the introduction of new Information Technology

    DEFF Research Database (Denmark)

    Abolfazlian, Ali Reza Kian

    1996-01-01

    I denne artikel beskrives der, hvordan medarbejdernes værdier i og for organisationerne udvikler sig i sammenhæng med de teknologiske værktøjer, som de udfører deres job med. På denne baggrund beskrives nogle af de etiske problemer, der opstår som konsekvens af indførelsen af ny informationstekno...... informationsteknologi i organisationerne, og hvordan Intelligent Software Agents (ISAs) på en aktiv måde kan hjælpe managers med at overkomme disse problemer....

  6. DrCell – A Software Tool for the Analysis of Cell Signals Recorded with Extracellular Microelectrodes

    Directory of Open Access Journals (Sweden)

    Christoph Nick

    2013-09-01

    Full Text Available Microelectrode arrays (MEAs have been applied for in vivo and in vitro recording and stimulation of electrogenic cells, namely neurons and cardiac myocytes, for almost four decades. Extracellular recordings using the MEA technique inflict minimum adverse effects on cells and enable long term applications such as implants in brain or heart tissue. Hence, MEAs pose a powerful tool for studying the processes of learning and memory, investigating the pharmacological impacts of drugs and the fundamentals of the basic electrical interface between novel electrode materials and biological tissue. Yet in order to study the areas mentioned above, powerful signal processing and data analysis tools are necessary. In this paper a novel toolbox for the offline analysis of cell signals is presented that allows a variety of parameters to be detected and analyzed. We developed an intuitive graphical user interface (GUI that enables users to perform high quality data analysis. The presented MATLAB® based toolbox gives the opportunity to examine a multitude of parameters, such as spike and neural burst timestamps, network bursts, as well as heart beat frequency and signal propagation for cardiomyocytes, signal-to-noise ratio and many more. Additionally a spike-sorting tool is included, offering a powerful tool for cases of multiple cell recordings on a single microelectrode. For stimulation purposes, artifacts caused by the stimulation signal can be removed from the recording, allowing the detection of field potentials as early as 5 ms after the stimulation.

  7. Sleep scoring made easy-Semi-automated sleep analysis software and manual rescoring tools for basic sleep research in mice.

    Science.gov (United States)

    Kreuzer, M; Polta, S; Gapp, J; Schuler, C; Kochs, E F; Fenzl, T

    2015-01-01

    Studying sleep behavior in animal models demands clear separation of vigilance states. Pure manual scoring is time-consuming and commercial scoring software is costly. We present a LabVIEW-based, semi-automated scoring routine using recorded EEG and EMG signals. This scoring routine is •designed to reliably assign the vigilance/sleep states wakefulness (WAKE), non-rapid eye movement sleep (NREMS) and rapid eye movement sleep (REMS) to defined EEG/EMG episodes.•straightforward to use even for beginners in the field of sleep research.•freely available upon request. Chronic recordings from mice were used to design and evaluate the scoring routine consisting of an artifact-removal, a scoring- and a rescoring routine. The scoring routine processes EMG and different EEG frequency bands. Amplitude-based thresholds for EEG and EMG parameters trigger a decision tree assigning each EEG episode to a defined vigilance/sleep state automatically. Using the rescoring routine individual episodes or particular state transitions can be re-evaluated manually. High agreements between auto-scored and manual sleep scoring could be shown for experienced scorers and for beginners quickly and reliably. With small modifications to the software, it can be easily adapted for sleep analysis in other animal models.

  8. Influence of Software Tool and Methodological Aspects of Total Metabolic Tumor Volume Calculation on Baseline [18F]FDG PET to Predict Survival in Hodgkin Lymphoma.

    Directory of Open Access Journals (Sweden)

    Salim Kanoun

    Full Text Available To investigate the respective influence of software tool and total metabolic tumor volume (TMTV0 calculation method on prognostic stratification of baseline 2-deoxy-2-[18F]fluoro-D-glucose positron emission tomography ([18F]FDG-PET in newly diagnosed Hodgkin lymphoma (HL.59 patients with newly diagnosed HL were retrospectively included. [18F]FDG-PET was performed before any treatment. Four sets of TMTV0 were calculated with Beth Israel (BI software: based on an absolute threshold selecting voxel with standardized uptake value (SUV >2.5 (TMTV02.5, applying a per-lesion threshold of 41% of the SUV max (TMTV041 and using a per-patient adapted threshold based on SUV max of the liver (>125% and >140% of SUV max of the liver background; TMTV0125 and TMTV0140. TMTV041 was also determined with commercial software for comparison of software tools. ROC curves were used to determine the optimal threshold for each TMTV0 to predict treatment failure.Median follow-up was 39 months. There was an excellent correlation between TMTV041 determined with BI and with the commercial software (r = 0.96, p<0.0001. The median TMTV0 value for TMTV041, TMTV02.5, TMTV0125 and TMTV0140 were respectively 160 (used as reference, 210 ([28;154] p = 0.005, 183 ([-4;114] p = 0.06 and 143 ml ([-58;64] p = 0.9. The respective optimal TMTV0 threshold and area under curve (AUC for prediction of progression free survival (PFS were respectively: 313 ml and 0.70, 432 ml and 0.68, 450 ml and 0.68, 330 ml and 0.68. There was no significant difference between ROC curves. High TMTV0 value was predictive of poor PFS in all methodologies: 4-years PFS was 83% vs 42% (p = 0.006 for TMTV02.5, 83% vs 41% (p = 0.003 for TMTV041, 85% vs 40% (p<0.001 for TMTV0125 and 83% vs 42% (p = 0.004 for TMTV0140.In newly diagnosed HL, baseline metabolic tumor volume values were significantly influenced by the choice of the method used for determination of volume. However, no significant differences were found

  9. A software tool to estimate the dynamic behaviour of the IP{sup 2}C samples as sensors for didactic purposes

    Energy Technology Data Exchange (ETDEWEB)

    Graziani, S.; Pagano, F.; Pitrone, N.; Umana, E., E-mail: nicola.pitrone@diees.unict.i [Dipartimento di Ingegneria Elettrica Elettronica e dei Sistemi -University of Catania V.le A. Doria 6, 95125, Catania (Italy)

    2010-07-01

    Ionic Polymer Polymer Composites (IP{sup 2}Cs) are emerging materials used to realize motion actuators and sensors. In the former case a voltage input is able to cause the membrane to bend while in the latter case by bending an IP{sup 2}C membrane, a voltage output is obtained. In this paper authors introduce a software tool able to estimate the dynamic behaviour for sensors based on IP{sup 2}Cs working in air. In the proposed tool, geometrical quantities that rule the sensing properties of IP{sup 2}C-based transducers are taken into account together with their dynamic characteristics. A graphical interface (GUI) has been developed in order to give a useful tool that allows the user to understand the behaviour and the role of the parameters involved in the transduction phenomena. The tool is based on the idea that a graphical user interface will allow persons not skilled in IP{sup 2}C materials to observe their behaviour and to analyze their characteristics. This could greatly increase the interest of researchers towards this new class of transducers; moreover, it can support the educational activity of students involved in advanced academical courses.

  10. Virtual chromoendoscopy can be a useful software tool in capsule endoscopy La Cromoendoscopia virtual puede ser una herramienta de software útil en la cápsula endoscópica

    Directory of Open Access Journals (Sweden)

    Gabriela Duque

    2012-05-01

    Full Text Available Background: capsule endoscopy (CE has revolutionized the study of small bowel. One major drawback of this technique is that we cannot interfere with image acquisition process. Therefore, the development of new software tools that could modify the images and increase both detection and diagnosis of small-bowel lesions would be very useful. The Flexible Spectral Imaging Color Enhancement (FICE that allows for virtual chromoendoscopy is one of these software tools. Aims: to evaluate the reproducibility and diagnostic accuracy of the FICE system in CE. Methods: this prospective study involved 20 patients. First, four physicians interpreted 150 static FICE images and the overall agree-ment between them was determined using the Fleiss Kappa Test. Second, two experienced gastroenterologists, blinded to each other results, analyzed the complete 20 video streams. One interpreted conventional capsule videos and the other, the CE-FICE videos at setting 2. All findings were reported, regardless of their clinical value. Non-concordant findings between both interpretations were analyzed by a consensus panel of four gastroenterologists who reached a final result (positive or negative finding. Results: in the first arm of the study the overall concordance between the four gastroenterologists was substantial (0.650. In the second arm, the conventional mode identified 75 findings and the CE-FICE mode 95. The CE-FICE mode did not miss any lesions identified by the conventional mode and allowed the identification of a higher number of angiodysplasias (35 vs 32, and erosions (41 vs. 24. Conclusions: there is reproducibility for the interpretation of CE-FICE images between different observers experienced in conventional CE. The use of virtual chromoendoscopy in CE seems to increase its diagnostic accuracy by highlighting small bowel erosions and angiodysplasias that weren't identified by the conventional mode.

  11. Implementation and testing of a fault detection software tool for improving control system performance in a large commercial building

    Energy Technology Data Exchange (ETDEWEB)

    Salsbury, T.I.; Diamond, R.C.

    2000-05-01

    This paper describes a model-based, feedforward control scheme that can detect faults in the controlled process and improve control performance over traditional PID control. The tool uses static simulation models of the system under control to generate feed-forward control action, which acts as a reference of correct operation. Faults that occur in the system cause discrepancies between the feedforward models and the controlled process. The scheme facilitates detection of faults by monitoring the level of these discrepancies. We present results from the first phase of tests on a dual-duct air-handling unit installed in a large office building in San Francisco. We demonstrate the ability of the tool to detect a number of preexisting faults in the system and discuss practical issues related to implementation.

  12. Development of a software tool for evaluating driving assistance systems; Entwicklung eines Softwaretools zur Bewertung von Fahrerassistenzsystemen

    Energy Technology Data Exchange (ETDEWEB)

    Marstaller, R.; Bubb, H. [Technische Universitaet Muenchen (Germany). Lehrstuhl fuer Ergonomie

    2002-07-01

    The increase in road safety in Germany could for example be indicated by the reducing number of seriously injured and killed people (/6/) in spite of increasing number of cars and total amount of kilometres. The selective measures therefore are based on four points: Improvement of active and passive security, direct and indirect psychological measures. While developing systems, which assist drivers on the guidance level, the question of safety of these measures more and more occurs. This led to the development of software, which contains a so called normative driver model, and compares actual driving data with this model. Thereby, situations can be identified, which deviate from the situational normative model, and consequently could be classified as critical. The practical application to driving data with active assistance systems with regulation in longitudinal and lateral direction showed significant improvement of driving safety in comparison to data without system usage. (orig.)

  13. Sleep scoring made easy—Semi-automated sleep analysis software and manual rescoring tools for basic sleep research in mice

    Directory of Open Access Journals (Sweden)

    M. Kreuzer

    2015-01-01

    Chronic recordings from mice were used to design and evaluate the scoring routine consisting of an artifact-removal, a scoring- and a rescoring routine. The scoring routine processes EMG and different EEG frequency bands. Amplitude-based thresholds for EEG and EMG parameters trigger a decision tree assigning each EEG episode to a defined vigilance/sleep state automatically. Using the rescoring routine individual episodes or particular state transitions can be re-evaluated manually. High agreements between auto-scored and manual sleep scoring could be shown for experienced scorers and for beginners quickly and reliably. With small modifications to the software, it can be easily adapted for sleep analysis in other animal models.

  14. Inter-operability and integration of industrial software tools; Interoperabilite et integration en ligne de logiciels industriels

    Energy Technology Data Exchange (ETDEWEB)

    Matania, R. [Gensym SA, 92 - Nanterre (France)

    2005-07-15

    During the eighties and early nineties, a multitude of software applications became available that addressed specific industry needs. These applications generally had very limited scope and operated independently. They were either unable to communicate with other applications or, in order to share data and resources, employed cumbersome and costly techniques such as file exchange or ad hoc application to application bridges. The consequences were that useful information would not always be available where it was most needed, information would be duplicated and the scalability of applications difficult and costly. This article firstly presents the trend from traditional methods of application inter-operability to the more recent messaging techniques with the example of the recent European CHEM project. Secondly, it focuses on Object Linking and Embedding for Process Control (OPC) as one of the newer industry standards for communication. (author)

  15. For goodness' sake : new software tool helps companies use the Internet to channel charitable activity

    Energy Technology Data Exchange (ETDEWEB)

    McKenzie-Brown, P.

    2010-12-15

    This article discussed a new computer application that has the potential to change how companies handle their community investment. The Goodness 3.0 tool can be integrated into a number of pages on a company's web site, such as the human resources Intranet page or the customer reward page on the web. Users can then make direct contributions to nearly any certified charity in North America. The user can select a group a charities, allocate percentages among them, and establish whether the donation is a one-time or a recurring event. Matching donations take place in real time. Being a good corporate citizen helps attract and retain employees. This program allows the company to focus on the charitable interests of the employees, and it empowers employees to have direct involvement in where their money goes. Integrated companies can use the tool to strengthen their consumer brands. At present, most large corporations use an online grant management system to process grant applications from not-for-profit organizations. This type of system is practical for processing applications, but it fails to build relationships.

  16. 'nparACT' package for R: A free software tool for the non-parametric analysis of actigraphy data.

    Science.gov (United States)

    Blume, Christine; Santhi, Nayantara; Schabus, Manuel

    2016-01-01

    For many studies, participants' sleep-wake patterns are monitored and recorded prior to, during and following an experimental or clinical intervention using actigraphy, i.e. the recording of data generated by movements. Often, these data are merely inspected visually without computation of descriptive parameters, in part due to the lack of user-friendly software. To address this deficit, we developed a package for R Core Team [6], that allows computing several non-parametric measures from actigraphy data. Specifically, it computes the interdaily stability (IS), intradaily variability (IV) and relative amplitude (RA) of activity and gives the start times and average activity values of M10 (i.e. the ten hours with maximal activity) and L5 (i.e. the five hours with least activity). Two functions compute these 'classical' parameters and handle either single or multiple files. Two other functions additionally allow computing an L-value (i.e. the least activity value) for a user-defined time span termed 'Lflex' value. A plotting option is included in all functions. The package can be downloaded from the Comprehensive R Archives Network (CRAN). •The package 'nparACT' for R serves the non-parametric analysis of actigraphy data.•Computed parameters include interdaily stability (IS), intradaily variability (IV) and relative amplitude (RA) as well as start times and average activity during the 10 h with maximal and the 5 h with minimal activity (i.e. M10 and L5).

  17. A software tool for teaching and training how to build and use matrixes in strategic information systems planning

    Directory of Open Access Journals (Sweden)

    Javier Andrés Arias Sanabria

    2010-10-01

    Full Text Available Strategic information systems planning (SISP allows an organisation to determine a portfolio of computer-based applications to help it achieve its business objectives. IBM’s business system planning for strategic alignment (BSP/SA is an important technique for developing a strategic plan for an entire company’s information resource. BSP/SA has been described in terms of stages and the specific tasks within them. Tasks are usually done manually and require some experience. This work was thus aimed at presenting a computer-based application that automates two of the most important tasks in BSP/SA methodology: process-organisation matrix (POM and processes-data classes–matrix (PDM. Special emphasis was placed on analysing, designing and implementing systems development life-cycle for developing the software. An important part of the analysis consisted of conducting a literature review and the semi-structured interviews with some experts in SISP. A special contribution of the present work is the design and implementation of statistical reports associated with each matrix. Automating this task has facilitated students being able to analyse POM and PDM during SISP workshops forming part of the Information Systems Management course (Systems Engineering, Universidad Nacional de Colombia. Results arising from the workshops have also been improved.

  18. Development of TUF-ELOCA - a software tool for integrated single-channel thermal-hydraulic and fuel element analyses

    Energy Technology Data Exchange (ETDEWEB)

    Popescu, A.I.; Wu, E.; Yousef, W.W.; Pascoe, J. [Nuclear Safety Solutions Ltd., Toronto, Ontario (Canada); Parlatan, Y. [Ontario Power Generation, Toronto, Ontario (Canada); Kwee, M. [Bruce Power, Tiverton, Ontario (Canada)

    2006-07-01

    The TUF-ELOCA tool couples the TUF and ELOCA codes to enable an integrated thermal-hydraulic and fuel element analysis for a single channel during transient conditions. The coupled architecture is based on TUF as the parent process controlling multiple ELOCA executions that simulate the fuel elements behaviour and is scalable to different fuel channel designs. The coupling ensures a proper feedback between the coolant conditions and fuel elements response, eliminates model duplications, and constitutes an improvement from the prediction accuracy point of view. The communication interfaces are based on PVM and allow parallelization of the fuel element simulations. Developmental testing results are presented showing realistic predictions for the fuel channel behaviour during a transient. (author)

  19. STELLA software as a tool for modelling phosphorus removal in a constructed wetland employing dewatered alum sludge as main substrate.

    Science.gov (United States)

    Kumar, J L G; Wang, Z Y; Zhao, Y Q; Babatunde, A O; Zhao, X H; Jørgensen, S E

    2011-01-01

    A dynamic simulation model was developed for the removal of soluble reactive phosphorus (SRP) from the vertical flow constructed wetlands (VFCW) using a dynamic software program called STELLA (structural thinking, experiential learning laboratory with animation) 9.1.3 to aid in simulating the environmental nature and succession of relationship between interdependent components and processes in the VFCW system. In particular, the VFCW employed dewatered alum sludge as its main substrate to enhance phosphorus (P) immobilization. Although computer modelling of P in treatment wetland has been well studied especially in recent years, there is still a need to develop simple and realistic models that can be used for investigating the dynamics of SRP in VFCWs. The state variables included in the model are dissolved phosphorus (DISP), plant phosphorus (PLAP), detritus phosphorus (DETP), plant biomass (PLBI) and adsorbed phosphorus (ADSP). The major P transformation processes considered in this study were adsorption, plant and microbial uptake and decomposition. The forcing functions which were considered in the model are temperature, radiation, volume of wastewater, P concentration, contact time, flow rate and the adsorbent (i.e., alum sludge). The model results revealed that up to 72% of the SRP can be removed through adsorption process whereas the uptake by plants is about 20% and the remaining processes such as microbial P utilization and decomposition, accounted for 7% SRP removal based on the mass balance calculations. The results obtained indicate that the model can be used to simulate outflow SRP concentration, and it can also be used to estimate the amount of P removed by individual processes in the VFCW using alum-sludge as a substrate.

  20. A method for creating teaching movie clips using screen recording software: usefulness of teaching movies as self-learning tools for medical students

    Energy Technology Data Exchange (ETDEWEB)

    Hwang, Seong Su [The Catholic University of Korea, Suwon (Korea, Republic of)

    2007-04-15

    I wanted to describe a method to create teaching movies with using screen recordings, and I wanted to see if self-learning movies are useful for medical students. Teaching movies were created by direct recording of the screen activity and voice narration during the interpretation of educational cases; we used a PACS system and screen recording software for the recording (CamStudio, Rendersoft, U.S.A.). The usefulness of teaching movies for seft-learning of abdominal CT anatomy was evacuated by the medical students. Creating teaching movie clips with using screen recording software was simple and easy. Survey responses were collected from 43 medical students. The contents of teaching movie was adequately understandable (52%) and useful for learning (47%). Only 23% students agreed the these movies helped motivated them to learn. Teaching movies were more useful than still photographs of the teaching image files. The students wanted teaching movies on the cross-sectional CT anatomy of different body regions (82%) and for understanding the radiological interpretation of various diseases (42%). Creating teaching movie by direct screen recording of a radiologist's interpretation process is easy and simple. The teaching video clips reveal a radiologist's interpretation process or the explanation of teaching cases with his/her own voice narration, and it is an effective self-learning tool for medical students and residents.

  1. FlowCal: A User-Friendly, Open Source Software Tool for Automatically Converting Flow Cytometry Data from Arbitrary to Calibrated Units.

    Science.gov (United States)

    Castillo-Hair, Sebastian M; Sexton, John T; Landry, Brian P; Olson, Evan J; Igoshin, Oleg A; Tabor, Jeffrey J

    2016-07-15

    Flow cytometry is widely used to measure gene expression and other molecular biological processes with single cell resolution via fluorescent probes. Flow cytometers output data in arbitrary units (a.u.) that vary with the probe, instrument, and settings. Arbitrary units can be converted to the calibrated unit molecules of equivalent fluorophore (MEF) using commercially available calibration particles. However, there is no convenient, nonproprietary tool available to perform this calibration. Consequently, most researchers report data in a.u., limiting interpretation. Here, we report a software tool named FlowCal to overcome current limitations. FlowCal can be run using an intuitive Microsoft Excel interface, or customizable Python scripts. The software accepts Flow Cytometry Standard (FCS) files as inputs and is compatible with different calibration particles, fluorescent probes, and cell types. Additionally, FlowCal automatically gates data, calculates common statistics, and produces publication quality plots. We validate FlowCal by calibrating a.u. measurements of E. coli expressing superfolder GFP (sfGFP) collected at 10 different detector sensitivity (gain) settings to a single MEF value. Additionally, we reduce day-to-day variability in replicate E. coli sfGFP expression measurements due to instrument drift by 33%, and calibrate S. cerevisiae Venus expression data to MEF units. Finally, we demonstrate a simple method for using FlowCal to calibrate fluorescence units across different cytometers. FlowCal should ease the quantitative analysis of flow cytometry data within and across laboratories and facilitate the adoption of standard fluorescence units in synthetic biology and beyond.

  2. A modular assembly cloning technique (aided by the BIOF software tool for seamless and error-free assembly of long DNA fragments

    Directory of Open Access Journals (Sweden)

    Orlova Nadezhda A

    2012-06-01

    Full Text Available Abstract Background Molecular cloning of DNA fragments >5 kbp is still a complex task. When no genomic DNA library is available for the species of interest, and direct PCR amplification of the desired DNA fragment is unsuccessful or results in an incorrect sequence, molecular cloning of a PCR-amplified region of the target sequence and assembly of the cloned parts by restriction and ligation is an option. Assembled components of such DNA fragments can be connected together by ligating the compatible overhangs produced by different restriction endonucleases. However, designing the corresponding cloning scheme can be a complex task that requires a software tool to generate a list of potential connection sites. Findings The BIOF program presented here analyzes DNA fragments for all available restriction enzymes and provides a list of potential sites for ligation of DNA fragments with compatible overhangs. The cloning scheme, which is called modular assembly cloning (MAC, is aided by the BIOF program. MAC was tested on a practical dataset, namely, two non-coding fragments of the translation elongation factor 1 alpha gene from Chinese hamster ovary cells. The individual fragment lengths exceeded 5 kbp, and direct PCR amplification produced no amplicons. However, separation of the target fragments into smaller regions, with downstream assembly of the cloned modules, resulted in both target DNA fragments being obtained with few subsequent steps. Conclusions Implementation of the MAC software tool and the experimental approach adopted here has great potential for simplifying the molecular cloning of long DNA fragments. This approach may be used to generate long artificial DNA fragments such as in vitro spliced cDNAs.

  3. The Use of Pro/Engineer CAD Software and Fishbowl Tool Kit in Ray-tracing Analysis

    Science.gov (United States)

    Nounu, Hatem N.; Kim, Myung-Hee Y.; Ponomarev, Artem L.; Cucinotta, Francis A.

    2009-01-01

    This document is designed as a manual for a user who wants to operate the Pro/ENGINEER (ProE) Wildfire 3.0 with the NASA Space Radiation Program's (SRP) custom-designed Toolkit, called 'Fishbowl', for the ray tracing of complex spacecraft geometries given by a ProE CAD model. The analysis of spacecraft geometry through ray tracing is a vital part in the calculation of health risks from space radiation. Space radiation poses severe risks of cancer, degenerative diseases and acute radiation sickness during long-term exploration missions, and shielding optimization is an important component in the application of radiation risk models. Ray tracing is a technique in which 3-dimensional (3D) vehicle geometry can be represented as the input for the space radiation transport code and subsequent risk calculations. In ray tracing a certain number of rays (on the order of 1000) are used to calculate the equivalent thickness, say of aluminum, of the spacecraft geometry seen at a point of interest called the dose point. The rays originate at the dose point and terminate at a homogenously distributed set of points lying on a sphere that circumscribes the spacecraft and that has its center at the dose point. The distance a ray traverses in each material is converted to aluminum or other user-selected equivalent thickness. Then all equivalent thicknesses are summed up for each ray. Since each ray points to a direction, the aluminum equivalent of each ray represents the shielding that the geometry provides to the dose point from that particular direction. This manual will first list for the user the contact information for help in installing ProE and Fishbowl in addition to notes on the platform support and system requirements information. Second, the document will show the user how to use the software to ray trace a Pro/E-designed 3-D assembly and will serve later as a reference for troubleshooting. The user is assumed to have previous knowledge of ProE and CAD modeling.

  4. The Solid Earth Research and Teaching Environment, a new software framework to share research tools in the classroom and across disciplines

    Science.gov (United States)

    Milner, K.; Becker, T. W.; Boschi, L.; Sain, J.; Schorlemmer, D.; Waterhouse, H.

    2009-12-01

    The Solid Earth Teaching and Research Environment (SEATREE) is a modular and user-friendly software framework to facilitate the use of solid Earth research tools in the classroom and for interdisciplinary research collaboration. SEATREE is open source and community developed, distributed freely under the GNU General Public License. It is a fully contained package that lets users operate in a graphical mode, while giving more advanced users the opportunity to view and modify the source code. Top level graphical user interfaces which initiate the calculations and visualize results, are written in the Python programming language using an object-oriented, modern design. Results are plotted with either Matlab-like Python libraries, or SEATREE’s own Generic Mapping Tools wrapper. The underlying computational codes used to produce the results can be written in any programming language and accessed through Python wrappers. There are currently four fully developed science modules for SEATREE: (1) HC is a global geodynamics tool based on a semi-analytical mantle-circulation program based on work by B. Steinberger, Becker, and C. O'Neill. HC can compute velocities and tractions for global, spherical Stokes flow and radial viscosity variations. HC is fast enough to be used for classroom instruction, for example to let students interactively explore the role of radial viscosity variations for global geopotential (geoid) anomalies. (2) ConMan wraps Scott King’s 2D finite element mantle convection code, allowing users to quickly observe how modifications to input parameters affect heat flow over time. As seismology modules, SEATREE includes, (3), Larry, a global, surface wave phase-velocity inversion tool and, (4), Syn2D, a Cartesian tomography teaching tool for ray-theory wave propagation in synthetic, arbitrary velocity structure in the presence of noise. Both underlying programs were contributed by Boschi. Using Syn2D, students can explore, for example, how well a given

  5. DEVELOPMENT OF A SOFTWARE DESIGN TOOL FOR HYBRID SOLAR-GEOTHERMAL HEAT PUMP SYSTEMS IN HEATING- AND COOLING-DOMINATED BUILDINGS

    Energy Technology Data Exchange (ETDEWEB)

    Yavuzturk, C. C. [Univ. of Hartford, West Hartford, CT (United States); Chiasson, A. D. [Univ. of Hartford, West Hartford, CT (United States); Filburn, T. P. [Univ. of Hartford, West Hartford, CT (United States)

    2012-11-29

    This project provides an easy-to-use, menu-driven, software tool for designing hybrid solar-geothermal heat pump systems (GHP) for both heating- and cooling-dominated buildings. No such design tool currently exists. In heating-dominated buildings, the design approach takes advantage of glazed solar collectors to effectively balance the annual thermal loads on the ground with renewable solar energy. In cooling-dominated climates, the design approach takes advantage of relatively low-cost, unglazed solar collectors as the heat rejecting component. The primary benefit of hybrid GHPs is the reduced initial cost of the ground heat exchanger (GHX). Furthermore, solar thermal collectors can be used to balance the ground loads over the annual cycle, thus making the GHX fully sustainable; in heating-dominated buildings, the hybrid energy source (i.e., solar) is renewable, in contrast to a typical fossil fuel boiler or electric resistance as the hybrid component; in cooling-dominated buildings, use of unglazed solar collectors as a heat rejecter allows for passive heat rejection, in contrast to a cooling tower that consumes a significant amount of energy to operate, and hybrid GHPs can expand the market by allowing reduced GHX footprint in both heating- and cooling-dominated climates. The design tool allows for the straight-forward design of innovative GHP systems that currently pose a significant design challenge. The project lays the foundations for proper and reliable design of hybrid GHP systems, overcoming a series of difficult and cumbersome steps without the use of a system simulation approach, and without an automated optimization scheme. As new technologies and design concepts emerge, sophisticated design tools and methodologies must accompany them and be made usable for practitioners. Lack of reliable design tools results in reluctance of practitioners to implement more complex systems. A menu-driven software tool for the design of hybrid solar GHP systems is

  6. SisRadiologia: a new software tool for analysis of radiological accidents and incidents in industrial radiography

    Energy Technology Data Exchange (ETDEWEB)

    Lima, Camila M. Araujo; Silva, Francisco C.A. da, E-mail: araujocamila@yahoo.com.br, E-mail: dasilva@ird.gov.br [Instituto de Radioprotecao e Dosimetria (IRD/CNEN-RJ), Rio de Janeiro, RJ (Brazil); Araujo, Rilton A., E-mail: consultoria@maximindustrial.com.br [Maxim Industrial Assessoria TI, Rio de Janeiro, RJ (Brazil)

    2013-07-01

    According to the International Atomic Energy Agency (IAEA), many efforts have been made by Member states, aiming a better control of radioactive sources. Accidents mostly happened in practices named as high radiological risk and classified by IAEA in categories 1 and 2, being highlighted those related to radiotherapy, large irradiators and industrial radiography. Worldwide, more than 40 radiological accidents have been recorded in the industrial radiography. Worldwide, more than 40 radiological accidents have been recorded in the industrial radiography area, involving 37 workers, 110 members of the public and 12 fatalities. Records display 5 severe radiological accidents in industrial radiography activities in Brazil, in which 7 workers and 19 members of the public were involved. Such events led to hands and fingers radiodermatitis, but to no death occurrence. The purpose of this study is to present a computational program that allows the data acquisition and recording in the company, in such a way to ease a further detailed analysis of radiological event, besides providing the learning cornerstones aiming the avoidance of future occurrences. After one year of the 'Industrial SisRadiologia' computational program application - and mostly based upon the workshop about Analysis and Dose Calculation of Radiological Accidents in Industrial Radiography (Workshop sobre Analise e Calculo de dose de acidentes Radiologicos em Radiografia Industrial - IRD 2012), in which several Radiation Protection officers took part - it can be concluded that the computational program is a powerful tool to data acquisition, as well as, to accidents and incidents events recording and surveying in Industrial Radiography. The program proved to be efficient in the report elaboration to the Brazilian Regulatory Authority, and very useful in workers training to fix the lessons learned from radiological events.

  7. Bioinformatics resource manager v2.3: an integrated software environment for systems biology with microRNA and cross-species analysis tools

    Directory of Open Access Journals (Sweden)

    Tilton Susan C

    2012-11-01

    Full Text Available Abstract Background MicroRNAs (miRNAs are noncoding RNAs that direct post-transcriptional regulation of protein coding genes. Recent studies have shown miRNAs are important for controlling many biological processes, including nervous system development, and are highly conserved across species. Given their importance, computational tools are necessary for analysis, interpretation and integration of high-throughput (HTP miRNA data in an increasing number of model species. The Bioinformatics Resource Manager (BRM v2.3 is a software environment for data management, mining, integration and functional annotation of HTP biological data. In this study, we report recent updates to BRM for miRNA data analysis and cross-species comparisons across datasets. Results BRM v2.3 has the capability to query predicted miRNA targets from multiple databases, retrieve potential regulatory miRNAs for known genes, integrate experimentally derived miRNA and mRNA datasets, perform ortholog mapping across species, and retrieve annotation and cross-reference identifiers for an expanded number of species. Here we use BRM to show that developmental exposure of zebrafish to 30 uM nicotine from 6–48 hours post fertilization (hpf results in behavioral hyperactivity in larval zebrafish and alteration of putative miRNA gene targets in whole embryos at developmental stages that encompass early neurogenesis. We show typical workflows for using BRM to integrate experimental zebrafish miRNA and mRNA microarray datasets with example retrievals for zebrafish, including pathway annotation and mapping to human ortholog. Functional analysis of differentially regulated (p Conclusions BRM provides the ability to mine complex data for identification of candidate miRNAs or pathways that drive phenotypic outcome and, therefore, is a useful hypothesis generation tool for systems biology. The miRNA workflow in BRM allows for efficient processing of multiple miRNA and mRNA datasets in a single

  8. Computer Software.

    Science.gov (United States)

    Kay, Alan

    1984-01-01

    Discusses the nature and development of computer software. Programing, programing languages, types of software (including dynamic spreadsheets), and software of the future are among the topics considered. (JN)

  9. Computational tools and resources for metabolism-related property predictions. 1. Overview of publicly available (free and commercial) databases and software.

    Science.gov (United States)

    Peach, Megan L; Zakharov, Alexey V; Liu, Ruifeng; Pugliese, Angelo; Tawa, Gregory; Wallqvist, Anders; Nicklaus, Marc C

    2012-10-01

    Metabolism has been identified as a defining factor in drug development success or failure because of its impact on many aspects of drug pharmacology, including bioavailability, half-life and toxicity. In this article, we provide an outline and descriptions of the resources for metabolism-related property predictions that are currently either freely or commercially available to the public. These resources include databases with data on, and software for prediction of, several end points: metabolite formation, sites of metabolic transformation, binding to metabolizing enzymes and metabolic stability. We attempt to place each tool in historical context and describe, wherever possible, the data it was based on. For predictions of interactions with metabolizing enzymes, we show a typical set of results for a small test set of compounds. Our aim is to give a clear overview of the areas and aspects of metabolism prediction in which the currently available resources are useful and accurate, and the areas in which they are inadequate or missing entirely.

  10. The cost-effectiveness of monitoring strategies for antiretroviral therapy of HIV infected patients in resource-limited settings: software tool.

    Directory of Open Access Journals (Sweden)

    Janne Estill

    Full Text Available The cost-effectiveness of routine viral load (VL monitoring of HIV-infected patients on antiretroviral therapy (ART depends on various factors that differ between settings and across time. Low-cost point-of-care (POC tests for VL are in development and may make routine VL monitoring affordable in resource-limited settings. We developed a software tool to study the cost-effectiveness of switching to second-line ART with different monitoring strategies, and focused on POC-VL monitoring.We used a mathematical model to simulate cohorts of patients from start of ART until death. We modeled 13 strategies (no 2nd-line, clinical, CD4 (with or without targeted VL, POC-VL, and laboratory-based VL monitoring, with different frequencies. We included a scenario with identical failure rates across strategies, and one in which routine VL monitoring reduces the risk of failure. We compared lifetime costs and averted disability-adjusted life-years (DALYs. We calculated incremental cost-effectiveness ratios (ICER. We developed an Excel tool to update the results of the model for varying unit costs and cohort characteristics, and conducted several sensitivity analyses varying the input costs.Introducing 2nd-line ART had an ICER of US$1651-1766/DALY averted. Compared with clinical monitoring, the ICER of CD4 monitoring was US$1896-US$5488/DALY averted and VL monitoring US$951-US$5813/DALY averted. We found no difference between POC- and laboratory-based VL monitoring, except for the highest measurement frequency (every 6 months, where laboratory-based testing was more effective. Targeted VL monitoring was on the cost-effectiveness frontier only if the difference between 1st- and 2nd-line costs remained large, and if we assumed that routine VL monitoring does not prevent failure.Compared with the less expensive strategies, the cost-effectiveness of routine VL monitoring essentially depends on the cost of 2nd-line ART. Our Excel tool is useful for determining optimal

  11. 基于MasterCAM软件的选刀及高度方向精度控制研究%Based on MasterCAM software tool selection and precision control of the height direction

    Institute of Scientific and Technical Information of China (English)

    吴平峰

    2013-01-01

    The study study based on the MasterCAM software tool selection and precision control of the height di-rection. Tool selection and precision control in height direction made MasterCAM software as the primary means of re-search, analysed and overview MasterCAM software, explored precision control for tool selection and precision control of high direction, analyzed the importance of the accuracy control between that two. In this paper, MasterCAM software pro-cess issues at home and abroad, the use of MasterCAM software for CNC machining technology have been analyzed,and put forward optimization MasterCAM software technology and precision control strategies,in order to improve efficiency for tool selection and highly precision control.%文章研究的是基于MasterCAM软件的选刀方面及其高度方向进度控制。选刀及其高度方向的精度控制以MasterCAM软件为主要的研究手段,分析和概述MasterCAM软件,探究选刀的精度控制和高度方向的精度控制,分析控制这两者精度的重要意义。本文对国内外MasterCAM软件的工艺问题、利用MasterCAM软件进行数控加工技术的现状进行了分析,并提出优化MasterCAM软件工艺和精度控制的策略,提高选刀及高度方向控制精度的效率。

  12. metAlignID: a high-throughput software tool set for automated detection of trace level contaminants in comprehensive LECO two-dimensional gas chromatography time-of-flight mass spectrometry data.

    Science.gov (United States)

    Lommen, Arjen; van der Kamp, Henk J; Kools, Harrie J; van der Lee, Martijn K; van der Weg, Guido; Mol, Hans G J

    2012-11-09

    A new alternative data processing tool set, metAlignID, is developed for automated pre-processing and library-based identification and concentration estimation of target compounds after analysis by comprehensive two-dimensional gas chromatography with mass spectrometric detection. The tool set has been developed for and tested on LECO data. The software is developed to run multi-threaded (one thread per processor core) on a standard PC (personal computer) under different operating systems and is as such capable of processing multiple data sets simultaneously. Raw data files are converted into netCDF (network Common Data Form) format using a fast conversion tool. They are then preprocessed using previously developed algorithms originating from metAlign software. Next, the resulting reduced data files are searched against a user-composed library (derived from user or commercial NIST-compatible libraries) (NIST=National Institute of Standards and Technology) and the identified compounds, including an indicative concentration, are reported in Excel format. Data can be processed batch wise. The overall time needed for conversion together with processing and searching of 30 raw data sets for 560 compounds is routinely within an hour. The screening performance is evaluated for detection of pesticides and contaminants in raw data obtained after analysis of soil and plant samples. Results are compared to the existing data-handling routine based on proprietary software (LECO, ChromaTOF). The developed software tool set, which is freely downloadable at www.metalign.nl, greatly accelerates data-analysis and offers more options for fine-tuning automated identification toward specific application needs. The quality of the results obtained is slightly better than the standard processing and also adds a quantitative estimate. The software tool set in combination with two-dimensional gas chromatography coupled to time-of-flight mass spectrometry shows great potential as a highly

  13. The Software Management Environment (SME)

    Science.gov (United States)

    Valett, Jon D.; Decker, William; Buell, John

    1988-01-01

    The Software Management Environment (SME) is a research effort designed to utilize the past experiences and results of the Software Engineering Laboratory (SEL) and to incorporate this knowledge into a tool for managing projects. SME provides the software development manager with the ability to observe, compare, predict, analyze, and control key software development parameters such as effort, reliability, and resource utilization. The major components of the SME, the architecture of the system, and examples of the functionality of the tool are discussed.

  14. Software For Genetic Algorithms

    Science.gov (United States)

    Wang, Lui; Bayer, Steve E.

    1992-01-01

    SPLICER computer program is genetic-algorithm software tool used to solve search and optimization problems. Provides underlying framework and structure for building genetic-algorithm application program. Written in Think C.

  15. Tier2 Submit Software

    Science.gov (United States)

    Download this tool for Windows or Mac, which helps facilities prepare a Tier II electronic chemical inventory report. The data can also be exported into the CAMEOfm (Computer-Aided Management of Emergency Operations) emergency planning software.

  16. Lean software development

    OpenAIRE

    Hefnerová, Lucie

    2011-01-01

    The main goal of this bachelor thesis is the emergence of the clear Czech written material concerning the concept of Lean Software Development, which has been gaining significant attention in the field of software development, recently. Another goal of this thesis is to summarize the possible approaches of categorizing the concept and to summarize the possible approaches of defining the relationship between Lean and Agile software development. The detailed categorization of the tools potentia...

  17. Cactus: Software Priorities

    Science.gov (United States)

    Hyde, Hartley

    2009-01-01

    The early eighties saw a period of rapid change in computing and teachers lost control of how they used computers in their classrooms. Software companies produced computer tools that looked so good that teachers forgot about writing their own classroom materials and happily purchased software--that offered much more than teachers needed--from…

  18. Social software in global software development

    DEFF Research Database (Denmark)

    2010-01-01

    Social software (SoSo) is defined by Farkas as tools that (1) allow people to communicate, collaborate, and build community online (2) can be syndicated, shared, reused or remixed and (3) let people learn easily from and capitalize on the behavior and knowledge of others. [1]. SoSo include a wide...... variety of tools such as: instant messaging, internet forums, mailing lists, blogs, wikis, social network sites, social bookmarking, social libraries, virtual worlds. Though normally rather belonging to the private realm, the use of social software in corporate context has been reported, e.g. as a way...

  19. PhasePlot: An Interactive Software Tool for Visualizing Phase Relations, Performing Virtual Experiments, and for Teaching Thermodynamic Concepts in Petrology

    Science.gov (United States)

    Ghiorso, M. S.

    2012-12-01

    The computer program PhasePlot was developed for Macintosh computers and released via the Mac App Store in December 2011. It permits the visualization of phase relations calculated from internally consistent thermodynamic data-model collections, including those from MELTS (Ghiorso and Sack, 1995, CMP 119, 197-212), pMELTS (Ghiorso et al., 2002, G-cubed 3, 10.1029/2001GC000217) and the deep mantle database of Stixrude and Lithgow-Bertelloni (2011, GJI 184, 1180-1213). The software allows users to enter a system bulk composition and a range of reference conditions, and then calculate a grid of phase relations. These relations may be visualized in a variety of ways including pseudosections, phase diagrams, phase proportion plots, and contour diagrams of phase compositions and abundances. The program interface is user friendly and the computations are fast on laptop-scale machines, which makes PhasePlot amenable to in-class demonstrations, as a tool in instructional laboratories, and as an aid in support of out-of-class exercises and research. Users focus on problem specification and interpretation of results rather than on manipulation and mechanics of computation. The software has been developed with NSF support and is free. The PhasePlot web site is at phaseplot.org where extensive user documentation, video tutorials and examples of use may be found. The original release of phase plot permitted calculations to be performed on pressure-, temperature-grids (P-T), by direct minimization of the Gibbs free energy of the system at each grid point. A revision of PhasePlot (scheduled for release to the Mac App Store in December 2012) extends capabilities to include pressure-, entropy-grids (P-S) by system enthalpy minimization, volume-, temperature-grids (V-T) by system Helmholtz energy minimization, and volume-,entropy-grids (V-S) by minimization of the Internal Energy of the system. P-S gridded results may be utilized to visualize phase relations as a function of heat

  20. LightTools软件在均匀光分布的照明系统设计中的应用%The application of LightTools software in uniform light distribution of the lighting system design

    Institute of Scientific and Technical Information of China (English)

    胡志威; 彭润玲; 秦汉; 林朋飞

    2012-01-01

    以模拟单颗LED的均匀配光为例,介绍了LightTools软件在照明系统设计中的应用,以便更进一步地掌握和使用LightTools软件.文中借助LightTools软件,在单颗LED上建立反光杯模型,在反光杯出光口建立透镜阵列,并在目标照射面上建立目标区域的强度网格,通过LightTools软件的优化模块进行优化后,可在目标区域得到均匀的光强分布.利用LightTools 软件进行辅助设计和优化模拟,具有很高的可信度,也可以大大缩短照明设计的周期.%With the simulation of single LED light distribution even as an example, LightTools software in lighting system design of the application is introduced in order to further control and use LightTools software. With LightTools software, a reflector model is established above a single LED component and lens array is made on the exit of the reflector, then an intensity mesh for a certain region of the target surface is established After optimization in LightTools, the intensity distribution can be uniform in the target area. The design and optimization with LightTools demonstrate high credibility, and the lighting design cycle could also be greatly shortened.

  1. Software Safety and Security

    CERN Document Server

    Nipkow, T; Hauptmann, B

    2012-01-01

    Recent decades have seen major advances in methods and tools for checking the safety and security of software systems. Automatic tools can now detect security flaws not only in programs of the order of a million lines of code, but also in high-level protocol descriptions. There has also been something of a breakthrough in the area of operating system verification. This book presents the lectures from the NATO Advanced Study Institute on Tools for Analysis and Verification of Software Safety and Security; a summer school held at Bayrischzell, Germany, in 2011. This Advanced Study Institute was

  2. Software Tools that Control a Framework of Perceptual Interfaces and Visual Display Systems for Human-System Interaction with Robotic and Autonomous Systems Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Turbogizmo, LLC will develop new software technology for Human-System Interaction (HSI) for NASA that increases performance and reduces the risk of conducting manned...

  3. Visual assessment of software evolution

    NARCIS (Netherlands)

    Voinea, Lucian; Lukkien, Johan; Telea, Alexandru

    2007-01-01

    Configuration management tools have become well and widely accepted by the software industry. Software Configuration Management (SCM) systems hold minute information about the entire evolution of complex software systems and thus represent a good source for process accounting and auditing. However,

  4. Proceedings of the Thirteenth Annual Software Engineering Workshop

    Science.gov (United States)

    1988-01-01

    Topics covered in the workshop included studies and experiments conducted in the Software Engineering Laboratory (SEL), a cooperative effort of NASA Goddard Space Flight Center, the University of Maryland, and Computer Sciences Corporation; software models; software products; and software tools.

  5. CoC GIS Tools (GIS Tool)

    Data.gov (United States)

    Department of Housing and Urban Development — This tool provides a no-cost downloadable software tool that allows users to interact with professional quality GIS maps. Users access pre-compiled projects through...

  6. Software engineering

    CERN Document Server

    Sommerville, Ian

    2010-01-01

    The ninth edition of Software Engineering presents a broad perspective of software engineering, focusing on the processes and techniques fundamental to the creation of reliable, software systems. Increased coverage of agile methods and software reuse, along with coverage of 'traditional' plan-driven software engineering, gives readers the most up-to-date view of the field currently available. Practical case studies, a full set of easy-to-access supplements, and extensive web resources make teaching the course easier than ever.

  7. Error-Free Software

    Science.gov (United States)

    1989-01-01

    001 is an integrated tool suited for automatically developing ultra reliable models, simulations and software systems. Developed and marketed by Hamilton Technologies, Inc. (HTI), it has been applied in engineering, manufacturing, banking and software tools development. The software provides the ability to simplify the complex. A system developed with 001 can be a prototype or fully developed with production quality code. It is free of interface errors, consistent, logically complete and has no data or control flow errors. Systems can be designed, developed and maintained with maximum productivity. Margaret Hamilton, President of Hamilton Technologies, also directed the research and development of USE.IT, an earlier product which was the first computer aided software engineering product in the industry to concentrate on automatically supporting the development of an ultrareliable system throughout its life cycle. Both products originated in NASA technology developed under a Johnson Space Center contract.

  8. Design and realization of on-line monitor software about polishing tool working condition%磨头工作状态在线监测软件的设计与实现

    Institute of Scientific and Technical Information of China (English)

    刘健; 王绍治; 王君林

    2012-01-01

    In order to detect polishing tool working pressure real timely thus guiding computer control optical surfacing processing,polishing tool on-line monitor software was developed at Visual C+ +2008 platform. Serial communication between computer and polishing tool was realized by wireless communication model and MSComm controls. Then,realtime acquisition was completed based on self defining communication protocol and data receiving buffer. High speed data flow was rendered dynamically,which based on queue data structure.double buffer and multiply thread mechanism. At last.experiment between software and hardware was developed. The result indicated this software could realize polishing tool pressure detecting steadily and dependably.%为了实时检测磨头的压力从而指导计算机控制光学表面成形过程,在Visual C++ 2008平台下开发了磨头工作状态在线监测软件.使用无线通信模块及MSComm控件实现了PC与磨头的串口通信.然后,基于自定义通信协议及数据接收缓冲区完成数据的实时采集.使用队列数据结构以及双缓冲绘图方式结合多线程机制实现了高速数据的动态显示.最后,进行软件与硬件联合调试的实测实验.结果表明利用本软件可以稳定可靠地实现对磨头工作压力的监测.

  9. Solving the forward problem in electrical impedance tomography for the human head using IDEAS (integrated design engineering analysis software), a finite element modelling tool.

    Science.gov (United States)

    Bayford, R H; Gibson, A; Tizzard, A; Tidswell, T; Holder, D S

    2001-02-01

    If electrical impedance tomography is to be used as a clinical tool, the image reconstruction algorithms must yield accurate images of impedance changes. One of the keys to producing an accurate reconstructed image is the inclusion of prior information regarding the physical geometry of the object. To achieve this, many researchers have created tools for solving the forward problem by means of finite element methods (FEMs). These tools are limited, allowing only a set number of meshes to be produced from the geometric information of the object. There is a clear need for geometrical accurate FEM models to improve the quality of the reconstructed images. We present a commercial tool called IDEAS, which can be used to create FEM meshes for these models. The application of this tool is demonstrated by using segmented data from the human head to model impedance changes inside the head.

  10. Software Requirements Management

    Directory of Open Access Journals (Sweden)

    Ali Altalbe

    2015-04-01

    Full Text Available Requirements are defined as the desired set of characteristics of a product or a service. In the world of software development, it is estimated that more than half of the failures are attributed towards poor requirements management. This means that although the software functions correctly, it is not what the client requested. Modern software requirements management methodologies are available to reduce the occur-rence of such incidents. This paper performs a review on the available literature in the area while tabulating possible methods of managing requirements. It also highlights the benefits of following a proper guideline for the requirements management task. With the introduction of specific software tools for the requirements management task, better software products are now been developed with lesser resources.

  11. Developing Software Simulations

    Directory of Open Access Journals (Sweden)

    Tom Hall

    2007-06-01

    Full Text Available Programs in education and business often require learners to develop and demonstrate competence in specified areas and then be able to effectively apply this knowledge. One method to aid in developing a skill set in these areas is through the use of software simulations. These simulations can be used for learner demonstrations of competencies in a specified course as well as a review of the basic skills at the beginning of subsequent courses. The first section of this paper discusses ToolBook, the software used to develop our software simulations. The second section discusses the process of developing software simulations. The third part discusses how we have used software simulations to assess student knowledge of research design by providing simulations that allow the student to practice using SPSS and Excel.

  12. Software Reviews.

    Science.gov (United States)

    Smith, Richard L., Ed.

    1985-01-01

    Reviews software packages by providing extensive descriptions and discussions of their strengths and weaknesses. Software reviewed include (1) "VISIFROG: Vertebrate Anatomy" (grade seven-adult); (2) "Fraction Bars Computer Program" (grades three to six) and (3) four telecommunications utilities. (JN)

  13. XPOS-MOPOS Expert Software Manual

    CERN Document Server

    De Vries, J C

    1997-01-01

    This document presents the software tools available for the experts to control and check the SPS orbit and trajectory systems. Presented are the configuration and acquisition data tools for XPOS and the timing diagnostics tools for MOPOS.

  14. Speakeasy software development

    Science.gov (United States)

    Baskinger, Patricia J.; Ozarow, Larry; Chruscicki, Mary C.

    1993-08-01

    The Speakeasy Software Development Project had three primary objectives. The first objective was to perform Independent Verification and Validation (IV & V) of the software and documentation associated with the signal processor being developed by Hazeltine and TRW under the Speakeasy program. The IV & V task also included an analysis and assessment of the ability of the signal processor software to provide LPI communications functions. The second objective was to assist in the enhancement and modification of an existing Rome Lab signal processor workstation. Finally, TASC developed project management support tools and provided program management support to the Speakeasy Program Office.

  15. Proceedings of the Ninth Annual Software Engineering Workshop

    Science.gov (United States)

    1984-01-01

    Experiences in measurement, utilization, and evaluation of software methodologies, models, and tools are discussed. NASA's involvement in ever larger and more complex systems, like the space station project, provides a motive for the support of software engineering research and the exchange of ideas in such forums. The topics of current SEL research are software error studies, experiments with software development, and software tools.

  16. Software reliability

    CERN Document Server

    Bendell, A

    1986-01-01

    Software Reliability reviews some fundamental issues of software reliability as well as the techniques, models, and metrics used to predict the reliability of software. Topics covered include fault avoidance, fault removal, and fault tolerance, along with statistical methods for the objective assessment of predictive accuracy. Development cost models and life-cycle cost models are also discussed. This book is divided into eight sections and begins with a chapter on adaptive modeling used to predict software reliability, followed by a discussion on failure rate in software reliability growth mo

  17. Software livre e projetos sociais - opções utilizadas como instrumento democratizador na sociedade da informação / Free software and social projects - options used as tools of democratization in the information society

    Directory of Open Access Journals (Sweden)

    Márcia Gorett Ribeiro Grossi

    2009-01-01

    Full Text Available O presente artigo tem como objetivo apresentar algumas contribuições de programas sociais e educacionais de inclusão social e digital no Brasil e refletir sobre o software livre e sua contribuição no acesso ao conhecimento e à informação, Apresenta-se dados importantes sobre as desigualdades econômicas e consequentemente tecnologias existentes entre os paises no Sistema Mundial. Para isso apresenta-se como fruto o estudo realizado das iniciativas do Estado brasileiro para diminuir as desigualdades do acesso às tecnologias e, assim promover a inclusão digital. Mostra-se dados e informações importantes sobre contextualizar a disponibilização do conhecimento na sociedade da informação, mostrando como essa disseminação pode ser mais democrática utilizando as tecnologias de informação e os softwares livres, uma vez que o avanço científico, tecnológico e dos meios de comunicação proporcionam maior acesso às tecnologias da informação e de comunicações e posicionam-se cada vez melhor na sociedade pós-industrial. Traz também algumas iniciativas de programas de inclusão digital no Brasil, que tem como meta a redução da exclusão digital.

  18. APEX (Aqueous Photochemistry of Environmentally occurring Xenobiotics): a free software tool to predict the kinetics of photochemical processes in surface waters.

    Science.gov (United States)

    Bodrato, Marco; Vione, Davide

    2014-04-01

    The APEX software predicts the photochemical transformation kinetics of xenobiotics in surface waters as a function of: photoreactivity parameters (direct photolysis quantum yield and second-order reaction rate constants with transient species, namely ˙OH, CO₃(-)˙, (1)O₂ and the triplet states of chromophoric dissolved organic matter, (3)CDOM*), water chemistry (nitrate, nitrite, bicarbonate, carbonate, bromide and dissolved organic carbon, DOC), and water depth (more specifically, the optical path length of sunlight in water). It applies to well-mixed surface water layers, including the epilimnion of stratified lakes, and the output data are average values over the considered water column. Based on intermediate formation yields from the parent compound via the different photochemical pathways, the software can also predict intermediate formation kinetics and overall yield. APEX is based on a photochemical model that has been validated against available field data of pollutant phototransformation, with good agreement between model predictions and field results. The APEX software makes allowance for different levels of knowledge of a photochemical system. For instance, the absorption spectrum of surface water can be used if known, or otherwise it can be modelled from the values of DOC. Also the direct photolysis quantum yield can be entered as a detailed wavelength trend, as a single value (constant or average), or it can be defined as a variable if unknown. APEX is based on the free software Octave. Additional applications are provided within APEX to assess the σ-level uncertainty of the results and the seasonal trend of photochemical processes.

  19. Using modern software tools to design, simulate and test a Level 1 trigger sub-system for the D Zero Detector

    Energy Technology Data Exchange (ETDEWEB)

    Angstadt, R.; Borcherding, F.; Johnson, M.E. [Fermi National Accelerator Lab., Batavia, IL (United States); Moreira, L. [CBPF-LAFEX/CEFET-EN, Rio de Janeiro, (Brazil)

    1995-06-01

    This paper describes a system which uses a commercial spreadsheet program and commercial hardware on an IBM PC to develop and test a track finding system for the D Zero Level 1 scintillating Fiber Trigger. The trigger system resides in a VME crate. This system allows the user to generate test input, write the pattern to the hardware simulate the results in software, read the hardware result: compare the results and inform the user of any differences.

  20. Computer software as an aid tool for the development of new asphalt products; Programa computacional como ferramenta de auxilio no desenvolvimento de novos produtos asfalticos

    Energy Technology Data Exchange (ETDEWEB)

    Souza, Flavio Vasconcelos de [University of Nebraska (United States). Coll. of Engineering; Soares, Jorge Barbosa [Universidade Federal do Ceara (UFC), Fortaleza, CE (Brazil). Dept. de Engenharia de Transportes. Lab. de Mecanica dos Pavimentos

    2008-04-15

    The use of composite materials in structural applications have been significantly increasing in the last few years, with emphasis to the use in aerospace, biomedical, civil, mechanical and oil engineering. This is due to the fact that the composite materials may offer optimal structural features for specific applications that their components are unable to offer separately. Modified binders and asphalt mixtures are examples of composite materials. Therefore, in order to optimize the performance of these asphalt products, it is necessary to use methodologies capable of retaining the maximum information about its microstructures so as the designer may establish the optimal fractions and distributions of their components for each type of application. A methodology that has been widely used in the scientific community for composite analysis is the so called multi scale modeling. The purpose of this paper is to model the structural behavior of asphalt mixtures using a multistage computer software and to demonstrate how this software can be used to improve quality an to the development of new asphalt products. The software was used in the diametral compression assessment simulation in an asphalt mix and its results are deemed satisfactory when compared to the results obtained experimentally. (author)

  1. Automatically-Programed Machine Tools

    Science.gov (United States)

    Purves, L.; Clerman, N.

    1985-01-01

    Software produces cutter location files for numerically-controlled machine tools. APT, acronym for Automatically Programed Tools, is among most widely used software systems for computerized machine tools. APT developed for explicit purpose of providing effective software system for programing NC machine tools. APT system includes specification of APT programing language and language processor, which executes APT statements and generates NC machine-tool motions specified by APT statements.

  2. Improvement of E and P PETROBRAS maintenance program by an RCM tool; MCC Net: software de revisao de planos de manutencao com base M.C.C. Metodo qualitativo

    Energy Technology Data Exchange (ETDEWEB)

    Frydman, Bernardo; Okada, Ricardo Yoshinori [PETROBRAS, Rio de Janeiro, RJ (Brazil). Exploracao e Producao; Souza, Arleniro Oliveira de [PETROBRAS, AM (Brazil). Unidade de Exploracao e Producao da Bacia do Solimoes; Frazao, Nelson A. [PETROBRAS, Macae, RJ (Brazil). Unidade de Negocios da Bacia de Campos

    2004-07-01

    The objective of this paper are remember some basic concepts necessaries to understanding the Reliability Centered Maintenance technique, presenting a report of the RCM applications already developed in the exploration and production segment at PETROBRAS, with some examples of results obtained, presents the premises that based the development of the denominated software MCCNet, developed by PETROBRAS E and P, tool for use of the technique of Reliability Centered Maintenance , for elaboration and/or revision of the preventive maintenance plans of equipment or systems, that it can be used for the study of any process type, and, still presenting the future vision for the subject of RCM in this company segment. (author)

  3. CMS software deployment on OSG

    Energy Technology Data Exchange (ETDEWEB)

    Kim, B; Avery, P [University of Florida, Gainesville, FL 32611 (United States); Thomas, M [California Institute of Technology, Pasadena, CA 91125 (United States); Wuerthwein, F [University of California at San Diego, La Jolla, CA 92093 (United States)], E-mail: bockjoo@phys.ufl.edu, E-mail: thomas@hep.caltech.edu, E-mail: avery@phys.ufl.edu, E-mail: fkw@fnal.gov

    2008-07-15

    A set of software deployment tools has been developed for the installation, verification, and removal of a CMS software release. The tools that are mainly targeted for the deployment on the OSG have the features of instant release deployment, corrective resubmission of the initial installation job, and an independent web-based deployment portal with Grid security infrastructure login mechanism. We have been deploying over 500 installations and found the tools are reliable and adaptable to cope with problems with changes in the Grid computing environment and the software releases. We present the design of the tools, statistics that we gathered during the operation of the tools, and our experience with the CMS software deployment on the OSG Grid computing environment.

  4. PET-Tool: a software suite for comprehensive processing and managing of Paired-End diTag (PET sequence data

    Directory of Open Access Journals (Sweden)

    Wei Chia-Lin

    2006-08-01

    Full Text Available Abstract Background We recently developed the Paired End diTag (PET strategy for efficient characterization of mammalian transcriptomes and genomes. The paired end nature of short PET sequences derived from long DNA fragments raised a new set of bioinformatics challenges, including how to extract PETs from raw sequence reads, and correctly yet efficiently map PETs to reference genome sequences. To accommodate and streamline data analysis of the large volume PET sequences generated from each PET experiment, an automated PET data process pipeline is desirable. Results We designed an integrated computation program package, PET-Tool, to automatically process PET sequences and map them to the genome sequences. The Tool was implemented as a web-based application composed of four modules: the Extractor module for PET extraction; the Examiner module for analytic evaluation of PET sequence quality; the Mapper module for locating PET sequences in the genome sequences; and the ProjectManager module for data organization. The performance of PET-Tool was evaluated through the analyses of 2.7 million PET sequences. It was demonstrated that PET-Tool is accurate and efficient in extracting PET sequences and removing artifacts from large volume dataset. Using optimized mapping criteria, over 70% of quality PET sequences were mapped specifically to the genome sequences. With a 2.4 GHz LINUX machine, it takes approximately six hours to process one million PETs from extraction to mapping. Conclusion The speed, accuracy, and comprehensiveness have proved that PET-Tool is an important and useful component in PET experiments, and can be extended to accommodate other related analyses of paired-end sequences. The Tool also provides user-friendly functions for data quality check and system for multi-layer data management.

  5. Software Engineering for Human Spaceflight

    Science.gov (United States)

    Fredrickson, Steven E.

    2014-01-01

    The Spacecraft Software Engineering Branch of NASA Johnson Space Center (JSC) provides world-class products, leadership, and technical expertise in software engineering, processes, technology, and systems management for human spaceflight. The branch contributes to major NASA programs (e.g. ISS, MPCV/Orion) with in-house software development and prime contractor oversight, and maintains the JSC Engineering Directorate CMMI rating for flight software development. Software engineering teams work with hardware developers, mission planners, and system operators to integrate flight vehicles, habitats, robotics, and other spacecraft elements. They seek to infuse automation and autonomy into missions, and apply new technologies to flight processor and computational architectures. This presentation will provide an overview of key software-related projects, software methodologies and tools, and technology pursuits of interest to the JSC Spacecraft Software Engineering Branch.

  6. User’s Guide for T.E.S.T. (version 4.2) (Toxicity Estimation Software Tool) A Program to Estimate Toxicity from Molecular Structure

    Science.gov (United States)

    The user's guide describes the methods used by TEST to predict toxicity and physical properties (including the new mode of action based method used to predict acute aquatic toxicity). It describes all of the experimental data sets included in the tool. It gives the prediction res...

  7. A software tool to model genetic regulatory networks. Applications to the modeling of threshold phenomena and of spatial patterning in Drosophila.

    Directory of Open Access Journals (Sweden)

    Rui Dilão

    Full Text Available We present a general methodology in order to build mathematical models of genetic regulatory networks. This approach is based on the mass action law and on the Jacob and Monod operon model. The mathematical models are built symbolically by the Mathematica software package GeneticNetworks. This package accepts as input the interaction graphs of the transcriptional activators and repressors of a biological process and, as output, gives the mathematical model in the form of a system of ordinary differential equations. All the relevant biological parameters are chosen automatically by the software. Within this framework, we show that concentration dependent threshold effects in biology emerge from the catalytic properties of genes and its associated conservation laws. We apply this methodology to the segment patterning in Drosophila early development and we calibrate the genetic transcriptional network responsible for the patterning of the gap gene proteins Hunchback and Knirps, along the antero-posterior axis of the Drosophila embryo. In this approach, the zygotically produced proteins Hunchback and Knirps do not diffuse along the antero-posterior axis of the embryo of Drosophila, developing a spatial pattern due to concentration dependent thresholds. This shows that patterning at the gap genes stage can be explained by the concentration gradients along the embryo of the transcriptional regulators.

  8. Advanced Software Protection Now

    CERN Document Server

    Bendersky, Diego; Notarfrancesco, Luciano; Sarraute, Carlos; Waissbein, Ariel

    2010-01-01

    Software digital rights management is a pressing need for the software development industry which remains, as no practical solutions have been acclamaimed succesful by the industry. We introduce a novel software-protection method, fully implemented with today's technologies, that provides traitor tracing and license enforcement and requires no additional hardware nor inter-connectivity. Our work benefits from the use of secure triggers, a cryptographic primitive that is secure assuming the existence of an ind-cpa secure block cipher. Using our framework, developers may insert license checks and fingerprints, and obfuscate the code using secure triggers. As a result, this rises the cost that software analysis tools have detect and modify protection mechanisms. Thus rising the complexity of cracking this system.

  9. SESAME: a software tool for the numerical dosimetric reconstruction of radiological accidents involving external sources and its application to the accident in Chile in December 2005.

    Science.gov (United States)

    Huet, C; Lemosquet, A; Clairand, I; Rioual, J B; Franck, D; de Carlan, L; Aubineau-Lanièce, I; Bottollier-Depois, J F

    2009-01-01

    Estimating the dose distribution in a victim's body is a relevant indicator in assessing biological damage from exposure in the event of a radiological accident caused by an external source. This dose distribution can be assessed by physical dosimetric reconstruction methods. Physical dosimetric reconstruction can be achieved using experimental or numerical techniques. This article presents the laboratory-developed SESAME--Simulation of External Source Accident with MEdical images--tool specific to dosimetric reconstruction of radiological accidents through numerical simulations which combine voxel geometry and the radiation-material interaction MCNP(X) Monte Carlo computer code. The experimental validation of the tool using a photon field and its application to a radiological accident in Chile in December 2005 are also described.

  10. Software engineering

    CERN Document Server

    Sommerville, Ian

    2016-01-01

    For courses in computer science and software engineering The Fundamental Practice of Software Engineering Software Engineering introduces readers to the overwhelmingly important subject of software programming and development. In the past few years, computer systems have come to dominate not just our technological growth, but the foundations of our world's major industries. This text seeks to lay out the fundamental concepts of this huge and continually growing subject area in a clear and comprehensive manner. The Tenth Edition contains new information that highlights various technological updates of recent years, providing readers with highly relevant and current information. Sommerville's experience in system dependability and systems engineering guides the text through a traditional plan-based approach that incorporates some novel agile methods. The text strives to teach the innovators of tomorrow how to create software that will make our world a better, safer, and more advanced place to live.

  11. In praise of open software

    CERN Multimedia

    2000-01-01

    Much scientific software is proprietary and beyond the reach of poorer scientific communities. This issue will become critical as companies build bioinformatics tools for genomics. The principal of open-source software needs to be defended by academic research institutions (1/2 p).

  12. A Software Configuration Management Course

    DEFF Research Database (Denmark)

    Asklund, U.; Bendix, Lars Gotfred

    2003-01-01

    Software Configuration Management has been a big success in research and creation of tools. There are also many vendors in the market of selling courses to companies. However, in the education sector Software Configuration Management has still not quite made it - at least not into the university...... and contents of such a course....

  13. Software Management Environment (SME): Components and algorithms

    Science.gov (United States)

    Hendrick, Robert; Kistler, David; Valett, Jon

    1994-01-01

    This document presents the components and algorithms of the Software Management Environment (SME), a management tool developed for the Software Engineering Branch (Code 552) of the Flight Dynamics Division (FDD) of the Goddard Space Flight Center (GSFC). The SME provides an integrated set of visually oriented experienced-based tools that can assist software development managers in managing and planning software development projects. This document describes and illustrates the analysis functions that underlie the SME's project monitoring, estimation, and planning tools. 'SME Components and Algorithms' is a companion reference to 'SME Concepts and Architecture' and 'Software Engineering Laboratory (SEL) Relationships, Models, and Management Rules.'

  14. Uso de software como ferramenta pedagógica no processo de ensino-aprendizagem da mamografia digital Educational software as a tool for teaching & learning of digital mammography

    Directory of Open Access Journals (Sweden)

    Simone Elias

    2009-04-01

    Full Text Available OBJETIVO: Avaliar o impacto sobre o treinamento de residentes utilizando uma ferramenta computacional dedicada à avaliação do desempenho da leitura de imagens radiológicas convencionais e digitais. MATERIAIS E MÉTODOS: O treinamento foi realizado no Laboratório de Qualificação de Imagens Médicas (QualIM. Os residentes de radiologia efetuaram cerca de 1.000 leituras de um total de 60 imagens obtidas de um simulador estatístico (Alvim® que apresenta fibras e microcalcificações de dimensões variadas. O desempenho dos residentes na detecção dessas estruturas foi avaliado por meio de parâmetros estatísticos. RESULTADOS: Os resultados da probabilidade de detectabilidade foram de 0,789 e 0,818 para os sistemas convencional e digital, respectivamente. As taxas de falso-positivos foram de 8% e 6% e os valores de verdadeiro-positivos, de 66% e 70%, respectivamente. O valor de kappa total foi 0,553 para as leituras em negatoscópio e 0,615 em monitor. A área sob a curva ROC foi de 0,716 para leitura em filme e 0,810 para monitor. CONCLUSÃO: O treinamento proposto mostrou ser efetivo e apresentou impacto positivo sobre o desempenho dos residentes, constituindo-se em interessante ferramenta pedagógica. Os resultados sugerem que o método de treinamento baseado na leitura de simuladores pode produzir um melhor desempenho dos profissionais na interpretação das imagens mamográficas.OBJECTIVE: The present study was aimed at evaluating the performance of residents trained in the reading of conventional and digital mammography images with a specific computational tool. MATERIALS AND METHODS: The training was accomplished in the Laboratory of Medical Images Qualification (QualIM - Laboratório de Qualificação de Imagens Médicas. Residents in radiology performed approximately 1,000 readings of a set of 60 images acquired from a statistical phantom (Alvim® presenting microcalcifications and fibers with different sizes. The analysis of the

  15. Hard- and software of real time simulation tools of Electric Power System for adequate modeling power semiconductors in voltage source convertor based HVDC and FACTS

    Directory of Open Access Journals (Sweden)

    Ufa Ruslan A.

    2014-01-01

    Full Text Available The motivation of the presented research is based on the needs for development of new methods and tools for adequate simulation of Flexible Alternating Current Transmission System (FACTS devices and High Voltage Direct Current Transmission (HVDC system as part of real electric power systems (EPS. For that, a hybrid approach for advanced simulation of the FACTS and HVDC based on Voltage Source is proposed. The presented simulation results of the developed hybrid model of VSC confirm the achievement of the desired properties of the model and the effectiveness of the proposed solutions.

  16. Generic domain models in software engineering

    Science.gov (United States)

    Maiden, Neil

    1992-01-01

    This paper outlines three research directions related to domain-specific software development: (1) reuse of generic models for domain-specific software development; (2) empirical evidence to determine these generic models, namely elicitation of mental knowledge schema possessed by expert software developers; and (3) exploitation of generic domain models to assist modelling of specific applications. It focuses on knowledge acquisition for domain-specific software development, with emphasis on tool support for the most important phases of software development.

  17. Master Pump Shutdown MPS Software Quality Assurance Plan (SQAP)

    Energy Technology Data Exchange (ETDEWEB)

    BEVINS, R.R.

    2000-09-20

    The MPSS Software Quality Assurance (SQAP) describes the tools and strategy used in the development of the MPSS software. The document also describes the methodology for controlling and managing changes to the software.

  18. Software requirements

    CERN Document Server

    Wiegers, Karl E

    2003-01-01

    Without formal, verifiable software requirements-and an effective system for managing them-the programs that developers think they've agreed to build often will not be the same products their customers are expecting. In SOFTWARE REQUIREMENTS, Second Edition, requirements engineering authority Karl Wiegers amplifies the best practices presented in his original award-winning text?now a mainstay for anyone participating in the software development process. In this book, you'll discover effective techniques for managing the requirements engineering process all the way through the development cy

  19. Analyzing Two-Phase Single-Case Data with Non-overlap and Mean Difference Indices: Illustration, Software Tools, and Alternatives.

    Science.gov (United States)

    Manolov, Rumen; Losada, José L; Chacón-Moscoso, Salvador; Sanduvete-Chaves, Susana

    2016-01-01

    Two-phase single-case designs, including baseline evaluation followed by an intervention, represent the most clinically straightforward option for combining professional practice and research. However, unless they are part of a multiple-baseline schedule, such designs do not allow demonstrating a causal relation between the intervention and the behavior. Although the statistical options reviewed here cannot help overcoming this methodological limitation, we aim to make practitioners and applied researchers aware of the available appropriate options for extracting maximum information from the data. In the current paper, we suggest that the evaluation of behavioral change should include visual and quantitative analyses, complementing the substantive criteria regarding the practical importance of the behavioral change. Specifically, we emphasize the need to use structured criteria for visual analysis, such as the ones summarized in the What Works Clearinghouse Standards, especially if such criteria are complemented by visual aids, as illustrated here. For quantitative analysis, we focus on the non-overlap of all pairs and the slope and level change procedure, as they offer straightforward information and have shown reasonable performance. An illustration is provided of the use of these three pieces of information: visual, quantitative, and substantive. To make the use of visual and quantitative analysis feasible, open source software is referred to and demonstrated. In order to provide practitioners and applied researchers with a more complete guide, several analytical alternatives are commented on pointing out the situations (aims, data patterns) for which these are potentially useful.

  20. Software Reliability through Theorem Proving

    Directory of Open Access Journals (Sweden)

    S.G.K. Murthy

    2009-05-01

    Full Text Available Improving software reliability of mission-critical systems is widely recognised as one of the major challenges. Early detection of errors in software requirements, designs and implementation, need rigorous verification and validation techniques. Several techniques comprising static and dynamic testing approaches are used to improve reliability of mission critical software; however it is hard to balance development time and budget with software reliability. Particularly using dynamic testing techniques, it is hard to ensure software reliability, as exhaustive testing is not possible. On the other hand, formal verification techniques utilise mathematical logic to prove correctness of the software based on given specifications, which in turn improves the reliability of the software. Theorem proving is a powerful formal verification technique that enhances the software reliability for missioncritical aerospace applications. This paper discusses the issues related to software reliability and theorem proving used to enhance software reliability through formal verification technique, based on the experiences with STeP tool, using the conventional and internationally accepted methodologies, models, theorem proving techniques available in the tool without proposing a new model.Defence Science Journal, 2009, 59(3, pp.314-317, DOI:http://dx.doi.org/10.14429/dsj.59.1527