WorldWideScience

Sample records for current software tools

  1. A Software Tool for the Evaluation of the Behaviour of Bioelectrical Currents

    Directory of Open Access Journals (Sweden)

    Gianluca Fabbri

    2011-06-01

    Full Text Available A software tool has been developed in order to evaluate bioelectrical currents. The tool is able to provide a graphical representation of the behaviour of small currents emitted by characteristic points of the human body and captured through a non invasive probe previously developed. The software implementation combines a variety of graphical techniques to create a powerful system that will enable users to perform an accurate and reliable analysis of the emitted currents and to easily go on to further applications and research. This paper introduces the design and the main characteristics of the tool and shows significant measurement results.

  2. Software engineering methodologies and tools

    Science.gov (United States)

    Wilcox, Lawrence M.

    1993-01-01

    Over the years many engineering disciplines have developed, including chemical, electronic, etc. Common to all engineering disciplines is the use of rigor, models, metrics, and predefined methodologies. Recently, a new engineering discipline has appeared on the scene, called software engineering. For over thirty years computer software has been developed and the track record has not been good. Software development projects often miss schedules, are over budget, do not give the user what is wanted, and produce defects. One estimate is there are one to three defects per 1000 lines of deployed code. More and more systems are requiring larger and more complex software for support. As this requirement grows, the software development problems grow exponentially. It is believed that software quality can be improved by applying engineering principles. Another compelling reason to bring the engineering disciplines to software development is productivity. It has been estimated that productivity of producing software has only increased one to two percent a year in the last thirty years. Ironically, the computer and its software have contributed significantly to the industry-wide productivity, but computer professionals have done a poor job of using the computer to do their job. Engineering disciplines and methodologies are now emerging supported by software tools that address the problems of software development. This paper addresses some of the current software engineering methodologies as a backdrop for the general evaluation of computer assisted software engineering (CASE) tools from actual installation of and experimentation with some specific tools.

  3. Software engineering tools.

    Science.gov (United States)

    Wear, L L; Pinkert, J R

    1994-01-01

    We have looked at general descriptions and illustrations of several software development tools, such as tools for prototyping, developing DFDs, testing, and maintenance. Many others are available, and new ones are being developed. However, you have at least seen some examples of powerful CASE tools for systems development.

  4. Machine Tool Software

    Science.gov (United States)

    1988-01-01

    A NASA-developed software package has played a part in technical education of students who major in Mechanical Engineering Technology at William Rainey Harper College. Professor Hack has been using (APT) Automatically Programmed Tool Software since 1969 in his CAD/CAM Computer Aided Design and Manufacturing curriculum. Professor Hack teaches the use of APT programming languages for control of metal cutting machines. Machine tool instructions are geometry definitions written in APT Language to constitute a "part program." The part program is processed by the machine tool. CAD/CAM students go from writing a program to cutting steel in the course of a semester.

  5. Modern Tools for Modern Software

    Energy Technology Data Exchange (ETDEWEB)

    Kumfert, G; Epperly, T

    2001-10-31

    This is a proposal for a new software configure/build tool for building, maintaining, deploying, and installing software. At its completion, this new tool will replace current standard tool suites such as ''autoconf'', ''automake'', ''libtool'', and the de facto standard build tool, ''make''. This ambitious project is born out of the realization that as scientific software has grown in size and complexity over the years, the difficulty of configuring and building software has increased as well. For high performance scientific software, additional complexities often arises from the need for portability to multiple platforms (including many one-of-a-kind platforms), multilanguage implementations, use of third party libraries, and a need to adapt algorithms to the specific features of the hardware. Development of scientific software is being hampered by the quality of configuration and build tools commonly available. Inordinate amounts of time and expertise are required to develop and maintain the configure and build system for a moderately complex project. Better build and configure tools will increase developer productivity. This proposal is a first step in a process of shoring up the foundation upon which DOE software is created and used.

  6. Software Tool Issues

    Science.gov (United States)

    Hennell, Michael

    This chapter relies on experience with tool development gained over the last thirty years. It shows that there are a large number of techniques that contribute to any successful project, and that formality is always the key: a modern software test tool is based on a firm mathematical foundation. After a brief introduction, Section 2 recalls and extends the terminology of Chapter 1. Section 3 discusses the the design of different sorts of static and dynamic analysis tools. Nine important issues to be taken into consideration when evaluating such tools are presented in Section 4. Section 5 investigates the interplay between testing and proof. In Section 6, we call for developers to take their own medicine and verify their tools. Finally, we conclude in Section 7 with a summary of our main messages, emphasising the important role of testing.

  7. CSAM Metrology Software Tool

    Science.gov (United States)

    Vu, Duc; Sandor, Michael; Agarwal, Shri

    2005-01-01

    CSAM Metrology Software Tool (CMeST) is a computer program for analysis of false-color CSAM images of plastic-encapsulated microcircuits. (CSAM signifies C-mode scanning acoustic microscopy.) The colors in the images indicate areas of delamination within the plastic packages. Heretofore, the images have been interpreted by human examiners. Hence, interpretations have not been entirely consistent and objective. CMeST processes the color information in image-data files to detect areas of delamination without incurring inconsistencies of subjective judgement. CMeST can be used to create a database of baseline images of packages acquired at given times for comparison with images of the same packages acquired at later times. Any area within an image can be selected for analysis, which can include examination of different delamination types by location. CMeST can also be used to perform statistical analyses of image data. Results of analyses are available in a spreadsheet format for further processing. The results can be exported to any data-base-processing software.

  8. Biological Imaging Software Tools

    Science.gov (United States)

    Eliceiri, Kevin W.; Berthold, Michael R.; Goldberg, Ilya G.; Ibáñez, Luis; Manjunath, B.S.; Martone, Maryann E.; Murphy, Robert F.; Peng, Hanchuan; Plant, Anne L.; Roysam, Badrinath; Stuurman, Nico; Swedlow, Jason R.; Tomancak, Pavel; Carpenter, Anne E.

    2013-01-01

    Few technologies are more widespread in modern biological laboratories than imaging. Recent advances in optical technologies and instrumentation are providing hitherto unimagined capabilities. Almost all these advances have required the development of software to enable the acquisition, management, analysis, and visualization of the imaging data. We review each computational step that biologists encounter when dealing with digital images, the challenges in that domain, and the overall status of available software for bioimage informatics, focusing on open source options. PMID:22743775

  9. Software Tools Streamline Project Management

    Science.gov (United States)

    2009-01-01

    Three innovative software inventions from Ames Research Center (NETMARK, Program Management Tool, and Query-Based Document Management) are finding their way into NASA missions as well as industry applications. The first, NETMARK, is a program that enables integrated searching of data stored in a variety of databases and documents, meaning that users no longer have to look in several places for related information. NETMARK allows users to search and query information across all of these sources in one step. This cross-cutting capability in information analysis has exponentially reduced the amount of time needed to mine data from days or weeks to mere seconds. NETMARK has been used widely throughout NASA, enabling this automatic integration of information across many documents and databases. NASA projects that use NETMARK include the internal reporting system and project performance dashboard, Erasmus, NASA s enterprise management tool, which enhances organizational collaboration and information sharing through document routing and review; the Integrated Financial Management Program; International Space Station Knowledge Management; Mishap and Anomaly Information Reporting System; and management of the Mars Exploration Rovers. Approximately $1 billion worth of NASA s projects are currently managed using Program Management Tool (PMT), which is based on NETMARK. PMT is a comprehensive, Web-enabled application tool used to assist program and project managers within NASA enterprises in monitoring, disseminating, and tracking the progress of program and project milestones and other relevant resources. The PMT consists of an integrated knowledge repository built upon advanced enterprise-wide database integration techniques and the latest Web-enabled technologies. The current system is in a pilot operational mode allowing users to automatically manage, track, define, update, and view customizable milestone objectives and goals. The third software invention, Query

  10. Tool Use Within NASA Software Quality Assurance

    Science.gov (United States)

    Shigeta, Denise; Port, Dan; Nikora, Allen P.; Wilf, Joel

    2013-01-01

    As space mission software systems become larger and more complex, it is increasingly important for the software assurance effort to have the ability to effectively assess both the artifacts produced during software system development and the development process itself. Conceptually, assurance is a straightforward idea - it is the result of activities carried out by an organization independent of the software developers to better inform project management of potential technical and programmatic risks, and thus increase management's confidence in the decisions they ultimately make. In practice, effective assurance for large, complex systems often entails assessing large, complex software artifacts (e.g., requirements specifications, architectural descriptions) as well as substantial amounts of unstructured information (e.g., anomaly reports resulting from testing activities during development). In such an environment, assurance engineers can benefit greatly from appropriate tool support. In order to do so, an assurance organization will need accurate and timely information on the tool support available for various types of assurance activities. In this paper, we investigate the current use of tool support for assurance organizations within NASA, and describe on-going work at JPL for providing assurance organizations with the information about tools they need to use them effectively.

  11. Design of parametric software tools

    DEFF Research Database (Denmark)

    Sabra, Jakob Borrits; Mullins, Michael

    2011-01-01

    the operations and functions of the design method. To evaluate the prototype potentials, surveys with architectural and healthcare design companies are conducted. Evaluation is done by the administration of questionnaires being part of the development of the tools. The results show that architects, designers......The studies investigate the field of evidence-based design used in architectural design practice and propose a method using 2D/3D CAD applications to: 1) enhance integration of evidence-based design knowledge in architectural design phases with a focus on lighting and interior design and 2) assess...... fulfilment of evidence-based design criterion regarding light distribution and location in relation to patient safety in architectural health care design proposals. The study uses 2D/3D CAD modelling software Rhinoceros 3D with plug-in Grasshopper to create parametric tool prototypes to exemplify...

  12. Tools for Embedded Computing Systems Software

    Science.gov (United States)

    1978-01-01

    A workshop was held to assess the state of tools for embedded systems software and to determine directions for tool development. A synopsis of the talk and the key figures of each workshop presentation, together with chairmen summaries, are presented. The presentations covered four major areas: (1) tools and the software environment (development and testing); (2) tools and software requirements, design, and specification; (3) tools and language processors; and (4) tools and verification and validation (analysis and testing). The utility and contribution of existing tools and research results for the development and testing of embedded computing systems software are described and assessed.

  13. Software Engineering Tools for Scientific Models

    Science.gov (United States)

    Abrams, Marc; Saboo, Pallabi; Sonsini, Mike

    2013-01-01

    Software tools were constructed to address issues the NASA Fortran development community faces, and they were tested on real models currently in use at NASA. These proof-of-concept tools address the High-End Computing Program and the Modeling, Analysis, and Prediction Program. Two examples are the NASA Goddard Earth Observing System Model, Version 5 (GEOS-5) atmospheric model in Cell Fortran on the Cell Broadband Engine, and the Goddard Institute for Space Studies (GISS) coupled atmosphere- ocean model called ModelE, written in fixed format Fortran.

  14. Software engineering the current practice

    CERN Document Server

    Rajlich, Vaclav

    2011-01-01

    INTRODUCTION History of Software EngineeringSoftware PropertiesOrigins of SoftwareBirth of Software EngineeringThird Paradigm: Iterative ApproachSoftware Life Span ModelsStaged ModelVariants of Staged ModelSoftware Technologies Programming Languages and CompilersObject-Oriented TechnologyVersion Control SystemSoftware ModelsClass DiagramsUML Activity DiagramsClass Dependency Graphs and ContractsSOFTWARE CHANGEIntroduction to Software ChangeCharacteristics of Software ChangePhases of Software ChangeRequirements and Their ElicitationRequirements Analysis and Change InitiationConcepts and Concept

  15. Tools & training for more secure software

    CERN Document Server

    CERN. Geneva

    2017-01-01

    Just by fate of nature, software today is shipped out as “beta”, coming with vulnerabilities and weaknesses, which should already have been fixed at the programming stage. This presentation will show the consequences of suboptimal software, why good programming, thorough software design, and a proper software development process is imperative for the overall security of the Organization, and how a few simple tools and training are supposed to make CERN software more secure.

  16. Technology Transfer Challenges for High-Assurance Software Engineering Tools

    Science.gov (United States)

    Koga, Dennis (Technical Monitor); Penix, John; Markosian, Lawrence Z.

    2003-01-01

    In this paper, we describe our experience with the challenges thar we are currently facing in our effort to develop advanced software verification and validation tools. We categorize these challenges into several areas: cost benefits modeling, tool usability, customer application domain, and organizational issues. We provide examples of challenges in each area and identrfj, open research issues in areas which limit our ability to transfer high-assurance software engineering tools into practice.

  17. FFI: A software tool for ecological monitoring

    Science.gov (United States)

    Duncan C. Lutes; Nathan C. Benson; MaryBeth Keifer; John F. Caratti; S. Austin Streetman

    2009-01-01

    A new monitoring tool called FFI (FEAT/FIREMON Integrated) has been developed to assist managers with collection, storage and analysis of ecological information. The tool was developed through the complementary integration of two fire effects monitoring systems commonly used in the United States: FIREMON and the Fire Ecology Assessment Tool. FFI provides software...

  18. Software management tools: Lessons learned from use

    Science.gov (United States)

    Reifer, D. J.; Valett, J.; Knight, J.; Wenneson, G.

    1985-01-01

    Experience in inserting software project planning tools into more than 100 projects producing mission critical software are discussed. The problems the software project manager faces are listed along with methods and tools available to handle them. Experience is reported with the Project Manager's Workstation (PMW) and the SoftCost-R cost estimating package. Finally, the results of a survey, which looked at what could be done in the future to overcome the problems experienced and build a set of truly useful tools, are presented.

  19. A software tool for network intrusion detection

    CSIR Research Space (South Africa)

    Van der Walt, C

    2012-10-01

    Full Text Available This presentation illustrates how a recently developed software tool enables operators to easily monitor a network and detect intrusions without requiring expert knowledge of network intrusion detections....

  20. Assessment and Development of Software Engineering Tools

    Science.gov (United States)

    1991-01-16

    Assessment (REA) tool would advise a potential software reuser on the tradeoffs between reusing a RSC versus developing a brand new software product...of memberships in the key RSC reusability attributes; e.g., size, structure, or documentation, etc., all of which would be weighted by reuser

  1. Software tool for portal dosimetry research.

    Science.gov (United States)

    Vial, P; Hunt, P; Greer, P B; Oliver, L; Baldock, C

    2008-09-01

    This paper describes a software tool developed for research into the use of an electronic portal imaging device (EPID) to verify dose for intensity modulated radiation therapy (IMRT) beams. A portal dose image prediction (PDIP) model that predicts the EPID response to IMRT beams has been implemented into a commercially available treatment planning system (TPS). The software tool described in this work was developed to modify the TPS PDIP model by incorporating correction factors into the predicted EPID image to account for the difference in EPID response to open beam radiation and multileaf collimator (MLC) transmitted radiation. The processes performed by the software tool include; i) read the MLC file and the PDIP from the TPS, ii) calculate the fraction of beam-on time that each point in the IMRT beam is shielded by MLC leaves, iii) interpolate correction factors from look-up tables, iv) create a corrected PDIP image from the product of the original PDIP and the correction factors and write the corrected image to file, v) display, analyse, and export various image datasets. The software tool was developed using the Microsoft Visual Studio.NET framework with the C# compiler. The operation of the software tool was validated. This software provided useful tools for EPID dosimetry research, and it is being utilised and further developed in ongoing EPID dosimetry and IMRT dosimetry projects.

  2. A Software Tool for Legal Drafting

    CERN Document Server

    Gorín, Daniel; Schapachnik, Fernando; 10.4204/EPTCS.68.7

    2011-01-01

    Although many attempts at automated aids for legal drafting have been made, they were based on the construction of a new tool, completely from scratch. This is at least curious, considering that a strong parallelism can be established between a normative document and a software specification: both describe what an entity should or should not do, can or cannot do. In this article we compare normative documents and software specifications to find out their similarities and differences. The comparison shows that there are distinctive particularities, but they are restricted to a very specific subclass of normative propositions. The rest, we postulate, can be dealt with software tools. For such an enterprise the \\FormaLex tool set was devised: an LTL-based language and companion tools that utilize model checking to find out normative incoherences in regulations, contracts and other legal documents. A feature-rich case study is analyzed with the presented tools.

  3. A Software Tool for Legal Drafting

    Directory of Open Access Journals (Sweden)

    Daniel Gorín

    2011-09-01

    Full Text Available Although many attempts at automated aids for legal drafting have been made, they were based on the construction of a new tool, completely from scratch. This is at least curious, considering that a strong parallelism can be established between a normative document and a software specification: both describe what an entity should or should not do, can or cannot do. In this article we compare normative documents and software specifications to find out their similarities and differences. The comparison shows that there are distinctive particularities, but they are restricted to a very specific subclass of normative propositions. The rest, we postulate, can be dealt with software tools. For such an enterprise the FormaLex tool set was devised: an LTL-based language and companion tools that utilize model checking to find out normative incoherences in regulations, contracts and other legal documents. A feature-rich case study is analyzed with the presented tools.

  4. Classifying Desirable Features of Software Visualization Tools for Corrective Maintenance

    NARCIS (Netherlands)

    Sensalire, Mariam; Ogao, Patrick; Telea, Alexandru

    2008-01-01

    We provide an evaluation of 15 software visualization tools applicable to corrective maintenance. The tasks supported as well as the techniques used are presented and graded based on the support level. By analyzing user acceptation of current tools, we aim to help developers to select what to consid

  5. Software tool for physics chart checks.

    Science.gov (United States)

    Li, H Harold; Wu, Yu; Yang, Deshan; Mutic, Sasa

    2014-01-01

    Physics chart check has long been a central quality assurance (QC) measure in radiation oncology. The purpose of this work is to describe a software tool that aims to accomplish simplification, standardization, automation, and forced functions in the process. Nationally recognized guidelines, including American College of Radiology and American Society for Radiation Oncology guidelines and technical standards, and the American Association of Physicists in Medicine Task Group reports were identified, studied, and summarized. Meanwhile, the reported events related to physics chart check service were analyzed using an event reporting and learning system. A number of shortfalls in the chart check process were identified. To address these problems, a software tool was designed and developed under Microsoft. Net in C# to hardwire as many components as possible at each stage of the process. The software consists of the following 4 independent modules: (1) chart check management; (2) pretreatment and during treatment chart check assistant; (3) posttreatment chart check assistant; and (4) quarterly peer-review management. The users were a large group of physicists in the author's radiation oncology clinic. During over 1 year of use the tool has proven very helpful in chart checking management, communication, documentation, and maintaining consistency. The software tool presented in this work aims to assist physicists at each stage of the physics chart check process. The software tool is potentially useful for any radiation oncology clinics that are either in the process of pursuing or maintaining the American College of Radiology accreditation.

  6. Tool Support for Software Lookup Table Optimization

    Directory of Open Access Journals (Sweden)

    Chris Wilcox

    2011-01-01

    Full Text Available A number of scientific applications are performance-limited by expressions that repeatedly call costly elementary functions. Lookup table (LUT optimization accelerates the evaluation of such functions by reusing previously computed results. LUT methods can speed up applications that tolerate an approximation of function results, thereby achieving a high level of fuzzy reuse. One problem with LUT optimization is the difficulty of controlling the tradeoff between performance and accuracy. The current practice of manual LUT optimization adds programming effort by requiring extensive experimentation to make this tradeoff, and such hand tuning can obfuscate algorithms. In this paper we describe a methodology and tool implementation to improve the application of software LUT optimization. Our Mesa tool implements source-to-source transformations for C or C++ code to automate the tedious and error-prone aspects of LUT generation such as domain profiling, error analysis, and code generation. We evaluate Mesa with five scientific applications. Our results show a performance improvement of 3.0× and 6.9× for two molecular biology algorithms, 1.4× for a molecular dynamics program, 2.1× to 2.8× for a neural network application, and 4.6× for a hydrology calculation. We find that Mesa enables LUT optimization with more control over accuracy and less effort than manual approaches.

  7. CPAchecker: A Tool for Configurable Software Verification

    CERN Document Server

    Beyer, Dirk

    2009-01-01

    Configurable software verification is a recent concept for expressing different program analysis and model checking approaches in one single formalism. This paper presents CPAchecker, a tool and framework that aims at easy integration of new verification components. Every abstract domain, together with the corresponding operations, is required to implement the interface of configurable program analysis (CPA). The main algorithm is configurable to perform a reachability analysis on arbitrary combinations of existing CPAs. The major design goal during the development was to provide a framework for developers that is flexible and easy to extend. We hope that researchers find it convenient and productive to implement new verification ideas and algorithms using this platform and that it advances the field by making it easier to perform practical experiments. The tool is implemented in Java and runs as command-line tool or as Eclipse plug-in. We evaluate the efficiency of our tool on benchmarks from the software mo...

  8. Software Tools Used for Continuous Assessment

    Directory of Open Access Journals (Sweden)

    Corina SBUGHEA

    2016-04-01

    Full Text Available he present paper addresses the subject of continuous evaluation and of the IT tools that support it. The approach starts from the main concepts and methods used in the teaching process, according to the assessment methodology and, then, it focuses on their implementation in the Wondershare QuizCreator software.

  9. Software for systems biology: from tools to integrated platforms.

    Science.gov (United States)

    Ghosh, Samik; Matsuoka, Yukiko; Asai, Yoshiyuki; Hsin, Kun-Yi; Kitano, Hiroaki

    2011-11-03

    Understanding complex biological systems requires extensive support from software tools. Such tools are needed at each step of a systems biology computational workflow, which typically consists of data handling, network inference, deep curation, dynamical simulation and model analysis. In addition, there are now efforts to develop integrated software platforms, so that tools that are used at different stages of the workflow and by different researchers can easily be used together. This Review describes the types of software tools that are required at different stages of systems biology research and the current options that are available for systems biology researchers. We also discuss the challenges and prospects for modelling the effects of genetic changes on physiology and the concept of an integrated platform.

  10. ATLAS software configuration and build tool optimisation

    CERN Document Server

    Rybkin, G; The ATLAS collaboration

    2014-01-01

    ATLAS software code base is over 6 million lines organised in about 2000 packages. It makes use of some 100 external software packages, is developed by more than 400 developers and used by more than 2500 physicists from over 200 universities and laboratories in 6 continents. To meet the challenge of configuration and building of this software, the Configuration Management Tool (CMT) is used. CMT expects each package to describe its build targets, build and environment setup parameters, dependencies on other packages in a text file called requirements, and each project (group of packages) to describe its policies and dependencies on other projects in a text project file. Based on the effective set of configuration parameters read from the requirements files of dependent packages and project files, CMT commands build the packages, generate the environment for their use, or query the packages. The main focus was on build time performance that was optimised within several approaches: reduction of the number of re...

  11. ATLAS software configuration and build tool optimisation

    CERN Document Server

    Rybkin, G; The ATLAS collaboration

    2013-01-01

    ATLAS software code base is over 6 million lines organised in about 2000 packages. It makes use of some 100 external software packages, is developed by more than 400 developers and used by more than 2500 physicists from over 200 universities and laboratories in 6 continents. To meet the challenge of configuration and building of this software, the Configuration Management Tool (CMT) is used. CMT expects each package to describe its build targets, build and environment setup parameters, dependencies on other packages in a text file called requirements, and each project (group of packages) to describe its policies and dependencies on other projects in a text project file. Based on the effective set of configuration parameters read from the requirements files of dependent packages and project files, CMT commands build the packages, generate the environment for their use, or query the packages. The main focus was on build time performance that was optimised within several approaches: reduction of the number of re...

  12. A software tool for ecosystem services assessments

    Science.gov (United States)

    Riegels, Niels; Klinting, Anders; Butts, Michael; Middelboe, Anne Lise; Mark, Ole

    2017-04-01

    The EU FP7 DESSIN project is developing methods and tools for assessment of ecosystem services (ESS) and associated economic values, with a focus on freshwater ESS in urban settings. Although the ESS approach has gained considerable visibility over the past ten years, operationalizing the approach remains a challenge. Therefore, DESSSIN is also supporting development of a free software tool to support users implementing the DESSIN ESS evaluation framework. The DESSIN ESS evaluation framework is a structured approach to measuring changes in ecosystem services. The main purpose of the framework is to facilitate the application of the ESS approach in the appraisal of projects that have impacts on freshwater ecosystems and their services. The DESSIN framework helps users evaluate changes in ESS by linking biophysical, economic, and sustainability assessments sequentially. It was developed using the Common International Classification of Ecosystem Services (CICES) and the DPSIR (Drivers, Pressures, States, Impacts, Responses) adaptive management cycle. The former is a standardized system for the classification of ESS developed by the European Union to enhance the consistency and comparability of ESS assessments. The latter is a well-known concept to disentangle the biophysical and social aspects of a system under study. As part of its analytical component, the DESSIN framework also integrates elements of the Final Ecosystem Goods and Services-Classification System (FEGS-CS) of the US Environmental Protection Agency (USEPA). As implemented in the software tool, the DESSIN framework consists of five parts: • In part I of the evaluation, the ecosystem is defined and described and the local stakeholders are identified. In addition, administrative details and objectives of the assessment are defined. • In part II, drivers and pressures are identified. Once these first two elements of the DPSIR scheme have been characterized, the claimed/expected capabilities of a

  13. ATLAS software configuration and build tool optimisation

    Science.gov (United States)

    Rybkin, Grigory; Atlas Collaboration

    2014-06-01

    ATLAS software code base is over 6 million lines organised in about 2000 packages. It makes use of some 100 external software packages, is developed by more than 400 developers and used by more than 2500 physicists from over 200 universities and laboratories in 6 continents. To meet the challenge of configuration and building of this software, the Configuration Management Tool (CMT) is used. CMT expects each package to describe its build targets, build and environment setup parameters, dependencies on other packages in a text file called requirements, and each project (group of packages) to describe its policies and dependencies on other projects in a text project file. Based on the effective set of configuration parameters read from the requirements files of dependent packages and project files, CMT commands build the packages, generate the environment for their use, or query the packages. The main focus was on build time performance that was optimised within several approaches: reduction of the number of reads of requirements files that are now read once per package by a CMT build command that generates cached requirements files for subsequent CMT build commands; introduction of more fine-grained build parallelism at package task level, i.e., dependent applications and libraries are compiled in parallel; code optimisation of CMT commands used for build; introduction of package level build parallelism, i. e., parallelise the build of independent packages. By default, CMT launches NUMBER-OF-PROCESSORS build commands in parallel. The other focus was on CMT commands optimisation in general that made them approximately 2 times faster. CMT can generate a cached requirements file for the environment setup command, which is especially useful for deployment on distributed file systems like AFS or CERN VMFS. The use of parallelism, caching and code optimisation significantly-by several times-reduced software build time, environment setup time, increased the efficiency of

  14. A Survey Identifying Trends on Use of Software Development Tools in Different Indian SMEs

    Directory of Open Access Journals (Sweden)

    Nomi Baruah Ashima

    2012-10-01

    Full Text Available Software Process Improvement defines the identification of the current state-of- practice of processeswithin an organization and then improving it. Software Process Improvement is ever lasting, never endingand ever changing process. Some of the issues which force an organization to undergo software processimprovement are customer dissatisfaction, inadequate software quality, inability to deliver on time andwithin budget, and excessive rework. The SMEs are using software process models but they are not able todeliver on time and within budget, and excessive rework. The SMEs are using software processimprovement models but they are not able to follow all the processes due to lack of resource and cost toimprove their productivity and quality of their product. A survey of 18 SMEs catering software market hasbeen carried out for finding software development scenarios.The intent of the study was to find theprevailing tools and techniques , the SMEs are using to automate software development process and toincorporate software project management. The survey identifies four different types of SoftwareDevelopment Tools which are proving to be effective in current scenario of software development. Theseare identified as Requirement Management Tools, Process Modelling Tools, Software ConfigurationManagement Tools and Cost Estimating Tools. This paper summarizes the trends followed in usage ofSoftware Development Tools in SMEs and it has been shown graphically also.

  15. The Systems Biology Research Tool: evolvable open-source software

    Directory of Open Access Journals (Sweden)

    Wright Jeremiah

    2008-06-01

    Full Text Available Abstract Background Research in the field of systems biology requires software for a variety of purposes. Software must be used to store, retrieve, analyze, and sometimes even to collect the data obtained from system-level (often high-throughput experiments. Software must also be used to implement mathematical models and algorithms required for simulation and theoretical predictions on the system-level. Results We introduce a free, easy-to-use, open-source, integrated software platform called the Systems Biology Research Tool (SBRT to facilitate the computational aspects of systems biology. The SBRT currently performs 35 methods for analyzing stoichiometric networks and 16 methods from fields such as graph theory, geometry, algebra, and combinatorics. New computational techniques can be added to the SBRT via process plug-ins, providing a high degree of evolvability and a unifying framework for software development in systems biology. Conclusion The Systems Biology Research Tool represents a technological advance for systems biology. This software can be used to make sophisticated computational techniques accessible to everyone (including those with no programming ability, to facilitate cooperation among researchers, and to expedite progress in the field of systems biology.

  16. Knickpoint finder: A software tool that improves neotectonic analysis

    Science.gov (United States)

    Queiroz, G. L.; Salamuni, E.; Nascimento, E. R.

    2015-03-01

    This work presents a new software tool for morphometric analysis of drainage networks based on the methods of Hack (1973) and Etchebehere et al. (2004). This tool is applicable to studies of morphotectonics and neotectonics. The software used a digital elevation model (DEM) to identify the relief breakpoints along drainage profiles (knickpoints). The program was coded in Python for use on the ArcGIS platform and is called Knickpoint Finder. A study area was selected to test and evaluate the software's ability to analyze and identify neotectonic morphostructures based on the morphology of the terrain. For an assessment of its validity, we chose an area of the James River basin, which covers most of the Piedmont area of Virginia (USA), which is an area of constant intraplate seismicity and non-orogenic active tectonics and exhibits a relatively homogeneous geodesic surface currently being altered by the seismogenic features of the region. After using the tool in the chosen area, we found that the knickpoint locations are associated with the geologic structures, epicenters of recent earthquakes, and drainages with rectilinear anomalies. The regional analysis demanded the use of a spatial representation of the data after processing using Knickpoint Finder. The results were satisfactory in terms of the correlation of dense areas of knickpoints with active lineaments and the rapidity of the identification of deformed areas. Therefore, this software tool may be considered useful in neotectonic analyses of large areas and may be applied to any area where there is DEM coverage.

  17. Selecting and effectively using a computer aided software engineering tool

    Energy Technology Data Exchange (ETDEWEB)

    Kuhn, D.L.

    1989-01-01

    Software engineering is a science by which user requirements are translated into a quality software product. Computer Aided Software Engineering (CASE) is the scientific application of a set of tools and methods to a software which results in high-quality, defect-free, and maintainable software products. The Computer Systems Engineering (CSE) group of Separations Technology at the Savannah River Site has successfully used CASE tools to produce high-quality, reliable, and maintainable software products. This paper details the selection process CSE used to acquire a commonly available CASE product and how the CSE group effectively used this CASE tool to consistently produce quality software. 9 refs.

  18. Towards E-CASE Tools for Software Engineering

    OpenAIRE

    Nabil Arman,

    2013-01-01

    CASE tools are having an important role in all phases of software systems development and engineering. This is evident in the huge benefits obtained from using these tools including their cost-effectiveness, rapid software application development, and improving the possibility of software reuse to name just a few. In this paper, the idea of moving towards E-CASE tools, rather than traditional CASE tools, is advocated since these E-CASE tools have all the benefits and advantages of traditional...

  19. A Mock-Up Tool for Software Component Reuse Repository

    OpenAIRE

    P.Niranjan; C.V.Guru Rao

    2010-01-01

    Software Reuse effectiveness can be improved by reducing cost and investment.Software reuse costs can be reduced when reusable components are easy to locate, adaptand integrate into new efficient applications. Reuse is the key paradigm for increasingsoftware quality in the software development. This paper focuses on the implementationof software tool with a new integrated classification scheme to make classification buildof software components and effective software reuse repositories to faci...

  20. Thermography based prescreening software tool for veterinary clinics

    Science.gov (United States)

    Dahal, Rohini; Umbaugh, Scott E.; Mishra, Deependra; Lama, Norsang; Alvandipour, Mehrdad; Umbaugh, David; Marino, Dominic J.; Sackman, Joseph

    2017-05-01

    Under development is a clinical software tool which can be used in the veterinary clinics as a prescreening tool for these pathologies: anterior cruciate ligament (ACL) disease, bone cancer and feline hyperthyroidism. Currently, veterinary clinical practice uses several imaging techniques including radiology, computed tomography (CT), and magnetic resonance imaging (MRI). But, harmful radiation involved during imaging, expensive equipment setup, excessive time consumption and the need for a cooperative patient during imaging, are major drawbacks of these techniques. In veterinary procedures, it is very difficult for animals to remain still for the time periods necessary for standard imaging without resorting to sedation - which creates another set of complexities. Therefore, clinical application software integrated with a thermal imaging system and the algorithms with high sensitivity and specificity for these pathologies, can address the major drawbacks of the existing imaging techniques. A graphical user interface (GUI) has been created to allow ease of use for the clinical technician. The technician inputs an image, enters patient information, and selects the camera view associated with the image and the pathology to be diagnosed. The software will classify the image using an optimized classification algorithm that has been developed through thousands of experiments. Optimal image features are extracted and the feature vector is then used in conjunction with the stored image database for classification. Classification success rates as high as 88% for bone cancer, 75% for ACL and 90% for feline hyperthyroidism have been achieved. The software is currently undergoing preliminary clinical testing.

  1. Sandia software guidelines: Volume 5, Tools, techniques, and methodologies

    Energy Technology Data Exchange (ETDEWEB)

    1989-07-01

    This volume is one in a series of Sandia Software Guidelines intended for use in producing quality software within Sandia National Laboratories. This volume describes software tools and methodologies available to Sandia personnel for the development of software, and outlines techniques that have proven useful within the Laboratories and elsewhere. References and evaluations by Sandia personnel are included. 6 figs.

  2. Tools and Behavioral Abstraction: A Direction for Software Engineering

    Science.gov (United States)

    Leino, K. Rustan M.

    As in other engineering professions, software engineers rely on tools. Such tools can analyze program texts and design specifications more automatically and in more detail than ever before. While many tools today are applied to find new defects in old code, I predict that more software-engineering tools of the future will be available to software authors at the time of authoring. If such analysis tools can be made to be fast enough and easy enough to use, they can help software engineers better produce and evolve programs.

  3. Key characteristics relevant for selecting knowledge management software tools

    CSIR Research Space (South Africa)

    Smuts, H

    2011-07-01

    Full Text Available phenomenon that makes the use of technology not an option, but a necessity. Software tools in knowledge management are a collection of technologies and are not necessarily acquired as a single software solution. Furthermore, these knowledge management...

  4. Software tools overview : process integration, modelling and optimisation for energy saving and pollution reduction

    OpenAIRE

    Lam, Hon Loong; Klemeš, Jiri; Kravanja, Zdravko; Varbanov, Petar

    2012-01-01

    This paper provides an overview of software tools based on long experience andapplications in the area of process integration, modelling and optimisation. The first part reviews the current design practice and the development of supporting software tools. Those are categorised as: (1) process integration and retrofit analysis tools, (2) general mathematical modelling suites with optimisation libraries, (3) flowsheeting simulation and (4) graph-based process optimisation tools. The second part...

  5. Educational Software Tool for Protection System Engineers. Distance Relay

    Directory of Open Access Journals (Sweden)

    Trujillo-Guajardo L.A.

    2012-04-01

    Full Text Available In this article, a graphical software tool is presented; this tool is based on the education of protection system engineers. The theoretical fundaments used for the design of operation characteristics of distance relays and their algorithms are presented. The software allows the evaluation and analysis of real time events or simulated ones of every stage of design of the distance relay. Some example cases are presented to illustrate the activities that could be done with the graphical software tool developed.

  6. Software Tools to Support the Assessment of System Health

    Science.gov (United States)

    Melcher, Kevin J.

    2013-01-01

    This presentation provides an overview of three software tools that were developed by the NASA Glenn Research Center to support the assessment of system health: the Propulsion Diagnostic Method Evaluation Strategy (ProDIMES), the Systematic Sensor Selection Strategy (S4), and the Extended Testability Analysis (ETA) tool. Originally developed to support specific NASA projects in aeronautics and space, these software tools are currently available to U.S. citizens through the NASA Glenn Software Catalog. The ProDiMES software tool was developed to support a uniform comparison of propulsion gas path diagnostic methods. Methods published in the open literature are typically applied to dissimilar platforms with different levels of complexity. They often address different diagnostic problems and use inconsistent metrics for evaluating performance. As a result, it is difficult to perform a one ]to ]one comparison of the various diagnostic methods. ProDIMES solves this problem by serving as a theme problem to aid in propulsion gas path diagnostic technology development and evaluation. The overall goal is to provide a tool that will serve as an industry standard, and will truly facilitate the development and evaluation of significant Engine Health Management (EHM) capabilities. ProDiMES has been developed under a collaborative project of The Technical Cooperation Program (TTCP) based on feedback provided by individuals within the aircraft engine health management community. The S4 software tool provides a framework that supports the optimal selection of sensors for health management assessments. S4 is structured to accommodate user ]defined applications, diagnostic systems, search techniques, and system requirements/constraints. One or more sensor suites that maximize this performance while meeting other user ]defined system requirements that are presumed to exist. S4 provides a systematic approach for evaluating combinations of sensors to determine the set or sets of

  7. Herramientas libres para modelar software Free tools to model software

    Directory of Open Access Journals (Sweden)

    Mauro Callejas Cuervo Óscar Yovany Baquero Moreno

    2010-11-01

    Full Text Available Observación acerca del  software libre y de suimplicación en procesos de desarrollo de  softwarecon herramientas 4G por parte de entidades opersonas sin capitales astronómicos y sin lamentalidad acaparadora de dominar el mercado conproductos costosos que las hagan multimillonarias yque no ofrecen una garantía real, ni la posibilidadsiquiera de conocer el  software por el que se hapagado, y mucho menos de modificarlo si no cumplenuestras expectativas.

  8. Current trends in free software research

    OpenAIRE

    Navarro Bosch, Ramon; Vila Marta, Sebastià

    2009-01-01

    This report analyzes how scientific research is studying free software. We find which research is being done on free software by looking into scientific journals and conferences publications. The data thus obtained is analized and the most salient trends related to free software discovered. We also reviewed the main works published in each free software research area.

  9. SRGM Analyzers Tool of SDLC for Software Improving Quality

    Directory of Open Access Journals (Sweden)

    Mr. Girish Nille

    2014-11-01

    Full Text Available Software Reliability Growth Models (SRGM have been developed to estimate software reliability measures such as software failure rate, number of remaining faults and software reliability. In this paper, the software analyzers tool proposed for deriving several software reliability growth models based on Enhanced Non-homogeneous Poisson Process (ENHPP in the presence of imperfect debugging and error generation. The proposed models are initially formulated for the case when there is no differentiation between failure observation and fault removal testing processes and then this extended for the case when there is a clear differentiation between failure observation and fault removal testing processes. Many Software Reliability Growth Models (SRGM have been developed to describe software failures as a random process and can be used to measure the development status during testing. With SRGM software consultants can easily measure (or evaluate the software reliability (or quality and plot software reliability growth charts.

  10. A Taxonomy of Knowledge Management Software Tools: Origins and Applications.

    Science.gov (United States)

    Tyndale, Peter

    2002-01-01

    Examines, evaluates, and organizes a wide variety of knowledge management software tools by examining the literature related to the selection and evaluation of knowledge management tools. (Author/SLD)

  11. Towards E-CASE Tools for Software Engineering

    Directory of Open Access Journals (Sweden)

    Nabil Arman

    2013-02-01

    Full Text Available CASE tools are having an important role in all phases of software systems development and engineering. This is evident in the huge benefits obtained from using these tools including their cost-effectiveness, rapid software application development, and improving the possibility of software reuse to name just a few. In this paper, the idea of moving towards E-CASE tools, rather than traditional CASE tools, is advocated since these E-CASE tools have all the benefits and advantages of traditional CASE tools and add to that all the benefits of web technology. This is presented by focusing on the role of E-CASE tools in facilitating the trend of telecommuting and virtual workplaces among software engineering and information technology professionals. In addition, E-CASE tools integrates smoothly with the trend of E-learning in conducting software engineering courses. Finally, two surveys were conducted for a group of software engineering professional and students of software engineering courses. The surveys show that E-CASE tools are of great value to both communities of students and professionals of software engineering.

  12. TAUS:A File—Based Software Understanding Tool

    Institute of Scientific and Technical Information of China (English)

    费翔林; 汪承藻; 等

    1990-01-01

    A program called TAUS,a Tool for Analyzing and Understanding Software,was developed.It is designed to help the programmer analyze and understand the software interactively.Its aim is to reduce the dependence on human intelligence in software understanding and improve the programmer's understanding productivity.The design and implementation of TAUS and its applications are described.

  13. The Value of Open Source Software Tools in Qualitative Research

    Science.gov (United States)

    Greenberg, Gary

    2011-01-01

    In an era of global networks, researchers using qualitative methods must consider the impact of any software they use on the sharing of data and findings. In this essay, I identify researchers' main areas of concern regarding the use of qualitative software packages for research. I then examine how open source software tools, wherein the publisher…

  14. Estimation of toxicity using a Java based software tool

    Science.gov (United States)

    A software tool has been developed that will allow a user to estimate the toxicity for a variety of endpoints (such as acute aquatic toxicity). The software tool is coded in Java and can be accessed using a web browser (or alternatively downloaded and ran as a stand alone applic...

  15. Estimation of toxicity using a Java based software tool

    Science.gov (United States)

    A software tool has been developed that will allow a user to estimate the toxicity for a variety of endpoints (such as acute aquatic toxicity). The software tool is coded in Java and can be accessed using a web browser (or alternatively downloaded and ran as a stand alone applic...

  16. EISA 432 Energy Audits Best Practices: Software Tools

    Energy Technology Data Exchange (ETDEWEB)

    Maryl Fisher

    2014-11-01

    Five whole building analysis software tools that can aid an energy manager with fulfilling energy audit and commissioning/retro-commissioning requirements were selected for review in this best practices study. A description of each software tool is provided as well as a discussion of the user interface and level of expertise required for each tool, a review of how to use the tool for analyzing energy conservation opportunities, the format and content of reports generated by the tool, and a discussion on the applicability of the tool for commissioning.

  17. Tools and Patterns for Dependable Concurrent Software

    NARCIS (Netherlands)

    Jovanovic, D.S.; Broenink, Johannes F.

    2005-01-01

    First, we give reasons for choosing a process-oriented approach for building complex concurrent systems. Upon a brief review of dependability attributes of software-supported systems, means for increasing dependability in process-oriented architectures are illustrated

  18. Software Tools for Fault Management Technologies Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Fault Management (FM) is a key requirement for safety, efficient onboard and ground operations, maintenance, and repair. QSI's TEAMS Software suite is a leading...

  19. Software Users Manual (SUM): Extended Testability Analysis (ETA) Tool

    Science.gov (United States)

    Maul, William A.; Fulton, Christopher E.

    2011-01-01

    This software user manual describes the implementation and use the Extended Testability Analysis (ETA) Tool. The ETA Tool is a software program that augments the analysis and reporting capabilities of a commercial-off-the-shelf (COTS) testability analysis software package called the Testability Engineering And Maintenance System (TEAMS) Designer. An initial diagnostic assessment is performed by the TEAMS Designer software using a qualitative, directed-graph model of the system being analyzed. The ETA Tool utilizes system design information captured within the diagnostic model and testability analysis output from the TEAMS Designer software to create a series of six reports for various system engineering needs. The ETA Tool allows the user to perform additional studies on the testability analysis results by determining the detection sensitivity to the loss of certain sensors or tests. The ETA Tool was developed to support design and development of the NASA Ares I Crew Launch Vehicle. The diagnostic analysis provided by the ETA Tool was proven to be valuable system engineering output that provided consistency in the verification of system engineering requirements. This software user manual provides a description of each output report generated by the ETA Tool. The manual also describes the example diagnostic model and supporting documentation - also provided with the ETA Tool software release package - that were used to generate the reports presented in the manual

  20. An Evaluation Format for "Open" Software Tools.

    Science.gov (United States)

    Murphy, Cheryl A.

    1995-01-01

    Evaluates six "open" (empty of content and customized by users) software programs using the literature-based characteristics of documentation, learner control, branching capabilities, portability, ease of use, and cost-effectiveness. Interviewed computer-knowledgeable individuals to confirm the legitimacy of the evaluative characteristics. (LRW)

  1. Innovative Software Tools Measure Behavioral Alertness

    Science.gov (United States)

    2014-01-01

    To monitor astronaut behavioral alertness in space, Johnson Space Center awarded Philadelphia-based Pulsar Informatics Inc. SBIR funding to develop software to be used onboard the International Space Station. Now used by the government and private companies, the technology has increased revenues for the firm by an average of 75 percent every year.

  2. DoD Current State for Software Technology Readiness Assessments

    Science.gov (United States)

    2010-04-01

    DoD Current State for Software Technology Readiness Assessments Systems & Software Technology Conference April 2010 Cynthia Dion-Schwarz, Ph.D...DoD Current State for Software Technology Readiness Assessments 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d...public release; distribution unlimited 13. SUPPLEMENTARY NOTES Presented at the 22nd Systems and Software Technology Conference (SSTC), 26-29 April

  3. Evaluating, selecting and relevance software tools in technology monitoring

    Directory of Open Access Journals (Sweden)

    Óscar Fernando Castellanos Domínguez

    2010-07-01

    Full Text Available The current setting for industrial and entrepreneurial development has posed the need for incorporating differentiating elements into the production apparatus leading to anticipating technological change. Technology monitoring (TM emerges as a methodology focused on analysing these changes for identifying challenges and opportunities (being mainly supported by information technology (IT through the search for, capture and analysis of data and information. This article proposes criteria for choosing and efficiently using software tools having different characteristics, requirements, capacity and cost which could be used in monitoring. An approach is made to different TM models, emphasising the identification and analysis of different information sources for coving and supporting information and access monitoring. Some evaluation, selection and analysis criteria are given for using these types of tools according to each production system’s individual profile and needs. Some of the existing software packages are described which are available on the market for carrying out monitoring prolects, relating them to their complexity, process characteristics and cost.

  4. Computer- Aided Design in Power Engineering Application of Software Tools

    CERN Document Server

    Stojkovic, Zlatan

    2012-01-01

    This textbooks demonstrates the application of software tools in solving a series of problems from the field of designing power system structures and systems. It contains four chapters: The first chapter leads the reader through all the phases necessary in the procedures of computer aided modeling and simulation. It guides through the complex problems presenting on the basis of eleven original examples. The second chapter presents  application of software tools in power system calculations of power systems equipment design. Several design example calculations are carried out using engineering standards like MATLAB, EMTP/ATP, Excel & Access, AutoCAD and Simulink. The third chapters focuses on the graphical documentation using a collection of software tools (AutoCAD, EPLAN, SIMARIS SIVACON, SIMARIS DESIGN) which enable the complete automation of the development of graphical documentation of a power systems. In the fourth chapter, the application of software tools in the project management in power systems ...

  5. Software Construction and Analysis Tools for Future Space Missions

    Science.gov (United States)

    Lowry, Michael R.; Clancy, Daniel (Technical Monitor)

    2002-01-01

    NASA and its international partners will increasingly depend on software-based systems to implement advanced functions for future space missions, such as Martian rovers that autonomously navigate long distances exploring geographic features formed by surface water early in the planet's history. The software-based functions for these missions will need to be robust and highly reliable, raising significant challenges in the context of recent Mars mission failures attributed to software faults. After reviewing these challenges, this paper describes tools that have been developed at NASA Ames that could contribute to meeting these challenges; 1) Program synthesis tools based on automated inference that generate documentation for manual review and annotations for automated certification. 2) Model-checking tools for concurrent object-oriented software that achieve memorability through synergy with program abstraction and static analysis tools.

  6. Use of Software Tools in Teaching Relational Database Design.

    Science.gov (United States)

    McIntyre, D. R.; And Others

    1995-01-01

    Discusses the use of state-of-the-art software tools in teaching a graduate, advanced, relational database design course. Results indicated a positive student response to the prototype of expert systems software and a willingness to utilize this new technology both in their studies and in future work applications. (JKP)

  7. ISWHM: Tools and Techniques for Software and System Health Management

    Science.gov (United States)

    Schumann, Johann; Mengshoel, Ole J.; Darwiche, Adnan

    2010-01-01

    This presentation presents status and results of research on Software Health Management done within the NRA "ISWHM: Tools and Techniques for Software and System Health Management." Topics include: Ingredients of a Guidance, Navigation, and Control System (GN and C); Selected GN and C Testbed example; Health Management of major ingredients; ISWHM testbed architecture; and Conclusions and next Steps.

  8. NASA-Enhanced Version Of Automatically Programmed Tool Software (APT)

    Science.gov (United States)

    Purves, L. R.

    1989-01-01

    APT code one of most widely used software tools for complex numerically-controlled machining. Both a programming language and software that processes language. Upgrades include super pocket for concave polygon pockets and editor to reprocess cutter location coordinates according to user-supplied commands.

  9. iPhone examination with modern forensic software tools

    Science.gov (United States)

    Höne, Thomas; Kröger, Knut; Luttenberger, Silas; Creutzburg, Reiner

    2012-06-01

    The aim of the paper is to show the usefulness of modern forensic software tools for iPhone examination. In particular, we focus on the new version of Elcomsoft iOS Forensic Toolkit and compare it with Oxygen Forensics Suite 2012 regarding functionality, usability and capabilities. It is shown how these software tools works and how capable they are in examining non-jailbreaked and jailbreaked iPhones.

  10. A Software Tool for Quantitative Seismicity Analysis - ZMAP

    Science.gov (United States)

    Wiemer, S.; Gerstenberger, M.

    2001-12-01

    Earthquake catalogs are probably the most basic product of seismology, and remain arguably the most useful for tectonic studies. Modern seismograph networks can locate up to 100,000 earthquakes annually, providing a continuous and sometime overwhelming stream of data. ZMAP is a set of tools driven by a graphical user interface (GUI), designed to help seismologists analyze catalog data. ZMAP is primarily a research tool suited to the evaluation of catalog quality and to addressing specific hypotheses; however, it can also be useful in routine network operations. Examples of ZMAP features include catalog quality assessment (artifacts, completeness, explosion contamination), interactive data exploration, mapping transients in seismicity (rate changes, b-values, p-values), fractal dimension analysis and stress tensor inversions. Roughly 100 scientists worldwide have used the software at least occasionally. About 30 peer-reviewed publications have made use of ZMAP. ZMAP code is open source, written in the commercial software language Matlab by the Mathworks, a widely used software in the natural sciences. ZMAP was first published in 1994, and has continued to grow over the past 7 years. Recently, we released ZMAP v.6. The poster will introduce the features of ZMAP. We will specifically focus on ZMAP features related to time-dependent probabilistic hazard assessment. We are currently implementing a ZMAP based system that computes probabilistic hazard maps, which combine the stationary background hazard as well as aftershock and foreshock hazard into a comprehensive time dependent probabilistic hazard map. These maps will be displayed in near real time on the Internet. This poster is also intended as a forum for ZMAP users to provide feedback and discuss the future of ZMAP.

  11. CURRENT AND FUTURE USE OF MANAGEMENT TOOLS

    National Research Council Canada - National Science Library

    Zlatko Nedelko; Vojko Pototcan; Marina Dabic

    2015-01-01

    .... The authors developed and tested a model for predicting the future use of management tools based on the current use of tools by employees in organizations, underlying assumptions of the theory...

  12. Generating DEM from LIDAR data - comparison of available software tools

    Science.gov (United States)

    Korzeniowska, K.; Lacka, M.

    2011-12-01

    In recent years many software tools and applications have appeared that offer procedures, scripts and algorithms to process and visualize ALS data. This variety of software tools and of "point cloud" processing methods contributed to the aim of this study: to assess algorithms available in various software tools that are used to classify LIDAR "point cloud" data, through a careful examination of Digital Elevation Models (DEMs) generated from LIDAR data on a base of these algorithms. The works focused on the most important available software tools: both commercial and open source ones. Two sites in a mountain area were selected for the study. The area of each site is 0.645 sq km. DEMs generated with analysed software tools ware compared with a reference dataset, generated using manual methods to eliminate non ground points. Surfaces were analysed using raster analysis. Minimum, maximum and mean differences between reference DEM and DEMs generated with analysed software tools were calculated, together with Root Mean Square Error. Differences between DEMs were also examined visually using transects along the grid axes in the test sites.

  13. Current practice in software development for computational neuroscience and how to improve it.

    Science.gov (United States)

    Gewaltig, Marc-Oliver; Cannon, Robert

    2014-01-01

    Almost all research work in computational neuroscience involves software. As researchers try to understand ever more complex systems, there is a continual need for software with new capabilities. Because of the wide range of questions being investigated, new software is often developed rapidly by individuals or small groups. In these cases, it can be hard to demonstrate that the software gives the right results. Software developers are often open about the code they produce and willing to share it, but there is little appreciation among potential users of the great diversity of software development practices and end results, and how this affects the suitability of software tools for use in research projects. To help clarify these issues, we have reviewed a range of software tools and asked how the culture and practice of software development affects their validity and trustworthiness. We identified four key questions that can be used to categorize software projects and correlate them with the type of product that results. The first question addresses what is being produced. The other three concern why, how, and by whom the work is done. The answers to these questions show strong correlations with the nature of the software being produced, and its suitability for particular purposes. Based on our findings, we suggest ways in which current software development practice in computational neuroscience can be improved and propose checklists to help developers, reviewers, and scientists to assess the quality of software and whether particular pieces of software are ready for use in research.

  14. Current practice in software development for computational neuroscience and how to improve it.

    Directory of Open Access Journals (Sweden)

    Marc-Oliver Gewaltig

    2014-01-01

    Full Text Available Almost all research work in computational neuroscience involves software. As researchers try to understand ever more complex systems, there is a continual need for software with new capabilities. Because of the wide range of questions being investigated, new software is often developed rapidly by individuals or small groups. In these cases, it can be hard to demonstrate that the software gives the right results. Software developers are often open about the code they produce and willing to share it, but there is little appreciation among potential users of the great diversity of software development practices and end results, and how this affects the suitability of software tools for use in research projects. To help clarify these issues, we have reviewed a range of software tools and asked how the culture and practice of software development affects their validity and trustworthiness. We identified four key questions that can be used to categorize software projects and correlate them with the type of product that results. The first question addresses what is being produced. The other three concern why, how, and by whom the work is done. The answers to these questions show strong correlations with the nature of the software being produced, and its suitability for particular purposes. Based on our findings, we suggest ways in which current software development practice in computational neuroscience can be improved and propose checklists to help developers, reviewers, and scientists to assess the quality of software and whether particular pieces of software are ready for use in research.

  15. The evolution of CACSD tools-a software engineering perspective

    DEFF Research Database (Denmark)

    Ravn, Ole; Szymkat, Maciej

    1992-01-01

    The earlier evolution of computer-aided control system design (CACSD) tools is discussed from a software engineering perspective. A model of the design process is presented as the basis for principles and requirements of future CACSD tools. Combinability, interfacing in memory, and an open...... workspace are seen as important concepts in CACSD. Some points are made about the problem of buy or make when new software is required, and the idea of buy and make is put forward. Emphasis is put on the time perspective and the life cycle of the software...

  16. Economic Consequence Analysis of Disasters: The ECAT Software Tool

    Energy Technology Data Exchange (ETDEWEB)

    Rose, Adam; Prager, Fynn; Chen, Zhenhua; Chatterjee, Samrat; Wei, Dan; Heatwole, Nathaniel; Warren, Eric

    2017-04-15

    This study develops a methodology for rapidly obtaining approximate estimates of the economic consequences from numerous natural, man-made and technological threats. This software tool is intended for use by various decision makers and analysts to obtain estimates rapidly. It is programmed in Excel and Visual Basic for Applications (VBA) to facilitate its use. This tool is called E-CAT (Economic Consequence Analysis Tool) and accounts for the cumulative direct and indirect impacts (including resilience and behavioral factors that significantly affect base estimates) on the U.S. economy. E-CAT is intended to be a major step toward advancing the current state of economic consequence analysis (ECA) and also contributing to and developing interest in further research into complex but rapid turnaround approaches. The essence of the methodology involves running numerous simulations in a computable general equilibrium (CGE) model for each threat, yielding synthetic data for the estimation of a single regression equation based on the identification of key explanatory variables (threat characteristics and background conditions). This transforms the results of a complex model, which is beyond the reach of most users, into a "reduced form" model that is readily comprehensible. Functionality has been built into E-CAT so that its users can switch various consequence categories on and off in order to create customized profiles of economic consequences of numerous risk events. E-CAT incorporates uncertainty on both the input and output side in the course of the analysis.

  17. Software project management tools in global software development: a systematic mapping study.

    Science.gov (United States)

    Chadli, Saad Yasser; Idri, Ali; Ros, Joaquín Nicolás; Fernández-Alemán, José Luis; de Gea, Juan M Carrillo; Toval, Ambrosio

    2016-01-01

    Global software development (GSD) which is a growing trend in the software industry is characterized by a highly distributed environment. Performing software project management (SPM) in such conditions implies the need to overcome new limitations resulting from cultural, temporal and geographic separation. The aim of this research is to discover and classify the various tools mentioned in literature that provide GSD project managers with support and to identify in what way they support group interaction. A systematic mapping study has been performed by means of automatic searches in five sources. We have then synthesized the data extracted and presented the results of this study. A total of 102 tools were identified as being used in SPM activities in GSD. We have classified these tools, according to the software life cycle process on which they focus and how they support the 3C collaboration model (communication, coordination and cooperation). The majority of the tools found are standalone tools (77%). A small number of platforms (8%) also offer a set of interacting tools that cover the software development lifecycle. Results also indicate that SPM areas in GSD are not adequately supported by corresponding tools and deserve more attention from tool builders.

  18. Technical Data Exchange Software Tools Adapted to Distributed Microsatellite Design

    Science.gov (United States)

    Pache, Charly

    2002-01-01

    One critical issue concerning distributed design of satellites, is the collaborative work it requires. In particular, the exchange of data between each group responsible for each subsystem can be complex and very time-consuming. The goal of this paper is to present a design collaborative tool, the SSETI Design Model (SDM), specifically developed for enabling satellite distributed design. SDM is actually used in the ongoing Student Space Exploration &Technology (SSETI) initiative (www.sseti.net). SSETI is lead by European Space Agency (ESA) outreach office (http://www.estec.esa.nl/outreach), involving student groups from all over Europe for design, construction and launch of a microsatellite. The first part of this paper presents the current version of the SDM tool, a collection of Microsoft Excel linked worksheets, one for each subsystem. An overview of the project framework/structure is given, explaining the different actors, the flows between them, as well as the different types of data and the links - formulas - between data sets. Unified Modeling Language (UML) diagrams give an overview of the different parts . Then the SDM's functionalities, developed in VBA scripts (Visual Basic for Application), are introduced, as well as the interactive features, user interfaces and administration tools. The second part discusses the capabilities and limitations of SDM current version. Taking into account these capabilities and limitations, the third part outlines the next version of SDM, a web-oriented, database-driven evolution of the current version. This new approach will enable real-time data exchange and processing between the different actors of the mission. Comprehensive UML diagrams will guide the audience through the entire modeling process of such a system. Tradeoffs simulation capabilities, security, reliability, hardware and software issues will also be thoroughly discussed.

  19. A Review on Software Mining: Current Trends and Methodologies

    Directory of Open Access Journals (Sweden)

    Gurtej Singh Ubhi

    2017-04-01

    Full Text Available With the evolution of Software Mining, it has enabled to play a crucial role in the day to day activities .By empowering the software with the data mining methods it aids to all the software developers and the managers at all the managerial levels to use it as a tool so that the relative software engineering data (in the form of code, design documents, bug reports to visualize the project‟s status of evolution and progress. To add on, further all the mining methods and algorithms help to device the models to develop any fault prone real time system for the real world more prior to the testing phase or the evolution phase. Also , in this review paper it will highlight the different methodologies for software mining along with its extension as a tool for fault tolerance and as well as a bibliography with the special prominence on mining the software engineering information

  20. Runtime software adaptation: approaches and a programming tool

    Directory of Open Access Journals (Sweden)

    Jarosław Rudy

    2012-03-01

    Full Text Available Software systems steadily tend to be bigger and more complex, making it more difficult to change them, especially during runtime. Several types of runtime software adaptation approaches were proposed to increase the adaptation capability of applications and turn them into an evolution software. Many of these approaches (using software architectural models for example are implemented during the design phase of software development life cycle, making them ineffective or difficult to use in case of already existing applications. Moreover, the overhead caused by the use of these approaches has not been determined in many cases. In this paper author presents the taxonomy of high- and low-level approaches to runtime software adaptation and then introduces a lightweight prototype programming tool used to add runtime code modification capability (via function hotswapping to existing applications written in C++ and run under Linux. The tool also enables to replace a defective function by its older or corrected version at runtime. Several tests were prepared to compare traditional C++ applications with the same applications developed with the aforementioned programming tool. Applications were compared in terms of execution time, size of executable code and memory usage. Different size and number of functions have been considered. The paper also researches the constant overhead caused by the programming tool regardless of the target application. The paper ends with the summary of presented approaches and their characteristics, including effects on the targeted systems, capabilities, ease of use, level of abstraction etc.

  1. Current and future trends in marine image annotation software

    Science.gov (United States)

    Gomes-Pereira, Jose Nuno; Auger, Vincent; Beisiegel, Kolja; Benjamin, Robert; Bergmann, Melanie; Bowden, David; Buhl-Mortensen, Pal; De Leo, Fabio C.; Dionísio, Gisela; Durden, Jennifer M.; Edwards, Luke; Friedman, Ariell; Greinert, Jens; Jacobsen-Stout, Nancy; Lerner, Steve; Leslie, Murray; Nattkemper, Tim W.; Sameoto, Jessica A.; Schoening, Timm; Schouten, Ronald; Seager, James; Singh, Hanumant; Soubigou, Olivier; Tojeira, Inês; van den Beld, Inge; Dias, Frederico; Tempera, Fernando; Santos, Ricardo S.

    2016-12-01

    Given the need to describe, analyze and index large quantities of marine imagery data for exploration and monitoring activities, a range of specialized image annotation tools have been developed worldwide. Image annotation - the process of transposing objects or events represented in a video or still image to the semantic level, may involve human interactions and computer-assisted solutions. Marine image annotation software (MIAS) have enabled over 500 publications to date. We review the functioning, application trends and developments, by comparing general and advanced features of 23 different tools utilized in underwater image analysis. MIAS requiring human input are basically a graphical user interface, with a video player or image browser that recognizes a specific time code or image code, allowing to log events in a time-stamped (and/or geo-referenced) manner. MIAS differ from similar software by the capability of integrating data associated to video collection, the most simple being the position coordinates of the video recording platform. MIAS have three main characteristics: annotating events in real time, posteriorly to annotation and interact with a database. These range from simple annotation interfaces, to full onboard data management systems, with a variety of toolboxes. Advanced packages allow to input and display data from multiple sensors or multiple annotators via intranet or internet. Posterior human-mediated annotation often include tools for data display and image analysis, e.g. length, area, image segmentation, point count; and in a few cases the possibility of browsing and editing previous dive logs or to analyze the annotations. The interaction with a database allows the automatic integration of annotations from different surveys, repeated annotation and collaborative annotation of shared datasets, browsing and querying of data. Progress in the field of automated annotation is mostly in post processing, for stable platforms or still images

  2. Navigating freely-available software tools for metabolomics analysis.

    Science.gov (United States)

    Spicer, Rachel; Salek, Reza M; Moreno, Pablo; Cañueto, Daniel; Steinbeck, Christoph

    2017-01-01

    The field of metabolomics has expanded greatly over the past two decades, both as an experimental science with applications in many areas, as well as in regards to data standards and bioinformatics software tools. The diversity of experimental designs and instrumental technologies used for metabolomics has led to the need for distinct data analysis methods and the development of many software tools. To compile a comprehensive list of the most widely used freely available software and tools that are used primarily in metabolomics. The most widely used tools were selected for inclusion in the review by either ≥ 50 citations on Web of Science (as of 08/09/16) or the use of the tool being reported in the recent Metabolomics Society survey. Tools were then categorised by the type of instrumental data (i.e. LC-MS, GC-MS or NMR) and the functionality (i.e. pre- and post-processing, statistical analysis, workflow and other functions) they are designed for. A comprehensive list of the most used tools was compiled. Each tool is discussed within the context of its application domain and in relation to comparable tools of the same domain. An extended list including additional tools is available at https://github.com/RASpicer/MetabolomicsTools which is classified and searchable via a simple controlled vocabulary. This review presents the most widely used tools for metabolomics analysis, categorised based on their main functionality. As future work, we suggest a direct comparison of tools' abilities to perform specific data analysis tasks e.g. peak picking.

  3. Concepts and Tools for the Software Life Cycle

    Science.gov (United States)

    Tausworthe, R. C.

    1985-01-01

    The tools, techniques, and aids needed to engineer, manage, and administer a large software-intensive task are themselves parts of a large softwaare base, and are incurred only at great expense. The needs of the software life cycle in terms of such supporting tools and methodologies are highlighted. The concept of a distributed network for engineering, management, and administrative functions is outlined, and the key characteristics of localized subnets in high-communications-traffic areas of software activity are discussed. A formal, deliberate, structured, systems-engineering approach for the construction of a uniform, coordinated tool set is proposed as a means to reduce development and maintenance costs, foster adaptability, enhance reliability, and promote standardization.

  4. Software tool for data mining and its applications

    Science.gov (United States)

    Yang, Jie; Ye, Chenzhou; Chen, Nianyi

    2002-03-01

    A software tool for data mining is introduced, which integrates pattern recognition (PCA, Fisher, clustering, hyperenvelop, regression), artificial intelligence (knowledge representation, decision trees), statistical learning (rough set, support vector machine), computational intelligence (neural network, genetic algorithm, fuzzy systems). It consists of nine function models: pattern recognition, decision trees, association rule, fuzzy rule, neural network, genetic algorithm, Hyper Envelop, support vector machine, visualization. The principle and knowledge representation of some function models of data mining are described. The software tool of data mining is realized by Visual C++ under Windows 2000. Nonmonotony in data mining is dealt with by concept hierarchy and layered mining. The software tool of data mining has satisfactorily applied in the prediction of regularities of the formation of ternary intermetallic compounds in alloy systems, and diagnosis of brain glioma.

  5. Management of Astronomical Software Projects with Open Source Tools

    Science.gov (United States)

    Briegel, F.; Bertram, T.; Berwein, J.; Kittmann, F.

    2010-12-01

    In this paper we will offer an innovative approach to managing the software development process with free open source tools, for building and automated testing, a system to automate the compile/test cycle on a variety of platforms to validate code changes, using virtualization to compile in parallel on various operating system platforms, version control and change management, enhanced wiki and issue tracking system for online documentation and reporting and groupware tools as they are: blog, discussion and calendar. Initially starting with the Linc-Nirvana instrument a new project and configuration management tool for developing astronomical software was looked for. After evaluation of various systems of this kind, we are satisfied with the selection we are using now. Following the lead of Linc-Nirvana most of the other software projects at the MPIA are using it now.

  6. Concepts and tools for the software life cycle

    Science.gov (United States)

    Tausworthe, Robert C.

    1985-10-01

    The life cycle process for large software-intensive systems is an extremely intricate and complex process involving many people performing amid a very large base of evolving computer programs, documentation and data. To be successful, the process must be well conceived, planned and conducted; however, the nature of scientific and other high-technology projects involving large-scale software is such that conceptualization, planning and implementation to the degree of detail required is so laborintensive and unmotivating as to be counter-productive and seldom cost-effective. The tools, techniques and aids needed to engineer, manage and administrate a large software-intensive task are themselves parts of a large software base, and are incurred only at great expense. This paper focuses on the needs of the software life cycle in terms of supporting tools and methodologies. The concept of a distributed network for engineering, management and administrative functions for engineering, management and administrative functions is outlined, and the key characteristics of localized subnets in high-communications-traffic areas of software activity are discussed. A formal, deliberate, structured, systems-engineered approach toward the construction of uniform, coordinated tools is proposed as a means to reduce development and maintenance costs, foster creativity, enhance reliability, promote standardization and sustain human motivation.

  7. Software engineering and data management for automated payload experiment tool

    Science.gov (United States)

    Maddux, Gary A.; Provancha, Anna; Chattam, David

    1994-01-01

    The Microgravity Projects Office identified a need to develop a software package that will lead experiment developers through the development planning process, obtain necessary information, establish an electronic data exchange avenue, and allow easier manipulation/reformatting of the collected information. An MS-DOS compatible software package called the Automated Payload Experiment Tool (APET) has been developed and delivered. The objective of this task is to expand on the results of the APET work previously performed by University of Alabama in Huntsville (UAH) and provide versions of the software in a Macintosh and Windows compatible format. Appendix 1 science requirements document (SRD) Users Manual is attached.

  8. Development of a Software Tool to Automate ADCO Flight Controller Console Planning Tasks

    Science.gov (United States)

    Anderson, Mark G.

    2011-01-01

    This independent study project covers the development of the International Space Station (ISS) Attitude Determination and Control Officer (ADCO) Planning Exchange APEX Tool. The primary goal of the tool is to streamline existing manual and time-intensive planning tools into a more automated, user-friendly application that interfaces with existing products and allows the ADCO to produce accurate products and timelines more effectively. This paper will survey the current ISS attitude planning process and its associated requirements, goals, documentation and software tools and how a software tool could simplify and automate many of the planning actions which occur at the ADCO console. The project will be covered from inception through the initial prototype delivery in November 2011 and will include development of design requirements and software as well as design verification and testing.

  9. High Energy Physics Forum for Computational Excellence: Working Group Reports (I. Applications Software II. Software Libraries and Tools III. Systems)

    Energy Technology Data Exchange (ETDEWEB)

    Habib, Salman [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Roser, Robert [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States)

    2015-10-28

    Computing plays an essential role in all aspects of high energy physics. As computational technology evolves rapidly in new directions, and data throughput and volume continue to follow a steep trend-line, it is important for the HEP community to develop an effective response to a series of expected challenges. In order to help shape the desired response, the HEP Forum for Computational Excellence (HEP-FCE) initiated a roadmap planning activity with two key overlapping drivers -- 1) software effectiveness, and 2) infrastructure and expertise advancement. The HEP-FCE formed three working groups, 1) Applications Software, 2) Software Libraries and Tools, and 3) Systems (including systems software), to provide an overview of the current status of HEP computing and to present findings and opportunities for the desired HEP computational roadmap. The final versions of the reports are combined in this document, and are presented along with introductory material.

  10. High Energy Physics Forum for Computational Excellence: Working Group Reports (I. Applications Software II. Software Libraries and Tools III. Systems)

    CERN Document Server

    Habib, Salman; LeCompte, Tom; Marshall, Zach; Borgland, Anders; Viren, Brett; Nugent, Peter; Asai, Makoto; Bauerdick, Lothar; Finkel, Hal; Gottlieb, Steve; Hoeche, Stefan; Sheldon, Paul; Vay, Jean-Luc; Elmer, Peter; Kirby, Michael; Patton, Simon; Potekhin, Maxim; Yanny, Brian; Calafiura, Paolo; Dart, Eli; Gutsche, Oliver; Izubuchi, Taku; Lyon, Adam; Petravick, Don

    2015-01-01

    Computing plays an essential role in all aspects of high energy physics. As computational technology evolves rapidly in new directions, and data throughput and volume continue to follow a steep trend-line, it is important for the HEP community to develop an effective response to a series of expected challenges. In order to help shape the desired response, the HEP Forum for Computational Excellence (HEP-FCE) initiated a roadmap planning activity with two key overlapping drivers -- 1) software effectiveness, and 2) infrastructure and expertise advancement. The HEP-FCE formed three working groups, 1) Applications Software, 2) Software Libraries and Tools, and 3) Systems (including systems software), to provide an overview of the current status of HEP computing and to present findings and opportunities for the desired HEP computational roadmap. The final versions of the reports are combined in this document, and are presented along with introductory material.

  11. Automotive Software Engineering. Fundamentals, processes, methods, tools; Automotive Software Engineering. Grundlagen, Prozesse, Methoden und Werkzeuge

    Energy Technology Data Exchange (ETDEWEB)

    Schaeuffele, J.; Zurawka, T.

    2003-07-01

    The book presents fundamentals and practical examples of processes, methods and tools to ensure safe operation of electronic systems and software in motor vehicles. The focus is on the electronic systems of the powertrain, suspension and ar body. Contents: The overall system of car, driver and environment; Fundamentals; Processes for development of electronic systems and software; Methods and tools for the development, production and servicing of electronic systems. The book addresses staff members of motor car producers and suppliers of electronic systems and software, as well as students of computer science, electrical and mechanical engineering specifying in car engineering, control engineering, mechatronics and software engineering. [German] Dieses Buch enthaelt Grundlagen und praktische Beispiele zu Prozessen, Methoden und Werkzeugen, die zur sicheren Beherrschbarkeit von elektronischen Systemen und Software im Fahrzeug beitragen. Dabei stehen die elektronischen Systeme des Antriebsstrangs, des Fahrwerks und der Karosserie im Vordergrund. Zum Inhalt gehoeren die folgenden Rubriken: Gesamtsystem Fahrzeug-Fahrer-Umwelt - Grundlagen - Prozesse zur Entwicklung von elektronischen Systemen und Software - Methoden und Werkzeuge fuer die Entwicklung, die Produktion un den Service elektronischer Systeme. Das Buch richtet sich an alle Mitarbeiter von Fahrzeugherstellern und Zulieferern, die sich mit elektornischen Systemen und Software im Fahrzeug beschaeftigen. Studierende der Informatik, der Elektrotechnik oder des Maschinenbaus mit den Schwerpunkten Fahrzeugtechnik, Steuerungs- und Regelungstechnik, Mechatronik oder Software-Technik. (orig.)

  12. Evaluation of The Virtual Cells Software: a Teaching Tool

    Directory of Open Access Journals (Sweden)

    C.C.P. da Silva

    2005-07-01

    Full Text Available Studies show that the use of games and interactive materials  at schools is a good educational strategy, motivating students to create mental  outlines and developing the reasoning and facilitating  the learn- ing. In this context, the Scientific Dissemination Coordination of the Center  for Structural Molecular Biotechnology  (CBME,  developed  a series of educational materials  destined  to the  elementary and high  schools,  universities  and  general  public.   Among  these,  we highlighted  the  Virtual  Cells soft- ware that was developed  with  the  aim of helping  in the  understanding of the  basic concepts  of cell types,  their  structures, organelles  and  specific functions.   Characterized by its  interactive  interface, this  software shows eukaryotes  and prokaryotes cells images, where organelles are shown as dynamic structures. In addition, it presents exercises in another  step that reinforce the comprehension  of Cy- tology.  A speaker  narrates the  resources  offered by the  program  and  the  necessary  steps  for its use. During  the  stage  of development of the  software,  students and  teachers of public and  private  high schools from Sao Carlos  city, Sao Paulo  State,  were invited  to register their  opinions  regarding  the language and content of the software in order to help us in the improvement of it.  After this stage, the Scientific Dissemination Coordination of CBME organized a series of workshops, where 120 individuals evaluated the software (students and teachers  of high school and others undergraduate students. For this evaluation, a questionnaire was elaborated based on the international current literature in the area of sciences teaching  and it was applied  after the interactive section with the software.  The analysis of the results demonstrated that most of the individuals  considered the software of easy

  13. Software Tools for High-Performance Computiing: Survey and Recommendations

    Directory of Open Access Journals (Sweden)

    Bill Appelbe

    1996-01-01

    Full Text Available Applications programming for high-performance computing is notoriously difficult. Al-though parallel programming is intrinsically complex, the principal reason why high-performance computing is difficult is the lack of effective software tools. We believe that the lack of tools in turn is largely due to market forces rather than our inability to design and build such tools. Unfortunately, the poor availability and utilization of parallel tools hurt the entire supercomputing industry and the U.S. high performance computing initiative which is focused on applications. A disproportionate amount of resources is being spent on faster hardware and architectures, while tools are being neglected. This article introduces a taxonomy of tools, analyzes the major factors that contribute to this situation, and suggests ways that the imbalance could be redressed and the likely evolution of tools.

  14. PAnalyzer: A software tool for protein inference in shotgun proteomics

    Directory of Open Access Journals (Sweden)

    Prieto Gorka

    2012-11-01

    Full Text Available Abstract Background Protein inference from peptide identifications in shotgun proteomics must deal with ambiguities that arise due to the presence of peptides shared between different proteins, which is common in higher eukaryotes. Recently data independent acquisition (DIA approaches have emerged as an alternative to the traditional data dependent acquisition (DDA in shotgun proteomics experiments. MSE is the term used to name one of the DIA approaches used in QTOF instruments. MSE data require specialized software to process acquired spectra and to perform peptide and protein identifications. However the software available at the moment does not group the identified proteins in a transparent way by taking into account peptide evidence categories. Furthermore the inspection, comparison and report of the obtained results require tedious manual intervention. Here we report a software tool to address these limitations for MSE data. Results In this paper we present PAnalyzer, a software tool focused on the protein inference process of shotgun proteomics. Our approach considers all the identified proteins and groups them when necessary indicating their confidence using different evidence categories. PAnalyzer can read protein identification files in the XML output format of the ProteinLynx Global Server (PLGS software provided by Waters Corporation for their MSE data, and also in the mzIdentML format recently standardized by HUPO-PSI. Multiple files can also be read simultaneously and are considered as technical replicates. Results are saved to CSV, HTML and mzIdentML (in the case of a single mzIdentML input file files. An MSE analysis of a real sample is presented to compare the results of PAnalyzer and ProteinLynx Global Server. Conclusions We present a software tool to deal with the ambiguities that arise in the protein inference process. Key contributions are support for MSE data analysis by ProteinLynx Global Server and technical replicates

  15. Metamodelling Approach and Software Tools for Physical Modelling and Simulation

    Directory of Open Access Journals (Sweden)

    Vitaliy Mezhuyev

    2015-02-01

    Full Text Available In computer science, metamodelling approach becomes more and more popular for the purpose of software systems development. In this paper, we discuss applicability of the metamodelling approach for development of software tools for physical modelling and simulation.To define a metamodel for physical modelling the analysis of physical models will be done. The result of such the analyses will show the invariant physical structures, we propose to use as the basic abstractions of the physical metamodel. It is a system of geometrical objects, allowing to build a spatial structure of physical models and to set a distribution of physical properties. For such geometry of distributed physical properties, the different mathematical methods can be applied. To prove the proposed metamodelling approach, we consider the developed prototypes of software tools.

  16. Software Tools: A One-Semester Secondary School Computer Course.

    Science.gov (United States)

    Bromley, John; Lakatos, John

    1985-01-01

    Provides a course outline, describes equipment and teacher requirements, discusses student evaluation and course outcomes, and details the computer programs used in a high school course. The course is designed to teach students use of the microcomputer as a tool through hands-on experience with a variety of commercial software programs. (MBR)

  17. Software Tools: A One-Semester Secondary School Computer Course.

    Science.gov (United States)

    Bromley, John; Lakatos, John

    1985-01-01

    Provides a course outline, describes equipment and teacher requirements, discusses student evaluation and course outcomes, and details the computer programs used in a high school course. The course is designed to teach students use of the microcomputer as a tool through hands-on experience with a variety of commercial software programs. (MBR)

  18. Metabolic interrelationships software application: Interactive learning tool for intermediary metabolism

    NARCIS (Netherlands)

    A.J.M. Verhoeven (Adrie); M. Doets (Mathijs); J.M.J. Lamers (Jos); J.F. Koster (Johan)

    2005-01-01

    textabstractWe developed and implemented the software application titled Metabolic Interrelationships as a self-learning and -teaching tool for intermediary metabolism. It is used by undergraduate medical students in an integrated organ systems-based and disease-oriented core curriculum, which start

  19. Metabolic interrelationships software application: Interactive learning tool for intermediary metabolism

    NARCIS (Netherlands)

    A.J.M. Verhoeven (Adrie); M. Doets (Mathijs); J.M.J. Lamers (Jos); J.F. Koster (Johan)

    2005-01-01

    textabstractWe developed and implemented the software application titled Metabolic Interrelationships as a self-learning and -teaching tool for intermediary metabolism. It is used by undergraduate medical students in an integrated organ systems-based and disease-oriented core curriculum, which

  20. Software tools for quay crane exploitation and training

    OpenAIRE

    Dragomir Cristina; Breazu Alina

    2011-01-01

    One of the objectives of berth crane operations management in ports is productivity maximization of berth cranes, matched with the vessel requirement of minimizing waiting times. This paper presents several management software tools for berth crane operations in ports that are used for aquiring such an objective

  1. Chips: A Tool for Developing Software Interfaces Interactively.

    Science.gov (United States)

    Cunningham, Robert E.; And Others

    This report provides a detailed description of Chips, an interactive tool for developing software employing graphical/computer interfaces on Xerox Lisp machines. It is noted that Chips, which is implemented as a collection of customizable classes, provides the programmer with a rich graphical interface for the creation of rich graphical…

  2. A Software Tool for Integrated Optical Design Analysis

    Science.gov (United States)

    Moore, Jim; Troy, Ed; DePlachett, Charles; Montgomery, Edward (Technical Monitor)

    2001-01-01

    Design of large precision optical systems requires multi-disciplinary analysis, modeling, and design. Thermal, structural and optical characteristics of the hardware must be accurately understood in order to design a system capable of accomplishing the performance requirements. The interactions between each of the disciplines become stronger as systems are designed lighter weight for space applications. This coupling dictates a concurrent engineering design approach. In the past, integrated modeling tools have been developed that attempt to integrate all of the complex analysis within the framework of a single model. This often results in modeling simplifications and it requires engineering specialist to learn new applications. The software described in this presentation addresses the concurrent engineering task using a different approach. The software tool, Integrated Optical Design Analysis (IODA), uses data fusion technology to enable a cross discipline team of engineering experts to concurrently design an optical system using their standard validated engineering design tools.

  3. A Brief Review of Software Tools for Pangenomics

    Institute of Scientific and Technical Information of China (English)

    Jingfa Xiao; Zhewen Zhang; Jiayan Wu; Jun Yu

    2015-01-01

    Since the proposal for pangenomic study, there have been a dozen software tools actively in use for pangenomic analysis. By the end of 2014, Panseq and the pan-genomes analysis pipeline (PGAP) ranked as the top two most popular packages according to cumulative citations of peer-reviewed scientific publications. The functions of the software packages and tools, albeit variable among them, include categorizing orthologous genes, calculating pangenomic profiles, integrating gene annotations, and constructing phylogenies. As epigenomic elements are being gradually revealed in prokaryotes, it is expected that pangenomic databases and toolkits have to be extended to handle information of detailed functional annotations for genes and non-protein-coding sequences including non-coding RNAs, insertion elements, and conserved structural elements. To develop better bioinformatic tools, user feedback and integration of novel features are both of essence.

  4. A brief review of software tools for pangenomics.

    Science.gov (United States)

    Xiao, Jingfa; Zhang, Zhewen; Wu, Jiayan; Yu, Jun

    2015-02-01

    Since the proposal for pangenomic study, there have been a dozen software tools actively in use for pangenomic analysis. By the end of 2014, Panseq and the pan-genomes analysis pipeline (PGAP) ranked as the top two most popular packages according to cumulative citations of peer-reviewed scientific publications. The functions of the software packages and tools, albeit variable among them, include categorizing orthologous genes, calculating pangenomic profiles, integrating gene annotations, and constructing phylogenies. As epigenomic elements are being gradually revealed in prokaryotes, it is expected that pangenomic databases and toolkits have to be extended to handle information of detailed functional annotations for genes and non-protein-coding sequences including non-coding RNAs, insertion elements, and conserved structural elements. To develop better bioinformatic tools, user feedback and integration of novel features are both of essence.

  5. A Brief Review of Software Tools for Pangenomics

    Directory of Open Access Journals (Sweden)

    Jingfa Xiao

    2015-02-01

    Full Text Available Since the proposal for pangenomic study, there have been a dozen software tools actively in use for pangenomic analysis. By the end of 2014, Panseq and the pan-genomes analysis pipeline (PGAP ranked as the top two most popular packages according to cumulative citations of peer-reviewed scientific publications. The functions of the software packages and tools, albeit variable among them, include categorizing orthologous genes, calculating pangenomic profiles, integrating gene annotations, and constructing phylogenies. As epigenomic elements are being gradually revealed in prokaryotes, it is expected that pangenomic databases and toolkits have to be extended to handle information of detailed functional annotations for genes and non-protein-coding sequences including non-coding RNAs, insertion elements, and conserved structural elements. To develop better bioinformatic tools, user feedback and integration of novel features are both of essence.

  6. DAE Tools: equation-based object-oriented modelling, simulation and optimisation software

    Directory of Open Access Journals (Sweden)

    Dragan D. Nikolić

    2016-04-01

    Full Text Available In this work, DAE Tools modelling, simulation and optimisation software, its programming paradigms and main features are presented. The current approaches to mathematical modelling such as the use of modelling languages and general-purpose programming languages are analysed. The common set of capabilities required by the typical simulation software are discussed, and the shortcomings of the current approaches recognised. A new hybrid approach is introduced, and the modelling languages and the hybrid approach are compared in terms of the grammar, compiler, parser and interpreter requirements, maintainability and portability. The most important characteristics of the new approach are discussed, such as: (1 support for the runtime model generation; (2 support for the runtime simulation set-up; (3 support for complex runtime operating procedures; (4 interoperability with the third party software packages (i.e. NumPy/SciPy; (5 suitability for embedding and use as a web application or software as a service; and (6 code-generation, model exchange and co-simulation capabilities. The benefits of an equation-based approach to modelling, implemented in a fourth generation object-oriented general purpose programming language such as Python are discussed. The architecture and the software implementation details as well as the type of problems that can be solved using DAE Tools software are described. Finally, some applications of the software at different levels of abstraction are presented, and its embedding capabilities and suitability for use as a software as a service is demonstrated.

  7. Development of a software tool for an internal dosimetry using MIRD method

    Science.gov (United States)

    Chaichana, A.; Tocharoenchai, C.

    2016-03-01

    Currently, many software packages for the internal radiation dosimetry have been developed. Many of them do not provide sufficient tools to perform all of the necessary steps from nuclear medicine image analysis for dose calculation. For this reason, we developed a CALRADDOSE software that can be performed internal dosimetry using MIRD method within a single environment. MATLAB software version 2015a was used as development tool. The calculation process of this software proceeds from collecting time-activity data from image data followed by residence time calculation and absorbed dose calculation using MIRD method. To evaluate the accuracy of this software, we calculate residence times and absorbed doses of 5 Ga- 67 studies and 5 I-131 MIBG studies and then compared the results with those obtained from OLINDA/EXM software. The results showed that the residence times and absorbed doses obtained from both software packages were not statistically significant differences. The CALRADDOSE software is a user-friendly, graphic user interface-based software for internal dosimetry. It provides fast and accurate results, which may be useful for a routine work.

  8. Software tools at the Rome CMS/ECAL Regional Center

    CERN Document Server

    Organtini, G

    2001-01-01

    The construction of the CMS electromagnetic calorimeter is under way in Rome and at CERN. To this purpose, two Regional Centers were set up in both sites. In Rome, the project was entirely carried out using new software technologies such as object oriented programming, object databases, CORBA programming and Web tools. It can be regarded as a use case for the evaluation of the benefits of new software technologies in high energy physics. Our experience is positive and encouraging for the future. (10 refs).

  9. CLAIRE, an event-driven simulation tool for testing software

    Energy Technology Data Exchange (ETDEWEB)

    Raguideau, J.; Schoen, D.; Henry, J.Y.; Boulc`h, J.

    1994-06-15

    CLAIRE is a software tool created to perform validations on executable codes or on specifications of distributed real-time applications for nuclear safety. CLAIRE can be used both to verify the safety properties by modelling the specifications, and also to validate the final code by simulating the behaviour of its equipment and software interfaces. It can be used to observe and provide dynamic control of the simulation process, and also to record changes to the simulated data for off-line analysis. (R.P.).

  10. Software Tool Integrating Data Flow Diagrams and Petri Nets

    Science.gov (United States)

    Thronesbery, Carroll; Tavana, Madjid

    2010-01-01

    Data Flow Diagram - Petri Net (DFPN) is a software tool for analyzing other software to be developed. The full name of this program reflects its design, which combines the benefit of data-flow diagrams (which are typically favored by software analysts) with the power and precision of Petri-net models, without requiring specialized Petri-net training. (A Petri net is a particular type of directed graph, a description of which would exceed the scope of this article.) DFPN assists a software analyst in drawing and specifying a data-flow diagram, then translates the diagram into a Petri net, then enables graphical tracing of execution paths through the Petri net for verification, by the end user, of the properties of the software to be developed. In comparison with prior means of verifying the properties of software to be developed, DFPN makes verification by the end user more nearly certain, thereby making it easier to identify and correct misconceptions earlier in the development process, when correction is less expensive. After the verification by the end user, DFPN generates a printable system specification in the form of descriptions of processes and data.

  11. COSTMODL: An automated software development cost estimation tool

    Science.gov (United States)

    Roush, George B.

    1991-01-01

    The cost of developing computer software continues to consume an increasing portion of many organizations' total budgets, both in the public and private sector. As this trend develops, the capability to produce reliable estimates of the effort and schedule required to develop a candidate software product takes on increasing importance. The COSTMODL program was developed to provide an in-house capability to perform development cost estimates for NASA software projects. COSTMODL is an automated software development cost estimation tool which incorporates five cost estimation algorithms including the latest models for the Ada language and incrementally developed products. The principal characteristic which sets COSTMODL apart from other software cost estimation programs is its capacity to be completely customized to a particular environment. The estimation equations can be recalibrated to reflect the programmer productivity characteristics demonstrated by the user's organization, and the set of significant factors which effect software development costs can be customized to reflect any unique properties of the user's development environment. Careful use of a capability such as COSTMODL can significantly reduce the risk of cost overruns and failed projects.

  12. Development of the software generation method using model driven software engineering tool

    Energy Technology Data Exchange (ETDEWEB)

    Jang, H. S.; Jeong, J. C.; Kim, J. H.; Han, H. W.; Kim, D. Y.; Jang, Y. W. [KOPEC, Taejon (Korea, Republic of); Moon, W. S. [NEXTech Inc., Seoul (Korea, Republic of)

    2003-10-01

    The methodologies to generate the automated software design specification and source code for the nuclear I and C systems software using model driven language is developed in this work. For qualitative analysis of the algorithm, the activity diagram is modeled and generated using Unified Modeling Language (UML), and then the sequence diagram is designed for automated source code generation. For validation of the generated code, the code audits and module test is performed using Test and QA tool. The code coverage and complexities of example code are examined in this stage. The low pressure pressurizer reactor trip module of the Plant Protection System was programmed as subject for this task. The test result using the test tool shows that errors were easily detected from the generated source codes that have been generated using test tool. The accuracy of input/output processing by the execution modules was clearly identified.

  13. Software Tools for Electrical Quality Assurance in the LHC

    CERN Document Server

    Bednarek, Mateusz

    2011-01-01

    There are over 1600 superconducting magnet circuits in the LHC machine. Many of them consist of a large number of components electrically connected in series. This enhances the sensitivity of the whole circuits to electrical faults of individual components. Furthermore, circuits are equipped with a large number of instrumentation wires, which are exposed to accidental damage or swapping. In order to ensure safe operation, an Electrical Quality Assurance (ELQA) campaign is needed after each thermal cycle. Due to the complexity of the circuits, as well as their distant geographical distribution (tunnel of 27km circumference divided in 8 sectors), suitable software and hardware platforms had to be developed. The software combines an Oracle database, LabView data acquisition applications and PHP-based web follow-up tools. This paper describes the software used for the ELQA of the LHC.

  14. The RUMBA software: tools for neuroimaging data analysis.

    Science.gov (United States)

    Bly, Benjamin Martin; Rebbechi, Donovan; Hanson, Stephen Jose; Grasso, Giorgio

    2004-01-01

    The enormous scale and complexity of data sets in functional neuroimaging makes it crucial to have well-designed and flexible software for image processing, modeling, and statistical analysis. At present, researchers must choose between general purpose scientific computing environments (e.g., Splus and Matlab), and specialized human brain mapping packages that implement particular analysis strategies (e.g., AFNI, SPM, VoxBo, FSL or FIASCO). For the vast majority of users in Human Brain Mapping and Cognitive Neuroscience, general purpose computing environments provide an insufficient framework for a complex data-analysis regime. On the other hand, the operational particulars of more specialized neuroimaging analysis packages are difficult or impossible to modify and provide little transparency or flexibility to the user for approaches other than massively multiple comparisons based on inferential statistics derived from linear models. In order to address these problems, we have developed open-source software that allows a wide array of data analysis procedures. The RUMBA software includes programming tools that simplify the development of novel methods, and accommodates data in several standard image formats. A scripting interface, along with programming libraries, defines a number of useful analytic procedures, and provides an interface to data analysis procedures. The software also supports a graphical functional programming environment for implementing data analysis streams based on modular functional components. With these features, the RUMBA software provides researchers programmability, reusability, modular analysis tools, novel data analysis streams, and an analysis environment in which multiple approaches can be contrasted and compared. The RUMBA software retains the flexibility of general scientific computing environments while adding a framework in which both experts and novices can develop and adapt neuroimaging-specific analyses.

  15. Constructing an advanced software tool for planetary atmospheric modeling

    Science.gov (United States)

    Keller, Richard M.; Sims, Michael; Podolak, Ester; Mckay, Christopher

    1990-01-01

    Scientific model building can be an intensive and painstaking process, often involving the development of large and complex computer programs. Despite the effort involved, scientific models cannot be easily distributed and shared with other scientists. In general, implemented scientific models are complex, idiosyncratic, and difficult for anyone but the original scientist/programmer to understand. We believe that advanced software techniques can facilitate both the model building and model sharing process. In this paper, we describe a prototype for a scientific modeling software tool that serves as an aid to the scientist in developing and using models. This tool includes an interactive intelligent graphical interface, a high level domain specific modeling language, a library of physics equations and experimental datasets, and a suite of data display facilities. Our prototype has been developed in the domain of planetary atmospheric modeling, and is being used to construct models of Titan's atmosphere.

  16. Proposing a Mathematical Software Tool in Physics Secondary Education

    Directory of Open Access Journals (Sweden)

    Konstantinos B. Baltzis

    2009-03-01

    Full Text Available MathCad® is a very popular software tool for mathematical and statistical analysis in science and engineering. Its low cost, ease of use, extensive function library, and worksheet–like user interface distinguish it among other commercial packages. Its features are also well suited to educational process. The use of natural mathematical notation and built–in measurement units are its two major advantages in teaching and learning. In this paper, its complementary use in the upper secondary physics education in Greece is explored. In order to demonstrate its application in the teaching process, a set of representative examples are presented. The main features and advantages of the software are also pointed out. The paper aims to present the benefits of the application of mathematical information technology tools in secondary physics education. In this effort, MathCad® is probably the most promising solution.

  17. Northwestern University Schizophrenia Data and Software Tool (NUSDAST)

    OpenAIRE

    Wang, Lei; Kogan, Alex; Cobia, Derin; Alpert, Kathryn; Kolasny, Anthony; Miller, Michael I.; Marcus, Daniel

    2013-01-01

    The schizophrenia research community has invested substantial resources on collecting, managing and sharing large neuroimaging datasets. As part of this effort, our group has collected high resolution magnetic resonance (MR) datasets from individuals with schizophrenia, their non-psychotic siblings, healthy controls and their siblings. This effort has resulted in a growing resource, the Northwestern University Schizophrenia Data and Software Tool (NUSDAST), an NIH-funded data sharing project ...

  18. Assessment tool for pharmacy drug-drug interaction software.

    Science.gov (United States)

    Warholak, Terri L; Hines, Lisa E; Saverno, Kim R; Grizzle, Amy J; Malone, Daniel C

    2011-01-01

    To assess the performance of pharmacy clinical decision support (CDS) systems for drug-drug interaction (DDI) detection and to identify approaches for improving the ability to recognize important DDIs. Pharmacists rely on CDS systems to assist in the identification of DDIs, and research suggests that these systems perform suboptimally. The software evaluation tool described here may be used in all pharmacy settings that use electronic decision support to detect potential DDIs, including large and small community chain pharmacies, community independent pharmacies, hospital pharmacies, and governmental facility pharmacies. A tool is provided to determine the ability of pharmacy CDS systems to identify established DDIs. It can be adapted to evaluate potential DDIs that reflect local practice patterns and patient safety priorities. Beyond assessing software performance, going through the evaluation processes creates the opportunity to evaluate inadequacies in policies, procedures, workflow, and training of all pharmacy staff relating to pharmacy information systems and DDIs. The DDI evaluation tool can be used to assess pharmacy information systems' ability to recognize relevant DDIs. Suggestions for improvement include determining whether the software allows for customization, creating standard policies for handling specific interactions, and ensuring that drug knowledge database updates occur frequently.

  19. Software Information Base(SIB)and Its Integration with Data Flow Diagram(DFD)Tool

    Institute of Scientific and Technical Information of China (English)

    董士海

    1989-01-01

    Software in formation base is the main technique of the integration of software engineering environment.Data flow diagram tool is an important software tool to support software requirement analysis phase.This article introduces the functions,structures of a Software Information Base(SIB),and a Data Flow Diagram tool first.The E-R data model of SIB and its integration with Data Flow Diagram tool are emphatically described.

  20. Software reference for SaTool - a Tool for Structural Analysis of Automated Systems

    DEFF Research Database (Denmark)

    Lorentzen, Torsten; Blanke, Mogens

    2004-01-01

    This software reference details the functions of SaTool – a tool for structural analysis of technical systems. SaTool is intended used as part of an industrial systems design cycle. Structural analysis is a graph-based technique where principal relations between variables express the system...... of the graph. SaTool makes analysis of the structure graph to provide knowledge about fundamental properties of the system in normal and faulty conditions. Salient features of SaTool include rapid analysis of possibility to diagnose faults and ability to make autonomous recovery should faults occur....

  1. Regulatory network operations in the Pathway Tools software

    Directory of Open Access Journals (Sweden)

    Paley Suzanne M

    2012-09-01

    Full Text Available Abstract Background Biologists are elucidating complex collections of genetic regulatory data for multiple organisms. Software is needed for such regulatory network data. Results The Pathway Tools software supports storage and manipulation of regulatory information through a variety of strategies. The Pathway Tools regulation ontology captures transcriptional and translational regulation, substrate-level regulation of enzyme activity, post-translational modifications, and regulatory pathways. Regulatory visualizations include a novel diagram that summarizes all regulatory influences on a gene; a transcription-unit diagram, and an interactive visualization of a full transcriptional regulatory network that can be painted with gene expression data to probe correlations between gene expression and regulatory mechanisms. We introduce a novel type of enrichment analysis that asks whether a gene-expression dataset is over-represented for known regulators. We present algorithms for ranking the degree of regulatory influence of genes, and for computing the net positive and negative regulatory influences on a gene. Conclusions Pathway Tools provides a comprehensive environment for manipulating molecular regulatory interactions that integrates regulatory data with an organism’s genome and metabolic network. Curated collections of regulatory data authored using Pathway Tools are available for Escherichia coli, Bacillus subtilis, and Shewanella oneidensis.

  2. C++ Software Quality in the ATLAS Experiment: Tools and Experience

    CERN Document Server

    Kluth, Stefan; The ATLAS collaboration; Obreshkov, Emil; Roe, Shaun; Seuster, Rolf; Snyder, Scott; Stewart, Graeme

    2016-01-01

    The ATLAS experiment at CERN uses about six million lines of code and currently has about 420 developers whose background is largely from physics. In this paper we explain how the C++ code quality is managed using a range of tools from compile-time through to run time testing and reflect on the great progress made in the last year largely through the use of static analysis tools such as Coverity®, an industry-standard tool which enables quality comparison with general open source C++ code. Other tools including cppcheck, Include-What-You-Use and run-time 'sanitizers' are also discussed.

  3. Classroom Live: a software-assisted gamification tool

    Science.gov (United States)

    de Freitas, Adrian A.; de Freitas, Michelle M.

    2013-06-01

    Teachers have come to rely on a variety of approaches in order to elicit and sustain student interest in the classroom. One particular approach, known as gamification, seeks to improve student engagement by transforming the traditional classroom experience into a competitive multiplayer game. Initial attempts at classroom gamification relied on the teacher manually tracking student progress. At the US Air Force Academy, we wanted to experiment with a software gamification tool. Our client/server suite, dubbed Classroom Live, streamlines the gamification process for the teacher by simplifying common tasks. Simultaneously, the tool provides students with an esthetically pleasing user interface that offers in game rewards in exchange for their participation. Classroom Live is still in development, but our initial experience using the tool has been extremely positive and confirms our belief that students respond positively to gamification, even at the undergraduate level.

  4. A NEO population generation and observation simulation software tool

    Science.gov (United States)

    Müller, Sven; Gelhaus, Johannes; Hahn, Gerhard; Franco, Raffaella

    One of the main targets of ESA's Space Situational Awareness (SSA) program is to build a wide knowledge base about objects that can potentially harm Earth (Near-Earth Objects, NEOs). An important part of this effort is to create the Small Bodies Data Centre (SBDC) which is going to aggregate measurement data from a fully-integrated NEO observation sensor network. Until this network is developed, artificial NEO measurement data is needed in order to validate SBDC algorithms. Moreover, to establish a functioning NEO observation sensor network, it has to be determined where to place sensors, what technical requirements have to be met in order to be able to detect NEOs and which observation strategies work the best. Because of this, a sensor simulation software was needed. This paper presents a software tool which allows users to create and analyse NEO populations and to simulate and analyse population observations. It is a console program written in Fortran and comes with a Graphical User Interface (GUI) written in Java and C. The tool can be distinguished into the components ``Population Generator'' and ``Observation Simulator''. The Population Generator component is responsible for generating and analysing a NEO population. Users can choose between creating fictitious (random) and synthetic populations. The latter are based on one of two models describing the orbital and size distribution of observed NEOs: The existing socalled ``Bottke Model'' (Bottke et al. 2000, 2002) and the new ``Granvik Model'' (Granvik et al. 2014, in preparation) which has been developed in parallel to the tool. Generated populations can be analysed by defining 2D, 3D and scatter plots using various NEO attributes. As a result, the tool creates the appropiate files for the plotting tool ``gnuplot''. The tool's Observation Simulator component yields the Observation Simulation and Observation Analysis functions. Users can define sensor systems using ground- or space-based locations as well as

  5. Tools and Behavior Abstraction: A Future for Software Engineering

    Directory of Open Access Journals (Sweden)

    Wilson Solís

    2012-06-01

    Full Text Available Software engineers rely on and use tools to analyze automatically and detailed the code and design specifications. Although many are still used to find new defects in old code, is expected in the future have more application in software engineering and are available to developers at the time of editing their products. If were possible build them fast enough and easy to use, software engineers would apply it to improve design and product development. To solve any problem, traditional engineering use programming languages, however, the level of abstraction of the most popular is not much larger than C programs several decades ago. Moreover, this level is the same in all the code and do not leaves room for abstraction of behavior, in which the design is divided into phases and which gradually introduces more details. This article presents a study of the need for a larger set of analysis tools to create languages and development environments, which provide good support to archive this abstraction.

  6. Evaluation of computer-aided software engineering tools for data base development

    Energy Technology Data Exchange (ETDEWEB)

    Woyna, M.A.; Carlson, C.R.

    1989-02-01

    More than 80 computer-aided software engineering (CASE) tools were evaluated to determine their usefulness in data base development projects. The goal was to review the current state of the CASE industry and recommend one or more tools for inclusion in the uniform development environment (UDE), a programming environment being designed by Argonne National Laboratory for the US Department of Defense Organization of the Joint Chiefs of Staff, J-8 Directorate. This environment gives a computer programmer a consistent user interface and access to a full suite of tools and utilities for software development. In an effort to identify tools that would be useful in the planning, analysis, design, implementation, and maintenance of Argonne's data base development projects for the J-8 Directorate, we evaluated 83 commercially available CASE products. This report outlines the method used and presents the results of the evaluation.

  7. Software tool for horizontal-axis wind turbine simulation

    Energy Technology Data Exchange (ETDEWEB)

    Vitale, A.J. [Instituto Argentino de Oceanografia, Camino La Carrindanga Km. 7, 5 CC 804, B8000FWB Bahia Blanca (Argentina); Universidad Tecnologica Nacional Facultad Regional Bahia Blanca, GESE, 11 de Abril 461, B8000LMI Bahia Blanca (Argentina); Dpto. de Ing. Electrica y de Computadoras, Universidad Nacional del Sur, Av. Alem 1253, 8000 Bahia Blanca (Argentina); Rossi, A.P. [Universidad Tecnologica Nacional Facultad Regional Bahia Blanca, GESE, 11 de Abril 461, B8000LMI Bahia Blanca (Argentina); Dpto. de Ing. Electrica y de Computadoras, Universidad Nacional del Sur, Av. Alem 1253, 8000 Bahia Blanca (Argentina)

    2008-07-15

    The main problem of a wind turbine generator design project is the design of the right blades capable of satisfying the specific energy requirement of an electric system with optimum performance. Once the blade has been designed for optimum operation at a particular rotor angular speed, it is necessary to determine the overall performance of the rotor under the range of wind speed that it will encounter. A software tool that simulates low-power, horizontal-axis wind turbines was developed for this purpose. With this program, the user can calculate the rotor power output for any combination of wind and rotor speeds, with definite blade shape and airfoil characteristics. The software also provides information about distribution of forces along the blade span, for different operational conditions. (author)

  8. AMIDE: A Free Software Tool for Multimodality Medical Image Analysis

    Directory of Open Access Journals (Sweden)

    Andreas Markus Loening

    2003-07-01

    Full Text Available Amide's a Medical Image Data Examiner (AMIDE has been developed as a user-friendly, open-source software tool for displaying and analyzing multimodality volumetric medical images. Central to the package's abilities to simultaneously display multiple data sets (e.g., PET, CT, MRI and regions of interest is the on-demand data reslicing implemented within the program. Data sets can be freely shifted, rotated, viewed, and analyzed with the program automatically handling interpolation as needed from the original data. Validation has been performed by comparing the output of AMIDE with that of several existing software packages. AMIDE runs on UNIX, Macintosh OS X, and Microsoft Windows platforms, and it is freely available with source code under the terms of the GNU General Public License.

  9. Northwestern University Schizophrenia Data and Software Tool (NUSDAST

    Directory of Open Access Journals (Sweden)

    Lei eWang

    2013-11-01

    Full Text Available The schizophrenia research community has invested substantial resources on collecting, managing and sharing large neuroimaging datasets. As part of this effort, our group has collected high resolution magnetic resonance (MR datasets from individuals with schizophrenia, their non-psychotic siblings, healthy controls and their siblings. This effort has resulted in a growing resource, the Northwestern University Schizophrenia Data and Software Tool (NUSDAST, an NIH-funded data sharing project to stimulate new research. This resource resides on XNAT Central, and it contains neuroimaging (MR scans, landmarks and surface maps for deep subcortical structures, and FreeSurfer cortical parcellation and measurement data, cognitive (cognitive domain scores for crystallized intelligence, working memory, episodic memory, and executive function, clinical (demographic, sibling relationship, SAPS and SANS psychopathology, and genetic (20 polymorphisms data, collected from more than 450 subjects, most with 2-year longitudinal follow-up. A neuroimaging mapping, analysis and visualization software tool, CAWorks, is also part of this resource. Moreover, in making our existing neuroimaging data along with the associated meta-data and computational tools publically accessible, we have established a web-based information retrieval portal that allows the user to efficiently search the collection. This research-ready dataset meaningfully combines neuroimaging data with other relevant information, and it can be used to help facilitate advancing neuroimaging research. It is our hope that this effort will help to overcome some of the commonly recognized technical barriers in advancing neuroimaging research such as lack of local organization and standard descriptions.

  10. Northwestern University Schizophrenia Data and Software Tool (NUSDAST).

    Science.gov (United States)

    Wang, Lei; Kogan, Alex; Cobia, Derin; Alpert, Kathryn; Kolasny, Anthony; Miller, Michael I; Marcus, Daniel

    2013-01-01

    The schizophrenia research community has invested substantial resources on collecting, managing and sharing large neuroimaging datasets. As part of this effort, our group has collected high resolution magnetic resonance (MR) datasets from individuals with schizophrenia, their non-psychotic siblings, healthy controls and their siblings. This effort has resulted in a growing resource, the Northwestern University Schizophrenia Data and Software Tool (NUSDAST), an NIH-funded data sharing project to stimulate new research. This resource resides on XNAT Central, and it contains neuroimaging (MR scans, landmarks and surface maps for deep subcortical structures, and FreeSurfer cortical parcellation and measurement data), cognitive (cognitive domain scores for crystallized intelligence, working memory, episodic memory, and executive function), clinical (demographic, sibling relationship, SAPS and SANS psychopathology), and genetic (20 polymorphisms) data, collected from more than 450 subjects, most with 2-year longitudinal follow-up. A neuroimaging mapping, analysis and visualization software tool, CAWorks, is also part of this resource. Moreover, in making our existing neuroimaging data along with the associated meta-data and computational tools publically accessible, we have established a web-based information retrieval portal that allows the user to efficiently search the collection. This research-ready dataset meaningfully combines neuroimaging data with other relevant information, and it can be used to help facilitate advancing neuroimaging research. It is our hope that this effort will help to overcome some of the commonly recognized technical barriers in advancing neuroimaging research such as lack of local organization and standard descriptions.

  11. PLAGIARISM DETECTION PROBLEMS AND ANALYSIS SOFTWARE TOOLS FOR ITS SOLVE

    Directory of Open Access Journals (Sweden)

    V. I. Shynkarenko

    2017-02-01

    Full Text Available Purpose. This study is aimed at: 1 the definition of plagiarism in texts on formal and natural languages, building a taxonomy of plagiarism; 2 identify major problems of plagiarism detection when using automated tools to solve them; 3 Analysis and systematization of information obtained during the review, testing and analysis of existing detection systems. Methodology. To identify the requirements of the software to detect plagiarism apply methods of analysis of normative documentation (legislative base and competitive tools. To check the requirements of the testing methods used and GUI interfaces review. Findings. The paper considers the concept of plagiarism issues of proliferation and classification. A review of existing systems to identify plagiarism: desktop applications, and online resources. Highlighting their functional characteristics, determine the format of the input and output data and constraints on them, customization features and access. Drill down system requirements is made. Originality. The authors proposed schemes complement the existing hierarchical taxonomy of plagiarism. Analysis of existing systems is done in terms of functionality and possibilities for use of large amounts of data. Practical value. The practical significance is determined by the breadth of the problem of plagiarism in various fields. In Ukraine, develops the legal framework for the fight against plagiarism, which requires the active solution development tasks, improvement and delivery of relevant software (PO. This work contributes to the solution of these problems. Review of existing programs, Anti-plagiarism, as well as study and research experience in the field and update the concept of plagiarism, the strategy allows it to identify more fully articulate to the functional performance requirements, the input and output of the developed software, as well as to identify the features of such software. The article focuses on the features of solving the

  12. Proceedings of the Workshop on software tools for distributed intelligent control systems

    Energy Technology Data Exchange (ETDEWEB)

    Herget, C.J. (ed.)

    1990-09-01

    The Workshop on Software Tools for Distributed Intelligent Control Systems was organized by Lawrence Livermore National Laboratory for the United States Army Headquarters Training and Doctrine Command and the Defense Advanced Research Projects Agency. The goals of the workshop were to the identify the current state of the art in tools which support control systems engineering design and implementation, identify research issues associated with writing software tools which would provide a design environment to assist engineers in multidisciplinary control design and implementation, formulate a potential investment strategy to resolve the research issues and develop public domain code which can form the core of more powerful engineering design tools, and recommend test cases to focus the software development process and test associated performance metrics. Recognizing that the development of software tools for distributed intelligent control systems will require a multidisciplinary effort, experts in systems engineering, control systems engineering, and compute science were invited to participate in the workshop. In particular, experts who could address the following topics were selected: operating systems, engineering data representation and manipulation, emerging standards for manufacturing data, mathematical foundations, coupling of symbolic and numerical computation, user interface, system identification, system representation at different levels of abstraction, system specification, system design, verification and validation, automatic code generation, and integration of modular, reusable code.

  13. QUALITY SERVICES EVALUATION MODEL BASED ON DEDICATED SOFTWARE TOOL

    Directory of Open Access Journals (Sweden)

    ANDREEA CRISTINA IONICĂ

    2012-10-01

    Full Text Available In this paper we introduced a new model, called Service Quality (SQ, which combines QFD and SERVQUAL methods. This model takes from the SERVQUAL method the five dimensions of requirements and three of characteristics and from the QFD method the application methodology. The originality of the SQ model consists in computing a global index that reflects the customers’ requirements accomplishment level by the quality characteristics. In order to prove the viability of the SQ model, there was developed a software tool that was applied for the evaluation of a health care services provider.

  14. Object-Oriented Software Tools for the Construction of Preconditioners

    Directory of Open Access Journals (Sweden)

    Eva Mossberg

    1997-01-01

    Full Text Available In recent years, there has been considerable progress concerning preconditioned iterative methods for large and sparse systems of equations arising from the discretization of differential equations. Such methods are particularly attractive in the context of high-performance (parallel computers. However, the implementation of a preconditioner is a nontrivial task. The focus of the present contribution is on a set of object-oriented software tools that support the construction of a family of preconditioners based on fast transforms. By combining objects of different classes, it is possible to conveniently construct any preconditioner within this family.

  15. A computer-aided software-tool for sustainable process synthesis-intensification

    DEFF Research Database (Denmark)

    Kumar Tula, Anjan; Babi, Deenesh K.; Bottlaender, Jack

    2017-01-01

    and determine within the design space, the more sustainable processes. In this paper, an integrated computer-aided software-tool that searches the design space for hybrid/intensified more sustainable process options is presented. Embedded within the software architecture are process synthesis...... constraints while also matching the design targets, they are therefore more sustainable than the base case. The application of the software-tool to the production of biodiesel is presented, highlighting the main features of the computer-aided, multi-stage, multi-scale methods that are able to determine more......Currently, the process industry is moving towards the design of innovative, more sustainable processes that show improvements in both economic and environmental factors. The design space of unit operations that can be combined to generate process flowsheet alternatives considering known unit...

  16. Software Tools for In-Situ Documentation of Built Heritage

    Science.gov (United States)

    Smars, P.

    2013-07-01

    The paper presents open source software tools developed by the author to facilitate in-situ documentation of architectural and archæological heritage. The design choices are exposed and related to a general issue in conservation and documentation: taking decisions about a valuable object under threat . The questions of level of objectivity is central to the three steps of this process. It is our belief that in-situ documentation has to be favoured in this demanding context, full of potential discoveries. The very powerful surveying techniques in rapid development nowadays enhance our vision but often tend to bring back a critical part of the documentation process to the office. The software presented facilitate a direct treatment of the data on the site. Emphasis is given to flexibility, interoperability and simplicity. Key features of the software are listed and illustrated with examples (3D model of Gothic vaults, analysis of the shape of a column, deformation of a wall, direct interaction with AutoCAD).

  17. The Software Improvement Process - Tools And Rules To Encourage Quality

    CERN Document Server

    Sigerud, K

    2011-01-01

    The Applications section of the CERN accelerator Controls group has decided to apply a systematic approach to quality assurance (QA), the “Software Improvement Process”, SIP. This process focuses on three areas: the development process itself, suitable QA tools, and how to practically encourage developers to do QA. For each stage of the development process we have agreed on the recommended activities and deliverables, and identified tools to automate and support the task. For example we do more code reviews. As peer reviews are resource-intensive, we only do them for complex parts of a product. As a complement, we are using static code checking tools, like FindBugs and Checkstyle. We also encourage unit testing and have agreed on a minimum level of test coverage recommended for all products, measured using Clover. Each of these tools is well integrated with our IDE (Eclipse) and give instant feedback to the developer about the quality of their code. The major challenges of SIP have been to 1) agree on com...

  18. Software tools to aid Pascal and Ada program design

    Energy Technology Data Exchange (ETDEWEB)

    Jankowitz, H.T.

    1987-01-01

    This thesis describes a software tool which analyses the style and structure of Pascal and Ada programs by ensuring that some minimum design requirements are fulfilled. The tool is used in much the same way as a compiler is used to teach students the syntax of a language, only in this case issues related to the design and structure of the program are of paramount importance. The tool operates by analyzing the design and structure of a syntactically correct program, automatically generating a report detailing changes that need to be made in order to ensure that the program is structurally sound. The author discusses how the model gradually evolved from a plagiarism detection system which extracted several measurable characteristics in a program to a model that analyzed the style of Pascal programs. In order to incorporate more-sophistical concepts like data abstraction, information hiding and data protection, this model was then extended to analyze the composition of Ada programs. The Ada model takes full advantage of facilities offered in the language and by using this tool the standard and quality of written programs is raised whilst the fundamental principles of program design are grasped through a process of self-tuition.

  19. Learning Photogrammetry with Interactive Software Tool PhoX

    Science.gov (United States)

    Luhmann, T.

    2016-06-01

    Photogrammetry is a complex topic in high-level university teaching, especially in the fields of geodesy, geoinformatics and metrology where high quality results are demanded. In addition, more and more black-box solutions for 3D image processing and point cloud generation are available that generate nice results easily, e.g. by structure-from-motion approaches. Within this context, the classical approach of teaching photogrammetry (e.g. focusing on aerial stereophotogrammetry) has to be reformed in order to educate students and professionals with new topics and provide them with more information behind the scene. Since around 20 years photogrammetry courses at the Jade University of Applied Sciences in Oldenburg, Germany, include the use of digital photogrammetry software that provide individual exercises, deep analysis of calculation results and a wide range of visualization tools for almost all standard tasks in photogrammetry. During the last years the software package PhoX has been developed that is part of a new didactic concept in photogrammetry and related subjects. It also serves as analysis tool in recent research projects. PhoX consists of a project-oriented data structure for images, image data, measured points and features and 3D objects. It allows for almost all basic photogrammetric measurement tools, image processing, calculation methods, graphical analysis functions, simulations and much more. Students use the program in order to conduct predefined exercises where they have the opportunity to analyse results in a high level of detail. This includes the analysis of statistical quality parameters but also the meaning of transformation parameters, rotation matrices, calibration and orientation data. As one specific advantage, PhoX allows for the interactive modification of single parameters and the direct view of the resulting effect in image or object space.

  20. Learning Photogrammetry with Interactive Software Tool PhoX

    Directory of Open Access Journals (Sweden)

    T. Luhmann

    2016-06-01

    Full Text Available Photogrammetry is a complex topic in high-level university teaching, especially in the fields of geodesy, geoinformatics and metrology where high quality results are demanded. In addition, more and more black-box solutions for 3D image processing and point cloud generation are available that generate nice results easily, e.g. by structure-from-motion approaches. Within this context, the classical approach of teaching photogrammetry (e.g. focusing on aerial stereophotogrammetry has to be reformed in order to educate students and professionals with new topics and provide them with more information behind the scene. Since around 20 years photogrammetry courses at the Jade University of Applied Sciences in Oldenburg, Germany, include the use of digital photogrammetry software that provide individual exercises, deep analysis of calculation results and a wide range of visualization tools for almost all standard tasks in photogrammetry. During the last years the software package PhoX has been developed that is part of a new didactic concept in photogrammetry and related subjects. It also serves as analysis tool in recent research projects. PhoX consists of a project-oriented data structure for images, image data, measured points and features and 3D objects. It allows for almost all basic photogrammetric measurement tools, image processing, calculation methods, graphical analysis functions, simulations and much more. Students use the program in order to conduct predefined exercises where they have the opportunity to analyse results in a high level of detail. This includes the analysis of statistical quality parameters but also the meaning of transformation parameters, rotation matrices, calibration and orientation data. As one specific advantage, PhoX allows for the interactive modification of single parameters and the direct view of the resulting effect in image or object space.

  1. Desire characteristics of a generic 'no frills' software engineering tools package

    Energy Technology Data Exchange (ETDEWEB)

    Rhodes, J.J.

    1986-07-29

    Increasing numbers of vendors are developing software engineering tools to meet the demands of increasingly complex software systems, higher reliability goals for software products, higher programming labor costs, and management's desire to more closely associate software lifecycle costs with the estimated development schedule. Some vendors have chosen a dedicated workstation approach to achieve high user interactivity through windowing and mousing. Other vendors are using multi-user mainframes with low cost terminals to economize on the costs of the hardware and the tools software. For all of the potential customers of software tools, the question remains: What are the minimum functional requirements that a software engineering tools package must have in order to be considered useful throughout the entire software lifecycle. This paper describes the desired characteristics of a non-existent but realistic 'no frills' software engineering tools package. 3 refs., 5 figs.

  2. The ultimate CASE (Computer-Aided Software Engineering) tool

    Energy Technology Data Exchange (ETDEWEB)

    Sharp, J.K.

    1990-01-01

    The theory and practice of information engineering is being actively developed at Sandia National Laboratories. The main output of Sandia is information. Information is created, analyzed and distributed. It is the life blood of our design laboratory. The proper management of information will have a large, positive impact on staff productivity. In order to achieve the potential benefits of shared information a commonly understood approach is needed, and the approach must be implemented in a CASE (Computer-Aided Software Engineering) tool that spans the entire life cycle of information. The commonly understood approach used at Sandia is natural language. More specifically, it is a structured subset of English. Users and system developers communicate requirements and commitments that they both understand. The approach is based upon NIAM (Nijssen's Information Analysis Methodology). In the last three years four NIAM training classes have been given at Sandia. The classes were all at the introductory level, with the latest class last October having an additional seminar highlighting successful projects. The continued growth in applications using NIAM requires an advanced class. The class will develop an information model for the Ultimate CASE Tool.'' This paper presents the requirements that have been established for the Ultimate CASE Tool'' and presents initial models. 4 refs., 1 tab.

  3. ACRF Ingest Software Status: New, Current, and Future - March 2008

    Energy Technology Data Exchange (ETDEWEB)

    AS Koontz; S Choudhury; BD Ermold; NN Keck; KL Gaustad; RC Perez

    2008-03-01

    The purpose of this report is to provide status of the ingest software used to process instrument data for the Atmospheric Radiation Measurement (ARM) Climate Research Facility (ACRF). The report is divided into four sections: (1) news about ingests currently under development, (2) current production ingests, (3) future ingest development plans, and (4) information on retired ingests. Please note that datastreams beginning in “xxx” indicate cases where ingests run at multiple ACRF sites, which results in a datastream(s) for each location.

  4. User Guide for the STAYSL PNNL Suite of Software Tools

    Energy Technology Data Exchange (ETDEWEB)

    Greenwood, Lawrence R.; Johnson, Christian D.

    2013-02-27

    The STAYSL PNNL software suite provides a set of tools for working with neutron activation rates measured in a nuclear fission reactor, an accelerator-based neutron source, or any neutron field to determine the neutron flux spectrum through a generalized least-squares approach. This process is referred to as neutron spectral adjustment since the preferred approach is to use measured data to adjust neutron spectra provided by neutron physics calculations. The input data consist of the reaction rates based on measured activities, an initial estimate of the neutron flux spectrum, neutron activation cross sections and their associated uncertainties (covariances), and relevant correction factors. The output consists of the adjusted neutron flux spectrum and associated covariance matrix, which is useful for neutron dosimetry and radiation damage calculations.

  5. Utilizing Spectroscopic Research Tools and Software in the Classroom

    Science.gov (United States)

    Grubbs, G. S., II

    2015-06-01

    Given today's technological age, it has become crucial to be able to reach the student in a more ''tech-savvy" way than traditional classroom methods afford. Given this, there are already a vast range of software packages available to the molecular spectroscopist that can easily be introduced to the classroom with success. This talk will highlight taking a few of these tools (Gaussian09, SPFIT/SPCAT, the AABS Package, LabViewTM, etc.) and implementing them in the classroom to teach subjects such as Quantum Mechanics and Thermodynamics as well as to aid in the linkage between these subjects. Examples of project implementation on both undergraduate and graduate level students will be presented with a discussion on the successes and failures of such attempts.

  6. Software reference for SaTool - a Tool for Structural Analysis of Automated Systems

    DEFF Research Database (Denmark)

    Lorentzen, Torsten; Blanke, Mogens

    2004-01-01

    This software reference details the functions of SaTool – a tool for structural analysis of technical systems. SaTool is intended used as part of an industrial systems design cycle. Structural analysis is a graph-based technique where principal relations between variables express the system......’s properties. Measured and controlled quantities in the system are related to variables through functional relations, which need only be stated as names, their explicit composition need not be described to the tool. The user enters a list of these relations that together describe the entirerity of the system....... The list of such variables and functional relations constitute the system’s structure graph. Normal operation means all functional relations are intact. Should faults occur, one or more functional relations cease to be valid. In a structure graph, this is seen as the disappearance of one or more nodes...

  7. Ethernet Networks: Current Trends and Tools

    CERN Document Server

    El-Sayed, Abdulqader M

    2009-01-01

    Ethernet topology discovery has gained increasing interest in the recent years. This trend is motivated mostly by increasing number of carrier Ethernet networks as well as the size of these networks, and consequently the increasing sales of these networks. To manage these networks efficiently, detailed and accurate knowledge of their topology is needed. Knowledge of a network's entities and the physical connections between them can be useful in various prospective. Administrators can use topology information for network planning and fault detecting. Topology information can also be used during protocol and routing algorithm development, for performance prediction and as a basis for accurate network simulations. From a network security perspective, threat detection, network monitoring, network access control and forensic investigations can benefit from accurate network topology information. In this paper, we analyze market trends and investigate current tools available for both research and commercial purposes...

  8. Distortion Analysis Toolkit—A Software Tool for Easy Analysis of Nonlinear Audio Systems

    Science.gov (United States)

    Pakarinen, Jyri

    2010-12-01

    Several audio effects devices deliberately add nonlinear distortion to the processed signal in order to create a desired sound. When creating virtual analog models of nonlinearly distorting devices, it would be very useful to carefully analyze the type of distortion, so that the model could be made as realistic as possible. While traditional system analysis tools such as the frequency response give detailed information on the operation of linear and time-invariant systems, they are less useful for analyzing nonlinear devices. Furthermore, although there do exist separate algorithms for nonlinear distortion analysis, there is currently no unified, easy-to-use tool for rapid analysis of distorting audio systems. This paper offers a remedy by introducing a new software tool for easy analysis of distorting effects. A comparison between a well-known guitar tube amplifier and two commercial software simulations is presented as a case study. This freely available software is written in Matlab language, but the analysis tool can also run as a standalone program, so the user does not need to have Matlab installed in order to perform the analysis.

  9. Software Development Outsourcing Decision Support Tool with Neural Network Learning

    Science.gov (United States)

    2004-03-01

    software domain, enterprise scripting software domain, and outsourcing ( maintenance and training) processes found to be included in the new model but not in...accounting and order entry) software domains, and outsourcing ( maintenance , configuration management and software engineer support) processes were...found in the original model but not in the new model included: enterprise (scripting and order entry) software domains and outsourcing maintenance process

  10. A software tool for rapid flood inundation mapping

    Science.gov (United States)

    Verdin, James; Verdin, Kristine; Mathis, Melissa L.; Magadzire, Tamuka; Kabuchanga, Eric; Woodbury, Mark; Gadain, Hussein

    2016-06-02

    The GIS Flood Tool (GFT) was developed by the U.S. Geological Survey with support from the U.S. Agency for International Development’s Office of U.S. Foreign Disaster Assistance to provide a means for production of reconnaissance-level flood inundation mapping for data-sparse and resource-limited areas of the world. The GFT has also attracted interest as a tool for rapid assessment flood inundation mapping for the Flood Inundation Mapping Program of the U.S. Geological Survey. The GFT can fill an important gap for communities that lack flood inundation mapping by providing a first-estimate of inundation zones, pending availability of resources to complete an engineering study. The tool can also help identify priority areas for application of scarce flood inundation mapping resources. The technical basis of the GFT is an application of the Manning equation for steady flow in an open channel, operating on specially processed digital elevation data. The GFT is implemented as a software extension in ArcGIS. Output maps from the GFT were validated at 11 sites with inundation maps produced previously by the Flood Inundation Mapping Program using standard one-dimensional hydraulic modeling techniques. In 80 percent of the cases, the GFT inundation patterns matched 75 percent or more of the one-dimensional hydraulic model inundation patterns. Lower rates of pattern agreement were seen at sites with low relief and subtle surface water divides. Although the GFT is simple to use, it should be applied with the oversight or review of a qualified hydraulic engineer who understands the simplifying assumptions of the approach.

  11. SNPdetector: a software tool for sensitive and accurate SNP detection.

    Directory of Open Access Journals (Sweden)

    Jinghui Zhang

    2005-10-01

    Full Text Available Identification of single nucleotide polymorphisms (SNPs and mutations is important for the discovery of genetic predisposition to complex diseases. PCR resequencing is the method of choice for de novo SNP discovery. However, manual curation of putative SNPs has been a major bottleneck in the application of this method to high-throughput screening. Therefore it is critical to develop a more sensitive and accurate computational method for automated SNP detection. We developed a software tool, SNPdetector, for automated identification of SNPs and mutations in fluorescence-based resequencing reads. SNPdetector was designed to model the process of human visual inspection and has a very low false positive and false negative rate. We demonstrate the superior performance of SNPdetector in SNP and mutation analysis by comparing its results with those derived by human inspection, PolyPhred (a popular SNP detection tool, and independent genotype assays in three large-scale investigations. The first study identified and validated inter- and intra-subspecies variations in 4,650 traces of 25 inbred mouse strains that belong to either the Mus musculus species or the M. spretus species. Unexpected heterozygosity in CAST/Ei strain was observed in two out of 1,167 mouse SNPs. The second study identified 11,241 candidate SNPs in five ENCODE regions of the human genome covering 2.5 Mb of genomic sequence. Approximately 50% of the candidate SNPs were selected for experimental genotyping; the validation rate exceeded 95%. The third study detected ENU-induced mutations (at 0.04% allele frequency in 64,896 traces of 1,236 zebra fish. Our analysis of three large and diverse test datasets demonstrated that SNPdetector is an effective tool for genome-scale research and for large-sample clinical studies. SNPdetector runs on Unix/Linux platform and is available publicly (http://lpg.nci.nih.gov.

  12. SNPdetector: A Software Tool for Sensitive and Accurate SNP Detection.

    Directory of Open Access Journals (Sweden)

    2005-10-01

    Full Text Available Identification of single nucleotide polymorphisms (SNPs and mutations is important for the discovery of genetic predisposition to complex diseases. PCR resequencing is the method of choice for de novo SNP discovery. However, manual curation of putative SNPs has been a major bottleneck in the application of this method to high-throughput screening. Therefore it is critical to develop a more sensitive and accurate computational method for automated SNP detection. We developed a software tool, SNPdetector, for automated identification of SNPs and mutations in fluorescence-based resequencing reads. SNPdetector was designed to model the process of human visual inspection and has a very low false positive and false negative rate. We demonstrate the superior performance of SNPdetector in SNP and mutation analysis by comparing its results with those derived by human inspection, PolyPhred (a popular SNP detection tool, and independent genotype assays in three large-scale investigations. The first study identified and validated inter- and intra-subspecies variations in 4,650 traces of 25 inbred mouse strains that belong to either the Mus musculus species or the M. spretus species. Unexpected heterozgyosity in CAST/Ei strain was observed in two out of 1,167 mouse SNPs. The second study identified 11,241 candidate SNPs in five ENCODE regions of the human genome covering 2.5 Mb of genomic sequence. Approximately 50% of the candidate SNPs were selected for experimental genotyping; the validation rate exceeded 95%. The third study detected ENU-induced mutations (at 0.04% allele frequency in 64,896 traces of 1,236 zebra fish. Our analysis of three large and diverse test datasets demonstrated that SNPdetector is an effective tool for genome-scale research and for large-sample clinical studies. SNPdetector runs on Unix/Linux platform and is available publicly (http://lpg.nci.nih.gov.

  13. 76 FR 5832 - International Business Machines (IBM), Software Group Business Unit, Optim Data Studio Tools QA...

    Science.gov (United States)

    2011-02-02

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF LABOR Employment and Training Administration International Business Machines (IBM), Software Group Business Unit... at International Business Machines (IBM), Software Group Business Unit, Optim Data Studio Tools...

  14. Can agile software tools bring the benefits of a task board to globally distributed teams?

    NARCIS (Netherlands)

    Katsma, Christiaan; Amrit, Chintan; Hillegersberg, van Jos; Sikkel, Klaas; Oshri, Ilan; Kotlarsky, Julia; Willcocks, Leslie P.

    2013-01-01

    Software-based tooling has become an essential part of globally disitrbuted software development. In this study we focus on the usage of such tools and task boards in particular. We investigate the deployment of these tools through a field research in 4 different companies that feature agile and glo

  15. Performance Evaluation of a Software Engineering Tool for Automated Design of Cooling Systems in Injection Moulding

    DEFF Research Database (Denmark)

    Jauregui-Becker, Juan M.; Tosello, Guido; van Houten, Fred J.A.M.

    2013-01-01

    This paper presents a software tool for automating the design of cooling systems for injection moulding and a validation of its performance. Cooling system designs were automatically generated by the proposed software tool and by applying a best practice tool engineering design approach. The two...

  16. Can agile software tools bring the benefits of a task board to globally distributed teams?

    NARCIS (Netherlands)

    Katsma, Christiaan; Amrit, Chintan Amrit; van Hillegersberg, Jos; Sikkel, Nicolaas; Oshri, Ilan; Kotlarsky, Julia; Willcocks, Leslie P.

    Software-based tooling has become an essential part of globally disitrbuted software development. In this study we focus on the usage of such tools and task boards in particular. We investigate the deployment of these tools through a field research in 4 different companies that feature agile and

  17. A software tool for teaching and training how to build and use a TOWS matrix

    Directory of Open Access Journals (Sweden)

    Amparo Mariño Ibáñez

    2010-05-01

    Full Text Available Strategic planning is currently being used by most companies; it analyses current and expected future situations, determines com-pany orientation and develops means or strategies for achieving their stated missions. This article is aimed at reviewing general considerations in strategic planning and presenting a computational tool designed for building a TOWS matrix for matching a company’s opportunities and threats with its weaknesses and, more especially, its strengths. The software development life cycle (SDLC involved analysis, design, implementation and use. The literature about strategic planning and SWOT analysis was re-viewed for making the analysis. The software only automates an aspect of the whole strategic planning process and can be used for improving students and staff training in SWOT analysis. This type of work seeks to motivate interdisciplinary research.

  18. Decision graphs: a tool for developing real-time software

    Energy Technology Data Exchange (ETDEWEB)

    Kozubal, A.J.

    1981-01-01

    The use of decision graphs in the preparation of, in particular, real-time software is briefly described. The usefulness of decision graphs in software design, testing, and maintenance is pointed out. 2 figures. (RWR)

  19. Software Reuse in Agile Development Organizations - A Conceptual Management Tool

    NARCIS (Netherlands)

    Spoelstra, Wouter; Iacob, Maria; Sinderen, van Marten

    2011-01-01

    The reuse of knowledge is considered a major factor for increasing productivity and quality. In the software industry knowledge is embodied in software assets such as code components, functional designs and test cases. This kind of knowledge reuse is also referred to as software reuse. Although the

  20. HOW AUTOMATED TESTING TOOLS ARE SHOWING ITS IMPACT IN THE FIELD OF SOFTWARE TESTING

    Directory of Open Access Journals (Sweden)

    Deepti Gaur

    2012-09-01

    Full Text Available As, we know that Software testing is a very vast field inSoftware development life cycle. In this paper, we describethat how automated testing tools are very much convenientand easy to use which also makes testing faster and moreeffective in less time. Actually the world of technologyrevolves at fast pace today and among all Testing tools onlyautomated testing tools makes Software testing moresignificant and effective.

  1. Characteristics and possibilities of software tool for metal-oxide surge arresters selection

    Directory of Open Access Journals (Sweden)

    Đorđević Dragan

    2012-01-01

    Full Text Available This paper presents a procedure for the selection of metal-oxide surge arresters based on the instructions given in the Siemens and ABB catalogues, respecting their differences and the characteristics and possibilities of the software tool. The software tool was developed during the preparation of a Master's thesis titled, 'Automation of Metal-Oxide Surge Arresters Selection'. An example is presented of the selection of metal-oxide surge arresters using the developed software tool.

  2. BEASTling: A software tool for linguistic phylogenetics using BEAST 2

    Science.gov (United States)

    Forkel, Robert; Kaiping, Gereon A.; Atkinson, Quentin D.

    2017-01-01

    We present a new open source software tool called BEASTling, designed to simplify the preparation of Bayesian phylogenetic analyses of linguistic data using the BEAST 2 platform. BEASTling transforms comparatively short and human-readable configuration files into the XML files used by BEAST to specify analyses. By taking advantage of Creative Commons-licensed data from the Glottolog language catalog, BEASTling allows the user to conveniently filter datasets using names for recognised language families, to impose monophyly constraints so that inferred language trees are backward compatible with Glottolog classifications, or to assign geographic location data to languages for phylogeographic analyses. Support for the emerging cross-linguistic linked data format (CLDF) permits easy incorporation of data published in cross-linguistic linked databases into analyses. BEASTling is intended to make the power of Bayesian analysis more accessible to historical linguists without strong programming backgrounds, in the hopes of encouraging communication and collaboration between those developing computational models of language evolution (who are typically not linguists) and relevant domain experts. PMID:28796784

  3. Introduction to format: the software tools test formatting program

    Energy Technology Data Exchange (ETDEWEB)

    Agazzi, C.

    1984-12-01

    Format is the name of the Software Tools formatter. It allows you to format text according to instructions that you place within the text. The text and instructions for each document you wish to create are kept in files. Each instruction, called a request line, makes changes in the way your document is laid out. For example, you can change the margins within your document to visually set off lists of items or topics. You can also bold face or underline words or sentences to highlight them. Throughout this manual are examples of how to use the Format request lines along with illustrations of the effects request lines have on an example letter. The request lines begin with a period in the first column on the screen. Each request line performs a specific function and is placed on the line immediately in front of the test to be formatted. Output lines are automatically filled; that is, their right margins are justified, without regard to the format of the input test lines.

  4. 76 FR 13663 - Cooper Tools, Currently Known as Apex Tool Group, LLC, Hicksville, OH; Amended Certification...

    Science.gov (United States)

    2011-03-14

    ... Employment and Training Administration Cooper Tools, Currently Known as Apex Tool Group, LLC, Hicksville, OH..., applicable to workers of Cooper Tools, Hicksville, Ohio. The workers are engaged in activities related to the... information shows that in July, 2010, Apex Tool Group, LLC. purchased Cooper Tools and is currently known...

  5. Variation of densitometry on computed tomography in COPD--influence of different software tools.

    Directory of Open Access Journals (Sweden)

    Mark O Wielpütz

    Full Text Available Quantitative multidetector computed tomography (MDCT as a potential biomarker is increasingly used for severity assessment of emphysema in chronic obstructive pulmonary disease (COPD. Aim of this study was to evaluate the user-independent measurement variability between five different fully-automatic densitometry software tools.MDCT and full-body plethysmography incl. forced expiratory volume in 1s and total lung capacity were available for 49 patients with advanced COPD (age = 64±9 years, forced expiratory volume in 1 s = 31±6% predicted. Measurement variation regarding lung volume, emphysema volume, emphysema index, and mean lung density was evaluated for two scientific and three commercially available lung densitometry software tools designed to analyze MDCT from different scanner types.One scientific tool and one commercial tool failed to process most or all datasets, respectively, and were excluded. One scientific and another commercial tool analyzed 49, the remaining commercial tool 30 datasets. Lung volume, emphysema volume, emphysema index and mean lung density were significantly different amongst these three tools (p<0.001. Limits of agreement for lung volume were [-0.195, -0.052 l], [-0.305, -0.131 l], and [-0.123, -0.052 l] with correlation coefficients of r = 1.00 each. Limits of agreement for emphysema index were [-6.2, 2.9%], [-27.0, 16.9%], and [-25.5, 18.8%], with r = 0.79 to 0.98. Correlation of lung volume with total lung capacity was good to excellent (r = 0.77 to 0.91, p<0.001, but segmented lung volume (6.7±1.3-6.8±1.3 l were significantly lower than total lung capacity (7.7±1.7 l, p<0.001.Technical incompatibilities hindered evaluation of two of five tools. The remaining three showed significant measurement variation for emphysema, hampering quantitative MDCT as a biomarker in COPD. Follow-up studies should currently use identical software, and standardization efforts should encompass software as

  6. Ranking Vaccines : A Prioritization Software Tool: Phase II: Prototype of a Decision-Support System

    National Research Council Canada - National Science Library

    Committee on Identifying and Prioritizing New Preventive Vaccines for Development, Phase II; Board on Population Health and Public Health Practice; Board on Global Health; Institute of Medicine; Kinpritma Sangha; Charles Phelps; Dennis Fryback; Rino Rappuoli; Rose Marie Martinez; Lonnie King

    2013-01-01

    SMART Vaccines--Strategic Multi-Attribute Ranking Tool for Vaccines--is a prioritization software tool developed by the Institute of Medicine that utilizes decision science and modeling to help inform...

  7. Towards a Reference Architecture to Provision Tools as a Service for Global Software Development

    DEFF Research Database (Denmark)

    Chauhan, Aufeef; Babar, Muhammad Ali

    2014-01-01

    Organizations involve in Global Software Development (GSD) face challenges in terms of having access to appropriate set of tools for performing distributed engineering and development activities, integration between heterogeneous desktop and web-based tools, management of artifacts developed and ...

  8. Systems and software variability management concepts, tools and experiences

    CERN Document Server

    Capilla, Rafael; Kang, Kyo-Chul

    2013-01-01

    The success of product line engineering techniques in the last 15 years has popularized the use of software variability as a key modeling approach for describing the commonality and variability of systems at all stages of the software lifecycle. Software product lines enable a family of products to share a common core platform, while allowing for product specific functionality being built on top of the platform. Many companies have exploited the concept of software product lines to increase the resources that focus on highly differentiating functionality and thus improve their competitiveness

  9. Software for predictive microbiology and risk assessment: a description and comparison of tools presented at the ICPMF8 Software Fair.

    Science.gov (United States)

    Tenenhaus-Aziza, Fanny; Ellouze, Mariem

    2015-02-01

    The 8th International Conference on Predictive Modelling in Food was held in Paris, France in September 2013. One of the major topics of this conference was the transfer of knowledge and tools between academics and stakeholders of the food sector. During the conference, a "Software Fair" was held to provide information and demonstrations of predictive microbiology and risk assessment software. This article presents an overall description of the 16 software tools demonstrated at the session and provides a comparison based on several criteria such as the modeling approach, the different modules available (e.g. databases, predictors, fitting tools, risk assessment tools), the studied environmental factors (temperature, pH, aw, etc.), the type of media (broth or food) and the number and type of the provided micro-organisms (pathogens and spoilers). The present study is a guide to help users select the software tools which are most suitable to their specific needs, before they test and explore the tool(s) in more depth. Copyright © 2014 Elsevier Ltd. All rights reserved.

  10. EVALUATION METRICS FOR WIRELESS SENSOR NETWORK SECURITY: ALGORITHMS REVIEW AND SOFTWARE TOOL

    Directory of Open Access Journals (Sweden)

    Qasem Abu Al-Haija

    2013-01-01

    Full Text Available Wireless Sensor Networks (WSN is currently receiving a significant attention due to their potential impact into several real life applications such as military and home automation technology. The work in this study is a complementary part of what’s discussed. In this study, we propose a software tool to simulate and evaluate the six evaluation metrics presented for non-deterministic wireless sensor network in which are: Scalability, Key Connectivity, Memory complexity, Communication complexity, Power Consumption and Confidentiality. The evaluation metrics were simulated as well as evaluated to help the network designer choosing the best probabilistic security key management algorithm for certain randomly distributed sensory network.

  11. Nik Software Captured The Complete Guide to Using Nik Software's Photographic Tools

    CERN Document Server

    Corbell, Tony L

    2011-01-01

    Learn all the features and functionality of the complete Nik family of products Styled in such a way as to resemble the way photographers think, Nik Software Captured aims to help you learn to apply all the features and functionality of the Nik software products. With Nik Software Captured, authors and Nik Software, Inc. insiders Tony Corbell and Josh Haftel help you use after-capture software products easier and more creatively. Their sole aim is to ensure that you can apply the techniques discussed in the book while gaining a thorough understanding of the capabilities of programs such as Dfi

  12. Software development tool for PicoBlaze multi-processor implementation

    OpenAIRE

    Claudiu Lung; Buchman Attila

    2012-01-01

    This paper presents a useful software tool for projects with multi PicoBlaze microprocessors implemented in FPGA circuits. Application presented in this paper which use for software development PicoBlaze SDK tool is an Automatic Packet Report System (APRS), with three PicoBlaze microprocessors implemented in FPGA circuit.

  13. Kid Tools: Self-Management, Problem- Solving, Organizational, and Planning Software for Children and Teachers

    Science.gov (United States)

    Miller, Kevin J.; Fitzgerald, Gail E.; Koury, Kevin A.; Mitchem, Herine J.; Hollingsead, Candice

    2007-01-01

    This article provides an overview of KidTools, an electronic performance software system designed for elementary and middle school children to use independently on classroom or home computers. The software system contains 30 computerized research-based strategy tools that can be implemented in a classroom or home environment. Through the…

  14. Development of Software for Analyzing Breakage Cutting ToolsBased on Image Processing

    Institute of Scientific and Technical Information of China (English)

    赵彦玲; 刘献礼; 王鹏; 王波; 王红运

    2004-01-01

    As the present day digital microsystems do not provide specialized microscopes that can detect cutting-tool, analysis software has been developed using VC++. A module for verge test and image segmentation is designed specifically for cutting-tools. Known calibration relations and given postulates are used in scale measurements. Practical operations show that the software can perform accurate detection.

  15. Possibilities for using software tools in the process of secuirty design

    Directory of Open Access Journals (Sweden)

    Ladislav Mariš

    2013-07-01

    Full Text Available The authors deal with the use of software support the process of security design. The article proposes the theoretical basis of the implementation of software tools to design activities. Based on the selected design standards of electrical safety systems application design solutions, especially in drawing documentation. The article should serve the needs of the project team members in order to use selected software tools and a subsequent increase in the degree of automation of design activities.

  16. Software Engineering Practices and Tool Support: an exploratory study in New Zealand

    Directory of Open Access Journals (Sweden)

    Chris Phillips

    2003-11-01

    Full Text Available This study was designed as a preliminary investigation of the practices of software engineers within New Zealand, including their use of development tools. The project involved a review of relevant literature on software engineering and CASE tools, the development and testing of an interview protocol, and structured interviews with five software engineers. This paper describes the project, presents the findings, examines the results in the context of the literature and outlines on-going funded work involving a larger survey.

  17. Software Support of Modelling using Ergonomic Tools in Engineering

    Directory of Open Access Journals (Sweden)

    Darina Dupláková

    2017-08-01

    Full Text Available One of the preconditions for correct development of industrial production is continuous interconnecting of virtual reality and real world by computer software. Computer software are used for product modelling, creation of technical documentation, scheduling, management and optimization of manufacturing processes, and efficiency increase of human work in manufacturing plants. This article describes the frequent used ergonomic software which helping to increase of human work by error rate reducing, risks factors of working environment, injury in workplaces and elimination of arising occupational diseases. They are categorized in the field of micro ergonomics and they are applicable at the manufacturing level with flexible approach in solving of established problems.

  18. Software tool for analysing the family shopping basket without candidate generation

    Directory of Open Access Journals (Sweden)

    Roberto Carlos Naranjo Cuervo

    2010-05-01

    Full Text Available Tools leading to useful knowledge being obtained for supporting marketing decisions being taken are currently needed in the e-commerce environment. A process is needed for this which uses a series of techniques for data-processing; data-mining is one such technique enabling automatic information discovery. This work presents the association rules as a suitable technique for dis-covering how customers buy from a company offering business to consumer (B2C e-business, aimed at supporting decision-ma-king in supplying its customers or capturing new ones. Many algorithms such as A priori, DHP, Partition, FP-Growth and Eclat are available for implementing association rules; the following criteria were defined for selecting the appropriate algorithm: database insert, computational cost, performance and execution time. The development of a software tool is also presented which involved the CRISP-DM approach; this software tool was formed by the following four sub-modules: data pre-processing, data-mining, re-sults analysis and results application. The application design used three-layer architecture: presentation logic, business logic and service logic. Data warehouse design and algorithm design were included in developing this data-mining software tool. It was tested by using a FoodMart company database; the tests included performance, functionality and results’ validity, thereby allo-wing association rules to be found. The results led to concluding that using association rules as a data mining technique facilita-tes analysing volumes of information for B2C e-business services which represents a competitive advantage for those companies using Internet as their sales’ media.

  19. Creating a strategic plan for configuration management using computer aided software engineering (CASE) tools

    Energy Technology Data Exchange (ETDEWEB)

    Smith, P.R.; Sarfaty, R.

    1993-05-01

    This paper provides guidance in the definition, documentation, measurement, enhancement of processes, and validation of a strategic plan for configuration management (CM). The approach and methodology used in establishing a strategic plan is the same for any enterprise, including the Department of Energy (DOE), commercial nuclear plants, the Department of Defense (DOD), or large industrial complexes. The principles and techniques presented are used world wide by some of the largest corporations. The authors used industry knowledge and the areas of their current employment to illustrate and provide examples. Developing a strategic configuration and information management plan for DOE Idaho Field Office (DOE-ID) facilities is discussed in this paper. A good knowledge of CM principles is the key to successful strategic planning. This paper will describe and define CM elements, and discuss how CM integrates the facility`s physical configuration, design basis, and documentation. The strategic plan does not need the support of a computer aided software engineering (CASE) tool. However, the use of the CASE tool provides a methodology for consistency in approach, graphics, and database capability combined to form an encyclopedia and a method of presentation that is easily understood and aids the process of reengineering. CASE tools have much more capability than those stated above. Some examples are supporting a joint application development group (JAD) to prepare a software functional specification document and, if necessary, provide the capability to automatically generate software application code. This paper briefly discusses characteristics and capabilities of two CASE tools that use different methodologies to generate similar deliverables.

  20. Tuning COCOMO-II for Software Process Improvement: A Tool Based Approach

    Directory of Open Access Journals (Sweden)

    SYEDA UMEMA HANI

    2016-10-01

    Full Text Available In order to compete in the international software development market the software organizations have to adopt internationally accepted software practices i.e. standard like ISO (International Standard Organization or CMMI (Capability Maturity Model Integration in spite of having scarce resources and tools. The aim of this study is to develop a tool which could be used to present an actual picture of Software Process Improvement benefits in front of the software development companies. However, there are few tools available to assist in making predictions, they are too expensive and could not cover dataset that reflect the cultural behavior of organizations for software development in developing countries. In extension to our previously done research reported elsewhere for Pakistani software development organizations which has quantified benefits of SDPI (Software Development Process Improvement, this research has used sixty-two datasets from three different software development organizations against the set of metrics used in COCOMO-II (Constructive Cost Model 2000. It derived a verifiable equation for calculating ISF (Ideal Scale Factor and tuned the COCOMO-II model to bring prediction capability for SDPI (benefit measurement classes such as ESCP (Effort, Schedule, Cost, and Productivity. This research has contributed towards software industry by giving a reliable and low-cost mechanism for generating prediction models with high prediction accuracy. Hopefully, this study will help software organizations to use this tool not only to predict ESCP but also to predict an exact impact of SDPI.

  1. Using Software Development Tools and Practices in Acquisition

    Science.gov (United States)

    2013-12-01

    provement like CMMI ® and software development processes like Rational Unified Process (RUP). Somewhat orthogonal to these structured methods are software...permission@sei.cmu.edu. * These restrictions do not apply to U.S. government entities. Architecture Tradeoff Analysis Method ®, Carnegie Mellon® and... CMMI ® are registered in the U.S. Patent and Trademark Office by Carnegie Mellon University. DM-0000791. CMU/SEI-2013-TN-017 | i Table of

  2. Virtual Power Electronics: Novel Software Tools for Design, Modeling and Education

    Science.gov (United States)

    Hamar, Janos; Nagy, István; Funato, Hirohito; Ogasawara, Satoshi; Dranga, Octavian; Nishida, Yasuyuki

    The current paper is dedicated to present browser-based multimedia-rich software tools and e-learning curriculum to support the design and modeling process of power electronics circuits and to explain sometimes rather sophisticated phenomena. Two projects will be discussed. The so-called Inetele project is financed by the Leonardo da Vinci program of the European Union (EU). It is a collaborative project between numerous EU universities and institutes to develop state-of-the art curriculum in Electrical Engineering. Another cooperative project with participation of Japanese, European and Australian institutes focuses especially on developing e-learning curriculum, interactive design and modeling tools, furthermore on development of a virtual laboratory. Snapshots from these two projects will be presented.

  3. A Web-based Tool for Automatizing the Software Process Improvement Initiatives in Small Software Enterprises

    NARCIS (Netherlands)

    Garcia, I.; Pacheco, C.

    2010-01-01

    Top-down process improvement approaches provide a high-level model of what the process of a software development organization should be. Such models are based on the consensus of a designated working group on how software should be developed or maintained. They are very useful in that they provide g

  4. A Web-based Tool for Automatizing the Software Process Improvement Initiatives in Small Software Enterprises

    NARCIS (Netherlands)

    Garcia, I.; Pacheco, C.

    2010-01-01

    Top-down process improvement approaches provide a high-level model of what the process of a software development organization should be. Such models are based on the consensus of a designated working group on how software should be developed or maintained. They are very useful in that they provide g

  5. Current tools and techniques in library science

    CERN Document Server

    Ralhan, Punit

    2009-01-01

    The book undertakes a comprehensive look at the trend of information explosion in library science throughout the ages with the aid of technology, and the contemporary tools in the field. The trend of library policy is clearly towards the ideal of making all information available without delay to all people. Because of technological progress, however, the difficulties of accomplishing this goal is formidable and growing.

  6. New tools for digital medical image processing implemented in DIP software

    Energy Technology Data Exchange (ETDEWEB)

    Araujo, Erica A.C.; Santana, Ivan E. [Instituto Federal de Educacao, Ciencia e Tecnologia de Pernambuco, Recife, PE (Brazil); Lima, Fernando R.A., E-mail: falima@cnen.gov.b [Centro Regional de Ciencias Nucleares, (CRCN/NE-CNEN-PE), Recife, PE (Brazil); Viera, Jose W. [Escola Politecnica de Pernambuco, Recife, PE (Brazil)

    2011-07-01

    The anthropomorphic models used in computational dosimetry, also called phantoms, are mostly built from stacks of images CT (Computed Tomography) or MRI (Magnetic Resonance Imaging) obtained from scans of patients or volunteers. The construction of voxel phantoms requires computational processing for transforming image formats, dimensional image compression (2D) to form three-dimensional arrays (3D), quantization, resampling, enhancement, restoration and image segmentation, among others. Hardly the computational dosimetry researcher finds all these skills into a single software and often it results in a decreased development of their research or inadequate use of alternative tools. The need to integrate the various tasks of the original digital image processing to obtain an image that can be used in a computational model of exposure led to the development of software DIP (Digital Image Processing). This software reads, writes and edits binary files containing the 3D matrix corresponding to a stack of cross-sectional images of a given geometry that can be a human body or other volume of interest. It can also read any type of computer image and do conversions. When the task involves only one output image, it is saved in the JPEG standard Windows. When it involves a stack of images, the binary output file is called SGI (Interactive Graphic Simulations, a symbol already used in other publications of the Research Group in Numerical Dosimetry). The following paper presents the third version of the DIP software and emphasizes the new tools it implemented. Currently it has the menus Basics, Views, Spatial Domain, Frequency Domain, Segmentations and Study. Each menu contains items and subitems with features that generally require an image as input and produce an image or an attribute in the output. (author)

  7. Software engineering capability for Ada (GRASP/Ada Tool)

    Science.gov (United States)

    Cross, James H., II

    1995-01-01

    The GRASP/Ada project (Graphical Representations of Algorithms, Structures, and Processes for Ada) has successfully created and prototyped a new algorithmic level graphical representation for Ada software, the Control Structure Diagram (CSD). The primary impetus for creation of the CSD was to improve the comprehension efficiency of Ada software and, as a result, improve reliability and reduce costs. The emphasis has been on the automatic generation of the CSD from Ada PDL or source code to support reverse engineering and maintenance. The CSD has the potential to replace traditional prettyprinted Ada Source code. A new Motif compliant graphical user interface has been developed for the GRASP/Ada prototype.

  8. MSP-Tool: a VBA-based software tool for the analysis of multispecimen paleointensity data

    Directory of Open Access Journals (Sweden)

    Marilyn Wilhelmina Leonora Monster

    2015-12-01

    Full Text Available The multispecimen protocol (MSP is a method to estimate the Earth’s magnetic field’s past strength from volcanic rocks or archeological materials. By reducing the amount of heating steps and aligning the specimens parallel to the applied field, thermochemical alteration and multi-domain effects are minimized. We present a new software tool, written for Microsoft Excel 2010 in Visual Basic for Applications (VBA, that evaluates paleointensity data acquired using this protocol. In addition to the three ratios (standard, fraction-corrected and domain-state-corrected calculated following Dekkers and Böhnel (2006 and Fabian and Leonhardt (2010 and a number of other parameters proposed by Fabian and Leonhardt (2010, it also provides several reliability criteria. These include an alteration criterion, whether or not the linear regression intersects the y axis within the theoretically prescribed range, and two directional checks. Overprints and misalignment are detected by isolating the remaining natural remanent magnetization (NRM and the partial thermoremanent magnetization (pTRM gained and comparing their declinations and inclinations. The NRM remaining and pTRM gained are then used to calculate alignment-corrected multispecimen plots. Data are analyzed using bootstrap statistics. The program was tested on lava samples that were given a full TRM and that acquired their pTRMs at angles of 0, 15, 30 and 90° with respect to their NRMs. MSP-Tool adequately detected and largely corrected these artificial alignment errors.

  9. MSP-Tool: a VBA-based software tool for the analysis of multispecimen paleointensity data

    Science.gov (United States)

    Monster, Marilyn; de Groot, Lennart; Dekkers, Mark

    2015-12-01

    The multispecimen protocol (MSP) is a method to estimate the Earth's magnetic field's past strength from volcanic rocks or archeological materials. By reducing the amount of heating steps and aligning the specimens parallel to the applied field, thermochemical alteration and multi-domain effects are minimized. We present a new software tool, written for Microsoft Excel 2010 in Visual Basic for Applications (VBA), that evaluates paleointensity data acquired using this protocol. In addition to the three ratios (standard, fraction-corrected and domain-state-corrected) calculated following Dekkers and Böhnel (2006) and Fabian and Leonhardt (2010) and a number of other parameters proposed by Fabian and Leonhardt (2010), it also provides several reliability criteria. These include an alteration criterion, whether or not the linear regression intersects the y axis within the theoretically prescribed range, and two directional checks. Overprints and misalignment are detected by isolating the remaining natural remanent magnetization (NRM) and the partial thermoremanent magnetization (pTRM) gained and comparing their declinations and inclinations. The NRM remaining and pTRM gained are then used to calculate alignment-corrected multispecimen plots. Data are analyzed using bootstrap statistics. The program was tested on lava samples that were given a full TRM and that acquired their pTRMs at angles of 0, 15, 30 and 90° with respect to their NRMs. MSP-Tool adequately detected and largely corrected these artificial alignment errors.

  10. C++ Software Quality in the ATLAS experiment: Tools and Experience

    CERN Document Server

    Martin-Haugh, Stewart; The ATLAS collaboration

    2017-01-01

    In this paper we explain how the C++ code quality is managed in ATLAS using a range of tools from compile-time through to run time testing and reflect on the substantial progress made in the last two years largely through the use of static analysis tools such as Coverity®, an industry-standard tool which enables quality comparison with general open source C++ code. Other available code analysis tools are also discussed, as is the role of unit testing with an example of how the GoogleTest framework can be applied to our codebase.

  11. Simulation and visualization tool design for robot software

    NARCIS (Netherlands)

    Lu, Zhou; Ran, Tjalling; Broenink, Jan F.; Chalmers, K.; Pedersen, J.B.

    2016-01-01

    Modern embedded systems are designed for multiple and increasingly demanding tasks. Complex concurrent software is required by multi-task automated service robotics for implementing their challenging (control) algorithms. TERRA is a communicating Sequential Processes (CSP) algebra-based Eclipse grap

  12. A Survey of Reliability, Maintainability, Supportability, and Testability Software Tools

    Science.gov (United States)

    1991-04-01

    Industrial Reliability Program ABSTRACT: Program is utilized for analysis of industrial equipment for which military requirements are not applicable...Can communicate with other TECNASA software and has British or Portuguese menus. MACHINES: IBM PC POC: TECNASA Attn: Jose L. Barletta Electronica ...termed "performability". Models both repairable and nonrepairable systems. MACHINES: No Data POC: Industrial Technology Institute 20 NAME: METFAC

  13. A SOFTWARE TOOL FOR EXPERIMENTAL STUDY LEAP MOTION

    Directory of Open Access Journals (Sweden)

    Georgi Krastev

    2015-12-01

    Full Text Available The paper aims to present computer application that illustrates Leap Motion controller’s abilities. It is a peripheral and software for PC, which enables control by natural user interface based on gestures. The publication also describes how the controller works and its main advantages/disadvantages. Some apps using leap motion controller are discussed.

  14. Towards an Interoperability Ontology for Software Development Tools

    Science.gov (United States)

    2003-03-01

    The description of feature models was tied to the introduction of the Feature-Oriented Domain Analysis ( FODA *) [KANG90] approach in the late eighties...Feature-oriented domain analysis ( FODA ) is a domain analysis method developed at the Software...was introduced in FODA (features are typically arranged in a hierarchical structure that spans a tree) by adding some additional information, such

  15. Vulnerability management tools for COTS software - A comparison

    NARCIS (Netherlands)

    Welberg, S.M.

    2008-01-01

    In this paper, we compare vulnerability management tools in two stages. In the first stage, we perform a global comparison involving thirty tools available in the market. A framework composed of several criteria based on scope and analysis is used for this comparison. From this global view of the to

  16. Use of software tools for calculating flow accelerated corrosion of nuclear power plant equipment and pipelines

    Science.gov (United States)

    Naftal', M. M.; Baranenko, V. I.; Gulina, O. M.

    2014-06-01

    The results obtained from calculations of flow accelerated corrosion of equipment and pipelines operating at nuclear power plants constructed on the basis of PWR, VVER, and RBMK reactors carried out using the EKI-02 and EKI-03 software tools are presented. It is shown that the calculation error does not exceed its value indicated in the qualification certificates for these software tools. It is pointed out that calculations aimed at predicting the service life of pipelines and efficient surveillance of flow accelerated corrosion wear are hardly possible without using the above-mentioned software tools.

  17. Software architecture and engineering for patient records: current and future.

    Science.gov (United States)

    Weng, Chunhua; Levine, Betty A; Mun, Seong K

    2009-05-01

    During the "The National Forum on the Future of the Defense Health Information System," a track focusing on "Systems Architecture and Software Engineering" included eight presenters. These presenters identified three key areas of interest in this field, which include the need for open enterprise architecture and a federated database design, net centrality based on service-oriented architecture, and the need for focus on software usability and reusability. The eight panelists provided recommendations related to the suitability of service-oriented architecture and the enabling technologies of grid computing and Web 2.0 for building health services research centers and federated data warehouses to facilitate large-scale collaborative health care and research. Finally, they discussed the need to leverage industry best practices for software engineering to facilitate rapid software development, testing, and deployment.

  18. Vienna VLBI Software - Current release and plans for the future

    Science.gov (United States)

    Madzak, M.; Böhm, J.; Böhm, S.; Krásná, H.; Nilsson, T.; Plank, L.; Tierno Ros, C.; Schuh, H.; Soja, B.; Sun, J.; Teke, K.

    2013-08-01

    The Vienna VLBI Software (VieVS) is a geodetic Very Long Baseline Interferometry (VLBI) data analysis software which has been developed at the Vienna University of Technology since 2008. This paper gives an overview about its capabilities, including scheduling and simulation of VLBI observations. The latest release, version 2.1 includes a a graphical user interface. A few results and planned future developments are presented as well.

  19. Ranking of Business Process Simulation Software Tools with DEX/QQ Hierarchical Decision Model.

    Science.gov (United States)

    Damij, Nadja; Boškoski, Pavle; Bohanec, Marko; Mileva Boshkoska, Biljana

    2016-01-01

    The omnipresent need for optimisation requires constant improvements of companies' business processes (BPs). Minimising the risk of inappropriate BP being implemented is usually performed by simulating the newly developed BP under various initial conditions and "what-if" scenarios. An effectual business process simulations software (BPSS) is a prerequisite for accurate analysis of an BP. Characterisation of an BPSS tool is a challenging task due to the complex selection criteria that includes quality of visual aspects, simulation capabilities, statistical facilities, quality reporting etc. Under such circumstances, making an optimal decision is challenging. Therefore, various decision support models are employed aiding the BPSS tool selection. The currently established decision support models are either proprietary or comprise only a limited subset of criteria, which affects their accuracy. Addressing this issue, this paper proposes a new hierarchical decision support model for ranking of BPSS based on their technical characteristics by employing DEX and qualitative to quantitative (QQ) methodology. Consequently, the decision expert feeds the required information in a systematic and user friendly manner. There are three significant contributions of the proposed approach. Firstly, the proposed hierarchical model is easily extendible for adding new criteria in the hierarchical structure. Secondly, a fully operational decision support system (DSS) tool that implements the proposed hierarchical model is presented. Finally, the effectiveness of the proposed hierarchical model is assessed by comparing the resulting rankings of BPSS with respect to currently available results.

  20. Survivability as a Tool for Evaluating Open Source Software

    Science.gov (United States)

    2015-06-01

    technology has expanded rapidly in recent years, largely due to the advancing complexity of military systems. The F-22 Raptor , which demonstrated initial...manner in order to feed OSS survivability analyses? 4. How could an adversary destroy a system or force a mission failure by manipulating software embedded...Malicious code initiates programmed flight path What is the intent of the malicious code? What impact does initiation have on the UAV’s behavior ? 3. Code

  1. Software tools for quantification of X-ray microtomography at the UGCT

    Energy Technology Data Exchange (ETDEWEB)

    Vlassenbroeck, J. [Department of Subatomic and Radiation Physics, Ghent University, Proeftuinstraat 86, B-9000 Gent (Belgium)], E-mail: jelle.vlassenbroeck@ugent.be; Dierick, M.; Masschaele, B. [Department of Subatomic and Radiation Physics, Ghent University, Proeftuinstraat 86, B-9000 Gent (Belgium); Cnudde, V. [Department of Geology and Soil Science, Ghent University, Krijgslaan 281/S8, B-9000 Gent (Belgium); Van Hoorebeke, L. [Department of Subatomic and Radiation Physics, Ghent University, Proeftuinstraat 86, B-9000 Gent (Belgium); Jacobs, P. [Department of Geology and Soil Science, Ghent University, Krijgslaan 281/S8, B-9000 Gent (Belgium)

    2007-09-21

    The technique of X-ray microtomography using X-ray tube radiation offers an interesting tool for the non-destructive investigation of a wide range of materials. A major challenge lies in the analysis and quantification of the resulting data, allowing for a full characterization of the sample under investigation. In this paper, we discuss the software tools for reconstruction and analysis of tomographic data that are being developed at the UGCT. The tomographic reconstruction is performed using Octopus, a high-performance and user-friendly software package. The reconstruction process transforms the raw acquisition data into a stack of 2D cross-sections through the sample, resulting in a 3D data set. A number of artifact and noise reduction algorithms are integrated to reduce ring artifacts, beam hardening artifacts, COR misalignment, detector or stage tilt, pixel non-linearities, etc. These corrections are very important to facilitate the analysis of the 3D data. The analysis of the 3D data focuses primarily on the characterization of pore structures, but will be extended to other applications. A first package for the analysis of pore structures in three dimensions was developed under Matlab. A new package, called Morpho+, is being developed in a C++ environment, with optimizations and extensions of the previously used algorithms. The current status of this project will be discussed. Examples of pore analysis can be found in pharmaceuticals, material science, geology and numerous other fields.

  2. Toolkit of Available EPA Green Infrastructure Modeling Software: Watershed Management Optimization Support Tool (WMOST)

    Science.gov (United States)

    Watershed Management Optimization Support Tool (WMOST) is a software application designed tofacilitate integrated water resources management across wet and dry climate regions. It allows waterresources managers and planners to screen a wide range of practices across their watersh...

  3. PROBEmer: A web-based software tool for selecting optimal DNA oligos

    National Research Council Canada - National Science Library

    Emrich, Scott J; Lowe, Mary; Delcher, Arthur L

    2003-01-01

    PROBEmer (http://probemer.cs.loyola.edu) is a web-based software tool that enables a researcher to select optimal oligos for PCR applications and multiplex detection platforms including oligonucleotide microarrays and bead-based arrays...

  4. Contingency Contractor Optimization Phase 3 Sustainment Third-Party Software List - Contingency Contractor Optimization Tool - Prototype

    Energy Technology Data Exchange (ETDEWEB)

    Durfee, Justin David; Frazier, Christopher Rawls; Bandlow, Alisa

    2016-05-01

    The Contingency Contractor Optimization Tool - Prototype (CCOT-P) requires several third-party software packages. These are documented below for each of the CCOT-P elements: client, web server, database server, solver, web application and polling application.

  5. Toolkit of Available EPA Green Infrastructure Modeling Software: Watershed Management Optimization Support Tool (WMOST)

    Science.gov (United States)

    Watershed Management Optimization Support Tool (WMOST) is a software application designed tofacilitate integrated water resources management across wet and dry climate regions. It allows waterresources managers and planners to screen a wide range of practices across their watersh...

  6. An Overview of Public Access Computer Software Management Tools for Libraries

    Science.gov (United States)

    Wayne, Richard

    2004-01-01

    An IT decision maker gives an overview of public access PC software that's useful in controlling session length and scheduling, Internet access, print output, security, and the latest headaches: spyware and adware. In this article, the author describes a representative sample of software tools in several important categories such as setup…

  7. Accuracy Test of Software Architecture Compliance Checking Tools – Test Instruction

    NARCIS (Netherlands)

    Pruijt, Leo; van der Werf, J.M.E.M.; Brinkkemper., Sjaak

    2015-01-01

    Software Architecture Compliance Checking (SACC) is an approach to verify conformance of implemented program code to high-level models of architectural design. Static SACC focuses on the modular software architecture and on the existence of rule violating dependencies between modules. Accurate tool

  8. Tools of creation of urban geographic information systems based on software and open source

    Directory of Open Access Journals (Sweden)

    Володимир Пилипович Ткаченко

    2015-06-01

    Full Text Available Conceptual principles of the creation of Urban Geographic Information System (UGIS tools based on software products with open source are considered. Organizational and technical structure of UGIS, the basic provisions of its conceptual architecture, tool software modules functions and their means of implementation are provided. Rational organization of municipal information resources for functional UGIS subsystem creations is proposed, and the examples of its implementation are provided

  9. Managing clinical research data: software tools for hypothesis exploration.

    Science.gov (United States)

    Starmer, C F; Dietz, M A

    1990-07-01

    Data representation, data file specification, and the communication of data between software systems are playing increasingly important roles in clinical data management. This paper describes the concept of a self-documenting file that contains annotations or comments that aid visual inspection of the data file. We describe access of data from annotated files and illustrate data analysis with a few examples derived from the UNIX operating environment. Use of annotated files provides the investigator with both a useful representation of the primary data and a repository of comments that describe some of the context surrounding data capture.

  10. A Process Framework for Designing Software Reference Architectures for Providing Tools as a Service

    DEFF Research Database (Denmark)

    Chauhan, Muhammad Aufeef; Babar, Muhammad Ali; Probst, Christian W.

    2016-01-01

    of software systems need customized and systematic SRA design and evaluation methods. In this paper, we present a software Reference Architecture Design process Framework (RADeF) that can be used for analysis, design and evaluation of the SRA for provisioning of Tools as a Service as part of a cloud......Software Reference Architecture (SRA), which is a generic architecture solution for a specific type of software systems, provides foundation for the design of concrete architectures in terms of architecture design guidelines and architecture elements. The complexity and size of certain types......-enabled workSPACE (TSPACE). The framework is based on the state of the art results from literature and our experiences with designing software architectures for cloud-based systems. We have applied RADeF SRA design two types of TSPACE: software architecting TSPACE and software implementation TSPACE...

  11. Software Tools | Office of Cancer Clinical Proteomics Research

    Science.gov (United States)

    The CPTAC program develops new approaches to elucidate aspects of the molecular complexity of cancer made from large-scale proteogenomic datasets, and advance them toward precision medicine.  Part of the CPTAC mission is to make data and tools available and accessible to the greater research community to accelerate the discovery process.

  12. PhosphoHunter: An Efficient Software Tool for Phosphopeptide Identification

    Directory of Open Access Journals (Sweden)

    Alessandra Tiengo

    2015-01-01

    Full Text Available Phosphorylation is a protein posttranslational modification. It is responsible of the activation/inactivation of disease-related pathways, thanks to its role of “molecular switch.” The study of phosphorylated proteins becomes a key point for the proteomic analyses focused on the identification of diagnostic/therapeutic targets. Liquid chromatography coupled to tandem mass spectrometry (LC-MS/MS is the most widely used analytical approach. Although unmodified peptides are automatically identified by consolidated algorithms, phosphopeptides still require automated tools to avoid time-consuming manual interpretation. To improve phosphopeptide identification efficiency, a novel procedure was developed and implemented in a Perl/C tool called PhosphoHunter, here proposed and evaluated. It includes a preliminary heuristic step for filtering out the MS/MS spectra produced by nonphosphorylated peptides before sequence identification. A method to assess the statistical significance of identified phosphopeptides was also formulated. PhosphoHunter performance was tested on a dataset of 1500 MS/MS spectra and it was compared with two other tools: Mascot and Inspect. Comparisons demonstrated that a strong point of PhosphoHunter is sensitivity, suggesting that it is able to identify real phosphopeptides with superior performance. Performance indexes depend on a single parameter (intensity threshold that users can tune according to the study aim. All the three tools localized >90% of phosphosites.

  13. lipID--a software tool for automated assignment of lipids in mass spectra.

    Science.gov (United States)

    Hübner, Göran; Crone, Catharina; Lindner, Buko

    2009-12-01

    A new software tool called lipID is reported, which supports the identification of glycerophospholipids, glycosphingolipids, fatty acids and small oligosaccharides in mass spectra. The user-extendable software is a Microsoft (MS) Excel Add-In developed using Visual Basic for Applications and is compatible with all Versions of MS Excel since MS Excel 97. It processes singly given mass-to-charge values as well as mass lists considering a number of user-defined options. The software's mode of operation, usage and options are explained and the benefits and limitations of the tool are illustrated by means of three typical analytical examples of lipid analyses.

  14. The Web Interface Template System (WITS), a software developer`s tool

    Energy Technology Data Exchange (ETDEWEB)

    Lauer, L.J.; Lynam, M.; Muniz, T. [Sandia National Labs., Albuquerque, NM (United States). Financial Systems Dept.

    1995-11-01

    The Web Interface Template System (WITS) is a tool for software developers. WITS is a three-tiered, object-oriented system operating in a Client/Server environment. This tool can be used to create software applications that have a Web browser as the user interface and access a Sybase database. Development, modification, and implementation are greatly simplified because the developer can change and test definitions immediately, without writing or compiling any code. This document explains WITS functionality, the system structure and components of WITS, and how to obtain, install, and use the software system.

  15. Software tool for the prosthetic foot modeling and stiffness optimization.

    Science.gov (United States)

    Strbac, Matija; Popović, Dejan B

    2012-01-01

    We present the procedure for the optimization of the stiffness of the prosthetic foot. The procedure allows the selection of the elements of the foot and the materials used for the design. The procedure is based on the optimization where the cost function is the minimization of the difference between the knee joint torques of healthy walking and the walking with the transfemural prosthesis. We present a simulation environment that allows the user to interactively vary the foot geometry and track the changes in the knee torque that arise from these adjustments. The software allows the estimation of the optimal prosthetic foot elasticity and geometry. We show that altering model attributes such as the length of the elastic foot segment or its elasticity leads to significant changes in the estimated knee torque required for a given trajectory.

  16. Improving Fund Risk Management by Using New Software Tools Technology

    Directory of Open Access Journals (Sweden)

    Stephanos Papadamou

    2004-01-01

    Full Text Available This paper introduces a new MATLAB-based toolbox for Computer Aided mutual fund risk evaluation. In the age of computerized trading, financial services companies and independent investors must quickly investigate fund investment style and market risk. The Fund Risk toolbox is a financial software that includes a set of functions based on value at risk (VaR and expected tail loss (ETL methodology for graphical presentation of risk forecasts, evaluation of different risk models and identification of fund investment style. The sample of historical data can be divided to an estimation rolling window and the back-testing period. MATLAB?s vast built-in mathematical and financial functionality along with the fact that is both an interpreted and compiled programming language make this toolbox easily extendable by adding new complicated risk models with minimum programming effort.

  17. Software Tool for the Prosthetic Foot Modeling and Stiffness Optimization

    Directory of Open Access Journals (Sweden)

    Matija Štrbac

    2012-01-01

    Full Text Available We present the procedure for the optimization of the stiffness of the prosthetic foot. The procedure allows the selection of the elements of the foot and the materials used for the design. The procedure is based on the optimization where the cost function is the minimization of the difference between the knee joint torques of healthy walking and the walking with the transfemural prosthesis. We present a simulation environment that allows the user to interactively vary the foot geometry and track the changes in the knee torque that arise from these adjustments. The software allows the estimation of the optimal prosthetic foot elasticity and geometry. We show that altering model attributes such as the length of the elastic foot segment or its elasticity leads to significant changes in the estimated knee torque required for a given trajectory.

  18. Collaborative Software Development in Support of Fast Adaptive AeroSpace Tools (FAAST)

    Science.gov (United States)

    Kleb, William L.; Nielsen, Eric J.; Gnoffo, Peter A.; Park, Michael A.; Wood, William A.

    2003-01-01

    A collaborative software development approach is described. The software product is an adaptation of proven computational capabilities combined with new capabilities to form the Agency's next generation aerothermodynamic and aerodynamic analysis and design tools. To efficiently produce a cohesive, robust, and extensible software suite, the approach uses agile software development techniques; specifically, project retrospectives, the Scrum status meeting format, and a subset of Extreme Programming's coding practices are employed. Examples are provided which demonstrate the substantial benefits derived from employing these practices. Also included is a discussion of issues encountered when porting legacy Fortran 77 code to Fortran 95 and a Fortran 95 coding standard.

  19. Atrioventricular junction (AVJ) motion tracking: a software tool with ITK/VTK/Qt.

    Science.gov (United States)

    Pengdong Xiao; Shuang Leng; Xiaodan Zhao; Hua Zou; Ru San Tan; Wong, Philip; Liang Zhong

    2016-08-01

    The quantitative measurement of the Atrioventricular Junction (AVJ) motion is an important index for ventricular functions of one cardiac cycle including systole and diastole. In this paper, a software tool that can conduct AVJ motion tracking from cardiovascular magnetic resonance (CMR) images is presented by using Insight Segmentation and Registration Toolkit (ITK), The Visualization Toolkit (VTK) and Qt. The software tool is written in C++ by using Visual Studio Community 2013 integrated development environment (IDE) containing both an editor and a Microsoft complier. The software package has been successfully implemented. From the software engineering practice, it is concluded that ITK, VTK, and Qt are very handy software systems to implement automatic image analysis functions for CMR images such as quantitative measure of motion by visual tracking.

  20. PTaaS: Platform for Providing Software Developing Applications and Tools as a Service

    DEFF Research Database (Denmark)

    Chauhan, Muhammad Aufeef; Babar, Muhammad Ali

    2014-01-01

    of large number of complex activities that does not only include technological aspects but also social aspects. A large number of applications and tools have been devised for providing solutions to the challenges of the GSD that emerge as a result of distributed development teams. However......, the technological solutions that have been proposed so far are limited in their ability to meet specific GSD challenged and emerging trends of GSD in which software development is not only global but it also involve multiple organizations. Involvement of the multiple organizations in GSD increase the complexity...... technological support for it that is not limited to one specific tools and a particular phase of software development life cycle. In this thesis, we have explored the possibility of offering software development applications and tools as services that can be acquired on demand according to the software...

  1. Computer-aided design in power engineering. Application of software tools

    Energy Technology Data Exchange (ETDEWEB)

    Stojkovic, Zlatan

    2012-07-01

    Demonstrates the use software tools in the practice of design in the field of power systems. Presents many applications in the design in the field of power systems. Useful for educative purposes and practical work. This textbooks demonstrates the application of software tools in solving a series of problems from the field of designing power system structures and systems. It contains four chapters: The first chapter leads the reader through all the phases necessary in the procedures of computer aided modeling and simulation. It guides through the complex problems presenting on the basis of eleven original examples. The second chapter presents application of software tools in power system calculations of power systems equipment design. Several design example calculations are carried out using engineering standards like MATLAB, EMTP/ATP, Excel and Access, AutoCAD and Simulink. The third chapters focuses on the graphical documentation using a collection of software tools (AutoCAD, EPLAN, SIMARIS SIVACON, SIMARIS DESIGN) which enable the complete automation of the development of graphical documentation of a power systems. In the fourth chapter, the application of software tools in the project management in power systems is discussed. Here, the emphasis is put on the standard software MS Excel and MS Project.

  2. PT-SAFE: a software tool for development and annunciation of medical audible alarms.

    Science.gov (United States)

    Bennett, Christopher L; McNeer, Richard R

    2012-03-01

    Recent reports by The Joint Commission as well as the Anesthesia Patient Safety Foundation have indicated that medical audible alarm effectiveness needs to be improved. Several recent studies have explored various approaches to improving the audible alarms, motivating the authors to develop real-time software capable of comparing such alarms. We sought to devise software that would allow for the development of a variety of audible alarm designs that could also integrate into existing operating room equipment configurations. The software is meant to be used as a tool for alarm researchers to quickly evaluate novel alarm designs. A software tool was developed for the purpose of creating and annunciating audible alarms. The alarms consisted of annunciators that were mapped to vital sign data received from a patient monitor. An object-oriented approach to software design was used to create a tool that is flexible and modular at run-time, can annunciate wave-files from disk, and can be programmed with MATLAB by the user to create custom alarm algorithms. The software was tested in a simulated operating room to measure technical performance and to validate the time-to-annunciation against existing equipment alarms. The software tool showed efficacy in a simulated operating room environment by providing alarm annunciation in response to physiologic and ventilator signals generated by a human patient simulator, on average 6.2 seconds faster than existing equipment alarms. Performance analysis showed that the software was capable of supporting up to 15 audible alarms on a mid-grade laptop computer before audio dropouts occurred. These results suggest that this software tool provides a foundation for rapidly staging multiple audible alarm sets from the laboratory to a simulation environment for the purpose of evaluating novel alarm designs, thus producing valuable findings for medical audible alarm standardization.

  3. Tools to Support the Reuse of Software Assets for the NASA Earth Science Decadal Survey Missions

    Science.gov (United States)

    Mattmann, Chris A.; Downs, Robert R.; Marshall, James J.; Most, Neal F.; Samadi, Shahin

    2011-01-01

    The NASA Earth Science Data Systems (ESDS) Software Reuse Working Group (SRWG) is chartered with the investigation, production, and dissemination of information related to the reuse of NASA Earth science software assets. One major current objective is to engage the NASA decadal missions in areas relevant to software reuse. In this paper we report on the current status of these activities. First, we provide some background on the SRWG in general and then discuss the group s flagship recommendation, the NASA Reuse Readiness Levels (RRLs). We continue by describing areas in which mission software may be reused in the context of NASA decadal missions. We conclude the paper with pointers to future directions.

  4. Development of tools for safety analysis of control software in advanced reactors

    Energy Technology Data Exchange (ETDEWEB)

    Guarro, S.; Yau, M.; Motamed, M. [Advanced Systems Concepts Associates, El Segundo, CA (United States)

    1996-04-01

    Software based control systems have gained a pervasive presence in a wide variety of applications, including nuclear power plant control and protection systems which are within the oversight and licensing responsibility of the US Nuclear Regulatory Commission. While the cost effectiveness and flexibility of software based plant process control is widely recognized, it is very difficult to achieve and prove high levels of demonstrated dependability and safety assurance for the functions performed by process control software, due to the very flexibility and potential complexity of the software itself. The development of tools to model, analyze and test software design and implementations in the context of the system that the software is designed to control can greatly assist the task of providing higher levels of assurance than those obtainable by software testing alone. This report presents and discusses the development of the Dynamic Flowgraph Methodology (DFM) and its application in the dependability and assurance analysis of software-based control systems. The features of the methodology and full-scale examples of application to both generic process and nuclear power plant control systems are presented and discussed in detail. The features of a workstation software tool developed to assist users in the application of DFM are also described.

  5. D-VASim: A Software Tool to Simulate and Analyze Genetic Logic Circuits

    DEFF Research Database (Denmark)

    Baig, Hasan; Madsen, Jan

    2016-01-01

    -stage researchers with limited experience in the field of biology. The Solution: Using LabVIEW to develop a user-friendly simulation tool named Dynamic Virtual Analyzer and Simulator (D-VASim), which is the first software tool in the domain of synthetic biology that provides a virtual laboratory environment...

  6. A free software tool for the development of decision support systems

    Directory of Open Access Journals (Sweden)

    COLONESE, G

    2008-06-01

    Full Text Available This article describes PostGeoOlap, a free software open source tool for decision support that integrates OLAP (On-Line Analytical Processing and GIS (Geographical Information Systems. Besides describing the tool, we show how it can be used to achieve effective and low cost decision support that is adequate for small and medium companies and for small public offices.

  7. Data Visualization: An Exploratory Study into the Software Tools Used by Businesses

    Science.gov (United States)

    Diamond, Michael; Mattia, Angela

    2015-01-01

    Data visualization is a key component to business and data analytics, allowing analysts in businesses to create tools such as dashboards for business executives. Various software packages allow businesses to create these tools in order to manipulate data for making informed business decisions. The focus is to examine what skills employers are…

  8. D-VASim: A Software Tool to Simulate and Analyze Genetic Logic Circuits

    DEFF Research Database (Denmark)

    Baig, Hasan; Madsen, Jan

    2016-01-01

    -stage researchers with limited experience in the field of biology. The Solution: Using LabVIEW to develop a user-friendly simulation tool named Dynamic Virtual Analyzer and Simulator (D-VASim), which is the first software tool in the domain of synthetic biology that provides a virtual laboratory environment...

  9. Nmag micromagnetic simulation tool - software engineering lessons learned

    CERN Document Server

    Fangohr, Hans; Franchin, Matteo

    2016-01-01

    We review design decisions and their impact for the open source code Nmag from a software engineering in computational science point of view. Key lessons to learn include that the approach of encapsulating the simulation functionality in a library of a general purpose language, here Python, eliminates the need for configuration files, provides greatest flexibility in using the simulation, allows mixing of multiple simulations, pre- and post-processing in the same (Python) file, and allows to benefit from the rich Python ecosystem of scientific packages. The choice of programming language (OCaml) for the computational core did not resonate with the users of the package (who are not computer scientists) and was suboptimal. The choice of Python for the top-level user interface was very well received by users from the science and engineering community. The from-source installation in which key requirements were compiled from a tarball was remarkably robust. In places, the code is a lot more ambitious than necessa...

  10. What's New in Software? Current Sources of Information Boost Effectiveness of Computer-Assisted Instruction.

    Science.gov (United States)

    Ellsworth, Nancy J.

    1990-01-01

    This article reviews current resources on computer-assisted instruction. Included are sources of software and hardware evaluations, advances in current technology, research, an information hotline, and inventories of available technological assistance. (DB)

  11. Software Tools for Robust Analysis of High-Dimensional Data

    Directory of Open Access Journals (Sweden)

    Valentin Todorov

    2014-06-01

    Full Text Available The present work discusses robust multivariate methods specifically designed for highdimensions. Their implementation in R is presented and their application is illustratedon examples. The first group are algorithms for outlier detection, already introducedelsewhere and implemented in other packages. The value added of the new package isthat all methods follow the same design pattern and thus can use the same graphicaland diagnostic tools. The next topic covered is sparse principal components including anobject oriented interface to the standard method proposed by Zou, Hastie, and Tibshirani(2006 and the robust one proposed by Croux, Filzmoser, and Fritz (2013. Robust partialleast squares (see Hubert and Vanden Branden 2003 as well as partial least squares fordiscriminant analysis conclude the scope of the new package.

  12. Towards a Reference Architecture to Provision Tools as a Service for Global Software Development

    DEFF Research Database (Denmark)

    Chauhan, Aufeef; Babar, Muhammad Ali

    2014-01-01

    Organizations involve in Global Software Development (GSD) face challenges in terms of having access to appropriate set of tools for performing distributed engineering and development activities, integration between heterogeneous desktop and web-based tools, management of artifacts developed...... and maintained over distant locations using different kind of tools, traceability among artifacts, and access to artifacts and data of sensitive nature. These challenges pose additional constraints on specific projects and reduce the possibility to carry out their engineering and development in globally...

  13. Towards a Reference Architecture to Provision Tools as a Service for Global Software Development

    DEFF Research Database (Denmark)

    Chauhan, Aufeef; Babar, Muhammad Ali

    2014-01-01

    Organizations involve in Global Software Development (GSD) face challenges in terms of having access to appropriate set of tools for performing distributed engineering and development activities, integration between heterogeneous desktop and web-based tools, management of artifacts developed...... and maintained over distant locations using different kind of tools, traceability among artifacts, and access to artifacts and data of sensitive nature. These challenges pose additional constraints on specific projects and reduce the possibility to carry out their engineering and development in globally...

  14. SOLAR: A software tool for meteorological data processing

    OpenAIRE

    Dudić, Dragana; Zlatanović, Ivan; Gligorević, Kosta; Urošević, Tijana

    2014-01-01

    The current standards for solar components and systems testing (SRB, EN, ISO, DIN, etc.) imply the determination of the system relevant parameters at the monitored locality. In this regard, it is necessary to prepare all the input data for the selected location in order to determine the standard values, such as: thermal collector performance, factor changes in incident angle of radiation, heat capacity, pressure values and various quality tests. The parameters collected with appropriate measu...

  15. Teaching structure: student use of software tools for understanding macromolecular structure in an undergraduate biochemistry course.

    Science.gov (United States)

    Jaswal, Sheila S; O'Hara, Patricia B; Williamson, Patrick L; Springer, Amy L

    2013-01-01

    Because understanding the structure of biological macromolecules is critical to understanding their function, students of biochemistry should become familiar not only with viewing, but also with generating and manipulating structural representations. We report a strategy from a one-semester undergraduate biochemistry course to integrate use of structural representation tools into both laboratory and homework activities. First, early in the course we introduce the use of readily available open-source software for visualizing protein structure, coincident with modules on amino acid and peptide bond properties. Second, we use these same software tools in lectures and incorporate images and other structure representations in homework tasks. Third, we require a capstone project in which teams of students examine a protein-nucleic acid complex and then use the software tools to illustrate for their classmates the salient features of the structure, relating how the structure helps explain biological function. To ensure engagement with a range of software and database features, we generated a detailed template file that can be used to explore any structure, and that guides students through specific applications of many of the software tools. In presentations, students demonstrate that they are successfully interpreting structural information, and using representations to illustrate particular points relevant to function. Thus, over the semester students integrate information about structural features of biological macromolecules into the larger discussion of the chemical basis of function. Together these assignments provide an accessible introduction to structural representation tools, allowing students to add these methods to their biochemical toolboxes early in their scientific development.

  16. Data Center IT Equipment Energy Assessment Tools: Current State of Commercial Tools, Proposal for a Future Set of Assessment Tools

    Energy Technology Data Exchange (ETDEWEB)

    Radhakrishnan, Ben D. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); National Univ., San Diego, CA (United States). School of Engineering

    2012-06-30

    This research project, which was conducted during the Summer and Fall of 2011, investigated some commercially available assessment tools with a focus on IT equipment to see if such tools could round out the DC Pro tool suite. In this research, the assessment capabilities of the various tools were compiled to help make “non-biased” information available to the public. This research should not be considered to be exhaustive on all existing vendor tools although a number of vendors were contacted. Large IT equipment OEM’s like IBM and Dell provide their proprietary internal automated software which does not work on any other IT equipment. However, found two companies with products that showed promise in performing automated assessments for IT equipment from different OEM vendors. This report documents the research and provides a list of software products reviewed, contacts and websites, product details, discussions with specific companies, a set of recommendations, and next steps. As a result of this research, a simple 3-level approach to an IT assessment tool is proposed along with an example of an assessment using a simple IT equipment data collection tool (Level 1, spreadsheet). The tool has been reviewed with the Green Grid and LBNL staff. The initial feedback has been positive although further refinement to the tool will be necessary. Proposed next steps include a field trial of at least two vendors’ software in two different data centers with an objective to prove the concept, ascertain the extent of energy and computational assessment, ease of installation and opportunities for continuous improvement. Based on the discussions, field trials (or case studies) are proposed with two vendors – JouleX (expected to be completed in 2012) and Sentilla.

  17. Sustaining an Online, Shared Community Resource for Models, Robust Open source Software Tools and Data for Volcanology - the Vhub Experience

    Science.gov (United States)

    Patra, A. K.; Valentine, G. A.; Bursik, M. I.; Connor, C.; Connor, L.; Jones, M.; Simakov, N.; Aghakhani, H.; Jones-Ivey, R.; Kosar, T.; Zhang, B.

    2015-12-01

    Over the last 5 years we have created a community collaboratory Vhub.org [Palma et al, J. App. Volc. 3:2 doi:10.1186/2191-5040-3-2] as a place to find volcanology-related resources, and a venue for users to disseminate tools, teaching resources, data, and an online platform to support collaborative efforts. As the community (current active users > 6000 from an estimated community of comparable size) embeds the tools in the collaboratory into educational and research workflows it became imperative to: a) redesign tools into robust, open source reusable software for online and offline usage/enhancement; b) share large datasets with remote collaborators and other users seamlessly with security; c) support complex workflows for uncertainty analysis, validation and verification and data assimilation with large data. The focus on tool development/redevelopment has been twofold - firstly to use best practices in software engineering and new hardware like multi-core and graphic processing units. Secondly we wish to enhance capabilities to support inverse modeling, uncertainty quantification using large ensembles and design of experiments, calibration, validation. Among software engineering practices we practice are open source facilitating community contributions, modularity and reusability. Our initial targets are four popular tools on Vhub - TITAN2D, TEPHRA2, PUFF and LAVA. Use of tools like these requires many observation driven data sets e.g. digital elevation models of topography, satellite imagery, field observations on deposits etc. These data are often maintained in private repositories that are privately shared by "sneaker-net". As a partial solution to this we tested mechanisms using irods software for online sharing of private data with public metadata and access limits. Finally, we adapted use of workflow engines (e.g. Pegasus) to support the complex data and computing workflows needed for usage like uncertainty quantification for hazard analysis using physical

  18. Cloud-Based SimJavaWeb Software Tool to Learn Simulation

    Directory of Open Access Journals (Sweden)

    A. Yu. Bykov

    2017-01-01

    Full Text Available Currently, in simulation there is a trend towards using the distributed software tools, particularly ones, which are using cloud technologies and the Internet. The article considers a simulation educational tool, implemented as a web application using the Java language with special Java class library developed for simulation. It is focused on a discrete event approach to modeling, similarly to the GPSS language, and intended for queuing systems simulation.The structure of the models obtained using this class library is similar to that of the GPSS language models. Also, the simulation language interpreter similar to GPSS is created using this class library, with some differences in the individual statements.Simulation experiments are performed on the server-side, and on client-side you must use a browser with standard functions to enter the source code into HTML-created form. Mobile devices can be used as clients. The source code of a model can be represented both in the Java language using a class library and in the language similar to GPSS.The simulation system implements functions especially for educational process. For example, there is possibility for a student to upload learning materials on the server, send developed software and reports of test control to the teacher via the Internet, and receive a detailed assessment of their results from the teacher. Also detailed results of passed tests in learning modules are entered, and some other functions are implemented in the system.As examples, the article considers models of the m/M/n/0 type queuing system in Java with a class library, and in the language similar to GPSS, shows simulation results, and presents the analytical model and calculations for this system. Analytical calculations proved that the modeling system is useful, as it overlaps simulation results with the acceptable error. Some approaches to the interaction with students through the Internet, used in modeling environment, can

  19. A Process Framework for Designing Software Reference Architectures for Providing Tools as a Service

    DEFF Research Database (Denmark)

    Chauhan, Muhammad Aufeef; Babar, Muhammad Ali; Probst, Christian W.

    2016-01-01

    -enabled workSPACE (TSPACE). The framework is based on the state of the art results from literature and our experiences with designing software architectures for cloud-based systems. We have applied RADeF SRA design two types of TSPACE: software architecting TSPACE and software implementation TSPACE....... The presented framework emphasizes on keeping the conceptual meta-model of the domain under investigation at the core of SRA design strategy and use it as a guiding tool for design, evaluation, implementation and evolution of the SRA. The framework also emphasizes to consider the nature of the tools...... to be provisioned and underlying cloud platforms to be used while designing SRA. The framework recommends adoption of the multi-faceted approach for evaluation of SRA and quantifiable measurement scheme to evaluate quality of the SRA. We foresee that RADeF can facilitate software architects and researchers during...

  20. New Methods, Current Trends and Software Infrastructure for NLP

    CERN Document Server

    Cunningham, H; Gaizauskas, R J; Cunningham, Hamish; Wilks, Yorick; Gaizauskas, Robert J.

    1996-01-01

    The increasing use of `new methods' in NLP, which the NeMLaP conference series exemplifies, occurs in the context of a wider shift in the nature and concerns of the discipline. This paper begins with a short review of this context and significant trends in the field. The review motivates and leads to a set of requirements for support software of general utility for NLP research and development workers. A freely-available system designed to meet these requirements is described (called GATE - a General Architecture for Text Engineering). Information Extraction (IE), in the sense defined by the Message Understanding Conferences (ARPA \\cite{Arp95}), is an NLP application in which many of the new methods have found a home (Hobbs \\cite{Hob93}; Jacobs ed. purposes, and this is described. Lastly we review related work.

  1. A Reference Architecture for Providing Tools as a Service to Support Global Software Development

    DEFF Research Database (Denmark)

    Chauhan, Aufeef

    2014-01-01

    -based solutions. The restricted ability of the organizations to have desired alignment of tools with software engineering and development processes results in administrative and managerial overhead that incur increased development cost and poor product quality. Moreover, stakeholders involved in the projects have...... for maintaining and resolving dependencies and traceability across heterogeneous tools are major challenges for GSD teams. The lack of insufficient support for cross platform tools integration also makes it hard to address the stated challenges using existing paradigms that are based upon desktop and web......-computing paradigm for addressing above-mentioned issues by providing a framework to select appropriate tools as well as associated services and reference architecture of the cloud-enabled middleware platform that allows on demand provisioning of software engineering Tools as a Service (TaaS) with focus...

  2. SaTool - a Software Tool for Structural Analysis of Complex Automation Systems

    DEFF Research Database (Denmark)

    Blanke, Mogens; Lorentzen, Torsten

    2006-01-01

    The paper introduces SaTool, a tool for structural analysis, the use of the Matlab (R)-based implementation is presented and special features are introduced, which were motivated by industrial users. Salient features of tool are presented, including the ability to specify the behavior of a comple...

  3. MyETL: A Java Software Tool to Extract, Transform, and Load Your Business

    OpenAIRE

    Michele Nuovo

    2015-01-01

    The project follows the development of a Java Software Tool that extracts data from Flat File (Fixed Length Record Type), CSV (Comma Separated Values), and XLS (Microsoft Excel 97-2003 Worksheet file), apply transformation to those sources, and finally load the data into the end target RDBMS. The software refers to a process known as ETL (Extract Transform and Load). Those kinds of systems are called ETL systems.

  4. MyETL: A Java Software Tool to Extract, Transform, and Load Your Business

    Directory of Open Access Journals (Sweden)

    Michele Nuovo

    2015-12-01

    Full Text Available The project follows the development of a Java Software Tool that extracts data from Flat File (Fixed Length Record Type, CSV (Comma Separated Values, and XLS (Microsoft Excel 97-2003 Worksheet file, apply transformation to those sources, and finally load the data into the end target RDBMS. The software refers to a process known as ETL (Extract Transform and Load. Those kinds of systems are called ETL systems.

  5. Reviews of Selected System and Software Tools for Strategic Defense Applications

    Science.gov (United States)

    1990-02-01

    uewal of OA Dmola is stalled to their mdio MW kudmd ue. The wort reported In IS I eeI wans e oMimited eedstd MCA M 64 C OM fir iso Oesotieet of Diesft...companies indicated: Product Vendor/Sponsor Software Through Pictures Interactive Development Environments , Inc. Teamwork Cadre Technologies, Inc...dataflow-oriented software engineering tool, developed by Interactive Development Environments , San Francisco, CA [IDE 88]. This review concerns the

  6. A Software Tool for Optimal Sizing of PV Systems in Malaysia

    OpenAIRE

    Tamer Khatib; Azah Mohamed; K. Sopian

    2012-01-01

    This paper presents a MATLAB based user friendly software tool called as PV.MY for optimal sizing of photovoltaic (PV) systems. The software has the capabilities of predicting the metrological variables such as solar energy, ambient temperature and wind speed using artificial neural network (ANN), optimizes the PV module/ array tilt angle, optimizes the inverter size and calculate optimal capacities of PV array, battery, wind turbine and diesel generator in hybrid PV systems. The ANN based mo...

  7. A System Identification Software Tool for General MISO ARX-Type of Model Structures

    OpenAIRE

    Lindskog, Peter

    1996-01-01

    The typical system identification procedure requires powerful and versatile software means. In this paper we describe and exemplify the use of a prototype identi#cation software tool, applicable for the rather broad class of multi input single output model structures with regressors that are formed by delayed in- and outputs. Interesting special instances of this model structure category include, e.g., linear ARX and many semi-physical structures, feed-forward neural networks, radial basis fu...

  8. Data Mining for Secure Software Engineering – Source Code Management Tool Case Study

    Directory of Open Access Journals (Sweden)

    A.V.Krishna Prasad,

    2010-07-01

    Full Text Available As Data Mining for Secure Software Engineering improves software productivity and quality, software engineers are increasingly applying data mining algorithms to various software engineering tasks. However mining software engineering data poses several challenges, requiring various algorithms to effectively mine sequences, graphs and text from such data. Software engineering data includes code bases, execution traces, historical code changes,mailing lists and bug data bases. They contains a wealth of information about a projects-status, progress and evolution. Using well established data mining techniques, practitioners and researchers can explore the potential of this valuable data in order to better manage their projects and do produce higher-quality software systems that are delivered on time and with in budget. Data mining can be used in gathering and extracting latent security requirements, extracting algorithms and business rules from code, mining legacy applications for requirements and business rules for new projects etc. Mining algorithms for software engineering falls into four main categories: Frequent pattern mining – finding commonly occurring patterns; Pattern matching – finding data instances for given patterns; Clustering – grouping data into clusters and Classification – predicting labels of data based on already labeled data. In this paper, we will discuss the overview of strategies for data mining for secure software engineering, with the implementation of a case study of text mining for source code management tool.

  9. MilQuant: a free, generic software tool for isobaric tagging-based quantitation.

    Science.gov (United States)

    Zou, Xiao; Zhao, Minzhi; Shen, Hongyan; Zhao, Xuyang; Tong, Yuanpeng; Wang, Qingsong; Wei, Shicheng; Ji, Jianguo

    2012-09-18

    Isobaric tagging techniques such as iTRAQ and TMT are widely used in quantitative proteomics and especially useful for samples that demand in vitro labeling. Due to diversity in choices of MS acquisition approaches, identification algorithms, and relative abundance deduction strategies, researchers are faced with a plethora of possibilities when it comes to data analysis. However, the lack of generic and flexible software tool often makes it cumbersome for researchers to perform the analysis entirely as desired. In this paper, we present MilQuant, mzXML-based isobaric labeling quantitator, a pipeline of freely available programs that supports native acquisition files produced by all mass spectrometer types and collection approaches currently used in isobaric tagging based MS data collection. Moreover, aside from effective normalization and abundance ratio deduction algorithms, MilQuant exports various intermediate results along each step of the pipeline, making it easy for researchers to customize the analysis. The functionality of MilQuant was demonstrated by four distinct datasets from different laboratories. The compatibility and extendibility of MilQuant makes it a generic and flexible tool that can serve as a full solution to data analysis of isobaric tagging-based quantitation.

  10. Streamlining the Design-to-Build Transition with Build-Optimization Software Tools.

    Science.gov (United States)

    Oberortner, Ernst; Cheng, Jan-Fang; Hillson, Nathan J; Deutsch, Samuel

    2017-03-17

    Scaling-up capabilities for the design, build, and test of synthetic biology constructs holds great promise for the development of new applications in fuels, chemical production, or cellular-behavior engineering. Construct design is an essential component in this process; however, not every designed DNA sequence can be readily manufactured, even using state-of-the-art DNA synthesis methods. Current biological computer-aided design and manufacture tools (bioCAD/CAM) do not adequately consider the limitations of DNA synthesis technologies when generating their outputs. Designed sequences that violate DNA synthesis constraints may require substantial sequence redesign or lead to price-premiums and temporal delays, which adversely impact the efficiency of the DNA manufacturing process. We have developed a suite of build-optimization software tools (BOOST) to streamline the design-build transition in synthetic biology engineering workflows. BOOST incorporates knowledge of DNA synthesis success determinants into the design process to output ready-to-build sequences, preempting the need for sequence redesign. The BOOST web application is available at https://boost.jgi.doe.gov and its Application Program Interfaces (API) enable integration into automated, customized DNA design processes. The herein presented results highlight the effectiveness of BOOST in reducing DNA synthesis costs and timelines.

  11. [Software CMAP TOOLS ™ to build concept maps: an evaluation by nursing students].

    Science.gov (United States)

    Ferreira, Paula Barreto; Cohrs, Cibelli Rizzo; De Domenico, Edvane Birelo Lopes

    2012-08-01

    Concept mapping (CM) is a teaching strategy that can be used to solve clinical cases, but the maps are difficult to write. The objective of this study was to describe the challenges and contributions of the Cmap Tools® software in building concept maps to solve clinical cases. To do this, a descriptive and qualitative method was used with junior nursing students from the Federal University of São Paulo. The teaching strategy was applied and the data were collected using the focal group technique. The results showed that the software facilitates and guarantees the organization, visualization, and correlation of the data, but there are difficulties related to the handling of its tools initially. In conclusion, the formatting and auto formatting resources of Cmap Tools® facilitated the construction of concept maps; however, orientation strategies should be implemented for the initial stage of the software utilization.

  12. A Reference Architecture for Providing Tools as a Service to Support Global Software Development

    DEFF Research Database (Denmark)

    Chauhan, Aufeef

    2014-01-01

    -based solutions. The restricted ability of the organizations to have desired alignment of tools with software engineering and development processes results in administrative and managerial overhead that incur increased development cost and poor product quality. Moreover, stakeholders involved in the projects have......Global Software Development (GSD) teams encounter challenges that are associated with distribution of software development activities across multiple geographic regions. The limited support for performing collaborative development and engineering activities and lack of sufficient support...... for maintaining and resolving dependencies and traceability across heterogeneous tools are major challenges for GSD teams. The lack of insufficient support for cross platform tools integration also makes it hard to address the stated challenges using existing paradigms that are based upon desktop and web...

  13. A software tool for design of process monitoring and analysis systems

    DEFF Research Database (Denmark)

    Singh, Ravendra; Gernaey, Krist; Gani, Rafiqul

    2009-01-01

    and analysis system. A software to achieve this has been developed. Two developed supporting tools for the design, a knowledge base (consisting of the process knowledge as well as the knowledge on measurement methods & tools) and a model library (consisting of the process operational models) have been extended...... rigorously and integrated with the user interface, which made the software more generic and applicable to a wide range of problems. The software for the design of a process monitoring and analysis system is presented and illustrated with a tablet manufacturing process example.......A well designed process monitoring and analysis system is necessary to consistently achieve any predefined end product quality. Systematic computer aided methods and tools provide the means to design the necessary process monitoring and analysis systems and/or to validate any existing monitoring...

  14. The Use of ACOP Tools in Writing Control System Software

    CERN Document Server

    Deloose, I; Wu, H

    1997-01-01

    Several institutes are making increasing use of PCs in accelerator controls. In particular, Windows NT and/or Windows 95 is already, or is becoming, a supported platform at the client-end in a variety of control systems. Notably, control systems at CERN/ISOLDE, DESY/HERA, KEK/PF-LINAC, Daresbury, ISA (Denmark), MSI (Sweden), and ESRF currently make use of Windows NT as a control system client. As all of these control systems are either object-oriented or object-based, their is a considerable overlap in their functionality and required features. This point was realized at the PCaPAC '96 workshop, and gave rise to the ACOP work group, which stands for Accelerator Component Oriented Programming. The first fruit born of this group is the ACOP.OCX (OLE Control eXtension) ActiveX control. MicrosoftTM ActiveX controls are the updated version of the former OLE (Object Linking and Embedding) control specification. The ACOP control has been designed in order to support the common functionality requirements of object-or...

  15. The social construction of educational technology through the use of authentic software tools

    Directory of Open Access Journals (Sweden)

    Allan Jones

    2011-12-01

    Full Text Available A major strand of science and technology studies in recent decades has relatedto the social construction of technology (SCOT movement, whose adherentsmaintain that technological systems are determined just as much by social forcesas by technological ones. Taking this SCOT notion as a starting point, and puttinga focus on the user, this paper looks at some examples of the educationaluse of software tools that exploit the functionality of the software in ways farremoved from the original design. Examples include the use of spreadsheets,graphics editors and audio editors, and online translation software. Connectionsare made between the social construction of technology and constructivist pedagogy,particularly in relation to authentic learning.

  16. Managing the Testing Process Practical Tools and Techniques for Managing Hardware and Software Testing

    CERN Document Server

    Black, Rex

    2011-01-01

    New edition of one of the most influential books on managing software and hardware testing In this new edition of his top-selling book, Rex Black walks you through the steps necessary to manage rigorous testing programs of hardware and software. The preeminent expert in his field, Mr. Black draws upon years of experience as president of both the International and American Software Testing Qualifications boards to offer this extensive resource of all the standards, methods, and tools you'll need. The book covers core testing concepts and thoroughly examines the best test management practices

  17. Software tools for assisting the multisource imagery analyst

    Science.gov (United States)

    Privett, Grant J.; Harvey, Peter R. W.; Booth, David M.; Kent, Philip J.; Redding, Nick J.; Evans, Dean; Jones, K. L.

    2003-11-01

    Increasingly demanding military requirements and rapid technological advances are producing reconnaissance sensors with greater spatial, spectral and temporal resolution. This, with the benefits to be gained from deploying multiple sensors co-operatively, is resulting in a so-called data deluge, where recording systems, data-links, and exploitation systems struggle to cope with the required imagery throughput. This paper focuses on the exploitation stage and, in particular, the provision of cueing aids for Imagery Analysts (IAs), who need to integrate a variety of sources in order to gain situational awareness. These sources may include multi-source imagery and intelligence feeds, various types of mapping and collateral data, as well the need for the IAs to add their own expertise in military doctrine etc. This integration task is becoming increasingly difficult as the volume and diversity of the input increases. The first stage in many exploitation tasks is that of image registration. It facilitates change detection and many avenues of multi-source exploitation. Progress is reported on the automating this task, on its current performance characteristics, its integration into a potentially operational system, and hence on its expected utility. We also report on the development of an evolutionary architecture, 'ICARUS' in which feature detectors (or cuers) are constructed incrementally using a genetic algorithm that evolves simple sub-structures before combining, and further evolving them, to form more comprehensive and robust detectors. This approach is shown to help overcome the complexity limit that prevents many machine-learning algorithms from scaling up to the real world.

  18. A Reference Architecture for Providing Tools as a Service to Support Global Software Development

    DEFF Research Database (Denmark)

    Chauhan, Aufeef

    2014-01-01

    Global Software Development (GSD) teams encounter challenges that are associated with distribution of software development activities across multiple geographic regions. The limited support for performing collaborative development and engineering activities and lack of sufficient support for main......Global Software Development (GSD) teams encounter challenges that are associated with distribution of software development activities across multiple geographic regions. The limited support for performing collaborative development and engineering activities and lack of sufficient support...... specific constraints regarding availability and deployments of the tools. The artifacts and data produced or consumed by the tools need to be governed according to the constraints and corresponding quality of service (QoS) parameters. In this paper, we present the research agenda to leverage cloud......-computing paradigm for addressing above-mentioned issues by providing a framework to select appropriate tools as well as associated services and reference architecture of the cloud-enabled middleware platform that allows on demand provisioning of software engineering Tools as a Service (TaaS) with focus...

  19. Towards early software reliability prediction for computer forensic tools (case study).

    Science.gov (United States)

    Abu Talib, Manar

    2016-01-01

    Versatility, flexibility and robustness are essential requirements for software forensic tools. Researchers and practitioners need to put more effort into assessing this type of tool. A Markov model is a robust means for analyzing and anticipating the functioning of an advanced component based system. It is used, for instance, to analyze the reliability of the state machines of real time reactive systems. This research extends the architecture-based software reliability prediction model for computer forensic tools, which is based on Markov chains and COSMIC-FFP. Basically, every part of the computer forensic tool is linked to a discrete time Markov chain. If this can be done, then a probabilistic analysis by Markov chains can be performed to analyze the reliability of the components and of the whole tool. The purposes of the proposed reliability assessment method are to evaluate the tool's reliability in the early phases of its development, to improve the reliability assessment process for large computer forensic tools over time, and to compare alternative tool designs. The reliability analysis can assist designers in choosing the most reliable topology for the components, which can maximize the reliability of the tool and meet the expected reliability level specified by the end-user. The approach of assessing component-based tool reliability in the COSMIC-FFP context is illustrated with the Forensic Toolkit Imager case study.

  20. Activity Theory applied to Global Software Engineering: Theoretical Foundations and Implications for Tool Builders

    DEFF Research Database (Denmark)

    Tell, Paolo; Ali Babar, Muhammad

    2012-01-01

    Although a plethora of tools are available for Global Software Engineering (GSE) teams, it is being realized increasingly that the most prevalent desktop metaphor underpinning the majority of tools have several inherent limitations. We have proposed that Activity-Based Computing (ABC) can...... for applying activity theory to GSE. We analyze and explain the fundamental concepts of activity theory, and how they can be applied by using examples of software architecture design and evaluation processes. We describe the kind of data model and architectural support required for applying activity theory...

  1. "Blogs" and "wikis" are valuable software tools for communication within research groups.

    Science.gov (United States)

    Sauer, Igor M; Bialek, Dominik; Efimova, Ekaterina; Schwartlander, Ruth; Pless, Gesine; Neuhaus, Peter

    2005-01-01

    Appropriate software tools may improve communication and ease access to knowledge for research groups. A weblog is a website which contains periodic, chronologically ordered posts on a common webpage, whereas a wiki is hypertext-based collaborative software that enables documents to be authored collectively using a web browser. Although not primarily intended for use as an intranet-based collaborative knowledge warehouse, both blogs and wikis have the potential to offer all the features of complex and expensive IT solutions. These tools enable the team members to share knowledge simply and quickly-the collective knowledge base of the group can be efficiently managed and navigated.

  2. Review of free software tools for image analysis of fluorescence cell micrographs.

    Science.gov (United States)

    Wiesmann, V; Franz, D; Held, C; Münzenmayer, C; Palmisano, R; Wittenberg, T

    2015-01-01

    An increasing number of free software tools have been made available for the evaluation of fluorescence cell micrographs. The main users are biologists and related life scientists with no or little knowledge of image processing. In this review, we give an overview of available tools and guidelines about which tools the users should use to segment fluorescence micrographs. We selected 15 free tools and divided them into stand-alone, Matlab-based, ImageJ-based, free demo versions of commercial tools and data sharing tools. The review consists of two parts: First, we developed a criteria catalogue and rated the tools regarding structural requirements, functionality (flexibility, segmentation and image processing filters) and usability (documentation, data management, usability and visualization). Second, we performed an image processing case study with four representative fluorescence micrograph segmentation tasks with figure-ground and cell separation. The tools display a wide range of functionality and usability. In the image processing case study, we were able to perform figure-ground separation in all micrographs using mainly thresholding. Cell separation was not possible with most of the tools, because cell separation methods are provided only by a subset of the tools and are difficult to parametrize and to use. Most important is that the usability matches the functionality of a tool. To be usable, specialized tools with less functionality need to fulfill less usability criteria, whereas multipurpose tools need a well-structured menu and intuitive graphical user interface.

  3. Contingency Contractor Optimization Phase 3 Sustainment Software Design Document - Contingency Contractor Optimization Tool - Prototype

    Energy Technology Data Exchange (ETDEWEB)

    Durfee, Justin David; Frazier, Christopher Rawls; Bandlow, Alisa; Jones, Katherine A

    2016-05-01

    This document describes the final software design of the Contingency Contractor Optimization Tool - Prototype. Its purpose is to provide the overall architecture of the software and the logic behind this architecture. Documentation for the individual classes is provided in the application Javadoc. The Contingency Contractor Optimization project is intended to address Department of Defense mandates by delivering a centralized strategic planning tool that allows senior decision makers to quickly and accurately assess the impacts, risks, and mitigation strategies associated with utilizing contract support. The Contingency Contractor Optimization Tool - Prototype was developed in Phase 3 of the OSD ATL Contingency Contractor Optimization project to support strategic planning for contingency contractors. The planning tool uses a model to optimize the Total Force mix by minimizing the combined total costs for selected mission scenarios. The model optimizes the match of personnel types (military, DoD civilian, and contractors) and capabilities to meet mission requirements as effectively as possible, based on risk, cost, and other requirements.

  4. A case study on the comparison of different software tools for automated quantification of peptides.

    Science.gov (United States)

    Colaert, Niklaas; Vandekerckhove, Joël; Martens, Lennart; Gevaert, Kris

    2011-01-01

    MS-driven proteomics has evolved over the past two decades to a high tech and high impact research field. Two distinct factors clearly influenced its expansion: the rapid growth of an arsenal of instrument and proteomic techniques that led to an explosion of high quality data and the development of software tools to analyze and interpret these data which boosted the number of scientific discoveries. In analogy with the benchmarking of new instruments and proteomic techniques, such software tools must be thoroughly tested and analyzed. Recently, new tools were developed for automatic peptide quantification in quantitative proteomic experiments. Here we present a case study where the most recent and frequently used tools are analyzed and compared.

  5. An Open-Source Tool Set Enabling Analog-Digital-Software Co-Design

    Directory of Open Access Journals (Sweden)

    Michelle Collins

    2016-02-01

    Full Text Available This paper presents an analog-digital hardware-software co-design environment for simulating and programming reconfigurable systems. The tool simulates, designs, as well as enables experimental measurements after compiling to configurable systems in the same integrated design tool framework. High level software in Scilab/Xcos (open-source programs similar to MATLAB/Simulink that converts the high-level block description by the user to blif format (sci2blif, which acts as an input to the modified VPR tool, including the code v p r 2 s w c s , encoding the specific platform through specific architecture files, resulting in a targetable switch list on the resulting configurable analog–digital system. The resulting tool uses an analog and mixed-signal library of components, enabling users and future researchers access to the basic analog operations/computations that are possible.

  6. Review of Ground Systems Development and Operations (GSDO) Tools for Verifying Command and Control Software

    Science.gov (United States)

    Aguilar, Michael L.; Bonanne, Kevin H.; Favretto, Jeffrey A.; Jackson, Maddalena M.; Jones, Stephanie L.; Mackey, Ryan M.; Sarrel, Marc A.; Simpson, Kimberly A.

    2014-01-01

    The Exploration Systems Development (ESD) Standing Review Board (SRB) requested the NASA Engineering and Safety Center (NESC) conduct an independent review of the plan developed by Ground Systems Development and Operations (GSDO) for identifying models and emulators to create a tool(s) to verify their command and control software. The NESC was requested to identify any issues or weaknesses in the GSDO plan. This document contains the outcome of the NESC review.

  7. A multicenter study benchmarks software tools for label-free proteome quantification.

    Science.gov (United States)

    Navarro, Pedro; Kuharev, Jörg; Gillet, Ludovic C; Bernhardt, Oliver M; MacLean, Brendan; Röst, Hannes L; Tate, Stephen A; Tsou, Chih-Chiang; Reiter, Lukas; Distler, Ute; Rosenberger, George; Perez-Riverol, Yasset; Nesvizhskii, Alexey I; Aebersold, Ruedi; Tenzer, Stefan

    2016-11-01

    Consistent and accurate quantification of proteins by mass spectrometry (MS)-based proteomics depends on the performance of instruments, acquisition methods and data analysis software. In collaboration with the software developers, we evaluated OpenSWATH, SWATH 2.0, Skyline, Spectronaut and DIA-Umpire, five of the most widely used software methods for processing data from sequential window acquisition of all theoretical fragment-ion spectra (SWATH)-MS, which uses data-independent acquisition (DIA) for label-free protein quantification. We analyzed high-complexity test data sets from hybrid proteome samples of defined quantitative composition acquired on two different MS instruments using different SWATH isolation-window setups. For consistent evaluation, we developed LFQbench, an R package, to calculate metrics of precision and accuracy in label-free quantitative MS and report the identification performance, robustness and specificity of each software tool. Our reference data sets enabled developers to improve their software tools. After optimization, all tools provided highly convergent identification and reliable quantification performance, underscoring their robustness for label-free quantitative proteomics.

  8. Development and assessment of a digital X-ray software tool to determine vertebral rotation in adolescent idiopathic scoliosis.

    Science.gov (United States)

    Eijgenraam, Susanne M; Boselie, Toon F M; Sieben, Judith M; Bastiaenen, Caroline H G; Willems, Paul C; Arts, Jacobus J; Lataster, Arno

    2017-02-01

    The amount of vertebral rotation in the axial plane is of key importance in the prognosis and treatment of adolescent idiopathic scoliosis (AIS). Current methods to determine vertebral rotation are either designed for use in analogue plain radiographs and not useful in digital images, or lack measurement precision and are therefore less suitable for the follow-up of rotation in AIS patients. This study aimed to develop a digital X-ray software tool with high measurement precision to determine vertebral rotation in AIS, and to assess its (concurrent) validity and reliability. In this study a combination of basic science and reliability methodology applied in both laboratory and clinical settings was used. Software was developed using the algorithm of the Perdriolle torsion meter for analogue AP plain radiographs of the spine. Software was then assessed for (1) concurrent validity and (2) intra- and interobserver reliability. Plain radiographs of both human cadaver vertebrae and outpatient AIS patients were used. Concurrent validity was measured by two independent observers, both experienced in the assessment of plain radiographs. Reliability-measurements were performed by three independent spine surgeons. Pearson correlation of the software compared with the analogue Perdriolle torsion meter for mid-thoracic vertebrae was 0.98, for low-thoracic vertebrae 0.97 and for lumbar vertebrae 0.97. Measurement exactness of the software was within 5° in 62% of cases and within 10° in 97% of cases. Intraclass correlation coefficient (ICC) for inter-observer reliability was 0.92 (0.91-0.95), ICC for intra-observer reliability was 0.96 (0.94-0.97). We developed a digital X-ray software tool to determine vertebral rotation in AIS with a substantial concurrent validity and reliability, which may be useful for the follow-up of vertebral rotation in AIS patients. Copyright © 2015 Elsevier Inc. All rights reserved.

  9. Programming heterogeneous MPSoCs tool flows to close the software productivity gap

    CERN Document Server

    Castrillón Mazo, Jerónimo

    2014-01-01

    This book provides embedded software developers with techniques for programmingheterogeneous Multi-Processor Systems-on-Chip (MPSoCs), capable of executing multiple applications simultaneously. It describes a set of algorithms and methodologies to narrow the software productivity gap, as well as an in-depth description of the underlying problems and challenges of today’s programming practices. The authors present four different tool flows: A parallelism extraction flow for applications writtenusing the C programming language, a mapping and scheduling flow for parallel applications, a special mapping flow for baseband applications in the context of Software Defined Radio (SDR) and a final flow for analyzing multiple applications at design time. The tool flows are evaluated on Virtual Platforms (VPs), which mimic different characteristics of state-of-the-art heterogeneous MPSoCs.   • Provides a novel set of algorithms and methodologies for programming heterogeneous Multi-Processor Systems-on-Chip (MPSoCs)...

  10. SDMdata: A Web-Based Software Tool for Collecting Species Occurrence Records.

    Directory of Open Access Journals (Sweden)

    Xiaoquan Kong

    Full Text Available It is important to easily and efficiently obtain high quality species distribution data for predicting the potential distribution of species using species distribution models (SDMs. There is a need for a powerful software tool to automatically or semi-automatically assist in identifying and correcting errors. Here, we use Python to develop a web-based software tool (SDMdata to easily collect occurrence data from the Global Biodiversity Information Facility (GBIF and check species names and the accuracy of coordinates (latitude and longitude. It is an open source software (GNU Affero General Public License/AGPL licensed allowing anyone to access and manipulate the source code. SDMdata is available online free of charge from .

  11. Astrophysics datamining in the classroom: Exploring real data with new software tools and robotic telescopes

    CERN Document Server

    Doran, Rosa; Boudier, Thomas; Pacôme,; Delva,; Ferlet, Roger; Almeida, Maria L T; Barbosa, Domingos; Gomez, Edward; Pennypacker, Carl; Roche, Paul; Roberts, Sarah

    2012-01-01

    Within the efforts to bring frontline interactive astrophysics and astronomy to the classroom, the Hands on Universe (HOU) developed a set of exercises and platform using real data obtained by some of the most advanced ground and space observatories. The backbone of this endeavour is a new free software Web tool - Such a Lovely Software for Astronomy based on Image J (Salsa J). It is student-friendly and developed specifically for the HOU project and targets middle and high schools. It allows students to display, analyze, and explore professionally obtained astronomical images, while learning concepts on gravitational dynamics, kinematics, nuclear fusion, electromagnetism. The continuous evolving set of exercises and tutorials is being completed with real (professionally obtained) data to download and detailed tutorials. The flexibility of the Salsa J platform tool enables students and teachers to extend the exercises with their own observations. The software developed for the HOU program has been designed to...

  12. Using a Self-Administered Visual Basic Software Tool To Teach Psychological Concepts.

    Science.gov (United States)

    Strang, Harold R.; Sullivan, Amie K.; Schoeny, Zahrl G.

    2002-01-01

    Introduces LearningLinks, a Visual Basic software tool that allows teachers to create individualized learning modules that use constructivist and behavioral learning principles. Describes field testing of undergraduates at the University of Virginia that tested a module designed to improve understanding of the psychological concepts of…

  13. Wiki as a Corporate Learning Tool: Case Study for Software Development Company

    Science.gov (United States)

    Milovanovic, Milos; Minovic, Miroslav; Stavljanin, Velimir; Savkovic, Marko; Starcevic, Dusan

    2012-01-01

    In our study, we attempted to further investigate how Web 2.0 technologies influence workplace learning. Our particular interest was on using Wiki as a tool for corporate exchange of knowledge with the focus on informal learning. In this study, we collaborated with a multinational software development company that uses Wiki as a corporate tool…

  14. The use of software tools and autonomous bots against vandalism : eroding Wikipedia's moral order?

    NARCIS (Netherlands)

    de Laat, Paul B.

    English-language Wikipedia is constantly being plagued by vandalistic contributions on a massive scale. In order to fight them its volunteer contributors deploy an array of software tools and autonomous bots. After an analysis of their functioning and the 'coactivity' in use between humans and bots,

  15. Experience with case tools in the design of process-oriented software

    CERN Document Server

    Novakov, O

    1993-01-01

    In Accelerator systems such as the CERN PS complex, process equipment has a life time which may exceed the typical life cycle of its related software. Taking in account the variety of such equipment, it is important to keep the analysis and design of the software in a system- independent form. This paper discusses the experience gathered in using commercial CASE tools for analysis, design and reverse engineering of different process- oriented software modules, with a principal emphasis on maintaining the initial analysis in a standardized form. Such tools are in existence for several years, but this paper shows that they are not entirely adapted to our needs.In particular, the paper stresses the problems of integrating such a tool in an existing data-base-dependent development chain, the lack of real-time simulation tools and of object-oriented concepts in existing commercial packages. Finally, the paper attempts to show a broader view of software engineering needs in our particular context.

  16. Wiki as a Corporate Learning Tool: Case Study for Software Development Company

    Science.gov (United States)

    Milovanovic, Milos; Minovic, Miroslav; Stavljanin, Velimir; Savkovic, Marko; Starcevic, Dusan

    2012-01-01

    In our study, we attempted to further investigate how Web 2.0 technologies influence workplace learning. Our particular interest was on using Wiki as a tool for corporate exchange of knowledge with the focus on informal learning. In this study, we collaborated with a multinational software development company that uses Wiki as a corporate tool…

  17. Software Construction and Composition Tools for Petascale Computing SCW0837 Progress Report

    Energy Technology Data Exchange (ETDEWEB)

    Epperly, T W; Hochstein, L

    2011-09-12

    The majority of scientific software is distributed as source code. As the number of library dependencies and supported platforms increases, so does the complexity of describing the rules for configuring and building software. In this project, we have performed an empirical study of the magnitude of the build problem by examining the development history of two DOE-funded scientific software projects. We have developed MixDown, a meta-build tool, to simplify the task of building applications that depend on multiple third-party libraries. The results of this research indicate that the effort that scientific programmers spend takes a significant fraction of the total development effort and that the use of MixDown can significantly simplify the task of building software with multiple dependencies.

  18. Merits and Pitfalls of Currently Used Diagnostic Tools in Mycetoma

    NARCIS (Netherlands)

    W.W.J. van de Sande (Wendy); A.H. Fahal (Ahmed); H. Goodfellow (Henry); E.S. Mahgoub (El Sheikh); O. Welsh (Oliverio); E. Zijlstra (Ed)

    2014-01-01

    textabstractTreatment of mycetoma depends on the causative organism and since many organisms, both actinomycetes (actinomycetoma) and fungi (eumycetoma), are capable of producing mycetoma, an accurate diagnosis is crucial. Currently, multiple diagnostic tools are used to determine the extent of infe

  19. Assess/Mitigate Risk through the Use of Computer-Aided Software Engineering (CASE) Tools

    Science.gov (United States)

    Aguilar, Michael L.

    2013-01-01

    The NASA Engineering and Safety Center (NESC) was requested to perform an independent assessment of the mitigation of the Constellation Program (CxP) Risk 4421 through the use of computer-aided software engineering (CASE) tools. With the cancellation of the CxP, the assessment goals were modified to capture lessons learned and best practices in the use of CASE tools. The assessment goal was to prepare the next program for the use of these CASE tools. The outcome of the assessment is contained in this document.

  20. Life Cycle Assessment Studies of Chemical and Biochemical Processes through the new LCSoft Software-tool

    DEFF Research Database (Denmark)

    Supawanich, Perapong; Malakul, Pomthong; Gani, Rafiqul

    2015-01-01

    requirements have to be evaluated together with environmental and economic aspects. The LCSoft software-tool has been developed to perform LCA as a stand-alone tool as well as integrated with other process design tools such as process simulation, economic analysis (ECON), and sustainable process design...... (SustainPro). An extended version of LCSoft is presented in this paper. The development work consists of four main tasks. The first task consists of the Life Cycle Inventory (LCI) calculation function. The second task deals with the extension of the Life Cycle Inventory database and improvement of the Life...

  1. New software tools for enhanced precision in robot-assisted laser phonomicrosurgery.

    Science.gov (United States)

    Dagnino, Giulio; Mattos, Leonardo S; Caldwell, Darwin G

    2012-01-01

    This paper describes a new software package created to enhance precision during robot-assisted laser phonomicrosurgery procedures. The new software is composed of three tools for camera calibration, automatic tumor segmentation, and laser tracking. These were designed and developed to improve the outcome of this demanding microsurgical technique, and were tested herein to produce quantitative performance data. The experimental setup was based on the motorized laser micromanipulator created by Istituto Italiano di Tecnologia and the experimental protocols followed are fully described in this paper. The results show the new tools are robust and effective: The camera calibration tool reduced residual errors (RMSE) to 0.009 ± 0.002 mm under 40× microscope magnification; the automatic tumor segmentation tool resulted in deep lesion segmentations comparable to manual segmentations (RMSE= 0.160 ± 0.028 mm under 40× magnification); and the laser tracker tool proved to be reliable even during cutting procedures (RMSE= 0.073 ± 0.023 mm under 40× magnification). These results demonstrate the new software package can provide excellent improvements to the previous microsurgical system, leading to important enhancements in surgical outcome.

  2. S2O - A software tool for integrating research data from general purpose statistic software into electronic data capture systems.

    Science.gov (United States)

    Bruland, Philipp; Dugas, Martin

    2017-01-07

    Data capture for clinical registries or pilot studies is often performed in spreadsheet-based applications like Microsoft Excel or IBM SPSS. Usually, data is transferred into statistic software, such as SAS, R or IBM SPSS Statistics, for analyses afterwards. Spreadsheet-based solutions suffer from several drawbacks: It is generally not possible to ensure a sufficient right and role management; it is not traced who has changed data when and why. Therefore, such systems are not able to comply with regulatory requirements for electronic data capture in clinical trials. In contrast, Electronic Data Capture (EDC) software enables a reliable, secure and auditable collection of data. In this regard, most EDC vendors support the CDISC ODM standard to define, communicate and archive clinical trial meta- and patient data. Advantages of EDC systems are support for multi-user and multicenter clinical trials as well as auditable data. Migration from spreadsheet based data collection to EDC systems is labor-intensive and time-consuming at present. Hence, the objectives of this research work are to develop a mapping model and implement a converter between the IBM SPSS and CDISC ODM standard and to evaluate this approach regarding syntactic and semantic correctness. A mapping model between IBM SPSS and CDISC ODM data structures was developed. SPSS variables and patient values can be mapped and converted into ODM. Statistical and display attributes from SPSS are not corresponding to any ODM elements; study related ODM elements are not available in SPSS. The S2O converting tool was implemented as command-line-tool using the SPSS internal Java plugin. Syntactic and semantic correctness was validated with different ODM tools and reverse transformation from ODM into SPSS format. Clinical data values were also successfully transformed into the ODM structure. Transformation between the spreadsheet format IBM SPSS and the ODM standard for definition and exchange of trial data is feasible

  3. IPAT: a freely accessible software tool for analyzing multiple patent documents with inbuilt landscape visualizer.

    Science.gov (United States)

    Ajay, Dara; Gangwal, Rahul P; Sangamwar, Abhay T

    2015-01-01

    Intelligent Patent Analysis Tool (IPAT) is an online data retrieval tool, operated based on text mining algorithm to extract specific patent information in a predetermined pattern into an Excel sheet. The software is designed and developed to retrieve and analyze technology information from multiple patent documents and generate various patent landscape graphs and charts. The software is C# coded in visual studio 2010, which extracts the publicly available patent information from the web pages like Google Patent and simultaneously study the various technology trends based on user-defined parameters. In other words, IPAT combined with the manual categorization will act as an excellent technology assessment tool in competitive intelligence and due diligence for predicting the future R&D forecast.

  4. A Business Analytics Software Tool for Monitoring and Predicting Radiology Throughput Performance.

    Science.gov (United States)

    Jones, Stephen; Cournane, Seán; Sheehy, Niall; Hederman, Lucy

    2016-12-01

    Business analytics (BA) is increasingly being utilised by radiology departments to analyse and present data. It encompasses statistical analysis, forecasting and predictive modelling and is used as an umbrella term for decision support and business intelligence systems. The primary aim of this study was to determine whether utilising BA technologies could contribute towards improved decision support and resource management within radiology departments. A set of information technology requirements were identified with key stakeholders, and a prototype BA software tool was designed, developed and implemented. A qualitative evaluation of the tool was carried out through a series of semi-structured interviews with key stakeholders. Feedback was collated, and emergent themes were identified. The results indicated that BA software applications can provide visibility of radiology performance data across all time horizons. The study demonstrated that the tool could potentially assist with improving operational efficiencies and management of radiology resources.

  5. Pathway Tools version 19.0 update: software for pathway/genome informatics and systems biology.

    Science.gov (United States)

    Karp, Peter D; Latendresse, Mario; Paley, Suzanne M; Krummenacker, Markus; Ong, Quang D; Billington, Richard; Kothari, Anamika; Weaver, Daniel; Lee, Thomas; Subhraveti, Pallavi; Spaulding, Aaron; Fulcher, Carol; Keseler, Ingrid M; Caspi, Ron

    2016-09-01

    Pathway Tools is a bioinformatics software environment with a broad set of capabilities. The software provides genome-informatics tools such as a genome browser, sequence alignments, a genome-variant analyzer and comparative-genomics operations. It offers metabolic-informatics tools, such as metabolic reconstruction, quantitative metabolic modeling, prediction of reaction atom mappings and metabolic route search. Pathway Tools also provides regulatory-informatics tools, such as the ability to represent and visualize a wide range of regulatory interactions. This article outlines the advances in Pathway Tools in the past 5 years. Major additions include components for metabolic modeling, metabolic route search, computation of atom mappings and estimation of compound Gibbs free energies of formation; addition of editors for signaling pathways, for genome sequences and for cellular architecture; storage of gene essentiality data and phenotype data; display of multiple alignments, and of signaling and electron-transport pathways; and development of Python and web-services application programming interfaces. Scientists around the world have created more than 9800 Pathway/Genome Databases by using Pathway Tools, many of which are curated databases for important model organisms.

  6. UNBizPlanner: a software tool for preparing a business plan

    Directory of Open Access Journals (Sweden)

    Oscar Ávila Cifuentes

    2010-04-01

    Full Text Available Universities are currently expected to play a new role in society (in addition to research and teaching by engaging in a third mission concerning socio-economic development. Universities also play an important role in encouraging en-trepreneurs through training them in business planning. A business plan is a document summarising how an entre-preneur will create an organisation to exploit a business opportunity. Preparing a business plan draws on a wide range of knowledge from many business disciplines (e.g. finance, human resource management, intellectual pro-perty management, supply chain management, operations management and marketing. This article presents a computational tool for drawing up a business plan from a Colombian viewpoint by identifying the most relevant stages which are born in mind by national entities having most experience in creating and consolidating companies. Special emphasis was placed on analysing, designing and implementing a systems development life cycle for de-veloping the software. Reviewing the literature concerning business plans formed an important part of the analysis stage (bearing a Colombian viewpoint in mind.

  7. CancellationTools: All-in-one software for administration and analysis of cancellation tasks.

    Science.gov (United States)

    Dalmaijer, Edwin S; Van der Stigchel, Stefan; Nijboer, Tanja C W; Cornelissen, Tim H W; Husain, Masud

    2015-12-01

    In a cancellation task, a participant is required to search for and cross out ("cancel") targets, which are usually embedded among distractor stimuli. The number of cancelled targets and their location can be used to diagnose the neglect syndrome after stroke. In addition, the organization of search provides a potentially useful way to measure executive control over multitarget search. Although many useful cancellation measures have been introduced, most fail to make their way into research studies and clinical practice due to the practical difficulty of acquiring such parameters from traditional pen-and-paper measures. Here we present new, open-source software that is freely available to all. It allows researchers and clinicians to flexibly administer computerized cancellation tasks using stimuli of their choice, and to directly analyze the data in a convenient manner. The automated analysis suite provides output that includes almost all of the currently existing measures, as well as several new ones introduced here. All tasks can be performed using either a computer mouse or a touchscreen as an input device, and an online version of the task runtime is available for tablet devices. A summary of the results is produced in a single A4-sized PDF document, including high quality data visualizations. For research purposes, batch analysis of large datasets is possible. In sum, CancellationTools allows users to employ a flexible, computerized cancellation task, which provides extensive benefits and ease of use.

  8. SU-E-T-27: A Tool for Routine Quality Assurance of Radiotherapy Dose Calculation Software

    Energy Technology Data Exchange (ETDEWEB)

    Popple, R; Cardan, R; Duan, J; Wu, X; Shen, S; Brezovich, I [The University of Alabama at Birmingham, Birmingham, AL (United States)

    2014-06-01

    Purpose: Dose calculation software is thoroughly evaluated when it is commissioned; however, evaluation of periodic software updates is typically limited in scope due to staffing constraints and the need to quickly return the treatment planning system to clinical service. We developed a tool for quickly and comprehensively testing and documenting dose calculation software against measured data. Methods: A tool was developed using MatLab (The MathWorks, Natick, MA) for evaluation of dose calculation algorithms against measured data. Inputs to the tool are measured data, reference DICOM RT PLAN files describing the measurements, and dose calculations in DICOM format. The tool consists of a collection of extensible modules that can perform analysis of point dose, depth dose curves, and profiles using dose difference, distance-to-agreement, and the gamma-index. Each module generates a report subsection that is incorporated into a master template, which is converted to final form in portable document format (PDF). Results: After each change to the treatment planning system, a report can be generated in approximately 90 minutes. The tool has been in use for more than 5 years, spanning 5 versions of the eMC and 4 versions of the AAA. We have detected changes to the algorithms that affected clinical practice once during this period. Conclusion: Our tool provides an efficient method for quality assurance of dose calculation software, providing a complete set of tests for an update. Future work includes the addition of plan level tests, allowing incorporation of, for example, the TG-119 test suite for IMRT, and integration with the treatment planning system via an application programming interface. Integration with the planning system will permit fully-automated testing and reporting at scheduled intervals.

  9. Development and evaluation of an open source software tool for deidentification of pathology reports

    Directory of Open Access Journals (Sweden)

    Mahaadevan Rajeshwarri

    2006-03-01

    Full Text Available Abstract Background Electronic medical records, including pathology reports, are often used for research purposes. Currently, there are few programs freely available to remove identifiers while leaving the remainder of the pathology report text intact. Our goal was to produce an open source, Health Insurance Portability and Accountability Act (HIPAA compliant, deidentification tool tailored for pathology reports. We designed a three-step process for removing potential identifiers. The first step is to look for identifiers known to be associated with the patient, such as name, medical record number, pathology accession number, etc. Next, a series of pattern matches look for predictable patterns likely to represent identifying data; such as dates, accession numbers and addresses as well as patient, institution and physician names. Finally, individual words are compared with a database of proper names and geographic locations. Pathology reports from three institutions were used to design and test the algorithms. The software was improved iteratively on training sets until it exhibited good performance. 1800 new pathology reports were then processed. Each report was reviewed manually before and after deidentification to catalog all identifiers and note those that were not removed. Results 1254 (69.7 % of 1800 pathology reports contained identifiers in the body of the report. 3439 (98.3% of 3499 unique identifiers in the test set were removed. Only 19 HIPAA-specified identifiers (mainly consult accession numbers and misspelled names were missed. Of 41 non-HIPAA identifiers missed, the majority were partial institutional addresses and ages. Outside consultation case reports typically contain numerous identifiers and were the most challenging to deidentify comprehensively. There was variation in performance among reports from the three institutions, highlighting the need for site-specific customization, which is easily accomplished with our tool

  10. Cerec Smile Design--a software tool for the enhancement of restorations in the esthetic zone.

    Science.gov (United States)

    Kurbad, Andreas; Kurbad, Susanne

    2013-01-01

    Restorations in the esthetic zone can now be enhanced using software tools. In addition to the design of the restoration itself, a part or all of the patient's face can be displayed on the monitor to increase the predictability of treatment results. Using the Smile Design components of the Cerec and inLab software, a digital photograph of the patient can be projected onto a three-dimensional dummy head. In addition to its use for the enhancement of the CAD process, this technology can also be utilized for marketing purposes.

  11. A Tool for Testing of Inheritance Related Bugs in Object Oriented Software

    Directory of Open Access Journals (Sweden)

    B. G. Geetha

    2008-01-01

    Full Text Available Object oriented software development different from traditional development products. In object oriented software polymorphism, inheritance, dynamic binding are the important features. An inheritance property is the main feature. The compilers usually detect the syntax oriented errors only. Some of the property errors may be located in the product. Data flow testing is an appropriate testing method for testing program futures. This test analysis structure of the software and gives the flow of property. This study is designed to detect the hidden errors with reference to the inheritance property. Inputs of the tool are set of classes and packages. Outputs of the tools are hierarchies of the classes, methods, attributes and a set of inheritance related bugs like naked access, spaghetti inheritance bugs are automatically detected by the tool. The tool is developed as three major modules. They are code analysis, knowledge base preparation and bugs analysis. The code analysis module is designed to parse extract details from the code. The knowledge base preparation module is designed to prepare the knowledge base about the program details. The bug's analysis module is designed to extract bugs related information from the database. It is a static testing. This study focused on Java programs.

  12. Pathway Tools version 13.0: integrated software for pathway/genome informatics and systems biology.

    Science.gov (United States)

    Karp, Peter D; Paley, Suzanne M; Krummenacker, Markus; Latendresse, Mario; Dale, Joseph M; Lee, Thomas J; Kaipa, Pallavi; Gilham, Fred; Spaulding, Aaron; Popescu, Liviu; Altman, Tomer; Paulsen, Ian; Keseler, Ingrid M; Caspi, Ron

    2010-01-01

    Pathway Tools is a production-quality software environment for creating a type of model-organism database called a Pathway/Genome Database (PGDB). A PGDB such as EcoCyc integrates the evolving understanding of the genes, proteins, metabolic network and regulatory network of an organism. This article provides an overview of Pathway Tools capabilities. The software performs multiple computational inferences including prediction of metabolic pathways, prediction of metabolic pathway hole fillers and prediction of operons. It enables interactive editing of PGDBs by DB curators. It supports web publishing of PGDBs, and provides a large number of query and visualization tools. The software also supports comparative analyses of PGDBs, and provides several systems biology analyses of PGDBs including reachability analysis of metabolic networks, and interactive tracing of metabolites through a metabolic network. More than 800 PGDBs have been created using Pathway Tools by scientists around the world, many of which are curated DBs for important model organisms. Those PGDBs can be exchanged using a peer-to-peer DB sharing system called the PGDB Registry.

  13. Merits and pitfalls of currently used diagnostic tools in mycetoma.

    Directory of Open Access Journals (Sweden)

    Wendy W J van de Sande

    2014-07-01

    Full Text Available Treatment of mycetoma depends on the causative organism and since many organisms, both actinomycetes (actinomycetoma and fungi (eumycetoma, are capable of producing mycetoma, an accurate diagnosis is crucial. Currently, multiple diagnostic tools are used to determine the extent of infections and to identify the causative agents of mycetoma. These include various imaging, cytological, histopathological, serological, and culture techniques; phenotypic characterisation; and molecular diagnostics. In this review, we summarize these techniques and identify their merits and pitfalls in the identification of the causative agents of mycetoma and the extent of the disease. We also emphasize the fact that there is no ideal diagnostic tool available to identify the causative agents and that future research should focus on the development of new and reliable diagnostic tools.

  14. Merits and pitfalls of currently used diagnostic tools in mycetoma.

    Science.gov (United States)

    van de Sande, Wendy W J; Fahal, Ahmed H; Goodfellow, Michael; Mahgoub, El Sheikh; Welsh, Oliverio; Zijlstra, Ed E

    2014-07-01

    Treatment of mycetoma depends on the causative organism and since many organisms, both actinomycetes (actinomycetoma) and fungi (eumycetoma), are capable of producing mycetoma, an accurate diagnosis is crucial. Currently, multiple diagnostic tools are used to determine the extent of infections and to identify the causative agents of mycetoma. These include various imaging, cytological, histopathological, serological, and culture techniques; phenotypic characterisation; and molecular diagnostics. In this review, we summarize these techniques and identify their merits and pitfalls in the identification of the causative agents of mycetoma and the extent of the disease. We also emphasize the fact that there is no ideal diagnostic tool available to identify the causative agents and that future research should focus on the development of new and reliable diagnostic tools.

  15. In-depth evaluation of software tools for data-independent acquisition based label-free quantification.

    Science.gov (United States)

    Kuharev, Jörg; Navarro, Pedro; Distler, Ute; Jahn, Olaf; Tenzer, Stefan

    2015-09-01

    Label-free quantification (LFQ) based on data-independent acquisition workflows currently experiences increasing popularity. Several software tools have been recently published or are commercially available. The present study focuses on the evaluation of three different software packages (Progenesis, synapter, and ISOQuant) supporting ion mobility enhanced data-independent acquisition data. In order to benchmark the LFQ performance of the different tools, we generated two hybrid proteome samples of defined quantitative composition containing tryptically digested proteomes of three different species (mouse, yeast, Escherichia coli). This model dataset simulates complex biological samples containing large numbers of both unregulated (background) proteins as well as up- and downregulated proteins with exactly known ratios between samples. We determined the number and dynamic range of quantifiable proteins and analyzed the influence of applied algorithms (retention time alignment, clustering, normalization, etc.) on quantification results. Analysis of technical reproducibility revealed median coefficients of variation of reported protein abundances below 5% for MS(E) data for Progenesis and ISOQuant. Regarding accuracy of LFQ, evaluation with synapter and ISOQuant yielded superior results compared to Progenesis. In addition, we discuss reporting formats and user friendliness of the software packages. The data generated in this study have been deposited to the ProteomeXchange Consortium with identifier PXD001240 (http://proteomecentral.proteomexchange.org/dataset/PXD001240).

  16. Eddy current NDE performance demonstrations using simulation tools

    Energy Technology Data Exchange (ETDEWEB)

    Maurice, L. [EDF - CEIDRE, 2 rue Ampere, 93206 Saint-Denis Cedex 1 (France); Costan, V.; Guillot, E.; Thomas, P. [EDF - R and D, THEMIS, 1, avenue du General de Gaulle, 92141 Clamart (France)

    2013-01-25

    To carry out performance demonstrations of the Eddy-Current NDE processes applied on French nuclear power plants, EDF studies the possibility of using simulation tools as an alternative to measurements on steam generator tube mocks-up. This paper focuses on the strategy led by EDF to assess and use code{sub C}armel3D and Civa, on the case of Eddy-Current NDE on wears problem which may appear in the U-shape region of steam generator tubes due to the rubbing of anti-vibration bars.

  17. The Formalism and Language Tools for Semantics Specification of Software Libraries

    Directory of Open Access Journals (Sweden)

    V. M. Itsykson

    2016-01-01

    Full Text Available The paper is dedicated to the specification of the structure and the behaviour of soft-ware libraries. It describes the existing problems of libraries specifications. A brief overview of the research field concerned with formalizing the specification of libraries and library functions is presented. The requirements imposed on the formalism designed are established; the formalism based on these requirements allows specifying all the properties of the libraries needed for automation of several classes of problems: defects detection in the software, migration of applications into a new environment, gen-eration of software documentation. The requirements on the language tools based on the developed formalism are proposed. The conclusion defines potential directions for further research.

  18. PyElph - a software tool for gel images analysis and phylogenetics

    Directory of Open Access Journals (Sweden)

    Pavel Ana Brânduşa

    2012-01-01

    Full Text Available Abstract Background This paper presents PyElph, a software tool which automatically extracts data from gel images, computes the molecular weights of the analyzed molecules or fragments, compares DNA patterns which result from experiments with molecular genetic markers and, also, generates phylogenetic trees computed by five clustering methods, using the information extracted from the analyzed gel image. The software can be successfully used for population genetics, phylogenetics, taxonomic studies and other applications which require gel image analysis. Researchers and students working in molecular biology and genetics would benefit greatly from the proposed software because it is free, open source, easy to use, has a friendly Graphical User Interface and does not depend on specific image acquisition devices like other commercial programs with similar functionalities do. Results PyElph software tool is entirely implemented in Python which is a very popular programming language among the bioinformatics community. It provides a very friendly Graphical User Interface which was designed in six steps that gradually lead to the results. The user is guided through the following steps: image loading and preparation, lane detection, band detection, molecular weights computation based on a molecular weight marker, band matching and finally, the computation and visualization of phylogenetic trees. A strong point of the software is the visualization component for the processed data. The Graphical User Interface provides operations for image manipulation and highlights lanes, bands and band matching in the analyzed gel image. All the data and images generated in each step can be saved. The software has been tested on several DNA patterns obtained from experiments with different genetic markers. Examples of genetic markers which can be analyzed using PyElph are RFLP (Restriction Fragment Length Polymorphism, AFLP (Amplified Fragment Length Polymorphism, RAPD

  19. SOFTWARE TOOL FOR LASER CUTTING PROCESS CONTROL – SOLVING REAL INDUSTRIAL CASE STUDIES

    Directory of Open Access Journals (Sweden)

    Miloš Madić

    2016-08-01

    Full Text Available Laser cutting is one of the leading non-conventional machining technologies with a wide spectrum of application in modern industry. It order to exploit a number of advantages that this technology offers for contour cutting of materials, it is necessary to carefully select laser cutting conditions for each given workpiece material, thickness and desired cut qualities. In other words, there is a need for process control of laser cutting. After a comprehensive analysis of the main laser cutting parameters and process performance characteristics, the application of the developed software tool “BRUTOMIZER” for off-line control of CO2 laser cutting process of three different workpiece materials (mild steel, stainless steel and aluminum is illustrated. Advantages and abilities of the developed software tool are also illustrated.

  20. RAVEN AS A TOOL FOR DYNAMIC PROBABILISTIC RISK ASSESSMENT: SOFTWARE OVERVIEW

    Energy Technology Data Exchange (ETDEWEB)

    Alfonsi Andrea; Mandelli Diego; Rabiti Cristian; Joshua Cogliati; Robert Kinoshita

    2013-05-01

    RAVEN is a software tool under development at the Idaho National Laboratory (INL) that acts as the control logic driver and post-processing tool for the newly developed Thermo-Hydraylic code RELAP- 7. The scope of this paper is to show the software structure of RAVEN and its utilization in connection with RELAP-7. A short overview of the mathematical framework behind the code is presented along with its main capabilities such as on-line controlling/monitoring and Monte-Carlo sampling. A demo of a Station Black Out PRA analysis of a simplified Pressurized Water Reactor (PWR) model is shown in order to demonstrate the Monte-Carlo and clustering capabilities.

  1. PC Software graphics tool for conceptual design of space/planetary electrical power systems

    Science.gov (United States)

    Truong, Long V.

    1995-01-01

    This paper describes the Decision Support System (DSS), a personal computer software graphics tool for designing conceptual space and/or planetary electrical power systems. By using the DSS, users can obtain desirable system design and operating parameters, such as system weight, electrical distribution efficiency, and bus power. With this tool, a large-scale specific power system was designed in a matter of days. It is an excellent tool to help designers make tradeoffs between system components, hardware architectures, and operation parameters in the early stages of the design cycle. The DSS is a user-friendly, menu-driven tool with online help and a custom graphical user interface. An example design and results are illustrated for a typical space power system with multiple types of power sources, frequencies, energy storage systems, and loads.

  2. A Novel Software Tool to Generate Customer Needs for Effective Design of Online Shopping Websites

    Directory of Open Access Journals (Sweden)

    Ashish K. Sharma

    2016-03-01

    Full Text Available —Effective design of online shopping websites is the need of the hour as design plays a crucial role in the success of online shopping businesses. Recently, the use of Quality Function Deployment (QFD has been reported for the design of online shopping websites. QFD is a customer driven process that encompasses voluminous data gathered from customers through several techniques like personal interview, focus groups, surveys etc. This massive, unsorted and unstructured data is required to be transformed into a limited number of structured information to represent the actual Customer Needs (CNs which are then utilized in subsequent stages of QFD process. This can be achieved through brainstorming using techniques like Affinity Process. However, integrating the Affinity Process within QFD is tedious and time consuming and cannot be dealt with manually. This generates a pressing need for a software tool to serve the purpose. Moreover, the researches carried out so far have focused on QFD application, post the generation of CNs. Also, the available QFD softwares lack the option to generate CNs from collected data. Thus, the paper aims to develop a novel software tool that integrates Affinity Process with QFD to generate customers‘ needs for effective design of online shopping websites. The software system is developed using Visual Basic Dot Net (VB.Net that integrates a MS-Access database.

  3. Software Tools for Emittance Measurement and Matching for 12 GeV CEBAF

    Energy Technology Data Exchange (ETDEWEB)

    Turner, Dennis L. [Thomas Jefferson National Accelerator Facility (TJNAF), Newport News, VA (United States)

    2016-05-01

    This paper discusses model-driven setup of the Continuous Electron Beam Accelerator Facility (CEBAF) for the 12GeV era, focusing on qsUtility. qsUtility is a set of software tools created to perform emittance measurements, analyze those measurements, and compute optics corrections based upon the measurements.qsUtility was developed as a toolset to facilitate reducing machine configuration time and reproducibility by way of an accurate accelerator model, and to provide Operations staff with tools to measure and correct machine optics with little or no assistance from optics experts.

  4. Development of computer-aided software engineering tool for sequential control of JT-60U

    Energy Technology Data Exchange (ETDEWEB)

    Shimono, M.; Akasaka, H.; Kurihara, K.; Kimura, T. [Japan Atomic Energy Research Inst., Naka, Ibaraki (Japan). Naka Fusion Research Establishment

    1995-12-31

    Discharge sequential control (DSC) is an essential control function for the intermittent and pulse discharge operation of a tokamak device, so that many subsystems may work with each other in correct order and/or synchronously. In the development of the DSC program, block diagrams of logical operation for sequential control are illustrated in its design at first. Then, the logical operators and I/O`s which are involved in the block diagrams are compiled and converted to a certain particular form. Since the block diagrams of the sequential control amounts to about 50 sheets in the case of the JT-60 upgrade tokamak (JT-60U) high power discharge and the above steps of the development have been performed manually so far, a great effort has been required for the program development. In order to remove inefficiency in such development processes, a computer-aided software engineering (CASE) tool has been developed on a UNIX workstation. This paper reports how the authors design it for the development of the sequential control programs. The tool is composed of the following three tools: (1) Automatic drawing tool, (2) Editing tool, and (3) Trace tool. This CASE tool, an object-oriented programming tool having graphical formalism, can powerfully accelerate the cycle for the development of the sequential control function commonly associated with pulse discharge in a tokamak fusion device.

  5. A SOFTWARE TOOL TO COMPARE MEASURED AND SIMULATED BUILDING ENERGY PERFORMANCE DATA

    Energy Technology Data Exchange (ETDEWEB)

    Maile, Tobias; Bazjanac, Vladimir; O' Donnell, James; Garr, Matthew

    2011-11-01

    Building energy performance is often inadequate when compared to design goals. To link design goals to actual operation one can compare measured with simulated energy performance data. Our previously developed comparison approach is the Energy Performance Comparison Methodology (EPCM), which enables the identification of performance problems based on a comparison of measured and simulated performance data. In context of this method, we developed a software tool that provides graphing and data processing capabilities of the two performance data sets. The software tool called SEE IT (Stanford Energy Efficiency Information Tool) eliminates the need for manual generation of data plots and data reformatting. SEE IT makes the generation of time series, scatter and carpet plots independent of the source of data (measured or simulated) and provides a valuable tool for comparing measurements with simulation results. SEE IT also allows assigning data points on a predefined building object hierarchy and supports different versions of simulated performance data. This paper briefly introduces the EPCM, describes the SEE IT tool and illustrates its use in the context of a building case study.

  6. phenoVein - A software tool for leaf vein segmentation and analysis

    OpenAIRE

    Bühler, Jonas; Rishmawi, Louai; Pflugfelder, Daniel; Huber, Gregor; Scharr, Hanno; Hülskamp, Martin; Koornneef, Maarten; SCHURR, ULRICH; Jahnke, Siegfried

    2015-01-01

    phenoVein is a software tool dedicated to automated segmenting and analyzing images of leaf veins. It includes comfortable manual correction features. Advanced image filtering automatically emphasizes veins from background and compensates for local brightness inhomogeneities. Phenotypical leaf vein traits being calculated are total vein density, vein lengths and widths and skeleton graph statistics. For determination of vein widths, a model based vein edge estimation approach has been impleme...

  7. statnet: Software Tools for the Representation, Visualization, Analysis and Simulation of Network Data

    Directory of Open Access Journals (Sweden)

    Mark S. Handcock

    2007-12-01

    Full Text Available statnet is a suite of software packages for statistical network analysis. The packages implement recent advances in network modeling based on exponential-family random graph models (ERGM. The components of the package provide a comprehensive framework for ERGM-based network modeling, including tools for model estimation, model evaluation, model-based network simulation, and network visualization. This broad functionality is powered by a central Markov chain Monte Carlo (MCMC algorithm. The coding is optimized for speed and robustness.

  8. Users' manual for the Hydroecological Integrity Assessment Process software (including the New Jersey Assessment Tools)

    Science.gov (United States)

    Henriksen, James A.; Heasley, John; Kennen, Jonathan G.; Nieswand, Steven

    2006-01-01

    This manual is a user’s guide to four computer software tools that have been developed for the Hydroecological Integrity Assessment Process. The Hydroecological Integrity Assessment Process recognizes that streamflow is strongly related to many critical physiochemical components of rivers, such as dissolved oxygen, channel geomorphology, and water temperature, and can be considered a “master variable” that limits the disturbance, abundance, and diversity of many aquatic plant and animal species.

  9. A software tool for determination of breast cancer treatment methods using data mining approach.

    Science.gov (United States)

    Cakır, Abdülkadir; Demirel, Burçin

    2011-12-01

    In this work, breast cancer treatment methods are determined using data mining. For this purpose, software is developed to help to oncology doctor for the suggestion of application of the treatment methods about breast cancer patients. 462 breast cancer patient data, obtained from Ankara Oncology Hospital, are used to determine treatment methods for new patients. This dataset is processed with Weka data mining tool. Classification algorithms are applied one by one for this dataset and results are compared to find proper treatment method. Developed software program called as "Treatment Assistant" uses different algorithms (IB1, Multilayer Perception and Decision Table) to find out which one is giving better result for each attribute to predict and by using Java Net beans interface. Treatment methods are determined for the post surgical operation of breast cancer patients using this developed software tool. At modeling step of data mining process, different Weka algorithms are used for output attributes. For hormonotherapy output IB1, for tamoxifen and radiotherapy outputs Multilayer Perceptron and for the chemotherapy output decision table algorithm shows best accuracy performance compare to each other. In conclusion, this work shows that data mining approach can be a useful tool for medical applications particularly at the treatment decision step. Data mining helps to the doctor to decide in a short time.

  10. Ignominy:a Tool for Software Dependency and Metric Analysis with Examples from Large HEP Packages

    Institute of Scientific and Technical Information of China (English)

    LassiA.Tuura; LucasTaylor

    2001-01-01

    Ignominy is a tool developed in the CMS IGUANA project to analyse the structure of software systems.Its primary component is a dependency scanner that distills information into human-usable forms.It also includes several tools to visualise the collected data in the form of graphical views and numerical metrics.Ignominy was designed to adapt to almost any reasonable structure,and it has been used to analyse several large projects. The original purpose of Ignominy was to help us better ensure the quality of our own software,and in particular warn us about possible structureal problems early on .As a part of this activity it is now used as a standard part of our release procedure,we also use it to evaluate and study the quality of external packages we plan to make use of .We describe what Ignominy can find out,and how if can be used to ivsualise and assess a software structure.We also discuss the inherent problems of the analysis as well as the different approaches to modularity the tool makes quite evident.The focus is the illustration of these issues through the analysis results for several sizable HEP softwre projects.

  11. User Driven Development of Software Tools for Open Data Discovery and Exploration

    Science.gov (United States)

    Schlobinski, Sascha; Keppel, Frank; Dihe, Pascal; Boot, Gerben; Falkenroth, Esa

    2016-04-01

    The use of open data in research faces challenges not restricted to inherent properties such as data quality, resolution of open data sets. Often Open data is catalogued insufficiently or fragmented. Software tools that support the effective discovery including the assessment of the data's appropriateness for research have shortcomings such as the lack of essential functionalities like support for data provenance. We believe that one of the reasons is the neglect of real end users requirements in the development process of aforementioned software tools. In the context of the FP7 Switch-On project we have pro-actively engaged the relevant user user community to collaboratively develop a means to publish, find and bind open data relevant for hydrologic research. Implementing key concepts of data discovery and exploration we have used state of the art web technologies to provide an interactive software tool that is easy to use yet powerful enough to satisfy the data discovery and access requirements of the hydrological research community.

  12. Mid-water Software Tools and the Application to Processing and Analysis of the Latest Generation Multibeam Sonars

    Science.gov (United States)

    Gee, L.; Doucet, M.

    2010-12-01

    The latest generation of multibeam sonars now has the ability to map the water-column, along with the seafloor. Currently, the users of these sonars have a limited view of the mid-water data in real-time, and if they do store the data, they are restricted to replaying it only, with no ability for further analysis. The water-column data has the potential to address a number of research areas including detection of small targets (wrecks, etc.) above the seabed, mapping of fish and marine mammals and a wide range of physical oceanographic processes. However, researchers have been required to develop their own in-house software tools before they can even begin their study of the water column data. This paper describes the development of more general software tools for the full processing of raw sonar data (bathymetry, backscatter and water-column) to yield output products suitable for visualization in a 4D time-synchronized environment. The huge water-column data volumes generated by the new sonars, combined with the variety of data formats from the different sonar manufacturers, provides a significant challenge in the design and development of tools that can be applied to the wide variety of applications. The development of the mid-water tools on this project addressed this problem by using a unified way of storing the water column data in a generic water column format (GWC). The sonar data are converted into the GWC by re-integrating the water column packets with time-based navigation and attitude, such that downstream in the workflow, the tools will have access to all relevant data of any particular ping. Dependent on the application and the resolution requirements, the conversion process also allows simple sub-sampling. Additionally, each file is indexed to enable fast non-linear lookup and extraction of any packet type or packet type collection in the sonar file. These tools also fully exploit multi-core and hyper-threading technologies to maximize the throughput

  13. A GIS-based Computational Tool for Multidimensional Flow Velocity by Acoustic Doppler Current Profilers

    Science.gov (United States)

    Kim, D.; Winkler, M.; Muste, M.

    2015-06-01

    Acoustic Doppler Current Profilers (ADCPs) provide efficient and reliable flow measurements compared to other tools for characteristics of the riverine environments. In addition to originally targeted discharge measurements, ADCPs are increasingly utilized to assess river flow characteristics. The newly developed VMS (Velocity Mapping Software) aims at providing an efficient process for quality assurance, mapping velocity vectors for visualization and facilitating comparison with physical and numerical model results. VMS was designed to provide efficient and smooth work flows for processing groups of transects. The software allows the user to select group of files and subsequently to conduct statistical and graphical quality assurance on the files as a group or individually as appropriate. VMS also enables spatial averaging in horizontal and vertical plane for ADCP data in a single or multiple transects over the same or consecutive cross sections. The analysis results are displayed in numerical and graphical formats.

  14. GOAL: A software tool for assessing biological significance of genes groups

    Directory of Open Access Journals (Sweden)

    Famili Fazel

    2010-05-01

    Full Text Available Abstract Background Modern high throughput experimental techniques such as DNA microarrays often result in large lists of genes. Computational biology tools such as clustering are then used to group together genes based on their similarity in expression profiles. Genes in each group are probably functionally related. The functional relevance among the genes in each group is usually characterized by utilizing available biological knowledge in public databases such as Gene Ontology (GO, KEGG pathways, association between a transcription factor (TF and its target genes, and/or gene networks. Results We developed GOAL: Gene Ontology AnaLyzer, a software tool specifically designed for the functional evaluation of gene groups. GOAL implements and supports efficient and statistically rigorous functional interpretations of gene groups through its integration with available GO, TF-gene association data, and association with KEGG pathways. In order to facilitate more specific functional characterization of a gene group, we implement three GO-tree search strategies rather than one as in most existing GO analysis tools. Furthermore, GOAL offers flexibility in deployment. It can be used as a standalone tool, a plug-in to other computational biology tools, or a web server application. Conclusion We developed a functional evaluation software tool, GOAL, to perform functional characterization of a gene group. GOAL offers three GO-tree search strategies and combines its strength in function integration, portability and visualization, and its flexibility in deployment. Furthermore, GOAL can be used to evaluate and compare gene groups as the output from computational biology tools such as clustering algorithms.

  15. The anatomy of E-Learning tools: Does software usability influence learning outcomes?

    Science.gov (United States)

    Van Nuland, Sonya E; Rogers, Kem A

    2016-07-08

    Reductions in laboratory hours have increased the popularity of commercial anatomy e-learning tools. It is critical to understand how the functionality of such tools can influence the mental effort required during the learning process, also known as cognitive load. Using dual-task methodology, two anatomical e-learning tools were examined to determine the effect of their design on cognitive load during two joint learning exercises. A.D.A.M. Interactive Anatomy is a simplistic, two-dimensional tool that presents like a textbook, whereas Netter's 3D Interactive Anatomy has a more complex three-dimensional usability that allows structures to be rotated. It was hypothesized that longer reaction times on an observation task would be associated with the more complex anatomical software (Netter's 3D Interactive Anatomy), indicating a higher cognitive load imposed by the anatomy software, which would result in lower post-test scores. Undergraduate anatomy students from Western University, Canada (n = 70) were assessed using a baseline knowledge test, Stroop observation task response times (a measure of cognitive load), mental rotation test scores, and an anatomy post-test. Results showed that reaction times and post-test outcomes were similar for both tools, whereas mental rotation test scores were positively correlated with post-test values when students used Netter's 3D Interactive Anatomy (P = 0.007), but not when they used A.D.A.M. Interactive Anatomy. This suggests that a simple e-learning tool, such as A.D.A.M. Interactive Anatomy, is as effective as more complicated tools, such as Netter's 3D Interactive Anatomy, and does not academically disadvantage those with poor spatial ability. Anat Sci Educ 9: 378-390. © 2015 American Association of Anatomists.

  16. Development of a User Interface for a Regression Analysis Software Tool

    Science.gov (United States)

    Ulbrich, Norbert Manfred; Volden, Thomas R.

    2010-01-01

    An easy-to -use user interface was implemented in a highly automated regression analysis tool. The user interface was developed from the start to run on computers that use the Windows, Macintosh, Linux, or UNIX operating system. Many user interface features were specifically designed such that a novice or inexperienced user can apply the regression analysis tool with confidence. Therefore, the user interface s design minimizes interactive input from the user. In addition, reasonable default combinations are assigned to those analysis settings that influence the outcome of the regression analysis. These default combinations will lead to a successful regression analysis result for most experimental data sets. The user interface comes in two versions. The text user interface version is used for the ongoing development of the regression analysis tool. The official release of the regression analysis tool, on the other hand, has a graphical user interface that is more efficient to use. This graphical user interface displays all input file names, output file names, and analysis settings for a specific software application mode on a single screen which makes it easier to generate reliable analysis results and to perform input parameter studies. An object-oriented approach was used for the development of the graphical user interface. This choice keeps future software maintenance costs to a reasonable limit. Examples of both the text user interface and graphical user interface are discussed in order to illustrate the user interface s overall design approach.

  17. BYMUR software: a free and open source tool for quantifying and visualizing multi-risk analyses

    Science.gov (United States)

    Tonini, Roberto; Selva, Jacopo

    2013-04-01

    The BYMUR software aims to provide an easy-to-use open source tool for both computing multi-risk and managing/visualizing/comparing all the inputs (e.g. hazard, fragilities and exposure) as well as the corresponding results (e.g. risk curves, risk indexes). For all inputs, a complete management of inter-model epistemic uncertainty is considered. The BYMUR software will be one of the final products provided by the homonymous ByMuR project (http://bymur.bo.ingv.it/) funded by Italian Ministry of Education, Universities and Research (MIUR), focused to (i) provide a quantitative and objective general method for a comprehensive long-term multi-risk analysis in a given area, accounting for inter-model epistemic uncertainty through Bayesian methodologies, and (ii) apply the methodology to seismic, volcanic and tsunami risks in Naples (Italy). More specifically, the BYMUR software will be able to separately account for the probabilistic hazard assessment of different kind of hazardous phenomena, the relative (time-dependent/independent) vulnerabilities and exposure data, and their possible (predefined) interactions: the software will analyze these inputs and will use them to estimate both single- and multi- risk associated to a specific target area. In addition, it will be possible to connect the software to further tools (e.g., a full hazard analysis), allowing a dynamic I/O of results. The use of Python programming language guarantees that the final software will be open source and platform independent. Moreover, thanks to the integration of some most popular and rich-featured Python scientific modules (Numpy, Matplotlib, Scipy) with the wxPython graphical user toolkit, the final tool will be equipped with a comprehensive Graphical User Interface (GUI) able to control and visualize (in the form of tables, maps and/or plots) any stage of the multi-risk analysis. The additional features of importing/exporting data in MySQL databases and/or standard XML formats (for

  18. Robust optimal design of experiments for model discrimination using an interactive software tool.

    Directory of Open Access Journals (Sweden)

    Johannes Stegmaier

    Full Text Available Mathematical modeling of biochemical processes significantly contributes to a better understanding of biological functionality and underlying dynamic mechanisms. To support time consuming and costly lab experiments, kinetic reaction equations can be formulated as a set of ordinary differential equations, which in turn allows to simulate and compare hypothetical models in silico. To identify new experimental designs that are able to discriminate between investigated models, the approach used in this work solves a semi-infinite constrained nonlinear optimization problem using derivative based numerical algorithms. The method takes into account parameter variabilities such that new experimental designs are robust against parameter changes while maintaining the optimal potential to discriminate between hypothetical models. In this contribution we present a newly developed software tool that offers a convenient graphical user interface for model discrimination. We demonstrate the beneficial operation of the discrimination approach and the usefulness of the software tool by analyzing a realistic benchmark experiment from literature. New robust optimal designs that allow to discriminate between the investigated model hypotheses of the benchmark experiment are successfully calculated and yield promising results. The involved robustification approach provides maximally discriminating experiments for the worst parameter configurations, which can be used to estimate the meaningfulness of upcoming experiments. A major benefit of the graphical user interface is the ability to interactively investigate the model behavior and the clear arrangement of numerous variables. In addition to a brief theoretical overview of the discrimination method and the functionality of the software tool, the importance of robustness of experimental designs against parameter variability is demonstrated on a biochemical benchmark problem. The software is licensed under the GNU

  19. Software tools of the Computis European project to process mass spectrometry images.

    Science.gov (United States)

    Robbe, Marie-France; Both, Jean-Pierre; Prideaux, Brendan; Klinkert, Ivo; Picaud, Vincent; Schramm, Thorsten; Hester, Atfons; Guevara, Victor; Stoeckli, Markus; Roempp, Andreas; Heeren, Ron M A; Spengler, Bernhard; Gala, Olivier; Haan, Serge

    2014-01-01

    Among the needs usually expressed by teams using mass spectrometry imaging, one that often arises is that for user-friendly software able to manage huge data volumes quickly and to provide efficient assistance for the interpretation of data. To answer this need, the Computis European project developed several complementary software tools to process mass spectrometry imaging data. Data Cube Explorer provides a simple spatial and spectral exploration for matrix-assisted laser desorption/ionisation-time of flight (MALDI-ToF) and time of flight-secondary-ion mass spectrometry (ToF-SIMS) data. SpectViewer offers visualisation functions, assistance to the interpretation of data, classification functionalities, peak list extraction to interrogate biological database and image overlay, and it can process data issued from MALDI-ToF, ToF-SIMS and desorption electrospray ionisation (DESI) equipment. EasyReg2D is able to register two images, in American Standard Code for Information Interchange (ASCII) format, issued from different technologies. The collaboration between the teams was hampered by the multiplicity of equipment and data formats, so the project also developed a common data format (imzML) to facilitate the exchange of experimental data and their interpretation by the different software tools. The BioMap platform for visualisation and exploration of MALDI-ToF and DESI images was adapted to parse imzML files, enabling its access to all project partners and, more globally, to a larger community of users. Considering the huge advantages brought by the imzML standard format, a specific editor (vBrowser) for imzML files and converters from proprietary formats to imzML were developed to enable the use of the imzML format by a broad scientific community. This initiative paves the way toward the development of a large panel of software tools able to process mass spectrometry imaging datasets in the future.

  20. Lightweight Methods for Effective Verification of Software Product Lines with Off-the-Shelf Tools

    DEFF Research Database (Denmark)

    Iosif-Lazar, Alexandru Florin

    2017-01-01

    Certification is the process of assessing the quality of a product and whether it meets a set of requirements and adheres to functional and safety standards. I is often legally required to provide guarantee for human safety and to make the product available on the market. The certification process...... relies on objective evidence of quality, which is produced by using qualified and state-of-the-art tools and verification and validation techniques. Software product line (SPL) engineering distributes costs among similar products that are developed simultaneously. However, SPL certification faces major...... variability. In this thesis, I identify three key challenges to SPL certification. I address them by proposing effective, lightweight methods that employ off-the-shelf tools for supporting and verifying SPL development. All the proposed methods are generalizable to a large part of existing tools...

  1. SOFI Simulation Tool: A Software Package for Simulating and Testing Super-Resolution Optical Fluctuation Imaging.

    Science.gov (United States)

    Girsault, Arik; Lukes, Tomas; Sharipov, Azat; Geissbuehler, Stefan; Leutenegger, Marcel; Vandenberg, Wim; Dedecker, Peter; Hofkens, Johan; Lasser, Theo

    2016-01-01

    Super-resolution optical fluctuation imaging (SOFI) allows one to perform sub-diffraction fluorescence microscopy of living cells. By analyzing the acquired image sequence with an advanced correlation method, i.e. a high-order cross-cumulant analysis, super-resolution in all three spatial dimensions can be achieved. Here we introduce a software tool for a simple qualitative comparison of SOFI images under simulated conditions considering parameters of the microscope setup and essential properties of the biological sample. This tool incorporates SOFI and STORM algorithms, displays and describes the SOFI image processing steps in a tutorial-like fashion. Fast testing of various parameters simplifies the parameter optimization prior to experimental work. The performance of the simulation tool is demonstrated by comparing simulated results with experimentally acquired data.

  2. Utilization of Software Tools for Uncertainty Calculation in Measurement Science Education

    Science.gov (United States)

    Zangl, Hubert; Zine-Zine, Mariam; Hoermaier, Klaus

    2015-02-01

    Despite its importance, uncertainty is often neglected by practitioners in the design of system even in safety critical applications. Thus, problems arising from uncertainty may only be identified late in the design process and thus lead to additional costs. Although there exists numerous tools to support uncertainty calculation, reasons for limited usage in early design phases may be low awareness of the existence of the tools and insufficient training in the practical application. We present a teaching philosophy that addresses uncertainty from the very beginning of teaching measurement science, in particular with respect to the utilization of software tools. The developed teaching material is based on the GUM method and makes use of uncertainty toolboxes in the simulation environment. Based on examples in measurement science education we discuss advantages and disadvantages of the proposed teaching philosophy and include feedback from students.

  3. Integration of life cycle assessment software with tools for economic and sustainability analyses and process simulation for sustainable process design

    DEFF Research Database (Denmark)

    Kalakul, Sawitree; Malakul, Pomthong; Siemanond, Kitipat

    2014-01-01

    The sustainable future of the world challenges engineers to develop chemical process designs that are not only technically and economically feasible but also environmental friendly. Life cycle assessment (LCA) is a tool for identifying and quantifying environmental impacts of the chemical product....... Although there are several commercial LCA tools, there is still a need for a simple LCA software that can be integrated with process design tools. In this paper, a new LCA software, LCSoft, is developed for evaluation of chemical, petrochemical, and biochemical processes with options for integration...... with other process design tools such as sustainable design (SustainPro), economic analysis (ECON) and process simulation. The software framework contains four main tools: Tool-I is for life cycle inventory (LCI) knowledge management that enables easy maintenance and future expansion of the LCI database; Tool...

  4. DVS-SOFTWARE: An Effective Tool for Applying Highly Parallelized Hardware To Computational Geophysics

    Science.gov (United States)

    Herrera, I.; Herrera, G. S.

    2015-12-01

    Most geophysical systems are macroscopic physical systems. The behavior prediction of such systems is carried out by means of computational models whose basic models are partial differential equations (PDEs) [1]. Due to the enormous size of the discretized version of such PDEs it is necessary to apply highly parallelized super-computers. For them, at present, the most efficient software is based on non-overlapping domain decomposition methods (DDM). However, a limiting feature of the present state-of-the-art techniques is due to the kind of discretizations used in them. Recently, I. Herrera and co-workers using 'non-overlapping discretizations' have produced the DVS-Software which overcomes this limitation [2]. The DVS-software can be applied to a great variety of geophysical problems and achieves very high parallel efficiencies (90%, or so [3]). It is therefore very suitable for effectively applying the most advanced parallel supercomputers available at present. In a parallel talk, in this AGU Fall Meeting, Graciela Herrera Z. will present how this software is being applied to advance MOD-FLOW. Key Words: Parallel Software for Geophysics, High Performance Computing, HPC, Parallel Computing, Domain Decomposition Methods (DDM)REFERENCES [1]. Herrera Ismael and George F. Pinder, Mathematical Modelling in Science and Engineering: An axiomatic approach", John Wiley, 243p., 2012. [2]. Herrera, I., de la Cruz L.M. and Rosas-Medina A. "Non Overlapping Discretization Methods for Partial, Differential Equations". NUMER METH PART D E, 30: 1427-1454, 2014, DOI 10.1002/num 21852. (Open source) [3]. Herrera, I., & Contreras Iván "An Innovative Tool for Effectively Applying Highly Parallelized Software To Problems of Elasticity". Geofísica Internacional, 2015 (In press)

  5. Emerging role of bioinformatics tools and software in evolution of clinical research

    Directory of Open Access Journals (Sweden)

    Supreet Kaur Gill

    2016-01-01

    Full Text Available Clinical research is making toiling efforts for promotion and wellbeing of the health status of the people. There is a rapid increase in number and severity of diseases like cancer, hepatitis, HIV etc, resulting in high morbidity and mortality. Clinical research involves drug discovery and development whereas clinical trials are performed to establish safety and efficacy of drugs. Drug discovery is a long process starting with the target identification, validation and lead optimization. This is followed by the preclinical trials, intensive clinical trials and eventually post marketing vigilance for drug safety. Softwares and the bioinformatics tools play a great role not only in the drug discovery but also in drug development. It involves the use of informatics in the development of new knowledge pertaining to health and disease, data management during clinical trials and to use clinical data for secondary research. In addition, new technology likes molecular docking, molecular dynamics simulation, proteomics and quantitative structure activity relationship in clinical research results in faster and easier drug discovery process. During the preclinical trials, the software is used for randomization to remove bias and to plan study design. In clinical trials software like electronic data capture, Remote data capture and electronic case report form (eCRF is used to store the data. eClinical, Oracle clinical are software used for clinical data management and for statistical analysis of the data. After the drug is marketed the safety of a drug could be monitored by drug safety software like Oracle Argus or ARISg. Therefore, softwares are used from the very early stages of drug designing, to drug development, clinical trials and during pharmacovigilance. This review describes different aspects related to application of computers and bioinformatics in drug designing, discovery and development, formulation designing and clinical research.

  6. Data Assimilation Tools for CO2 Reservoir Model Development – A Review of Key Data Types, Analyses, and Selected Software

    Energy Technology Data Exchange (ETDEWEB)

    Rockhold, Mark L.; Sullivan, E. C.; Murray, Christopher J.; Last, George V.; Black, Gary D.

    2009-09-30

    Pacific Northwest National Laboratory (PNNL) has embarked on an initiative to develop world-class capabilities for performing experimental and computational analyses associated with geologic sequestration of carbon dioxide. The ultimate goal of this initiative is to provide science-based solutions for helping to mitigate the adverse effects of greenhouse gas emissions. This Laboratory-Directed Research and Development (LDRD) initiative currently has two primary focus areas—advanced experimental methods and computational analysis. The experimental methods focus area involves the development of new experimental capabilities, supported in part by the U.S. Department of Energy’s (DOE) Environmental Molecular Science Laboratory (EMSL) housed at PNNL, for quantifying mineral reaction kinetics with CO2 under high temperature and pressure (supercritical) conditions. The computational analysis focus area involves numerical simulation of coupled, multi-scale processes associated with CO2 sequestration in geologic media, and the development of software to facilitate building and parameterizing conceptual and numerical models of subsurface reservoirs that represent geologic repositories for injected CO2. This report describes work in support of the computational analysis focus area. The computational analysis focus area currently consists of several collaborative research projects. These are all geared towards the development and application of conceptual and numerical models for geologic sequestration of CO2. The software being developed for this focus area is referred to as the Geologic Sequestration Software Suite or GS3. A wiki-based software framework is being developed to support GS3. This report summarizes work performed in FY09 on one of the LDRD projects in the computational analysis focus area. The title of this project is Data Assimilation Tools for CO2 Reservoir Model Development. Some key objectives of this project in FY09 were to assess the current state

  7. A software tool of digital tomosynthesis application for patient positioning in radiotherapy.

    Science.gov (United States)

    Yan, Hui; Dai, Jian-Rong

    2016-03-08

    Digital Tomosynthesis (DTS) is an image modality in reconstructing tomographic images from two-dimensional kV projections covering a narrow scan angles. Comparing with conventional cone-beam CT (CBCT), it requires less time and radiation dose in data acquisition. It is feasible to apply this technique in patient positioning in radiotherapy. To facilitate its clinical application, a software tool was developed and the reconstruction processes were accelerated by graphic process-ing unit (GPU). Two reconstruction and two registration processes are required for DTS application which is different from conventional CBCT application which requires one image reconstruction process and one image registration process. The reconstruction stage consists of productions of two types of DTS. One type of DTS is reconstructed from cone-beam (CB) projections covering a narrow scan angle and is named onboard DTS (ODTS), which represents the real patient position in treatment room. Another type of DTS is reconstructed from digitally reconstructed radiography (DRR) and is named reference DTS (RDTS), which represents the ideal patient position in treatment room. Prior to the reconstruction of RDTS, The DRRs are reconstructed from planning CT using the same acquisition setting of CB projections. The registration stage consists of two matching processes between ODTS and RDTS. The target shift in lateral and longitudinal axes are obtained from the matching between ODTS and RDTS in coronal view, while the target shift in longitudinal and vertical axes are obtained from the matching between ODTS and RDTS in sagittal view. In this software, both DRR and DTS reconstruction algorithms were implemented on GPU environments for acceleration purpose. The comprehensive evaluation of this software tool was performed including geometric accuracy, image quality, registration accuracy, and reconstruction efficiency. The average correlation coefficient between DRR/DTS generated by GPU-based algorithm

  8. Internet-Based Software Tools for Analysis and Processing of LIDAR Point Cloud Data via the OpenTopography Portal

    Science.gov (United States)

    Nandigam, V.; Crosby, C. J.; Baru, C.; Arrowsmith, R.

    2009-12-01

    LIDAR is an excellent example of the new generation of powerful remote sensing data now available to Earth science researchers. Capable of producing digital elevation models (DEMs) more than an order of magnitude higher resolution than those currently available, LIDAR data allows earth scientists to study the processes that contribute to landscape evolution at resolutions not previously possible, yet essential for their appropriate representation. Along with these high-resolution datasets comes an increase in the volume and complexity of data that the user must efficiently manage and process in order for it to be scientifically useful. Although there are expensive commercial LIDAR software applications available, processing and analysis of these datasets are typically computationally inefficient on the conventional hardware and software that is currently available to most of the Earth science community. We have designed and implemented an Internet-based system, the OpenTopography Portal, that provides integrated access to high-resolution LIDAR data as well as web-based tools for processing of these datasets. By using remote data storage and high performance compute resources, the OpenTopography Portal attempts to simplify data access and standard LIDAR processing tasks for the Earth Science community. The OpenTopography Portal allows users to access massive amounts of raw point cloud LIDAR data as well as a suite of DEM generation tools to enable users to generate custom digital elevation models to best fit their science applications. The Cyberinfrastructure software tools for processing the data are freely available via the portal and conveniently integrated with the data selection in a single user-friendly interface. The ability to run these tools on powerful Cyberinfrastructure resources instead of their own labs provides a huge advantage in terms of performance and compute power. The system also encourages users to explore data processing methods and the

  9. GMFilter and SXTestPlate: software tools for improving the SNPlex™ genotyping system

    Directory of Open Access Journals (Sweden)

    Schreiber Stefan

    2009-03-01

    Full Text Available Abstract Background Genotyping of single-nucleotide polymorphisms (SNPs is a fundamental technology in modern genetics. The SNPlex™ mid-throughput genotyping system (Applied Biosystems, Foster City, CA, USA enables the multiplexed genotyping of up to 48 SNPs simultaneously in a single DNA sample. The high level of automation and the large amount of data produced in a high-throughput laboratory require advanced software tools for quality control and workflow management. Results We have developed two programs, which address two main aspects of quality control in a SNPlex™ genotyping environment: GMFilter improves the analysis of SNPlex™ plates by removing wells with a low overall signal intensity. It enables scientists to automatically process the raw data in a standardized way before analyzing a plate with the proprietary GeneMapper software from Applied Biosystems. SXTestPlate examines the genotype concordance of a SNPlex™ test plate, which was typed with a control SNP set. This program allows for regular quality control checks of a SNPlex™ genotyping platform. It is compatible to other genotyping methods as well. Conclusion GMFilter and SXTestPlate provide a valuable tool set for laboratories engaged in genotyping based on the SNPlex™ system. The programs enhance the analysis of SNPlex™ plates with the GeneMapper software and enable scientists to evaluate the performance of their genotyping platform.

  10. TESPI (Tool for Environmental Sound Product Innovation): a simplified software tool to support environmentally conscious design in SMEs

    Science.gov (United States)

    Misceo, Monica; Buonamici, Roberto; Buttol, Patrizia; Naldesi, Luciano; Grimaldi, Filomena; Rinaldi, Caterina

    2004-12-01

    TESPI (Tool for Environmental Sound Product Innovation) is the prototype of a software tool developed within the framework of the "eLCA" project. The project, (www.elca.enea.it)financed by the European Commission, is realising "On line green tools and services for Small and Medium sized Enterprises (SMEs)". The implementation by SMEs of environmental product innovation (as fostered by the European Integrated Product Policy, IPP) needs specific adaptation to their economic model, their knowledge of production and management processes and their relationships with innovation and the environment. In particular, quality and costs are the main driving forces of innovation in European SMEs, and well known barriers exist to the adoption of an environmental approach in the product design. Starting from these considerations, the TESPI tool has been developed to support the first steps of product design taking into account both the quality and the environment. Two main issues have been considered: (i) classic Quality Function Deployment (QFD) can hardly be proposed to SMEs; (ii) the environmental aspects of the product life cycle need to be integrated with the quality approach. TESPI is a user friendly web-based tool, has a training approach and applies to modular products. Users are guided through the investigation of the quality aspects of their product (customer"s needs and requirements fulfilment) and the identification of the key environmental aspects in the product"s life cycle. A simplified check list allows analyzing the environmental performance of the product. Help is available for a better understanding of the analysis criteria. As a result, the significant aspects for the redesign of the product are identified.

  11. Interactive software tool to comprehend the calculation of optimal sequence alignments with dynamic programming.

    Science.gov (United States)

    Ibarra, Ignacio L; Melo, Francisco

    2010-07-01

    Dynamic programming (DP) is a general optimization strategy that is successfully used across various disciplines of science. In bioinformatics, it is widely applied in calculating the optimal alignment between pairs of protein or DNA sequences. These alignments form the basis of new, verifiable biological hypothesis. Despite its importance, there are no interactive tools available for training and education on understanding the DP algorithm. Here, we introduce an interactive computer application with a graphical interface, for the purpose of educating students about DP. The program displays the DP scoring matrix and the resulting optimal alignment(s), while allowing the user to modify key parameters such as the values in the similarity matrix, the sequence alignment algorithm version and the gap opening/extension penalties. We hope that this software will be useful to teachers and students of bioinformatics courses, as well as researchers who implement the DP algorithm for diverse applications. The software is freely available at: http:/melolab.org/sat. The software is written in the Java computer language, thus it runs on all major platforms and operating systems including Windows, Mac OS X and LINUX. All inquiries or comments about this software should be directed to Francisco Melo at fmelo@bio.puc.cl.

  12. Web-based software tool for constraint-based design specification of synthetic biological systems.

    Science.gov (United States)

    Oberortner, Ernst; Densmore, Douglas

    2015-06-19

    miniEugene provides computational support for solving combinatorial design problems, enabling users to specify and enumerate designs for novel biological systems based on sets of biological constraints. This technical note presents a brief tutorial for biologists and software engineers in the field of synthetic biology on how to use miniEugene. After reading this technical note, users should know which biological constraints are available in miniEugene, understand the syntax and semantics of these constraints, and be able to follow a step-by-step guide to specify the design of a classical synthetic biological system-the genetic toggle switch.1 We also provide links and references to more information on the miniEugene web application and the integration of the miniEugene software library into sophisticated Computer-Aided Design (CAD) tools for synthetic biology ( www.eugenecad.org ).

  13. Techniques and tools for measuring energy efficiency of scientific software applications

    CERN Document Server

    Abdurachmanov, David; Eulisse, Giulio; Knight, Robert; Niemi, Tapio; Nurminen, Jukka K.; Nyback, Filip; Pestana, Goncalo; Ou, Zhonghong; Khan, Kashif

    2014-01-01

    The scale of scientific High Performance Computing (HPC) and High Throughput Computing (HTC) has increased significantly in recent years, and is becoming sensitive to total energy use and cost. Energy-efficiency has thus become an important concern in scientific fields such as High Energy Physics (HEP). There has been a growing interest in utilizing alternate architectures, such as low power ARM processors, to replace traditional Intel x86 architectures. Nevertheless, even though such solutions have been successfully used in mobile applications with low I/O and memory demands, it is unclear if they are suitable and more energy-efficient in the scientific computing environment. Furthermore, there is a lack of tools and experience to derive and compare power consumption between the architectures for various workloads, and eventually to support software optimizations for energy efficiency. To that end, we have performed several physical and software-based measurements of workloads from HEP applications running o...

  14. Fuzzy cognitive map software tool for treatment management of uncomplicated urinary tract infection.

    Science.gov (United States)

    Papageorgiou, Elpiniki I

    2012-03-01

    Uncomplicated urinary tract infection (uUTI) is a bacterial infection that affects individuals with normal urinary tracts from both structural and functional perspective. The appropriate antibiotics and treatment suggestions to individuals suffer of uUTI is an important and complex task that demands a special attention. How to decrease the unsafely use of antibiotics and their consumption is an important issue in medical treatment. Aiming to model medical decision making for uUTI treatment, an innovative and flexible approach called fuzzy cognitive maps (FCMs) is proposed to handle with uncertainty and missing information. The FCM is a promising technique for modeling knowledge and/or medical guidelines/treatment suggestions and reasoning with it. A software tool, namely FCM-uUTI DSS, is investigated in this work to produce a decision support module for uUTI treatment management. The software tool was tested (evaluated) in a number of 38 patient cases, showing its functionality and demonstrating that the use of the FCMs as dynamic models is reliable and good. The results have shown that the suggested FCM-uUTI tool gives a front-end decision on antibiotics' suggestion for uUTI treatment and are considered as helpful references for physicians and patients. Due to its easy graphical representation and simulation process the proposed FCM formalization could be used to make the medical knowledge widely available through computer consultation systems.

  15. TScratch: a novel and simple software tool for automated analysis of monolayer wound healing assays.

    Science.gov (United States)

    Gebäck, Tobias; Schulz, Martin Michael Peter; Koumoutsakos, Petros; Detmar, Michael

    2009-04-01

    Cell migration plays a major role in development, physiology, and disease, and is frequently evaluated in vitro by the monolayer wound healing assay. The assay analysis, however, is a time-consuming task that is often performed manually. In order to accelerate this analysis, we have developed TScratch, a new, freely available image analysis technique and associated software tool that uses the fast discrete curvelet transform to automate the measurement of the area occupied by cells in the images. This tool helps to significantly reduce the time needed for analysis and enables objective and reproducible quantification of assays. The software also offers a graphical user interface which allows easy inspection of analysis results and, if desired, manual modification of analysis parameters. The automated analysis was validated by comparing its results with manual-analysis results for a range of different cell lines. The comparisons demonstrate a close agreement for the vast majority of images that were examined and indicate that the present computational tool can reproduce statistically significant results in experiments with well-known cell migration inhibitors and enhancers.

  16. Analyst Tools and Quality Control Software for the ARM Data System

    Energy Technology Data Exchange (ETDEWEB)

    Moore, Sean; Hughes, Gary

    2008-07-31

    Mission Research develops analyst tools and automated quality control software in order to assist the Atmospheric Radiation Measurement (ARM) Data Quality Office with their data inspection tasks. We have developed web-based data analysis and visualization tools such as the interactive plotting program NCVweb, various diagnostic plot browsers, and a datastream processing status application. These tools allow even novice ARM researchers to be productive with ARM data with only minimal effort. We also contribute to the ARM Data Quality Office by analyzing ARM data streams, developing new quality control metrics, new diagnostic plots, and integrating this information into DQ HandS - the Data Quality Health and Status web-based explorer. We have developed several ways to detect outliers in ARM data streams and have written software to run in an automated fashion to flag these outliers. We have also embarked on a system to comprehensively generate long time-series plots, frequency distributions, and other relevant statistics for scientific and engineering data in most high-level, publicly available ARM data streams. Furthermore, frequency distributions categorized by month or by season are made available to help define valid data ranges specific to those time domains. These statistics can be used to set limits that when checked, will improve upon the reporting of suspicious data and the early detection of instrument malfunction. The statistics and proposed limits are stored in a database for easy reporting, refining, and for use by other processes. Web-based applications to view the results are also available.

  17. Software tools for manipulating fe mesh, virtual surgery and post-processing

    Directory of Open Access Journals (Sweden)

    Milašinović Danko Z.

    2009-01-01

    Full Text Available This paper describes a set of software tools which we developed for the calculation of fluid flow through cardiovascular organs. Our tools work with medical data from a CT scanner, but could be used with any other 3D input data. For meshing we used a Tetgen tetrahedral mesh generator, as well as a mesh re-generator that we have developed for conversion of tetrahedral elements into bricks. After adequate meshing we used our PAKF solver for calculation of fluid flow. For human-friendly presentation of results we developed a set of post-processing software tools. With modification of 2D mesh (boundary of cardiovascular organ it is possible to do virtual surgery, so in a case of an aorta with aneurism, which we had received from University Clinical center in Heidelberg from a multi-slice 64-CT scanner, we removed the aneurism and ran calculations on both geometrical models afterwards. The main idea of this methodology is creating a system that could be used in clinics.

  18. A software tool for advanced MRgFUS prostate therapy planning and follow up

    Science.gov (United States)

    van Straaten, Dörte; Hoogenboom, Martijn; van Amerongen, Martinus J.; Weiler, Florian; Issawi, Jumana Al; Günther, Matthias; Fütterer, Jurgen; Jenne, Jürgen W.

    2017-03-01

    US guided HIFU/FUS ablation for the therapy of prostate cancer is a clinical established method, while MR guided HIFU/FUS applications for prostate recently started clinical evaluation. Even if MRI examination is an excellent diagnostic tool for prostate cancer, it is a time consuming procedure and not practicable within an MRgFUS therapy session. The aim of our ongoing work is to develop software to support therapy planning and post-therapy follow-up for MRgFUS on localized prostate cancer, based on multi-parametric MR protocols. The clinical workflow of diagnosis, therapy and follow-up of MR guided FUS on prostate cancer was deeply analyzed. Based on this, the image processing workflow was designed and all necessary components, e.g. GUI, viewer, registration tools etc. were defined and implemented. The software bases on MeVisLab with several implemented C++ modules for the image processing tasks. The developed software, called LTC (Local Therapy Control) will register and visualize automatically all images (T1w, T2w, DWI etc.) and ADC or perfusion maps gained from the diagnostic MRI session. This maximum of diagnostic information helps to segment all necessary ROIs, e.g. the tumor, for therapy planning. Final therapy planning will be performed based on these segmentation data in the following MRgFUS therapy session. In addition, the developed software should help to evaluate the therapy success, by synchronization and display of pre-therapeutic, therapy and follow-up image data including the therapy plan and thermal dose information. In this ongoing project, the first stand-alone prototype was completed and will be clinically evaluated.

  19. DSC: software tool for simulation-based design of control strategies applied to wastewater treatment plants.

    Science.gov (United States)

    Ruano, M V; Ribes, J; Seco, A; Ferrer, J

    2011-01-01

    This paper presents a computer tool called DSC (Simulation based Controllers Design) that enables an easy design of control systems and strategies applied to wastewater treatment plants. Although the control systems are developed and evaluated by simulation, this tool aims to facilitate the direct implementation of the designed control system to the PC of the full-scale WWTP (wastewater treatment plants). The designed control system can be programmed in a dedicated control application and can be connected to either the simulation software or the SCADA of the plant. To this end, the developed DSC incorporates an OPC server (OLE for process control) which facilitates an open-standard communication protocol for different industrial process applications. The potential capabilities of the DSC tool are illustrated through the example of a full-scale application. An aeration control system applied to a nutrient removing WWTP was designed, tuned and evaluated with the DSC tool before its implementation in the full scale plant. The control parameters obtained by simulation were suitable for the full scale plant with only few modifications to improve the control performance. With the DSC tool, the control systems performance can be easily evaluated by simulation. Once developed and tuned by simulation, the control systems can be directly applied to the full-scale WWTP.

  20. Software Tool for Automated Failure Modes and Effects Analysis (FMEA) of Hydraulic Systems

    DEFF Research Database (Denmark)

    Stecki, J. S.; Conrad, Finn; Oh, B.

    2002-01-01

    Offshore, marine,aircraft and other complex engineering systems operate in harsh environmental and operational conditions and must meet stringent requirements of reliability, safety and maintability. To reduce the hight costs of development of new systems in these fields improved the design...... management techniques and a vast array of computer aided techniques are applied during design and testing stages. The paper present and discusses the research and development of a software tool for automated failure mode and effects analysis - FMEA - of hydraulic systems. The paper explains the underlying...

  1. Computation of Internal Fluid Flows in Channels Using the CFD Software Tool FlowVision

    CERN Document Server

    Kochevsky, A N

    2004-01-01

    The article describes the CFD software tool FlowVision (OOO "Tesis", Moscow). The model equations used for this research are the set of Reynolds and continuity equations and equations of the standard k - e turbulence model. The aim of the paper was testing of FlowVision by comparing the computational results for a number of simple internal channel fluid flows with known experimental data. The test cases are non-swirling and swirling flows in pipes and diffusers, flows in stationary and rotating bends. Satisfactory correspondence of results was obtained both for flow patterns and respective quantitative values.

  2. A Tale of Two Cultures: Cross Cultural Comparison in Learning the Prezi Presentation Software Tool in the US and Norway

    Science.gov (United States)

    Brock, Sabra; Brodahl, Cornelia

    2013-01-01

    Presentation software is an important tool for both student and professorial communicators. PowerPoint has been the standard since it was introduced in 1990. However, new "improved" software platforms are emerging. Prezi is one of these, claiming to remedy the linear thinking that underlies PowerPoint by creating one canvas and…

  3. A Tale of Two Cultures: Cross Cultural Comparison in Learning the Prezi Presentation Software Tool in the US and Norway

    Science.gov (United States)

    Brock, Sabra; Brodahl, Cornelia

    2013-01-01

    Presentation software is an important tool for both student and professorial communicators. PowerPoint has been the standard since it was introduced in 1990. However, new "improved" software platforms are emerging. Prezi is one of these, claiming to remedy the linear thinking that underlies PowerPoint by creating one canvas and…

  4. NEuronMOrphological analysis tool: open-source software for quantitative morphometrics

    Directory of Open Access Journals (Sweden)

    Lucia eBilleci

    2013-02-01

    Full Text Available Morphometric analysis of neurons and brain tissue is relevant to the study of neuron circuitry development during the first phases of brain growth or for probing the link between microstructural morphology and degenerative diseases. As neural imaging techniques become ever more sophisticated, so does the amount and complexity of data generated. The NEuronMOrphological analysis tool NEMO was purposely developed to handle and process large numbers of optical microscopy image files of neurons in culture or slices in order to automatically run batch routines, store data and apply multivariate classification and feature extraction using3-way principal component analysis. Here we describe the software's main features, underlining the differences between NEMO and other commercial and non-commercial image processing tools, and show an example of how NEMO can be used to classify neurons from wild-type mice and from animal models of autism.

  5. Verification of visual odometry algorithms with an OpenGL-based software tool

    Science.gov (United States)

    Skulimowski, Piotr; Strumillo, Pawel

    2015-05-01

    We present a software tool called a stereovision egomotion sequence generator that was developed for testing visual odometry (VO) algorithms. Various approaches to single and multicamera VO algorithms are reviewed first, and then a reference VO algorithm that has served to demonstrate the program's features is described. The program offers simple tools for defining virtual static three-dimensional scenes and arbitrary six degrees of freedom motion paths within such scenes and output sequences of stereovision images, disparity ground-truth maps, and segmented scene images. A simple script language is proposed that simplifies tests of VO algorithms for user-defined scenarios. The program's capabilities are demonstrated by testing a reference VO technique that employs stereoscopy and feature tracking.

  6. Dynamic Software Updating with Gosh! Current Status and the Road Ahead

    DEFF Research Database (Denmark)

    Gregersen, Allan Raundahl; Rasmussen, Michael; Jørgensen, Bo Nørregaard

    2013-01-01

    Any non-trivial software system has to be upgraded regularly to incorporate bug fixes and security patches or simply to keep up with the inevitable evolution in end-user requirements. Software upgrading is challenging, especially when it comes to online upgrading of running systems. In this paper......, we present the current status of Gosh!, a dynamic-software-updating system for Java, which provides comprehensive support for changing class definitions of live objects, including adding, removing and moving fields, methods, classes and interfaces anywhere in the inheritance hierarchy. Prior...... to the acquisition by zeroturnaround.com, Gosh! was known as Javeleon. In this paper we demonstrate the capabilities of Gosh! by performing a dynamic updating experiment on five consecutive revisions of the classical arcade game Breakout. Based on the result of this experiment we show that dynamic updating of class...

  7. Homology Modeling a Fast Tool for Drug Discovery: Current Perspectives

    Science.gov (United States)

    Vyas, V. K.; Ukawala, R. D.; Ghate, M.; Chintha, C.

    2012-01-01

    Major goal of structural biology involve formation of protein-ligand complexes; in which the protein molecules act energetically in the course of binding. Therefore, perceptive of protein-ligand interaction will be very important for structure based drug design. Lack of knowledge of 3D structures has hindered efforts to understand the binding specificities of ligands with protein. With increasing in modeling software and the growing number of known protein structures, homology modeling is rapidly becoming the method of choice for obtaining 3D coordinates of proteins. Homology modeling is a representation of the similarity of environmental residues at topologically corresponding positions in the reference proteins. In the absence of experimental data, model building on the basis of a known 3D structure of a homologous protein is at present the only reliable method to obtain the structural information. Knowledge of the 3D structures of proteins provides invaluable insights into the molecular basis of their functions. The recent advances in homology modeling, particularly in detecting and aligning sequences with template structures, distant homologues, modeling of loops and side chains as well as detecting errors in a model contributed to consistent prediction of protein structure, which was not possible even several years ago. This review focused on the features and a role of homology modeling in predicting protein structure and described current developments in this field with victorious applications at the different stages of the drug design and discovery. PMID:23204616

  8. Software Tool Support to Specify and Verify Scientific Sensor Data Properties to Improve Anomaly Detection

    Science.gov (United States)

    Gallegos, I.; Gates, A. Q.; Tweedie, C.; Cybershare

    2010-12-01

    Advancements in scientific sensor data acquisition technologies, such as wireless sensor networks and robotic trams equipped with sensors, are increasing the amount of data being collected at field sites . This elevates the challenges of verifying the quality of streamed data and monitoring the correct operation of the instrumentation. Without the ability to evaluate the data collection process at near real-time, scientists can lose valuable time and data. In addition, scientists have to rely on their knowledge and experience in the field to evaluate data quality. Such knowledge is rarely shared or reused by other scientists mostly because of the lack of a well-defined methodology and tool support. Numerous scientific projects address anomaly detection, mostly as part of the verification system’s source code; however, anomaly detection properties, which often are embedded or hard-coded in the source code, are difficult to refine. In addition, a software developer is required to modify the source code every time a new anomaly detection property or a modification to an existing property is needed. This poster describes the tool support that has been developed, based on software engineering techniques, to address these challenges. The overall tool support allows scientists to specify and reuse anomaly detection properties generated using the specification tool and to use the specified properties to conduct automated anomaly detection at near-real time. The anomaly-detection mechanism is independent of the system used to collect the sensor data. With guidance provided by a classification and categorization of anomaly-detection properties, the user specifies properties on scientific sensor data. The properties, which can be associated with particular field sites or instrumentation, document knowledge about data anomalies that otherwise would have limited availability to the scientific community.

  9. Easyverifier 1.0: a software tool for revising scientific articles’ bibliographical citations

    Directory of Open Access Journals (Sweden)

    Freddy Alberto Correa Riveros

    2010-05-01

    Full Text Available The first academic revolution which occurred in developed countries during the late 19th century made research a university func- tion in addition to the traditional task of teaching. A second academic revolution has tried to transform the university into a tea- ching, research and socio-economic development enterprise. The scientific article has become an excellent practical means for the movement of new knowledge between the university and the socioeconomic environment. This work had two purposes. One was to present some general considerations regarding research and the scientific article. The second was to provide information about a computational tool which supports revising scientific articles’ citations; this step is usually done manually and requires some experience. The software allows two text files to be read, one containing the scientific article’s content and another one the bibliography. A report is then generated allowing the authors mentioned in the text but not indexed in the bibliography to be identified and to determine which authors have been mentioned in the bibliography but who have not been mentioned in the text of the article. The software allows researchers and journal coordinators to detect reference errors among citations in the text and the bibliographical references. The steps to develop the software were: analysis, design, implementation and use. For the analysis it was important the revision of the literature about elaboration of citations in scientific documents.

  10. MyView2, a new visualization software tool for analysis of LHD data

    Energy Technology Data Exchange (ETDEWEB)

    Moon, Chanho, E-mail: moon@nifs.ac.jp; Yoshinuma, Mikirou; Emoto, Masahiko; Ida, Katsumi

    2016-03-15

    The Large Helical Device (LHD) at the National Institute for Fusion Science (NIFS) is the world’s largest superconducting helical fusion device, providing a scientific research center to elucidate important physics research such as plasma transport, turbulence dynamics, and other topics. Furthermore, many types of advanced diagnostic devices are used to measure the confinement plasma characteristics, and these valuable physical data are registered over the 131,000 discharges in the LHD database. However, it is difficult to investigate the experimental data even though much physical data has been registered. In order to improve the efficiency for investigating plasma physics in LHD, we have developed a new data visualization software, MyView2, which consists of Python-based modules that can be easily set up and updated. MyView2 provides immediate access to experimental results, cross-shot analysis, and a collaboration point for scientific research. In particular, the MyView2 software is a portable structure for making viewable LHD experimental data in on- and off-site web servers, which is a capability not previously available in any general use tool. We will also discuss the benefits of using the MyView2 software for in-depth analysis of LHD experimental data.

  11. ConfocalCheck - A Software Tool for the Automated Monitoring of Confocal Microscope Performance

    Science.gov (United States)

    Hng, Keng Imm; Dormann, Dirk

    2013-01-01

    Laser scanning confocal microscopy has become an invaluable tool in biomedical research but regular quality testing is vital to maintain the system’s performance for diagnostic and research purposes. Although many methods have been devised over the years to characterise specific aspects of a confocal microscope like measuring the optical point spread function or the field illumination, only very few analysis tools are available. Our aim was to develop a comprehensive quality assurance framework ranging from image acquisition to automated analysis and documentation. We created standardised test data to assess the performance of the lasers, the objective lenses and other key components required for optimum confocal operation. The ConfocalCheck software presented here analyses the data fully automatically. It creates numerous visual outputs indicating potential issues requiring further investigation. By storing results in a web browser compatible file format the software greatly simplifies record keeping allowing the operator to quickly compare old and new data and to spot developing trends. We demonstrate that the systematic monitoring of confocal performance is essential in a core facility environment and how the quantitative measurements obtained can be used for the detailed characterisation of system components as well as for comparisons across multiple instruments. PMID:24224017

  12. ConfocalCheck--a software tool for the automated monitoring of confocal microscope performance.

    Directory of Open Access Journals (Sweden)

    Keng Imm Hng

    Full Text Available Laser scanning confocal microscopy has become an invaluable tool in biomedical research but regular quality testing is vital to maintain the system's performance for diagnostic and research purposes. Although many methods have been devised over the years to characterise specific aspects of a confocal microscope like measuring the optical point spread function or the field illumination, only very few analysis tools are available. Our aim was to develop a comprehensive quality assurance framework ranging from image acquisition to automated analysis and documentation. We created standardised test data to assess the performance of the lasers, the objective lenses and other key components required for optimum confocal operation. The ConfocalCheck software presented here analyses the data fully automatically. It creates numerous visual outputs indicating potential issues requiring further investigation. By storing results in a web browser compatible file format the software greatly simplifies record keeping allowing the operator to quickly compare old and new data and to spot developing trends. We demonstrate that the systematic monitoring of confocal performance is essential in a core facility environment and how the quantitative measurements obtained can be used for the detailed characterisation of system components as well as for comparisons across multiple instruments.

  13. Event visualisation in ATLAS: current software technologies, future prospects and trends

    CERN Document Server

    Bianchi, Riccardo-Maria; The ATLAS collaboration; Moyse, Edward

    2016-01-01

    At the beginning, HEP experiments made use of photographical images both to record and store experimental data and to illustrate their findings. Then the experiments evolved and needed to find ways to visualize their data. With the availability of computer graphics, software packages to display event data and the detector geometry started to be developed. Here a brief history of event displays is presented, with an overview of the different event display tools used today in HEP experiments in general, and in the LHC experiments in particular. Then the case of the ATLAS experiment is considered in more detail and two widely used event display packages are presented, Atlantis and VP1, focusing on the software technologies they employ, as well as their strengths, differences and their usage in the experiment: from physics analysis to detector development, and from online monitoring to outreach and communication. Future development plans and improvements in the ATLAS event display packages will also be discussed,...

  14. Single-tone and Polyharmonic Eddy Current Metal Detection and Non-Destructive Testing Education Software

    Science.gov (United States)

    Svatoš, J.

    2016-11-01

    This paper describes the design of a measuring chain for polyharmonic metal detectors used for education in laboratory exercises at Czech Technical University in Prague, Faculty of Electrical Engineering, Department of Measurement. The Measuring chain is composed of DDS signal generator, Digitiser and PC with software programmed in Labview. Eddy current principles or more specifically eddy current metal detectors are an important part of nondestructive testing, instrumentations and measurement. A short introduction to the background and principles of eddy current metal detectors are presented. Next part of the article deals with a brief description of the most common methods, as well as, non-traditional polyharmonic methods for eddy current metal detection. The following part contains an implementation of the proposed algorithms in LabVIEW graphical programming language. Finally, the created program for education of eddy current metal detectors and results obtained on the metal detector ATMID are discussed.

  15. Proofreading Using an Assistive Software Homophone Tool: Compensatory and Remedial Effects on the Literacy Skills of Students with Reading Difficulties

    Science.gov (United States)

    Lange, Alissa A.; Mulhern, Gerry; Wylie, Judith

    2009-01-01

    The present study investigated the effects of using an assistive software homophone tool on the assisted proofreading performance and unassisted basic skills of secondary-level students with reading difficulties. Students aged 13 to 15 years proofread passages for homophonic errors under three conditions: with the homophone tool, with homophones…

  16. Quality-driven multi-objective optimization of software architecture design : method, tool, and application

    NARCIS (Netherlands)

    Etemadi Idgahi (Etemaadi), Ramin

    2014-01-01

    Software architecting is a non-trivial and demanding task for software engineers to perform. The architecture is a key enabler for software systems. Besides being crucial for user functionality, the software architecture has deep impact on software qualities such as performance, safety, and cost.

  17. Quality-driven multi-objective optimization of software architecture design : method, tool, and application

    NARCIS (Netherlands)

    Etemadi Idgahi (Etemaadi), Ramin

    2014-01-01

    Software architecting is a non-trivial and demanding task for software engineers to perform. The architecture is a key enabler for software systems. Besides being crucial for user functionality, the software architecture has deep impact on software qualities such as performance, safety, and cost.

  18. Genoviz Software Development Kit: Java tool kit for building genomics visualization applications

    Directory of Open Access Journals (Sweden)

    Chervitz Stephen A

    2009-08-01

    Full Text Available Abstract Background Visualization software can expose previously undiscovered patterns in genomic data and advance biological science. Results The Genoviz Software Development Kit (SDK is an open source, Java-based framework designed for rapid assembly of visualization software applications for genomics. The Genoviz SDK framework provides a mechanism for incorporating adaptive, dynamic zooming into applications, a desirable feature of genome viewers. Visualization capabilities of the Genoviz SDK include automated layout of features along genetic or genomic axes; support for user interactions with graphical elements (Glyphs in a map; a variety of Glyph sub-classes that promote experimentation with new ways of representing data in graphical formats; and support for adaptive, semantic zooming, whereby objects change their appearance depending on zoom level and zooming rate adapts to the current scale. Freely available demonstration and production quality applications, including the Integrated Genome Browser, illustrate Genoviz SDK capabilities. Conclusion Separation between graphics components and genomic data models makes it easy for developers to add visualization capability to pre-existing applications or build new applications using third-party data models. Source code, documentation, sample applications, and tutorials are available at http://genoviz.sourceforge.net/.

  19. Biogem: an effective tool based approach for scaling up open source software development in bioinformatics

    NARCIS (Netherlands)

    Bonnal, R.J.P.; Smant, G.; Prins, J.C.P.

    2012-01-01

    Biogem provides a software development environment for the Ruby programming language, which encourages community-based software development for bioinformatics while lowering the barrier to entry and encouraging best practices. Biogem, with its targeted modular and decentralized approach, software ge

  20. JCoast - a biologist-centric software tool for data mining and comparison of prokaryotic (meta)genomes.

    Science.gov (United States)

    Richter, Michael; Lombardot, Thierry; Kostadinov, Ivaylo; Kottmann, Renzo; Duhaime, Melissa Beth; Peplies, Jörg; Glöckner, Frank Oliver

    2008-04-01

    Current sequencing technologies give access to sequence information for genomes and metagenomes at a tremendous speed. Subsequent data processing is mainly performed by automatic pipelines provided by the sequencing centers. Although, standardised workflows are desirable and useful in many respects, rational data mining, comparative genomics, and especially the interpretation of the sequence information in the biological context, demands for intuitive, flexible, and extendable solutions. The JCoast software tool was primarily designed to analyse and compare (meta)genome sequences of prokaryotes. Based on a pre-computed GenDB database project, JCoast offers a flexible graphical user interface (GUI), as well as an application programming interface (API) that facilitates back-end data access. JCoast offers individual, cross genome-, and metagenome analysis, and assists the biologist in exploration of large and complex datasets. JCoast combines all functions required for the mining, annotation, and interpretation of (meta)genomic data. The lightweight software solution allows the user to easily take advantage of advanced back-end database structures by providing a programming and graphical user interface to answer biological questions. JCoast is available at the project homepage.

  1. Proceedings of the Ninth Annual Software Engineering Workshop

    Science.gov (United States)

    1984-01-01

    Experiences in measurement, utilization, and evaluation of software methodologies, models, and tools are discussed. NASA's involvement in ever larger and more complex systems, like the space station project, provides a motive for the support of software engineering research and the exchange of ideas in such forums. The topics of current SEL research are software error studies, experiments with software development, and software tools.

  2. A software tool to evaluate crystal types and morphological developments of accessory zircon

    Science.gov (United States)

    Sturm, Robert

    2014-08-01

    Computer programs for an appropriate visualization of crystal types and morphological developments of accessory zircon are not available hitherto. Usually, typological computations are conducted by using simple calculation tools or spread-sheet programs. In practice, however, high numbers of data sets including information of numerous zircon populations have to be processed and stored. The paper describes the software ZIRCTYP, which is a macro-driven program within the Microsoft Access database management system. It allows the computation of zircon morphologies occurring in specific rock samples and their presentation in typology diagrams. In addition, morphological developments within a given zircon population are presented (1) statistically and (2) graphically as crystal sequences showing initial, intermediate, and final growth stages.

  3. BioBrick assembly standards and techniques and associated software tools.

    Science.gov (United States)

    Røkke, Gunvor; Korvald, Eirin; Pahr, Jarle; Oyås, Ove; Lale, Rahmi

    2014-01-01

    The BioBrick idea was developed to introduce the engineering principles of abstraction and standardization into synthetic biology. BioBricks are DNA sequences that serve a defined biological function and can be readily assembled with any other BioBrick parts to create new BioBricks with novel properties. In order to achieve this, several assembly standards can be used. Which assembly standards a BioBrick is compatible with, depends on the prefix and suffix sequences surrounding the part. In this chapter, five of the most common assembly standards will be described, as well as some of the most used assembly techniques, cloning procedures, and a presentation of the available software tools that can be used for deciding on the best method for assembling of different BioBricks, and searching for BioBrick parts in the Registry of Standard Biological Parts database.

  4. Development and validation of evolutionary algorithm software as an optimization tool for biological and environmental applications.

    Science.gov (United States)

    Sys, K; Boon, N; Verstraete, W

    2004-06-01

    A flexible, extendable tool for the optimization of (micro)biological processes and protocols using evolutionary algorithms was developed. It has been tested using three different theoretical optimization problems: 2 two-dimensional problems, one with three maxima and one with five maxima and a river autopurification optimization problem with boundary conditions. For each problem, different evolutionary parameter settings were used for the optimization. For each combination of evolutionary parameters, 15 generations were run 20 times. It has been shown that in all cases, the evolutionary algorithm gave rise to valuable results. Generally, the algorithms were able to detect the more stable sub-maximum even if there existed less stable maxima. The latter is, from a practical point of view, generally more desired. The most important factors influencing the convergence process were the parameter value randomization rate and distribution. The developed software, described in this work, is available for free.

  5. A TAXONOMY FOR TOOLS, PROCESSES AND LANGUAGES IN AUTOMOTIVE SOFTWARE ENGINEERING

    Directory of Open Access Journals (Sweden)

    Florian Bock

    2016-01-01

    Full Text Available Within the growing domain of software engineering in the automotive sector, the number of used tools, processes, methods and languages has increased distinctly in the past years. To be able to choose proper methods for particular development use cases, factors like the intended use, key-features and possible limitations have to be evaluated. This requires a taxonomy that aids the decision making. An analysis of the main existing taxonomies revealed two major deficiencies: the lack of the automotive focus and the limitation to particular engineering method types. To face this, a graphical taxonomy is proposed based on two well-established engineering approaches and enriched with additional classification information. It provides a self-evident and -explanatory overview and comparison technique for engineering methods in the automotive domain. The taxonomy is applied to common automotive engineering methods. The resulting diagram classifies each method and enables the reader to select appropriate solutions for given project requirements.

  6. A browsing tool for the Internet Logical Library of the HPCC Software Exchange

    Science.gov (United States)

    Biro, Ross

    1993-01-01

    As the quantity of information available on the Internet grows, locating a particular piece of information becomes more difficult. One possible solution is for a database of pointers to all available information to be maintained at a central site. Subject classifications for all the information could also be maintained in order to make searching possible. This paper describes one possible method of searching such an index. In particular a prototype browsing tool has been created using TCL/TK to demonstrate several possible features: rapidly scanning at any rank of the index, narrowing the index to any scope, regular-expression searching, and creation of a list of pointers answering to any set of index terms. The prototype browser is an easy-to-use independent X application designed for use in the Catalog of Repositories of the HPCC (High Performance Computing and Communications) Software Exchange.

  7. Evaluating a digital ship design tool prototype: Designers' perceptions of novel ergonomics software.

    Science.gov (United States)

    Mallam, Steven C; Lundh, Monica; MacKinnon, Scott N

    2017-03-01

    Computer-aided solutions are essential for naval architects to manage and optimize technical complexities when developing a ship's design. Although there are an array of software solutions aimed to optimize the human element in design, practical ergonomics methodologies and technological solutions have struggled to gain widespread application in ship design processes. This paper explores how a new ergonomics technology is perceived by naval architecture students using a mixed-methods framework. Thirteen Naval Architecture and Ocean Engineering Masters students participated in the study. Overall, results found participants perceived the software and its embedded ergonomics tools to benefit their design work, increasing their empathy and ability to understand the work environment and work demands end-users face. However, participant's questioned if ergonomics could be practically and efficiently implemented under real-world project constraints. This revealed underlying social biases and a fundamental lack of understanding in engineering postgraduate students regarding applied ergonomics in naval architecture. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. Introduction of software tools for epidemiological surveillance in infection control in Colombia

    Science.gov (United States)

    Motoa, Gabriel; Vallejo, Marta; Blanco, Víctor M; Correa, Adriana; de la Cadena, Elsa; Villegas, María Virginia

    2015-01-01

    Introduction: Healthcare-Associated Infections (HAI) are a challenge for patient safety in the hospitals. Infection control committees (ICC) should follow CDC definitions when monitoring HAI. The handmade method of epidemiological surveillance (ES) may affect the sensitivity and specificity of the monitoring system, while electronic surveillance can improve the performance, quality and traceability of recorded information. Objective: To assess the implementation of a strategy for electronic surveillance of HAI, Bacterial Resistance and Antimicrobial Consumption by the ICC of 23 high-complexity clinics and hospitals in Colombia, during the period 2012-2013. Methods: An observational study evaluating the introduction of electronic tools in the ICC was performed; we evaluated the structure and operation of the ICC, the degree of incorporation of the software HAI Solutions and the adherence to record the required information. Results: Thirty-eight percent of hospitals (8/23) had active surveillance strategies with standard criteria of the CDC, and 87% of institutions adhered to the module of identification of cases using the HAI Solutions software. In contrast, compliance with the diligence of the risk factors for device-associated HAIs was 33%. Conclusions: The introduction of ES could achieve greater adherence to a model of active surveillance, standardized and prospective, helping to improve the validity and quality of the recorded information. PMID:26309340

  9. PREDICTION OF SMARTPHONES’ PERCEIVED IMAGE QUALITY USING SOFTWARE EVALUATION TOOL VIQET

    Directory of Open Access Journals (Sweden)

    Pinchas ZOREA

    2016-12-01

    Full Text Available A great deal of resources and efforts have been made in recent years to assess how the smartphones users perceived the image quality. Unfortunately, only limited success has been achieved and the image quality assessment still based on many physical human visual test. The paper describes the new model proposed for perceived quality based on human visual tests compared with image analysis by the software application tool. The values of parameters of perceived image quality (brightness, contrast, color saturation and sharpness were calibrated based on results from human visual experiments.PREDICŢIA CALITĂŢII PERCEPUTE A IMAGINILOR AFIȘATE DE SMARTPHONE-URI UTILIZÂND APLICAŢIA DE EVALUARE VIQETÎn ultimii ani au fost depuse eforturi semnificative pentru a evalua modul în care utilizatorii de smartphone  percep calitatea imaginilor. Din păcate, a fost atins doar un progres limitat, evaluarea calităţii imaginiilor bazându-se încă pe multiple teste vizuale umane. În lucrare este descris un nou model al calităţii percepute pe baza testelor vizuale umane, comparate cu analiza imaginii efectuate cu o aplicaţie software. Valorile parametrilor calităţii  percepute a imaginii (lu­minozitate, contrast, saturaţia culorilor şi claritatea au fost calibrate pe baza rezultatelor experimentelor vizuale umane.

  10. A Software Tool for Optimal Sizing of PV Systems in Malaysia

    Directory of Open Access Journals (Sweden)

    Tamer Khatib

    2012-01-01

    Full Text Available This paper presents a MATLAB based user friendly software tool called as PV.MY for optimal sizing of photovoltaic (PV systems. The software has the capabilities of predicting the metrological variables such as solar energy, ambient temperature and wind speed using artificial neural network (ANN, optimizes the PV module/ array tilt angle, optimizes the inverter size and calculate optimal capacities of PV array, battery, wind turbine and diesel generator in hybrid PV systems. The ANN based model for metrological prediction uses four meteorological variables, namely, sun shine ratio, day number and location coordinates. As for PV system sizing, iterative methods are used for determining the optimal sizing of three types of PV systems, which are standalone PV system, hybrid PV/wind system and hybrid PV/diesel generator system. The loss of load probability (LLP technique is used for optimization in which the energy sources capacities are the variables to be optimized considering very low LLP. As for determining the optimal PV panels tilt angle and inverter size, the Liu and Jordan model for solar energy incident on a tilt surface is used in optimizing the monthly tilt angle, while a model for inverter efficiency curve is used in the optimization of inverter size.

  11. RadNotes: a novel software development tool for radiology education.

    Science.gov (United States)

    Baxter, A B; Klein, J S; Oesterle, E V

    1997-01-01

    RadNotes is a novel software development tool that enables physicians to develop teaching materials incorporating text and images in an intelligent, highly usable format. Projects undertaken in the RadNotes environment require neither programming expertise nor the assistance of a software engineer. The first of these projects, Thoracic Imaging, integrates image teaching files, concise disease and topic summaries, references, and flash card quizzes into a single program designed to provide an overview of chest radiology. RadNotes is intended to support the academic goals of teaching radiologists by enabling authors to create, edit, and electronically distribute image-oriented presentations. RadNotes also supports the educational goals of physicians who wish to quickly review selected imaging topics, as well as to develop a visual vocabulary of corresponding radiologic anatomy and pathologic conditions. Although Thoracic Imaging was developed with the aim of introducing chest radiology to residents, RadNotes can be used to develop tutorials and image-based tests for all levels; create corresponding World Wide Web sites; and organize notes, images, and references for individual use.

  12. Neurophysiological analytics for all! Free open-source software tools for documenting, analyzing, visualizing, and sharing using electronic notebooks.

    Science.gov (United States)

    Rosenberg, David M; Horn, Charles C

    2016-08-01

    Neurophysiology requires an extensive workflow of information analysis routines, which often includes incompatible proprietary software, introducing limitations based on financial costs, transfer of data between platforms, and the ability to share. An ecosystem of free open-source software exists to fill these gaps, including thousands of analysis and plotting packages written in Python and R, which can be implemented in a sharable and reproducible format, such as the Jupyter electronic notebook. This tool chain can largely replace current routines by importing data, producing analyses, and generating publication-quality graphics. An electronic notebook like Jupyter allows these analyses, along with documentation of procedures, to display locally or remotely in an internet browser, which can be saved as an HTML, PDF, or other file format for sharing with team members and the scientific community. The present report illustrates these methods using data from electrophysiological recordings of the musk shrew vagus-a model system to investigate gut-brain communication, for example, in cancer chemotherapy-induced emesis. We show methods for spike sorting (including statistical validation), spike train analysis, and analysis of compound action potentials in notebooks. Raw data and code are available from notebooks in data supplements or from an executable online version, which replicates all analyses without installing software-an implementation of reproducible research. This demonstrates the promise of combining disparate analyses into one platform, along with the ease of sharing this work. In an age of diverse, high-throughput computational workflows, this methodology can increase efficiency, transparency, and the collaborative potential of neurophysiological research.

  13. Software Tools for the Analysis of the Photocathode Response of Photomultiplier Vacuum Tubes

    CERN Document Server

    Fabbri, R

    2013-01-01

    The central institute of electronics (ZEA-2) in the Forschungszentrum Juelich (FZJ) has developed a system to scan the response of the photocathode of photomultiplier tubes (PMT). The PMT sits tight on a supporting structure, while a blue light emitting diode is moved along its surface by two stepper motors, spanning both the x and y coordinates. All the system is located in a light-tight box made by wood. A graphical software was developed in-situ to perform the scan operations under different configurations (e.g., the step size of the scan and the number of measurements per point). During each point measurement the current output generated in the vacuum photomultiplier is processed in sequence by a pre-amplifier (mainly to convert the current signal into a voltage signal), an amplifier, and by an ADC module (typically a CAEN N957). The information of the measurement is saved in files at the end of the scan. Recently, software based on the CERN ROOT and on the Qt libraries was developed to help the user anal...

  14. Techniques and software tools for estimating ultrasonic signal-to-noise ratios

    Science.gov (United States)

    Chiou, Chien-Ping; Margetan, Frank J.; McKillip, Matthew; Engle, Brady J.; Roberts, Ronald A.

    2016-02-01

    At Iowa State University's Center for Nondestructive Evaluation (ISU CNDE), the use of models to simulate ultrasonic inspections has played a key role in R&D efforts for over 30 years. To this end a series of wave propagation models, flaw response models, and microstructural backscatter models have been developed to address inspection problems of interest. One use of the combined models is the estimation of signal-to-noise ratios (S/N) in circumstances where backscatter from the microstructure (grain noise) acts to mask sonic echoes from internal defects. Such S/N models have been used in the past to address questions of inspection optimization and reliability. Under the sponsorship of the National Science Foundation's Industry/University Cooperative Research Center at ISU, an effort was recently initiated to improve existing research-grade software by adding graphical user interface (GUI) to become user friendly tools for the rapid estimation of S/N for ultrasonic inspections of metals. The software combines: (1) a Python-based GUI for specifying an inspection scenario and displaying results; and (2) a Fortran-based engine for computing defect signal and backscattered grain noise characteristics. The latter makes use of several models including: the Multi-Gaussian Beam Model for computing sonic fields radiated by commercial transducers; the Thompson-Gray Model for the response from an internal defect; the Independent Scatterer Model for backscattered grain noise; and the Stanke-Kino Unified Model for attenuation. The initial emphasis was on reformulating the research-grade code into a suitable modular form, adding the graphical user interface and performing computations rapidly and robustly. Thus the initial inspection problem being addressed is relatively simple. A normal-incidence pulse/echo immersion inspection is simulated for a curved metal component having a non-uniform microstructure, specifically an equiaxed, untextured microstructure in which the average

  15. Software development for dynamic position emission tomography: Dynamic image analysis (DIA) tool

    Energy Technology Data Exchange (ETDEWEB)

    Pyeon, Do Yeong; Jung, Young Jin [Dongseo University, Busan (Korea, Republic of); Kim, Jung Su [Dept. of Radilogical Science, Dongnam Health University, Suwon (Korea, Republic of)

    2016-09-15

    Positron Emission Tomography(PET) is nuclear medical tests which is a combination of several compounds with a radioactive isotope that can be injected into body to quantitatively measure the metabolic rate (in the body). Especially, Phenomena that increase (sing) glucose metabolism in cancer tissue using the 18F-FDG (Fluorodeoxyglucose) is utilized widely in cancer diagnosis. And then, Numerous studies have been reported that incidence seems high availability even in the modern diagnosis of dementia and Parkinson's (disease) in brain disease. When using a dynamic PET image including the time information in the static information that is provided for the diagnosis many can increase the accuracy of diagnosis. For this reason, clinical researchers getting great attention but, it is the lack of tools to conduct research. And, it interfered complex mathematical algorithm and programming skills for activation of research. In this study, in order to easy to use and enable research dPET, we developed the software based graphic user interface(GUI). In the future, by many clinical researcher using DIA-Tool is expected to be of great help to dPET research.

  16. A software tool to assess uncertainty in transient-storage model parameters using Monte Carlo simulations

    Science.gov (United States)

    Ward, Adam S.; Kelleher, Christa A.; Mason, Seth J. K.; Wagener, Thorsten; McIntyre, Neil; McGlynn, Brian L.; Runkel, Robert L.; Payn, Robert A.

    2017-01-01

    Researchers and practitioners alike often need to understand and characterize how water and solutes move through a stream in terms of the relative importance of in-stream and near-stream storage and transport processes. In-channel and subsurface storage processes are highly variable in space and time and difficult to measure. Storage estimates are commonly obtained using transient-storage models (TSMs) of the experimentally obtained solute-tracer test data. The TSM equations represent key transport and storage processes with a suite of numerical parameters. Parameter values are estimated via inverse modeling, in which parameter values are iteratively changed until model simulations closely match observed solute-tracer data. Several investigators have shown that TSM parameter estimates can be highly uncertain. When this is the case, parameter values cannot be used reliably to interpret stream-reach functioning. However, authors of most TSM studies do not evaluate or report parameter certainty. Here, we present a software tool linked to the One-dimensional Transport with Inflow and Storage (OTIS) model that enables researchers to conduct uncertainty analyses via Monte-Carlo parameter sampling and to visualize uncertainty and sensitivity results. We demonstrate application of our tool to 2 case studies and compare our results to output obtained from more traditional implementation of the OTIS model. We conclude by suggesting best practices for transient-storage modeling and recommend that future applications of TSMs include assessments of parameter certainty to support comparisons and more reliable interpretations of transport processes.

  17. SHAPA: An interactive software tool for protocol analysis applied to aircrew communications and workload

    Science.gov (United States)

    James, Jeffrey M.; Sanderson, Penelope M.; Seidler, Karen S.

    1990-01-01

    As modern transport environments become increasingly complex, issues such as crew communication, interaction with automation, and workload management have become crucial. Much research is being focused on holistic aspects of social and cognitive behavior, such as the strategies used to handle workload, the flow of information, the scheduling of tasks, the verbal and non-verbal interactions between crew members. Traditional laboratory performance measures no longer sufficiently meet the needs of researchers addressing these issues. However observational techniques are better equipped to capture the type of data needed and to build models of the requisite level of sophistication. Presented here is SHAPA, an interactive software tool for performing both verbal and non-verbal protocol analysis. It has been developed with the idea of affording the researchers the closest possible degree of engagement with protocol data. The researcher can configure SHAPA to encode protocols using any theoretical framework or encoding vocabulary that is desired. SHAPA allows protocol analysis to be performed at any level of analysis, and it supplies a wide variety of tools for data aggregation, manipulation. The output generated by SHAPA can be used alone or in combination with other performance variables to get a rich picture of the influences on sequences of verbal or nonverbal behavior.

  18. Effectiveness of Crown Preparation Assessment Software As an Educational Tool in Simulation Clinic: A Pilot Study.

    Science.gov (United States)

    Tiu, Janine; Cheng, Enxin; Hung, Tzu-Chiao; Yu, Chuan-Chia; Lin, Tony; Schwass, Don; Al-Amleh, Basil

    2016-08-01

    The aim of this pilot study was to evaluate the feasibility of a new tooth preparation assessment software, Preppr, as an educational tool for dental students in achieving optimal parameters for a crown preparation. In February 2015, 30 dental students in their fourth year in a five-year undergraduate dental curriculum in New Zealand were randomly selected from a pool of volunteers (N=40) out of the total class of 85. The participants were placed into one of three groups of ten students each: Group A, the control group, received only written and pictorial instructions; Group B received tutor evaluation and feedback; and Group C performed self-directed learning with the aid of Preppr. Each student was asked to prepare an all-ceramic crown on the lower first molar typodont within three hours and to repeat the exercise three times over the next four weeks. The exercise stipulated a 1 mm finish line dimension and total convergence angles (TOC) between 10 and 20 degrees. Fulfillment of these parameters was taken as an acceptable preparation. The results showed that Group C had the highest percentage of students who achieved minimum finish line dimensions and acceptable TOC angles. Those students also achieved the stipulated requirements earlier than the other groups. This study's findings provide promising data on the feasibility of using Preppr as a self-directed educational tool for students training to prepare dental crowns.

  19. PentaPlot: A software tool for the illustration of genome mosaicism

    Directory of Open Access Journals (Sweden)

    Zhaxybayeva Olga

    2005-06-01

    Full Text Available Abstract Background Dekapentagonal maps depict the phylogenetic relationships of five genomes in a visually appealing diagram and can be viewed as an alternative to a single evolutionary consensus tree. In particular, the generated maps focus attention on those gene families that significantly deviate from the consensus or plurality phylogeny. PentaPlot is a software tool that computes such dekapentagonal maps given an appropriate probability support matrix. Results The visualization with dekapentagonal maps critically depends on the optimal layout of unrooted tree topologies representing different evolutionary relationships among five organisms along the vertices of the dekapentagon. This is a difficult optimization problem given the large number of possible layouts. At its core our tool utilizes a genetic algorithm with demes and a local search strategy to search for the optimal layout. The hybrid genetic algorithm performs satisfactorily even in those cases where the chosen genomes are so divergent that little phylogenetic information has survived in the individual gene families. Conclusion PentaPlot is being made publicly available as an open source project at http://pentaplot.sourceforge.net.

  20. GraphCrunch 2: Software tool for network modeling, alignment and clustering

    Directory of Open Access Journals (Sweden)

    Hayes Wayne

    2011-01-01

    Full Text Available Abstract Background Recent advancements in experimental biotechnology have produced large amounts of protein-protein interaction (PPI data. The topology of PPI networks is believed to have a strong link to their function. Hence, the abundance of PPI data for many organisms stimulates the development of computational techniques for the modeling, comparison, alignment, and clustering of networks. In addition, finding representative models for PPI networks will improve our understanding of the cell just as a model of gravity has helped us understand planetary motion. To decide if a model is representative, we need quantitative comparisons of model networks to real ones. However, exact network comparison is computationally intractable and therefore several heuristics have been used instead. Some of these heuristics are easily computable "network properties," such as the degree distribution, or the clustering coefficient. An important special case of network comparison is the network alignment problem. Analogous to sequence alignment, this problem asks to find the "best" mapping between regions in two networks. It is expected that network alignment might have as strong an impact on our understanding of biology as sequence alignment has had. Topology-based clustering of nodes in PPI networks is another example of an important network analysis problem that can uncover relationships between interaction patterns and phenotype. Results We introduce the GraphCrunch 2 software tool, which addresses these problems. It is a significant extension of GraphCrunch which implements the most popular random network models and compares them with the data networks with respect to many network properties. Also, GraphCrunch 2 implements the GRAph ALigner algorithm ("GRAAL" for purely topological network alignment. GRAAL can align any pair of networks and exposes large, dense, contiguous regions of topological and functional similarities far larger than any other

  1. Upgrading the Interface and Developer Tools of the Trigger Supervisor Software Framework of the CMS experiment at CERN

    CERN Document Server

    AUTHOR|(CDS)2097518; Karsmakers, Peter

    The Compact Muon Solenoid (CMS) Trigger Supervisor (TS) is a software framework that has been designed to handle the CMS Level-1 trigger setup, configuration and monitoring during data taking as well as all communications with the main run control of CMS. The interface consists of a web-based GUI rendered by a back-end C++ framework (AjaXell) and a front-end JavaScript framework (Dojo). These provide developers with the tools they need to to write their own custom control panels. However, currently there is much frustration with this framework given the age of the Dojo library and the various hacks needed to implement modern use cases. The task at hand is to renew this library and its developer tools, updating it to use the newest standards and technologies, while maintaining full compatibility with legacy code. This document describes the requirements, development process, and changes to this framework that were included in the upgrade from v2.x to v3.x. Keywords: CERN, CMS, L1 Trigger, C++, Polymer, Web Com...

  2. ImaSim, a software tool for basic education of medical x-ray imaging in radiotherapy and radiology

    Directory of Open Access Journals (Sweden)

    guillaume elandry

    2013-11-01

    Full Text Available Introduction: X-ray imaging is an important part of medicine and plays a crucial role in radiotherapy. Education in this field is mostly limited to textbook teaching due to equipment restrictions. A novel simulation tool, ImaSim, for teaching the fundamentals of the x-ray imaging process based on ray-tracing is presented in this work. ImaSim is used interactively via a graphical user interface (GUI.Materials and methods: The software package covers the main x-ray based medical modalities: planar kilo voltage (kV, planar (portal mega voltage (MV, fan beam computed tomography (CT and cone beam CT (CBCT imaging. The user can modify the photon source, object to be imaged and imaging setup with three-dimensional editors. Objects are currently obtained by combining blocks with variable shapes. The imaging of three-dimensional voxelized geometries is currently not implemented, but can be added in a later release. The program follows a ray-tracing approach, ignoring photon scatter in its current implementation. Simulations of a phantom CT scan were generated in ImaSim and were compared to measured data in terms of CT number accuracy. Spatial variations in the photon fluence and mean energy from an x-ray tube caused by the heel effect were estimated from ImaSim and Monte Carlo simulations and compared.Results: In this paper we describe ImaSim and provide two examples of its capabilities. CT numbers were found to agree within 36 Hounsfield Units (HU for bone, which corresponds to a 2% attenuation coefficient difference. ImaSim reproduced the heel effect reasonably well when compared to Monte Carlo simulations. Discussion: An x-ray imaging simulation tool is made available for teaching and research purposes. ImaSim provides a means to facilitate the teaching of medical x-ray imaging.

  3. ImaSim, a software tool for basic education of medical x-ray imaging in radiotherapy and radiology

    Science.gov (United States)

    Landry, Guillaume; deBlois, François; Verhaegen, Frank

    2013-11-01

    Introduction: X-ray imaging is an important part of medicine and plays a crucial role in radiotherapy. Education in this field is mostly limited to textbook teaching due to equipment restrictions. A novel simulation tool, ImaSim, for teaching the fundamentals of the x-ray imaging process based on ray-tracing is presented in this work. ImaSim is used interactively via a graphical user interface (GUI). Materials and methods: The software package covers the main x-ray based medical modalities: planar kilo voltage (kV), planar (portal) mega voltage (MV), fan beam computed tomography (CT) and cone beam CT (CBCT) imaging. The user can modify the photon source, object to be imaged and imaging setup with three-dimensional editors. Objects are currently obtained by combining blocks with variable shapes. The imaging of three-dimensional voxelized geometries is currently not implemented, but can be added in a later release. The program follows a ray-tracing approach, ignoring photon scatter in its current implementation. Simulations of a phantom CT scan were generated in ImaSim and were compared to measured data in terms of CT number accuracy. Spatial variations in the photon fluence and mean energy from an x-ray tube caused by the heel effect were estimated from ImaSim and Monte Carlo simulations and compared. Results: In this paper we describe ImaSim and provide two examples of its capabilities. CT numbers were found to agree within 36 Hounsfield Units (HU) for bone, which corresponds to a 2% attenuation coefficient difference. ImaSim reproduced the heel effect reasonably well when compared to Monte Carlo simulations. Discussion: An x-ray imaging simulation tool is made available for teaching and research purposes. ImaSim provides a means to facilitate the teaching of medical x-ray imaging.

  4. System Software and Tools for High Performance Computing Environments: A report on the findings of the Pasadena Workshop, April 14--16, 1992

    Energy Technology Data Exchange (ETDEWEB)

    Sterling, T. [Universities Space Research Association, Washington, DC (United States); Messina, P. [Jet Propulsion Lab., Pasadena, CA (United States); Chen, M. [Yale Univ., New Haven, CT (United States)] [and others

    1993-04-01

    The Pasadena Workshop on System Software and Tools for High Performance Computing Environments was held at the Jet Propulsion Laboratory from April 14 through April 16, 1992. The workshop was sponsored by a number of Federal agencies committed to the advancement of high performance computing (HPC) both as a means to advance their respective missions and as a national resource to enhance American productivity and competitiveness. Over a hundred experts in related fields from industry, academia, and government were invited to participate in this effort to assess the current status of software technology in support of HPC systems. The overall objectives of the workshop were to understand the requirements and current limitations of HPC software technology and to contribute to a basis for establishing new directions in research and development for software technology in HPC environments. This report includes reports written by the participants of the workshop`s seven working groups. Materials presented at the workshop are reproduced in appendices. Additional chapters summarize the findings and analyze their implications for future directions in HPC software technology development.

  5. TIDE TOOL: Open-Source Sea-Level Monitoring Software for Tsunami Warning Systems

    Science.gov (United States)

    Weinstein, S. A.; Kong, L. S.; Becker, N. C.; Wang, D.

    2012-12-01

    A tsunami warning center (TWC) typically decides to issue a tsunami warning bulletin when initial estimates of earthquake source parameters suggest it may be capable of generating a tsunami. A TWC, however, relies on sea-level data to provide prima facie evidence for the existence or non-existence of destructive tsunami waves and to constrain tsunami wave height forecast models. In the aftermath of the 2004 Sumatra disaster, the International Tsunami Information Center asked the Pacific Tsunami Warning Center (PTWC) to develop a platform-independent, easy-to-use software package to give nascent TWCs the ability to process WMO Global Telecommunications System (GTS) sea-level messages and to analyze the resulting sea-level curves (marigrams). In response PTWC developed TIDE TOOL that has since steadily grown in sophistication to become PTWC's operational sea-level processing system. TIDE TOOL has two main parts: a decoder that reads GTS sea-level message logs, and a graphical user interface (GUI) written in the open-source platform-independent graphical toolkit scripting language Tcl/Tk. This GUI consists of dynamic map-based clients that allow the user to select and analyze a single station or groups of stations by displaying their marigams in strip-chart or screen-tiled forms. TIDE TOOL also includes detail maps of each station to show each station's geographical context and reverse tsunami travel time contours to each station. TIDE TOOL can also be coupled to the GEOWARE™ TTT program to plot tsunami travel times and to indicate the expected tsunami arrival time on the marigrams. Because sea-level messages are structured in a rich variety of formats TIDE TOOL includes a metadata file, COMP_META, that contains all of the information needed by TIDE TOOL to decode sea-level data as well as basic information such as the geographical coordinates of each station. TIDE TOOL can therefore continuously decode theses sea-level messages in real-time and display the time

  6. Should we have blind faith in bioinformatics software? Illustrations from the SNAP web-based tool.

    Directory of Open Access Journals (Sweden)

    Sébastien Robiou-du-Pont

    Full Text Available Bioinformatics tools have gained popularity in biology but little is known about their validity. We aimed to assess the early contribution of 415 single nucleotide polymorphisms (SNPs associated with eight cardio-metabolic traits at the genome-wide significance level in adults in the Family Atherosclerosis Monitoring In earLY Life (FAMILY birth cohort. We used the popular web-based tool SNAP to assess the availability of the 415 SNPs in the Illumina Cardio-Metabochip genotyped in the FAMILY study participants. We then compared the SNAP output with the Cardio-Metabochip file provided by Illumina using chromosome and chromosomal positions of SNPs from NCBI Human Genome Browser (Genome Reference Consortium Human Build 37. With the HapMap 3 release 2 reference, 201 out of 415 SNPs were reported as missing in the Cardio-Metabochip by the SNAP output. However, the Cardio-Metabochip file revealed that 152 of these 201 SNPs were in fact present in the Cardio-Metabochip array (false negative rate of 36.6%. With the more recent 1000 Genomes Project release, we found a false-negative rate of 17.6% by comparing the outputs of SNAP and the Illumina product file. We did not find any 'false positive' SNPs (SNPs specified as available in the Cardio-Metabochip by SNAP, but not by the Cardio-Metabochip Illumina file. The Cohen's Kappa coefficient, which calculates the percentage of agreement between both methods, indicated that the validity of SNAP was fair to moderate depending on the reference used (the HapMap 3 or 1000 Genomes. In conclusion, we demonstrate that the SNAP outputs for the Cardio-Metabochip are invalid. This study illustrates the importance of systematically assessing the validity of bioinformatics tools in an independent manner. We propose a series of guidelines to improve practices in the fast-moving field of bioinformatics software implementation.

  7. Performance analysis and optimization of an advanced pharmaceutical wastewater treatment plant through a visual basic software tool (PWWT.VB).

    Science.gov (United States)

    Pal, Parimal; Thakura, Ritwik; Chakrabortty, Sankha

    2016-05-01

    A user-friendly, menu-driven simulation software tool has been developed for the first time to optimize and analyze the system performance of an advanced continuous membrane-integrated pharmaceutical wastewater treatment plant. The software allows pre-analysis and manipulation of input data which helps in optimization and shows the software performance visually on a graphical platform. Moreover, the software helps the user to "visualize" the effects of the operating parameters through its model-predicted output profiles. The software is based on a dynamic mathematical model, developed for a systematically integrated forward osmosis-nanofiltration process for removal of toxic organic compounds from pharmaceutical wastewater. The model-predicted values have been observed to corroborate well with the extensive experimental investigations which were found to be consistent under varying operating conditions like operating pressure, operating flow rate, and draw solute concentration. Low values of the relative error (RE = 0.09) and high values of Willmott-d-index (d will = 0.981) reflected a high degree of accuracy and reliability of the software. This software is likely to be a very efficient tool for system design or simulation of an advanced membrane-integrated treatment plant for hazardous wastewater.

  8. Diva software, a tool for European regional seas and Ocean climatologies production

    Science.gov (United States)

    Ouberdous, M.; Troupin, C.; Barth, A.; Alvera-Azcàrate, A.; Beckers, J.-M.

    2012-04-01

    Diva (Data-Interpolating Variational Analysis) is a software based on a method designed to perform data-gridding (or analysis) tasks, with the assets of taking into account the intrinsic nature of oceanographic data, i.e., the uncertainty on the in situ measurements and the anisotropy due to advection and irregular coastlines and topography. The Variational Inverse Method (VIM, Brasseur et al., 1996) implemented in Diva consists in minimizing a variational principle which accounts for the differences between the observations and the reconstructed field, the influence of the gradients and variability of the reconstructed field. The resolution of the numerical problem is based on finite-element method, which allows a great numerical efficiency and the consideration of complicated contours. Along with the analysis, Diva provides also error fields (Brankart and Brasseur, 1998; Rixen et al., 2000) based on the data coverage and noise. Diva is used for the production of climatologies in the pan-European network SeaDataNet. SeaDataNet is connecting the existing marine data centres of more than 30 countries and set up a data management infrastructure consisting of a standardized distributed system. The consortium has elaborated integrated products, using common procedures and methods. Among these, it uses the Diva software as reference tool for climatologies computation for various European regional seas, the Atlantic and the global ocean. During the first phase of the SeaDataNet project, a number of additional tools were developed to make easier the climatologies production for the users. Among these tools: the advection constraint during the field reconstruction through the specification of a velocity field on a regular grid, forcing the analysis to align with the velocity vectors; the Generalized Cross Validation for the determination of analysis parameters (signal-to-noise ratio); the creation of contours at selected depths; the detection of possible outliers; the

  9. Development and implementation of a 'Mental Health Finder' software tool within an electronic medical record system.

    Science.gov (United States)

    Swan, D; Hannigan, A; Higgins, S; McDonnell, R; Meagher, D; Cullen, W

    2017-02-01

    In Ireland, as in many other healthcare systems, mental health service provision is being reconfigured with a move toward more care in the community, and particularly primary care. Recording and surveillance systems for mental health information and activities in primary care are needed for service planning and quality improvement. We describe the development and initial implementation of a software tool ('mental health finder') within a widely used primary care electronic medical record system (EMR) in Ireland to enable large-scale data collection on the epidemiology and management of mental health and substance use problems among patients attending general practice. In collaboration with the Irish Primary Care Research Network (IPCRN), we developed the 'Mental Health Finder' as a software plug-in to a commonly used primary care EMR system to facilitate data collection on mental health diagnoses and pharmacological treatments among patients. The finder searches for and identifies patients based on diagnostic coding and/or prescribed medicines. It was initially implemented among a convenience sample of six GP practices. Prevalence of mental health and substance use problems across the six practices, as identified by the finder, was 9.4% (range 6.9-12.7%). 61.9% of identified patients were female; 25.8% were private patients. One-third (33.4%) of identified patients were prescribed more than one class of psychotropic medication. Of the patients identified by the finder, 89.9% were identifiable via prescribing data, 23.7% via diagnostic coding. The finder is a feasible and promising methodology for large-scale data collection on mental health problems in primary care.

  10. Would Consolidation of Army Software Engineering Organizations Help to Control Software Costs for Current and Future Systems

    Science.gov (United States)

    2015-04-16

    Development and Engineering Center; Armament Research, Development and Engineering Center (ARDEC); and Tank and Automotive Research, Development and...specializes in specific Army domains and programs—aviation and missile, communications and electronics, armaments, tank and automotive . In fiscal year...Sustaining engineering o Product fielding and user support o Regression testing • Efficiencies are needed to cope with workload . AMC Software Support

  11. An open-source software tool for the generation of relaxation time maps in magnetic resonance imaging

    Directory of Open Access Journals (Sweden)

    Kühne Titus

    2010-07-01

    Full Text Available Abstract Background In magnetic resonance (MR imaging, T1, T2 and T2* relaxation times represent characteristic tissue properties that can be quantified with the help of specific imaging strategies. While there are basic software tools for specific pulse sequences, until now there is no universal software program available to automate pixel-wise mapping of relaxation times from various types of images or MR systems. Such a software program would allow researchers to test and compare new imaging strategies and thus would significantly facilitate research in the area of quantitative tissue characterization. Results After defining requirements for a universal MR mapping tool, a software program named MRmap was created using a high-level graphics language. Additional features include a manual registration tool for source images with motion artifacts and a tabular DICOM viewer to examine pulse sequence parameters. MRmap was successfully tested on three different computer platforms with image data from three different MR system manufacturers and five different sorts of pulse sequences: multi-image inversion recovery T1; Look-Locker/TOMROP T1; modified Look-Locker (MOLLI T1; single-echo T2/T2*; and multi-echo T2/T2*. Computing times varied between 2 and 113 seconds. Estimates of relaxation times compared favorably to those obtained from non-automated curve fitting. Completed maps were exported in DICOM format and could be read in standard software packages used for analysis of clinical and research MR data. Conclusions MRmap is a flexible cross-platform research tool that enables accurate mapping of relaxation times from various pulse sequences. The software allows researchers to optimize quantitative MR strategies in a manufacturer-independent fashion. The program and its source code were made available as open-source software on the internet.

  12. Application of the PredictAD Software Tool to Predict Progression in Patients with Mild Cognitive Impairment

    DEFF Research Database (Denmark)

    Simonsen, Anja H; Mattila, Jussi; Hejl, Anne-Mette

    2012-01-01

    Background: The PredictAD tool integrates heterogeneous data such as imaging, cerebrospinal fluid biomarkers and results from neuropsychological tests for compact visualization in an interactive user interface. This study investigated whether the software tool could assist physicians in the early...... diagnosis of Alzheimer's disease. Methods: Baseline data from 140 patients with mild cognitive impairment were selected from the Alzheimer's Disease Neuroimaging Study. Three clinical raters classified patients into 6 categories of confidence in the prediction of early Alzheimer's disease, in 4 phases...

  13. Data-Driven Decision Making as a Tool to Improve Software Development Productivity

    Science.gov (United States)

    Brown, Mary Erin

    2013-01-01

    The worldwide software project failure rate, based on a survey of information technology software manager's view of user satisfaction, product quality, and staff productivity, is estimated to be between 24% and 36% and software project success has not kept pace with the advances in hardware. The problem addressed by this study was the limited…

  14. Data-Driven Decision Making as a Tool to Improve Software Development Productivity

    Science.gov (United States)

    Brown, Mary Erin

    2013-01-01

    The worldwide software project failure rate, based on a survey of information technology software manager's view of user satisfaction, product quality, and staff productivity, is estimated to be between 24% and 36% and software project success has not kept pace with the advances in hardware. The problem addressed by this study was the limited…

  15. GTest: a software tool for graphical assessment of empirical distributions' Gaussianity.

    Science.gov (United States)

    Barca, E; Bruno, E; Bruno, D E; Passarella, G

    2016-03-01

    their request for an effective tool for addressing such difficulties motivated us in adopting the inference-by-eye paradigm and implementing an easy-to-use, quick and reliable statistical tool. GTest visualizes its outcomes as a modified version of the Q-Q plot. The application has been developed in Visual Basic for Applications (VBA) within MS Excel 2010, which demonstrated to have all the characteristics of robustness and reliability needed. GTest provides true graphical normality tests which are as reliable as any statistical quantitative approach but much easier to understand. The Q-Q plots have been integrated with the outlining of an acceptance region around the representation of the theoretical distribution, defined in accordance with the alpha level of significance and the data sample size. The test decision rule is the following: if the empirical scatterplot falls completely within the acceptance region, then it can be concluded that the empirical distribution fits the theoretical one at the given alpha level. A comprehensive case study has been carried out with simulated and real-world data in order to check the robustness and reliability of the software.

  16. TRANSIT--A Software Tool for Himar1 TnSeq Analysis.

    Directory of Open Access Journals (Sweden)

    Michael A DeJesus

    2015-10-01

    Full Text Available TnSeq has become a popular technique for determining the essentiality of genomic regions in bacterial organisms. Several methods have been developed to analyze the wealth of data that has been obtained through TnSeq experiments. We developed a tool for analyzing Himar1 TnSeq data called TRANSIT. TRANSIT provides a graphical interface to three different statistical methods for analyzing TnSeq data. These methods cover a variety of approaches capable of identifying essential genes in individual datasets as well as comparative analysis between conditions. We demonstrate the utility of this software by analyzing TnSeq datasets of M. tuberculosis grown on glycerol and cholesterol. We show that TRANSIT can be used to discover genes which have been previously implicated for growth on these carbon sources. TRANSIT is written in Python, and thus can be run on Windows, OSX and Linux platforms. The source code is distributed under the GNU GPL v3 license and can be obtained from the following GitHub repository: https://github.com/mad-lab/transit.

  17. Prognostic 2.0: software tool for heart rate variability analysis and QT interval dispersion

    Science.gov (United States)

    Mendoza, Alfonso; Rueda, Oscar L.; Bautista, Lola X.; Martinez, Víctor E.; Lopez, Eddie R.; Gomez, Mario F.; Alvarez, Alexander

    2007-09-01

    Cardiovascular diseases, in particular Acute Myocardial Infarction (AMI) are the first cause of death in industrialized countries. Measurements of indicators of the behavior of the autonomic nervous system, such as the Heart Rate Variability (HRV) and the QT Interval Dispersion (QTD) in the acute phase of the AMI (first 48 hours after the event) give a good estimation of the subsequent cardiac events that could present a person who had suffered an AMI. This paper describes the implementation of the second version of Prognostic-AMI, a software tool that automate the calculation of such indicators. It uses the Discrete Wavelet Transform (DWT) to de-noise the signals an to detect the QRS complex and the T-wave from a conventional electrocardiogram of 12 leads. Indicators are measured in both time and frequency domain. A pilot trial performed on a sample population of 76 patients shows that people who had had cardiac complications in the acute phase of the AMI have low values in the indicators of HRV and QTD.

  18. Software Tool for Analysis of Breathing-Related Errors in Transthoracic Electrical Bioimpedance Spectroscopy Measurements

    Science.gov (United States)

    Abtahi, F.; Gyllensten, I. C.; Lindecrantz, K.; Seoane, F.

    2012-12-01

    During the last decades, Electrical Bioimpedance Spectroscopy (EBIS) has been applied in a range of different applications and mainly using the frequency sweep-technique. Traditionally the tissue under study is considered to be timeinvariant and dynamic changes of tissue activity are ignored and instead treated as a noise source. This assumption has not been adequately tested and could have a negative impact and limit the accuracy for impedance monitoring systems. In order to successfully use frequency-sweeping EBIS for monitoring time-variant systems, it is paramount to study the effect of frequency-sweep delay on Cole Model-based analysis. In this work, we present a software tool that can be used to simulate the influence of respiration activity in frequency-sweep EBIS measurements of the human thorax and analyse the effects of the different error sources. Preliminary results indicate that the deviation on the EBIS measurement might be significant at any frequency, and especially in the impedance plane. Therefore the impact on Cole-model analysis might be different depending on method applied for Cole parameter estimation.

  19. Development of a Kinect Software Tool to Classify Movements during Active Video Gaming.

    Science.gov (United States)

    Rosenberg, Michael; Thornton, Ashleigh L; Lay, Brendan S; Ward, Brodie; Nathan, David; Hunt, Daniel; Braham, Rebecca

    2016-01-01

    While it has been established that using full body motion to play active video games results in increased levels of energy expenditure, there is little information on the classification of human movement during active video game play in relationship to fundamental movement skills. The aim of this study was to validate software utilising Kinect sensor motion capture technology to recognise fundamental movement skills (FMS), during active video game play. Two human assessors rated jumping and side-stepping and these assessments were compared to the Kinect Action Recognition Tool (KART), to establish a level of agreement and determine the number of movements completed during five minutes of active video game play, for 43 children (m = 12 years 7 months ± 1 year 6 months). During five minutes of active video game play, inter-rater reliability, when examining the two human raters, was found to be higher for the jump (r = 0.94, p Kinect sensor can be used to count the number of jumps and sidestep during five minutes of active video game play with a similar level of accuracy as human raters. However, in contrast to humans, the KART system required a fraction of the time to analyse and tabulate the results.

  20. A software tool for quality assurance of computed/digital radiography (CR/DR) systems

    Science.gov (United States)

    Desai, Nikunj; Valentino, Daniel J.

    2011-03-01

    The recommended methods to test the performance of computed radiography (CR) systems have been established by The American Association of Physicists in Medicine, Report No. 93, "Acceptance Testing and Quality Control of Photostimulable Storage Phosphor Imaging Systems". The quality assurance tests are categorized by how frequently they need to be performed. Quality assurance of CR systems is the responsibility of the facility that performs the exam and is governed by the state in which the facility is located. For Example, the New York State Department of Health has established a guide which lists the tests that a CR facility must perform for quality assurance. This study aims at educating the reader about the new quality assurance requirements defined by the state. It further demonstrates an easy to use software tool, henceforth referred to as the Digital Physicist, developed to aid a radiologic facility in conforming with state guidelines and monitoring quality assurance of CR/DR imaging systems. The Digital Physicist provides a vendor independent procedure for quality assurance of CR/DR systems. Further it, generates a PDF report with a brief description of these tests and the obtained results.

  1. EPA's science blog: "It All Starts with Science"; Article title: "EPA's Solvent Substitution Software Tool, PARIS III"

    Science.gov (United States)

    EPA's solvent substitution software tool, PARIS III is provided by the EPA for free, and can be effective and efficiently used to help environmentally-conscious individuals find better and greener solvent mixtures for many different common industrial processes. People can downlo...

  2. Productivity, part 2: cloud storage, remote meeting tools, screencasting, speech recognition software, password managers, and online data backup.

    Science.gov (United States)

    Lackey, Amanda E; Pandey, Tarun; Moshiri, Mariam; Lalwani, Neeraj; Lall, Chandana; Bhargava, Puneet

    2014-06-01

    It is an opportune time for radiologists to focus on personal productivity. The ever increasing reliance on computers and the Internet has significantly changed the way we work. Myriad software applications are available to help us improve our personal efficiency. In this article, the authors discuss some tools that help improve collaboration and personal productivity, maximize e-learning, and protect valuable digital data.

  3. The Design and Development of a Computerized Tool Support for Conducting Senior Projects in Software Engineering Education

    Science.gov (United States)

    Chen, Chung-Yang; Teng, Kao-Chiuan

    2011-01-01

    This paper presents a computerized tool support, the Meetings-Flow Project Collaboration System (MFS), for designing, directing and sustaining the collaborative teamwork required in senior projects in software engineering (SE) education. Among many schools' SE curricula, senior projects serve as a capstone course that provides comprehensive…

  4. State transition storyboards: A tool for designing the Goldstone solar system radar data acquisition system user interface software

    Science.gov (United States)

    Howard, S. D.

    1987-01-01

    Effective user interface design in software systems is a complex task that takes place without adequate modeling tools. By combining state transition diagrams and the storyboard technique of filmmakers, State Transition Storyboards were developed to provide a detailed modeling technique for the Goldstone Solar System Radar Data Acquisition System human-machine interface. Illustrations are included with a description of the modeling technique.

  5. Plagiarism Detection: A Comparison of Teaching Assistants and a Software Tool in Identifying Cheating in a Psychology Course

    Science.gov (United States)

    Seifried, Eva; Lenhard, Wolfgang; Spinath, Birgit

    2015-01-01

    Essays that are assigned as homework in large classes are prone to cheating via unauthorized collaboration. In this study, we compared the ability of a software tool based on Latent Semantic Analysis (LSA) and student teaching assistants to detect plagiarism in a large group of students. To do so, we took two approaches: the first approach was…

  6. The Design and Development of a Computerized Tool Support for Conducting Senior Projects in Software Engineering Education

    Science.gov (United States)

    Chen, Chung-Yang; Teng, Kao-Chiuan

    2011-01-01

    This paper presents a computerized tool support, the Meetings-Flow Project Collaboration System (MFS), for designing, directing and sustaining the collaborative teamwork required in senior projects in software engineering (SE) education. Among many schools' SE curricula, senior projects serve as a capstone course that provides comprehensive…

  7. A fluoroscopy-based planning and guidance software tool for minimally invasive hip refixation by cement injection

    NARCIS (Netherlands)

    Malan, D.F.; Van der Walt, S.J.; Raidou, R.G.; Van den Berg, B.; Stoel, B.C.; Botha, C.P.; Nelissen, R.G.H.H.; Valstar, E.R.

    2015-01-01

    Purpose In orthopaedics, minimally invasive injection of bone cement is an established technique. We present HipRFX, a software tool for planning and guiding a cement injection procedure for stabilizing a loosening hip prosthesis. HipRFX works by analysing a pre-operative CT and intraoperative C-arm

  8. EPA's science blog: "It All Starts with Science"; Article title: "EPA's Solvent Substitution Software Tool, PARIS III"

    Science.gov (United States)

    EPA's solvent substitution software tool, PARIS III is provided by the EPA for free, and can be effective and efficiently used to help environmentally-conscious individuals find better and greener solvent mixtures for many different common industrial processes. People can downlo...

  9. A Software Tool for Atmospheric Correction and Surface Temperature Estimation of Landsat Infrared Thermal Data

    Directory of Open Access Journals (Sweden)

    Benjamin Tardy

    2016-08-01

    Full Text Available Land surface temperature (LST is an important variable involved in the Earth’s surface energy and water budgets and a key component in many aspects of environmental research. The Landsat program, jointly carried out by NASA and the USGS, has been recording thermal infrared data for the past 40 years. Nevertheless, LST data products for Landsat remain unavailable. The atmospheric correction (AC method commonly used for mono-window Landsat thermal data requires detailed information concerning the vertical structure (temperature, pressure and the composition (water vapor, ozone of the atmosphere. For a given coordinate, this information is generally obtained through either radio-sounding or atmospheric model simulations and is passed to the radiative transfer model (RTM to estimate the local atmospheric correction parameters. Although this approach yields accurate LST data, results are relevant only near this given coordinate. To meet the scientific community’s demand for high-resolution LST maps, we developed a new software tool dedicated to processing Landsat thermal data. The proposed tool improves on the commonly-used AC algorithm by incorporating spatial variations occurring in the Earth’s atmosphere composition. The ERA-Interim dataset (ECMWFmeteorological organization was used to retrieve vertical atmospheric conditions, which are available at a global scale with a resolution of 0.125 degrees and a temporal resolution of 6 h. A temporal and spatial linear interpolation of meteorological variables was performed to match the acquisition dates and coordinates of the Landsat images. The atmospheric correction parameters were then estimated on the basis of this reconstructed atmospheric grid using the commercial RTMsoftware MODTRAN. The needed surface emissivity was derived from the common vegetation index NDVI, obtained from the red and near-infrared (NIR bands of the same Landsat image. This permitted an estimation of LST for the entire

  10. Mars, accessing the third dimension: a software tool to exploit Mars ground penetrating radars data.

    Science.gov (United States)

    Cantini, Federico; Ivanov, Anton B.

    2016-04-01

    The Mars Advanced Radar for Subsurface and Ionosphere Sounding (MARSIS), on board the ESA's Mars Express and the SHAllow RADar (SHARAD), on board the NASA's Mars Reconnaissance Orbiter are two ground penetrating radars (GPRs) aimed to probe the crust of Mars to explore the subsurface structure of the planet. By now they are collecting data since about 10 years covering a large fraction of the Mars surface. On the Earth GPRs collect data by sending electromagnetic (EM) pulses toward the surface and listening to the return echoes occurring at the dielectric discontinuities on the planet's surface and subsurface. The wavelengths used allow MARSIS EM pulses to penetrate the crust for several kilometers. The data products (Radargrams) are matrices where the x-axis spans different sampling points on the planet surface and the y-axis is the power of the echoes over time in the listening window. No standard way to manage this kind of data is established in the planetary science community and data analysis and interpretation require very often some knowledge of radar signal processing. Our software tool is aimed to ease the access to this data in particular to scientists without a specific background in signal processing. MARSIS and SHARAD geometrical data such as probing point latitude and longitude and spacecraft altitude, are stored, together with relevant acquisition metadata, in a geo-enabled relational database implemented using PostgreSQL and PostGIS. Data are extracted from official ESA and NASA released data using self-developed python classes and scripts and inserted in the database using OGR utilities. This software is also aimed to be the core of a collection of classes and script to implement more complex GPR data analysis. Geometrical data and metadata are exposed as WFS layers using a QGIS server, which can be further integrated with other data, such as imaging, spectroscopy and topography. Radar geometry data will be available as a part of the iMars Web

  11. Software for CCTV systems design

    Directory of Open Access Journals (Sweden)

    Adamek Milan

    2016-01-01

    Full Text Available The article is focused on the software, which is used in the design of the CCTV systems. It shows tools available online, tools for PC and mobile applications. It describes the basic components of the camera systems, their characteristics and current trends in CCTV systems. Moreover, it compares two selected software tools, their features and supported functions. In the practical part, these tools are used for the design of a CCTV system and the whole process is described in detail.

  12. Open Software Tools Applied to Jordan's National Multi-Agent Water Management Model

    Science.gov (United States)

    Knox, Stephen; Meier, Philipp; Harou, Julien; Yoon, Jim; Selby, Philip; Lachaut, Thibaut; Klassert, Christian; Avisse, Nicolas; Khadem, Majed; Tilmant, Amaury; Gorelick, Steven

    2016-04-01

    Jordan is the fourth most water scarce country in the world, where demand exceeds supply in a politically and demographically unstable context. The Jordan Water Project (JWP) aims to perform policy evaluation by modelling the hydrology, economics, and governance of Jordan's water resource system. The multidisciplinary nature of the project requires a modelling software system capable of integrating submodels from multiple disciplines into a single decision making process and communicating results to stakeholders. This requires a tool for building an integrated model and a system where diverse data sets can be managed and visualised. The integrated Jordan model is built using Pynsim, an open-source multi-agent simulation framework implemented in Python. Pynsim operates on network structures of nodes and links and supports institutional hierarchies, where an institution represents a grouping of nodes, links or other institutions. At each time step, code within each node, link and institution can executed independently, allowing for their fully autonomous behaviour. Additionally, engines (sub-models) perform actions over the entire network or on a subset of the network, such as taking a decision on a set of nodes. Pynsim is modular in design, allowing distinct modules to be modified easily without affecting others. Data management and visualisation is performed using Hydra (www.hydraplatform.org), an open software platform allowing users to manage network structure and data. The Hydra data manager connects to Pynsim, providing necessary input parameters for the integrated model. By providing a high-level portal to the model, Hydra removes a barrier between the users of the model (researchers, stakeholders, planners etc) and the model itself, allowing them to manage data, run the model and visualise results all through a single user interface. Pynsim's ability to represent institutional hierarchies, inter-network communication and the separation of node, link and

  13. SAGES: a suite of freely-available software tools for electronic disease surveillance in resource-limited settings.

    Directory of Open Access Journals (Sweden)

    Sheri L Lewis

    Full Text Available Public health surveillance is undergoing a revolution driven by advances in the field of information technology. Many countries have experienced vast improvements in the collection, ingestion, analysis, visualization, and dissemination of public health data. Resource-limited countries have lagged behind due to challenges in information technology infrastructure, public health resources, and the costs of proprietary software. The Suite for Automated Global Electronic bioSurveillance (SAGES is a collection of modular, flexible, freely-available software tools for electronic disease surveillance in resource-limited settings. One or more SAGES tools may be used in concert with existing surveillance applications or the SAGES tools may be used en masse for an end-to-end biosurveillance capability. This flexibility allows for the development of an inexpensive, customized, and sustainable disease surveillance system. The ability to rapidly assess anomalous disease activity may lead to more efficient use of limited resources and better compliance with World Health Organization International Health Regulations.

  14. Software sensors as a tool for optimization of animal-cell cultures.

    OpenAIRE

    Dorresteijn, P.C.

    1997-01-01

    In this thesis software sensors are introduced that predict the biomass activity and the concentrations of glucose, glutamine, lactic acid, and ammonium on line, The software sensors for biomass activity, glucose and lactic acid can be applied for any type of animal cell that is grown in a bioreactor system. The glutamine and ammonium software sensors are determined experimentally by correlating them to the acid-production rate. Therefore, they can only be used for Vero cells.In the developme...

  15. Linking meteoroid streams to their parent bodies by means of orbital association software tools

    Science.gov (United States)

    Madiedo, Jose Maria; Trigo-Rodriguez, Josep Maria

    2013-01-01

    A Microsoft-Windows-compatible software, called ORAS (ORbital Association Software), has been developed for verifying possible associations between orbits using different existing criteria. Applying this software revealed a likely association for the orbit of a Northern Chi-Orionid (ORN) fireball and Asteroid (PHA) 2008XM1. A numerical integration 4000 years backwards in time for the orbital parameters shows that this asteroid is a better match for this Northern Chi-Orionid than Asteroid (NEO) 2002XM35.

  16. Biomedical Mutation Analysis (BMA): A software tool for analyzing mutations associated with antiviral resistance.

    Science.gov (United States)

    Salvatierra, Karina; Florez, Hector

    2016-01-01

    Hepatitis C virus (HCV) is considered a major public health problem, with 200 million people infected worldwide. The treatment for HCV chronic infection with pegylated interferon alpha plus ribavirin inhibitors is unspecific; consequently, the treatment is effective in only 50% of patients infected. This has prompted the development of direct-acting antivirals (DAA) that target virus proteins. These DAA have demonstrated a potent effect in vitro and in vivo; however, virus mutations associated with the development of resistance have been described. To design and develop an online information system for detecting mutations in amino acids known to be implicated in resistance to DAA.    We have used computer applications, technological tools, standard languages, infrastructure systems and algorithms, to analyze positions associated with resistance to DAA for the NS3, NS5A, and NS5B genes of HCV. We have designed and developed an online information system named Biomedical Mutation Analysis (BMA), which allows users to calculate changes in nucleotide and amino acid sequences for each selected sequence from conventional Sanger and cloning sequencing using a graphical interface. BMA quickly, easily and effectively analyzes mutations, including complete documentation and examples. Furthermore, the development of different visualization techniques allows proper interpretation and understanding of the results. The data obtained using BMA will be useful for the assessment and surveillance of HCV resistance to new antivirals, and for the treatment regimens by selecting those DAA to which the virus is not resistant, avoiding unnecessary treatment failures. The software is available at: http://bma.itiud.org.

  17. Development of a Kinect Software Tool to Classify Movements during Active Video Gaming.

    Directory of Open Access Journals (Sweden)

    Michael Rosenberg

    Full Text Available While it has been established that using full body motion to play active video games results in increased levels of energy expenditure, there is little information on the classification of human movement during active video game play in relationship to fundamental movement skills. The aim of this study was to validate software utilising Kinect sensor motion capture technology to recognise fundamental movement skills (FMS, during active video game play. Two human assessors rated jumping and side-stepping and these assessments were compared to the Kinect Action Recognition Tool (KART, to establish a level of agreement and determine the number of movements completed during five minutes of active video game play, for 43 children (m = 12 years 7 months ± 1 year 6 months. During five minutes of active video game play, inter-rater reliability, when examining the two human raters, was found to be higher for the jump (r = 0.94, p < .01 than the sidestep (r = 0.87, p < .01, although both were excellent. Excellent reliability was also found between human raters and the KART system for the jump (r = 0.84, p, .01 and moderate reliability for sidestep (r = 0.6983, p < .01 during game play, demonstrating that both humans and KART had higher agreement for jumps than sidesteps in the game play condition. The results of the study provide confidence that the Kinect sensor can be used to count the number of jumps and sidestep during five minutes of active video game play with a similar level of accuracy as human raters. However, in contrast to humans, the KART system required a fraction of the time to analyse and tabulate the results.

  18. A flexible, interactive software tool for fitting the parameters of neuronal models

    Directory of Open Access Journals (Sweden)

    Péter eFriedrich

    2014-07-01

    Full Text Available The construction of biologically relevant neuronal models as well as model-based analysis of experimental data often requires the simultaneous fitting of multiple model parameters, so that the behavior of the model in a certain paradigm matches (as closely as possible the corresponding output of a real neuron according to some predefined criterion. Although the task of model optimization is often computationally hard, and the quality of the results depends heavily on technical issues such as the appropriate choice (and implementation of cost functions and optimization algorithms, no existing program provides access to the best available methods while also guiding the user through the process effectively. Our software, called Optimizer, implements a modular and extensible framework for the optimization of neuronal models, and also features a graphical interface which makes it easy for even non-expert users to handle many commonly occurring scenarios. Meanwhile, educated users can extend the capabilities of the program and customize it according to their needs with relatively little effort. Optimizer has been developed in Python, takes advantage of open-source Python modules for nonlinear optimization, and interfaces directly with the NEURON simulator to run the models. Other simulators are supported through an external interface. We have tested the program on several different types of problem of varying complexity, using different model classes. As targets, we used simulated traces from the same or a more complex model class, as well as experimental data. We successfully used Optimizer to determine passive parameters and conductance densities in compartmental models, and to fit simple (adaptive exponential integrate-and-fire neuronal models to complex biological data. Our detailed comparisons show that Optimizer can handle a wider range of problems, and delivers equally good or better performance than any other existing neuronal model fitting

  19. Autism Spectrum Disorder: Examining Current Diagnosis Strategies and Assessment Tools

    Science.gov (United States)

    Wormald, Amy Marie

    2011-01-01

    Recent literature notes a significant increase in students being diagnosed with Autism Spectrum Disorder (ASD). Yet, no uniform testing protocol exists. It is vital for students and educators that a unified testing process be created and established. This study investigated which ASD testing instruments were currently used in Southern California…

  20. Determination of Flux linkage Characteristics and Inductance of a Submersible Switched Reluctance Motor using Software Tools

    Directory of Open Access Journals (Sweden)

    Sundaram Maruthachalam

    2011-01-01

    Full Text Available Problem statement: The Switched Reluctance Motor (SRM is an old member of the Electric Machines Family. It’s simple structure, ruggedness and inexpensive manufacturing capability make it more attractive for Industrial applications. Now, the applications of switched reluctance motors in various industrial fields are tried by many engineers. However, switched reluctance motors are not used so far in submersible underwater motor for agriculture purposee. The torque developed by an SRM is dependent on the change of flux-linkage and rotor position. The flux linkage characteristic of the motor is required to make the control circuit. Since the SRM is non-linear in nature, estimation and calculation of the flux linkage characteristics is very difficult. Approach: With the flux tube method concept a simple algorithm is being developed in a MATLAB. ANSYS Software is used to determine the flux distribution at various rotor positions. Results: The aligned and unaligned flux linkage values for theoretical calculation at a current of 7 A is 72.7 mwb and 13.79 mwb respectively. With FEA simulation the obtained value is 92.73 mwb and 19.175. Conclusion: In this and, a simplified method for the determination of flux linkage characteristics of submersible SRM using MATLAB has been presented. The obtained value has been validated with the ANSYS FEM method. the calculated unaligned and aligned inductance values of a 4- phase, 3 hp, 220 V Submersible SRM using simplified MATLAB method very much matches with the ANSYS FEM Method.

  1. Turnitoff: Identifying and Fixing a Hole in Current Plagiarism Detection Software

    Science.gov (United States)

    Heather, James

    2010-01-01

    In recent times, "plagiarism detection software" has become popular in universities and colleges, in an attempt to stem the tide of plagiarised student coursework. Such software attempts to detect any copied material and identify its source. The most popular such software is Turnitin, a commercial system used by thousands of institutions…

  2. Turnitoff: Identifying and Fixing a Hole in Current Plagiarism Detection Software

    Science.gov (United States)

    Heather, James

    2010-01-01

    In recent times, "plagiarism detection software" has become popular in universities and colleges, in an attempt to stem the tide of plagiarised student coursework. Such software attempts to detect any copied material and identify its source. The most popular such software is Turnitin, a commercial system used by thousands of institutions…

  3. Software tools and preliminary design of a control system for the 40m OAN radiotelescope

    Science.gov (United States)

    de Vicente, P.; Bolaño, R.

    2004-07-01

    The Observatorio Astronómico Nacional (OAN) is building a 40m radiotelescope in its facilities in Yebes (Spain) which will be delivered by April 2004. The servosystem will be controlled by an ACU (Antenna Control Unit), a real time computer running VxWorks which will be commanded from a remote computer (RCC) or from a local computer (LCC) which will act as console. We present the tools we have chosen to develop and use the control system for the RCC and the criteria followed for the choices we made. We also present a preliminary design of the control system on which we are currently working. The RCC will run a server which communicates with the ACU using sockets and with the clients, receivers and backends using OmniOrb, a free implementation of CORBA. Clients running Python will allow the users to control the antenna from any host connected to a LAN or a secure Internet connection.

  4. MOS: A critical tool for current and future radio surveys

    CERN Document Server

    Smith, Daniel J B

    2015-01-01

    Since radio continuum observations are not affected by dust obscuration, they are of immense potential diagnostic power as cosmological probes and for studying galaxy formation and evolution out to high redshifts. However, the power-law nature of radio frequency spectra ensures that ancillary spectroscopic information remains critical for studying the properties of the faint radio sources being detected in rapidly-increasing numbers on the pathway to the Square Kilometre Array. In this contribution, I present some of the key scientific motivations for exploiting the immense synergies between radio continuum observations and multi-object spectroscopic surveys. I review some of the ongoing efforts to obtain the spectra necessary to harness the huge numbers of star-forming galaxies and AGN that current and future radio surveys will detect. I also touch on the WEAVE-LOFAR survey, which will use the WEAVE spectrograph currently being built for the William Herschel Telescope to target hundreds of thousands of low f...

  5. A new online software tool for pressure ulcer monitoring as an educational instrument for unified nursing assessment in clinical settings

    Directory of Open Access Journals (Sweden)

    Andrea Pokorná

    2016-07-01

    Full Text Available Data collection and evaluation of that data is crucial for effective quality management and naturally also for prevention and treatment of pressure ulcers. Data collected in a uniform manner by nurses in clinical practice could be used for further analyses. Data about pressure ulcers are collected to differing degrees of quality based on the local policy of the given health care facility and in relation to the nurse’s actual level of knowledge concerning pressure ulcer identification and use of objective scales (i.e. categorization of pressure ulcers. Therefore, we have developed software suitable for data collection which includes some educational tools to promote unified reporting of data by nurses. A description of this software and some educational and learning components of the tool is presented herein. The planned process of clinical application of the newly developed software is also briefly mentioned. The discussion is focused on the usability of the online reporting tool and possible further development of the tool.

  6. Development of the software tool for generation and visualization of the finite element head model with bone conduction sounds

    Science.gov (United States)

    Nikolić, Dalibor; Milošević, Žarko; Saveljić, Igor; Filipović, Nenad

    2015-12-01

    Vibration of the skull causes a hearing sensation. We call it Bone Conduction (BC) sound. There are several investigations about transmission properties of bone conducted sound. The aim of this study was to develop a software tool for easy generation of the finite element (FE) model of the human head with different materials based on human head anatomy and to calculate sound conduction through the head. Developed software tool generates a model in a few steps. The first step is to do segmentation of CT medical images (DICOM) and to generate a surface mesh files (STL). Each STL file presents a different layer of human head with different material properties (brain, CSF, different layers of the skull bone, skin, etc.). The next steps are to make tetrahedral mesh from obtained STL files, to define FE model boundary conditions and to solve FE equations. This tool uses PAK solver, which is the open source software implemented in SIFEM FP7 project, for calculations of the head vibration. Purpose of this tool is to show impact of the bone conduction sound of the head on the hearing system and to estimate matching of obtained results with experimental measurements.

  7. Software sensors as a tool for optimization of animal-cell cultures

    NARCIS (Netherlands)

    Dorresteijn, R.C.

    1997-01-01

    In this thesis software sensors are introduced that predict the biomass activity and the concentrations of glucose, glutamine, lactic acid, and ammonium on line, The software sensors for biomass activity, glucose and lactic acid can be applied for any type of animal cell that is grown in a

  8. Software sensors as a tool for optimization of animal-cell cultures.

    NARCIS (Netherlands)

    Dorresteijn, P.C.

    1997-01-01

    In this thesis software sensors are introduced that predict the biomass activity and the concentrations of glucose, glutamine, lactic acid, and ammonium on line, The software sensors for biomass activity, glucose and lactic acid can be applied for any type of animal cell that is grown in a bioreacto

  9. A tool framework for deriving the application architecture for global software development projects

    NARCIS (Netherlands)

    Yildiz, Bugra Mehmet; Tekinerdogan, B.; Cetin, Semih

    In order to meet the communication, coordination and control requirements of distributed Global Software Development (GSD) teams, it is necessary to define a proper software architecture. Designing a GSD architecture, however, involves a multitude of design decisions that are related in different

  10. Metabolomics: current state and evolving methodologies and tools.

    Science.gov (United States)

    Oldiges, Marco; Lütz, Stephan; Pflug, Simon; Schroer, Kirsten; Stein, Nadine; Wiendahl, Christiane

    2007-09-01

    In recent years, metabolomics developed to an accepted and valuable tool in life sciences. Substantial improvements of analytical hardware allow metabolomics to run routinely now. Data are successfully used to investigate genotype-phenotype relations of strains and mutants. Metabolomics facilitates metabolic engineering to optimise mircoorganisms for white biotechnology and spreads to the investigation of biotransformations and cell culture. Metabolomics serves not only as a source of qualitative but also quantitative data of intra-cellular metabolites essential for the model-based description of the metabolic network operating under in vivo conditions. To collect reliable metabolome data sets, culture and sampling conditions, as well as the cells' metabolic state, are crucial. Hence, application of biochemical engineering principles and method standardisation efforts become important. Together with the other more established omics technologies, metabolomics will strengthen its claim to contribute to the detailed understanding of the in vivo function of gene products, biochemical and regulatory networks and, even more ambitious, the mathematical description and simulation of the whole cell in the systems biology approach. This knowledge will allow the construction of designer organisms for process application using biotransformation and fermentative approaches making effective use of single enzymes, whole microbial and even higher cells.

  11. Numerical arc segmentation algorithm for a radio conference - A software tool for communication satellite systems planning

    Science.gov (United States)

    Whyte, W. A.; Heyward, A. O.; Ponchak, D. S.; Spence, R. L.; Zuzek, J. E.

    1988-01-01

    A detailed description of a Numerical Arc Segmentation Algorithm for a Radio Conference (NASARC) software package for communication satellite systems planning is presented. This software provides a method of generating predetermined arc segments for use in the development of an allotment planning procedure to be carried out at the 1988 World Administrative Radio Conference (WARC - 88) on the use of the GEO and the planning of space services utilizing GEO. The features of the NASARC software package are described, and detailed information is given about the function of each of the four NASARC program modules. The results of a sample world scenario are presented and discussed.

  12. Using McIDAS-V data analysis and visualization software as an educational tool for understanding the atmosphere

    Science.gov (United States)

    Achtor, T. H.; Rink, T.

    2010-12-01

    The University of Wisconsin’s Space Science and Engineering Center (SSEC) has been at the forefront in developing data analysis and visualization tools for environmental satellites and other geophysical data. The fifth generation of the Man-computer Interactive Data Access System (McIDAS-V) is Java-based, open-source, freely available software that operates on Linux, Macintosh and Windows systems. The software tools provide powerful new data manipulation and visualization capabilities that work with geophysical data in research, operational and educational environments. McIDAS-V provides unique capabilities to support innovative techniques for evaluating research results, teaching and training. McIDAS-V is based on three powerful software elements. VisAD is a Java library for building interactive, collaborative, 4 dimensional visualization and analysis tools. The Integrated Data Viewer (IDV) is a reference application based on the VisAD system and developed by the Unidata program that demonstrates the flexibility that is needed in this evolving environment, using a modern, object-oriented software design approach. The third tool, HYDRA, allows users to build, display and interrogate multi and hyperspectral environmental satellite data in powerful ways. The McIDAS-V software is being used for training and education in several settings. The McIDAS User Group provides training workshops at its annual meeting. Numerous online tutorials with training data sets have been developed to aid users in learning simple and more complex operations in McIDAS-V, all are available online. In a University of Wisconsin-Madison undergraduate course in Radar and Satellite Meteorology, McIDAS-V is used to create and deliver laboratory exercises using case study and real time data. At the high school level, McIDAS-V is used in several exercises in our annual Summer Workshop in Earth and Atmospheric Sciences to provide young scientists the opportunity to examine data with friendly and

  13. Evaluation of software tools for automated identification of neuroanatomical structures in quantitative β-amyloid PET imaging to diagnose Alzheimer's disease

    Energy Technology Data Exchange (ETDEWEB)

    Tuszynski, Tobias; Luthardt, Julia; Butzke, Daniel; Tiepolt, Solveig; Seese, Anita; Barthel, Henryk [Leipzig University Medical Centre, Department of Nuclear Medicine, Leipzig (Germany); Rullmann, Michael; Hesse, Swen; Sabri, Osama [Leipzig University Medical Centre, Department of Nuclear Medicine, Leipzig (Germany); Leipzig University Medical Centre, Integrated Treatment and Research Centre (IFB) Adiposity Diseases, Leipzig (Germany); Gertz, Hermann-Josef [Leipzig University Medical Centre, Department of Psychiatry, Leipzig (Germany); Lobsien, Donald [Leipzig University Medical Centre, Department of Neuroradiology, Leipzig (Germany)

    2016-06-15

    For regional quantification of nuclear brain imaging data, defining volumes of interest (VOIs) by hand is still the gold standard. As this procedure is time-consuming and operator-dependent, a variety of software tools for automated identification of neuroanatomical structures were developed. As the quality and performance of those tools are poorly investigated so far in analyzing amyloid PET data, we compared in this project four algorithms for automated VOI definition (HERMES Brass, two PMOD approaches, and FreeSurfer) against the conventional method. We systematically analyzed florbetaben brain PET and MRI data of ten patients with probable Alzheimer's dementia (AD) and ten age-matched healthy controls (HCs) collected in a previous clinical study. VOIs were manually defined on the data as well as through the four automated workflows. Standardized uptake value ratios (SUVRs) with the cerebellar cortex as a reference region were obtained for each VOI. SUVR comparisons between ADs and HCs were carried out using Mann-Whitney-U tests, and effect sizes (Cohen's d) were calculated. SUVRs of automatically generated VOIs were correlated with SUVRs of conventionally derived VOIs (Pearson's tests). The composite neocortex SUVRs obtained by manually defined VOIs were significantly higher for ADs vs. HCs (p=0.010, d=1.53). This was also the case for the four tested automated approaches which achieved effect sizes of d=1.38 to d=1.62. SUVRs of automatically generated VOIs correlated significantly with those of the hand-drawn VOIs in a number of brain regions, with regional differences in the degree of these correlations. Best overall correlation was observed in the lateral temporal VOI for all tested software tools (r=0.82 to r=0.95, p<0.001). Automated VOI definition by the software tools tested has a great potential to substitute for the current standard procedure to manually define VOIs in β-amyloid PET data analysis. (orig.)

  14. Algorithms and software tools for ordering clone libraries: application to the mapping of the genome of Schizosaccharomyces pombe.

    Science.gov (United States)

    Mott, R; Grigoriev, A; Maier, E; Hoheisel, J; Lehrach, H

    1993-04-25

    A complete set of software tools to aid the physical mapping of a genome has been developed and successfully applied to the genomic mapping of the fission yeast Schizosaccharomyces pombe. Two approaches were used for ordering single-copy hybridisation probes: one was based on the simulated annealing algorithm to order all probes, and another on inferring the minimum-spanning subset of the probes using a heuristic filtering procedure. Both algorithms produced almost identical maps, with minor differences in the order of repetitive probes and those having identical hybridisation patterns. A separate algorithm fitted the clones to the established probe order. Approaches for handling experimental noise and repetitive elements are discussed. In addition to these programs and the database management software, tools for visualizing and editing the data are described. The issues of combining the information from different libraries are addressed. Also, ways of handling multiple-copy probes and non-hybridisation data are discussed.

  15. Development and verification of a software tool for the acoustic location of partial discharge in a power transformer

    Directory of Open Access Journals (Sweden)

    Polužanski Vladimir

    2014-01-01

    Full Text Available This paper discusses the development and verification of software tool for determining the location of partial discharge in a power transformer with the acoustic method. Classification and systematization of physical principles and detection methods and tests of partial discharge in power transformers are shown at the beginning of this paper. The most important mathematical models, features, algorithms, and real problems that affect measurement accuracy are highlighted. This paper describes the development and implementation of a software tool for determining the location of partial discharge in a power transformer based on a no iterative mathematical algorithm. Verification and accuracy of measurement are proved both by computer simulation and experimental results available in the literature.

  16. New risk metrics and mathematical tools for risk analysis: Current and future challenges

    Science.gov (United States)

    Skandamis, Panagiotis N.; Andritsos, Nikolaos; Psomas, Antonios; Paramythiotis, Spyridon

    2015-01-01

    The current status of the food safety supply world wide, has led Food and Agriculture Organization (FAO) and World Health Organization (WHO) to establishing Risk Analysis as the single framework for building food safety control programs. A series of guidelines and reports that detail out the various steps in Risk Analysis, namely Risk Management, Risk Assessment and Risk Communication is available. The Risk Analysis approach enables integration between operational food management systems, such as Hazard Analysis Critical Control Points, public health and governmental decisions. To do that, a series of new Risk Metrics has been established as follows: i) the Appropriate Level of Protection (ALOP), which indicates the maximum numbers of illnesses in a population per annum, defined by quantitative risk assessments, and used to establish; ii) Food Safety Objective (FSO), which sets the maximum frequency and/or concentration of a hazard in a food at the time of consumption that provides or contributes to the ALOP. Given that ALOP is rather a metric of the public health tolerable burden (it addresses the total `failure' that may be handled at a national level), it is difficult to be interpreted into control measures applied at the manufacturing level. Thus, a series of specific objectives and criteria for performance of individual processes and products have been established, all of them assisting in the achievement of FSO and hence, ALOP. In order to achieve FSO, tools quantifying the effect of processes and intrinsic properties of foods on survival and growth of pathogens are essential. In this context, predictive microbiology and risk assessment have offered an important assistance to Food Safety Management. Predictive modelling is the basis of exposure assessment and the development of stochastic and kinetic models, which are also available in the form of Web-based applications, e.g., COMBASE and Microbial Responses Viewer), or introduced into user-friendly softwares

  17. New risk metrics and mathematical tools for risk analysis: Current and future challenges

    Energy Technology Data Exchange (ETDEWEB)

    Skandamis, Panagiotis N., E-mail: pskan@aua.gr; Andritsos, Nikolaos, E-mail: pskan@aua.gr; Psomas, Antonios, E-mail: pskan@aua.gr; Paramythiotis, Spyridon, E-mail: pskan@aua.gr [Laboratory of Food Quality Control and Hygiene, Department of Food Science and Technology, Agricultural University of Athens, Iera Odos 75, 118 55, Athens (Greece)

    2015-01-22

    The current status of the food safety supply world wide, has led Food and Agriculture Organization (FAO) and World Health Organization (WHO) to establishing Risk Analysis as the single framework for building food safety control programs. A series of guidelines and reports that detail out the various steps in Risk Analysis, namely Risk Management, Risk Assessment and Risk Communication is available. The Risk Analysis approach enables integration between operational food management systems, such as Hazard Analysis Critical Control Points, public health and governmental decisions. To do that, a series of new Risk Metrics has been established as follows: i) the Appropriate Level of Protection (ALOP), which indicates the maximum numbers of illnesses in a population per annum, defined by quantitative risk assessments, and used to establish; ii) Food Safety Objective (FSO), which sets the maximum frequency and/or concentration of a hazard in a food at the time of consumption that provides or contributes to the ALOP. Given that ALOP is rather a metric of the public health tolerable burden (it addresses the total ‘failure’ that may be handled at a national level), it is difficult to be interpreted into control measures applied at the manufacturing level. Thus, a series of specific objectives and criteria for performance of individual processes and products have been established, all of them assisting in the achievement of FSO and hence, ALOP. In order to achieve FSO, tools quantifying the effect of processes and intrinsic properties of foods on survival and growth of pathogens are essential. In this context, predictive microbiology and risk assessment have offered an important assistance to Food Safety Management. Predictive modelling is the basis of exposure assessment and the development of stochastic and kinetic models, which are also available in the form of Web-based applications, e.g., COMBASE and Microbial Responses Viewer), or introduced into user

  18. Analysis of metabolomic data: tools, current strategies and future challenges for omics data integration.

    Science.gov (United States)

    Cambiaghi, Alice; Ferrario, Manuela; Masseroli, Marco

    2017-05-01

    Metabolomics is a rapidly growing field consisting of the analysis of a large number of metabolites at a system scale. The two major goals of metabolomics are the identification of the metabolites characterizing each organism state and the measurement of their dynamics under different situations (e.g. pathological conditions, environmental factors). Knowledge about metabolites is crucial for the understanding of most cellular phenomena, but this information alone is not sufficient to gain a comprehensive view of all the biological processes involved. Integrated approaches combining metabolomics with transcriptomics and proteomics are thus required to obtain much deeper insights than any of these techniques alone. Although this information is available, multilevel integration of different 'omics' data is still a challenge. The handling, processing, analysis and integration of these data require specialized mathematical, statistical and bioinformatics tools, and several technical problems hampering a rapid progress in the field exist. Here, we review four main tools for number of users or provided features (MetaCoreTM, MetaboAnalyst, InCroMAP and 3Omics) out of the several available for metabolomic data analysis and integration with other 'omics' data, highlighting their strong and weak aspects; a number of related issues affecting data analysis and integration are also identified and discussed. Overall, we provide an objective description of how some of the main currently available software packages work, which may help the experimental practitioner in the choice of a robust pipeline for metabolomic data analysis and integration. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  19. The Seismic Tool-Kit (STK): an open source software for seismology and signal processing.

    Science.gov (United States)

    Reymond, Dominique

    2016-04-01

    We present an open source software project (GNU public license), named STK: Seismic ToolKit, that is dedicated mainly for seismology and signal processing. The STK project that started in 2007, is hosted by SourceForge.net, and count more than 19 500 downloads at the date of writing. The STK project is composed of two main branches: First, a graphical interface dedicated to signal processing (in the SAC format (SAC_ASCII and SAC_BIN): where the signal can be plotted, zoomed, filtered, integrated, derivated, ... etc. (a large variety of IFR and FIR filter is proposed). The estimation of spectral density of the signal are performed via the Fourier transform, with visualization of the Power Spectral Density (PSD) in linear or log scale, and also the evolutive time-frequency representation (or sonagram). The 3-components signals can be also processed for estimating their polarization properties, either for a given window, or either for evolutive windows along the time. This polarization analysis is useful for extracting the polarized noises, differentiating P waves, Rayleigh waves, Love waves, ... etc. Secondly, a panel of Utilities-Program are proposed for working in a terminal mode, with basic programs for computing azimuth and distance in spherical geometry, inter/auto-correlation, spectral density, time-frequency for an entire directory of signals, focal planes, and main components axis, radiation pattern of P waves, Polarization analysis of different waves (including noize), under/over-sampling the signals, cubic-spline smoothing, and linear/non linear regression analysis of data set. A MINimum library of Linear AlGebra (MIN-LINAG) is also provided for computing the main matrix process like: QR/QL decomposition, Cholesky solve of linear system, finding eigen value/eigen vectors, QR-solve/Eigen-solve of linear equations systems ... etc. STK is developed in C/C++, mainly under Linux OS, and it has been also partially implemented under MS-Windows. Usefull links: http

  20. An user-friendly software tool for the solution of the time-dependent Schroedinger and Gross-Pitaevskii equations

    Energy Technology Data Exchange (ETDEWEB)

    Serafini, Thomas; Bertoni, Andrea, E-mail: andrea.bertoni@unimore.i [S3 National Research Center, INFM-CNR, 41125 Modena (Italy)

    2009-11-15

    In this work we present TDStool, a general-purpose easy-to-use software tool for the solution of the time-dependent Schroedinger equation in 2D and 3D domains with arbitrary time-dependent potentials. The numerical algorithms adopted in the code, namely Fourier split-step and box-integration methods, are sketched and the main characteristics of the tool are illustrated. As an example, the dynamics of a single electron in systems of two and three coupled quantum dots is obtained. The code is released as an open-source project and has a build-in graphical interface for the visualization of the results.

  1. From field to cloud: a collaborative software tool to manage hydrological observatories

    Science.gov (United States)

    Kraft, Philipp; Weber, Chris P.; Windhorst, David; Breuer, Lutz

    2017-04-01

    marked as an error. Transformation functions convert direct observation to derived data, like discharge, on the fly. Improved stage-discharge relations apply directly to older measurements. The management system consists of a web portal, plotting and mapping facilities, import and export functions, an image database, and a management tool to assign tasks. A transparent link to CUAHSI Hydrological Information System (HIS), a data sharing portal, is currently under development using the standardized WaterML interface. The system is freely available and built upon open source tools. The system is in operational use for three observatories located in Germany, Ecuador and Kenya holding 10 to 50 Million records.

  2. A Survey Identifying Trends on Use of Software Development Tools in Different Indian SMEs

    National Research Council Canada - National Science Library

    Nomi Baruah Ashima; Ashima

    2012-01-01

    .... The SMEs are using software processimprovement models but they are not able to follow all the processes due to lack of resource and cost toimprove their productivity and quality of their product...

  3. 1st International Workshop on Tools for Managing Globally Distributed Software Development (TOMAG 2007)

    NARCIS (Netherlands)

    Harmsen, Frank; van Hillegersberg, Jos; Harmsen, Frank; Amrit, Chintan Amrit; Geisberger, Eva; Keil, Patrick; Kuhrmann, Marco

    2007-01-01

    The advent of global distribution of software development has made managing collaboration and coordination among developers more difficult due to various reasons including physical distance, differences in time, cultural differences etc. A nearly total absence of informal communication among

  4. METHODS AND TOOLS FOR DEVELOPING COMPUTER LEARNING SOFTWARE ON ELECTRICAL ENGINEERING BASED ON WIKI TECHNOLOGY

    Directory of Open Access Journals (Sweden)

    Roman V. Alekhin

    2014-01-01

    Full Text Available This work is devoted to issues related to the development of learning software and, in particular, digital libraries and knowledge bases in different fields and disciplines. Main attention is paid to the development of computer learning software for electrical engineering on the base of rapidly growing and popular wiki technology. This work was supported by RFBR (projects 12-08-00358, 14-01-00427, 12-07-00508. 

  5. GUM2DFT—a software tool for uncertainty evaluation of transient signals in the frequency domain

    Science.gov (United States)

    Eichstädt, S.; Wilkens, V.

    2016-05-01

    The Fourier transform and its counterpart for discrete time signals, the discrete Fourier transform (DFT), are common tools in measurement science and application. Although almost every scientific software package offers ready-to-use implementations of the DFT, the propagation of uncertainties in line with the guide to the expression of uncertainty in measurement (GUM) is typically neglected. This is of particular importance in dynamic metrology, when input estimation is carried out by deconvolution in the frequency domain. To this end, we present the new open-source software tool GUM2DFT, which utilizes closed formulas for the efficient propagation of uncertainties for the application of the DFT, inverse DFT and input estimation in the frequency domain. It handles different frequency domain representations, accounts for autocorrelation and takes advantage of the symmetry inherent in the DFT result for real-valued time domain signals. All tools are presented in terms of examples which form part of the software package. GUM2DFT will foster GUM-compliant evaluation of uncertainty in a DFT-based analysis and enable metrologists to include uncertainty evaluations in their routine work.

  6. Particle Loss Calculator – a new software tool for the assessment of the performance of aerosol inlet systems

    Directory of Open Access Journals (Sweden)

    S.-L. von der Weiden

    2009-09-01

    Full Text Available Most aerosol measurements require an inlet system to transport aerosols from a select sampling location to a suitable measurement device through some length of tubing. Such inlet systems must be optimized to minimize aerosol sampling artifacts and maximize sampling efficiency. In this study we introduce a new multifunctional software tool (Particle Loss Calculator, PLC that can be used to quickly determine aerosol sampling efficiency and particle transport losses due to passage through arbitrary tubing systems. The software employs relevant empirical and theoretical relationships found in established literature and accounts for the most important sampling and transport effects that might be encountered during deployment of typical, ground-based ambient aerosol measurements through a constant-diameter sampling probe. The software treats non-isoaxial and non-isokinetic aerosol sampling, aerosol diffusion and sedimentation as well as turbulent inertial deposition and inertial deposition in bends and contractions of tubing. This software was validated through comparison with experimentally determined particle losses for several tubing systems bent to create various diffusion, sedimentation and inertial deposition properties. As long as the tube geometries are not "too extreme", agreement is satisfactory. We discuss the conclusions of these experiments, the limitations of the software and present three examples of the use of the Particle Loss Calculator in the field.

  7. Particle Loss Calculator – a new software tool for the assessment of the performance of aerosol inlet systems

    Directory of Open Access Journals (Sweden)

    S.-L. von der Weiden

    2009-04-01

    Full Text Available Most aerosol measurements require an inlet system to transport aerosols from a select sampling location to a suitable measurement device through some length of tubing. Such inlet systems must be optimized to minimize aerosol sampling artifacts and maximize sampling efficiency. In this study we introduce a new multifunctional software tool (Particle Loss Calculator, PLC that can be used to quickly determine aerosol sampling efficiency and particle transport losses due to passage through arbitrary tubing systems. The software employs relevant empirical and theoretical relationships found in established literature and accounts for the most important sampling and transport effects that might be encountered during deployment of typical, ground-based ambient aerosol measurements. The software treats non-isoaxial and non-isokinetic aerosol sampling, aerosol diffusion and sedimentation as well as turbulent inertial deposition and inertial deposition in bends and contractions of tubing. This software was validated through comparison with experimentally determined particle losses for several tubing systems bent to create various diffusion, sedimentation and inertial deposition properties. As long as the tube geometries are not "too extreme", agreement is satisfactory. We discuss the conclusions of these experiments, the limitations of the software and present three examples of the use of the Particle Loss Calculator in the field.

  8. TU-C-17A-03: An Integrated Contour Evaluation Software Tool Using Supervised Pattern Recognition for Radiotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Chen, H; Tan, J; Kavanaugh, J; Dolly, S; Gay, H; Thorstad, W; Anastasio, M; Altman, M; Mutic, S; Li, H [Washington University School of Medicine, Saint Louis, MO (United States)

    2014-06-15

    Purpose: Radiotherapy (RT) contours delineated either manually or semiautomatically require verification before clinical usage. Manual evaluation is very time consuming. A new integrated software tool using supervised pattern contour recognition was thus developed to facilitate this process. Methods: The contouring tool was developed using an object-oriented programming language C# and application programming interfaces, e.g. visualization toolkit (VTK). The C# language served as the tool design basis. The Accord.Net scientific computing libraries were utilized for the required statistical data processing and pattern recognition, while the VTK was used to build and render 3-D mesh models from critical RT structures in real-time and 360° visualization. Principal component analysis (PCA) was used for system self-updating geometry variations of normal structures based on physician-approved RT contours as a training dataset. The inhouse design of supervised PCA-based contour recognition method was used for automatically evaluating contour normality/abnormality. The function for reporting the contour evaluation results was implemented by using C# and Windows Form Designer. Results: The software input was RT simulation images and RT structures from commercial clinical treatment planning systems. Several abilities were demonstrated: automatic assessment of RT contours, file loading/saving of various modality medical images and RT contours, and generation/visualization of 3-D images and anatomical models. Moreover, it supported the 360° rendering of the RT structures in a multi-slice view, which allows physicians to visually check and edit abnormally contoured structures. Conclusion: This new software integrates the supervised learning framework with image processing and graphical visualization modules for RT contour verification. This tool has great potential for facilitating treatment planning with the assistance of an automatic contour evaluation module in avoiding

  9. Lilith: A software framework for the rapid development of scalable tools for distributed computing

    Energy Technology Data Exchange (ETDEWEB)

    Gentile, A.C.; Evensky, D.A.; Armstrong, R.C.

    1998-03-01

    Lilith is a general purpose framework, written in Java, that provides a highly scalable distribution of user code across a heterogeneous computing platform. By creation of suitable user code, the Lilith framework can be used for tool development. The scalable performance provided by Lilith is crucial to the development of effective tools for large distributed systems. Furthermore, since Lilith handles the details of code distribution and communication, the user code need focus primarily on the tool functionality, thus, greatly decreasing the time required for tool development. In this paper, the authors concentrate on the use of the Lilith framework to develop scalable tools. The authors review the functionality of Lilith and introduce a typical tool capitalizing on the features of the framework. They present new Objects directly involved with tool creation. They explain details of development and illustrate with an example. They present timing results demonstrating scalability.

  10. Development of a case tool to support decision based software development

    Science.gov (United States)

    Wild, Christian J.

    1993-01-01

    A summary of the accomplishments of the research over the past year are presented. Achievements include: made demonstrations with DHC, a prototype supporting decision based software development (DBSD) methodology, for Paramax personnel at ODU; met with Paramax personnel to discuss DBSD issues, the process of integrating DBSD and Refinery and the porting process model; completed and submitted a paper describing DBSD paradigm to IFIP '92; completed and presented a paper describing the approach for software reuse at the Software Reuse Workshop in April 1993; continued to extend DHC with a project agenda, facility necessary for a better project management; completed a primary draft of the re-engineering process model for porting; created a logging form to trace all the activities involved in the process of solving the reengineering problem, and developed a primary chart with the problems involved by the reengineering process.

  11. Software Tools for Software Maintenance

    Science.gov (United States)

    1988-10-01

    I- IIBM Main DOS, OS! None Spoc I Quikjob I AI ----------- ------------ I ----------------------------- I-- IBM Main OS Cobol I...Cobol Debug A I ---------------------- -------------------------- i IBM Main OS Cobol I Quick Online Debugging System A I ----------------------I...Debug Assembler JSCDebu&.Cobot Debug Cobol QUODS (Qusk Online Dtbugring System) Cobol Superbug Antmblcr Trace Ary Tracer Fortn Assembler X)tbug

  12. Lilith: A software framework for the rapid development of scalable tools for distributed computing

    Energy Technology Data Exchange (ETDEWEB)

    Gentile, A.C.; Evensky, D.A.; Armstrong, R.C.

    1997-12-31

    Lilith is a general purpose tool that provides a highly scalable, easy distribution of user code across a heterogeneous computing platform. By handling the details of code distribution and communication, such a framework allows for the rapid development of tools for the use and management of large distributed systems. This speed-up in development not only enables the easy creation of tools as needed but also facilitates the ultimate development of more refined, hard-coded tools as well. Lilith is written in Java, providing platform independence and further facilitating rapid tool development through Object reuse and ease of development. The authors present the user-involved objects in the Lilith Distributed Object System and the Lilith User API. They present an example of tool development, illustrating the user calls, and present results demonstrating Lilith`s scalability.

  13. ARCHER, a New Monte Carlo Software Tool for Emerging Heterogeneous Computing Environments

    Science.gov (United States)

    Xu, X. George; Liu, Tianyu; Su, Lin; Du, Xining; Riblett, Matthew; Ji, Wei; Gu, Deyang; Carothers, Christopher D.; Shephard, Mark S.; Brown, Forrest B.; Kalra, Mannudeep K.; Liu, Bob

    2014-06-01

    The Monte Carlo radiation transport community faces a number of challenges associated with peta- and exa-scale computing systems that rely increasingly on heterogeneous architectures involving hardware accelerators such as GPUs. Existing Monte Carlo codes and methods must be strategically upgraded to meet emerging hardware and software needs. In this paper, we describe the development of a software, called ARCHER (Accelerated Radiation-transport Computations in Heterogeneous EnviRonments), which is designed as a versatile testbed for future Monte Carlo codes. Preliminary results from five projects in nuclear engineering and medical physics are presented.

  14. New EPA 'PLUS' software: A useful tool for local emergency planners

    Energy Technology Data Exchange (ETDEWEB)

    Anastas, P.T.; Tobin, P.S.

    1993-09-01

    EPA's Office of Pollution Prevention and Toxics has produced a new bibliographic database designed for local emergency planners. The PLUS'' (Planner's Library in User-friendly Software) system is a resource database containing abstracts of hundreds of references useful in planning for hazardous substances emergencies. The software's structure allows planners to construct customized search strategies for generating lists of emergency planning-related references on subjects ranging from chemical profiles to personal protective gear.

  15. Validation of a Low-Thrust Mission Design Tool Using Operational Navigation Software

    Science.gov (United States)

    Englander, Jacob A.; Knittel, Jeremy M.; Williams, Ken; Stanbridge, Dale; Ellison, Donald H.

    2017-01-01

    Design of flight trajectories for missions employing solar electric propulsion requires a suitably high-fidelity design tool. In this work, the Evolutionary Mission Trajectory Generator (EMTG) is presented as a medium-high fidelity design tool that is suitable for mission proposals. EMTG is validated against the high-heritage deep-space navigation tool MIRAGE, demonstrating both the accuracy of EMTG's model and an operational mission design and navigation procedure using both tools. The validation is performed using a benchmark mission to the Jupiter Trojans.

  16. SU-E-T-203: Development of a QA Software Tool for Automatic Verification of Plan Data Transfer and Delivery.

    Science.gov (United States)

    Chen, G; Li, X

    2012-06-01

    Consistency verification between the data from treatment planning system (TPS), record and verification system (R&V), and delivered recorder with visual inspection is time consuming and subject to human error. The purpose of this work is to develop a software tool to automatically perform such verifications. Using Microsoft visual C++, a quality assurance (QA) tool was developed to (1) read plan data including gantry/collimator/couch parameters, multi-leaf-collimator leaf positions, and monitor unit (MU) numbers from a TPS (Xio, CMS/Elekta, or RealART, Prowess) via RTP link or DICOM transfer, (2) retrieve imported (prior to delivery) and recorded (after delivery) data from a R&V system (Mosaiq, Elekta) with open database connectivity, calculate MU independently based on the DICOM plan data using a modified Clarkson integration algorithm, and (4) compare all the extracted data to identify possible discrepancy between TPS and R&V, and R&V and delivery. The tool was tested for 20 patients with 3DCRT and IMRT plans from regular and the online adaptive radiotherapy treatments. It was capable of automatically detecting any inconsistency between the beam data from the TPS and the data stored in the R&V system with an independent MU check and any significant treatment delivery deviation from the plan within a few seconds. With this tool being used prior to and after the delivery as an essential QA step, our clinical online adaptive re-planning process can be speeded up to save a few minutes by eliminating the tedious visual inspection. A QA software tool has been developed to automatically verify the treatment data consistency from delivery back to plan and to identify discrepancy in MU calculations between the TPS and the secondary MU check. This tool speeds up clinical QA process and eliminating human errors from visual inspection, thus improves safety. © 2012 American Association of Physicists in Medicine.

  17. Cyber-physical systems software development: way of working and tool suite

    NARCIS (Netherlands)

    Bezemer, M.M.

    2013-01-01

    Designing embedded control software for modern cyber-physical systems becomes more and more difficult, because of the increasing amount and complexity of their requirements. The regular requirements are extended with modern requirements, for example, to get a general purpose cyber-physical system

  18. Teaching Locus with a Conserved Property by Integrating Mathematical Tools and Dynamic Geometric Software

    Science.gov (United States)

    Stupel, Moshe; Segal, Ruti; Oxman, Victor

    2016-01-01

    In this article, we present investigative tasks that concern loci, which integrate the use of dynamic geometry software (DGS) with mathematics for proving the obtained figures. Additional conditions were added to the loci: ellipse, parabola and circle, which result in the emergence of new loci, similar in form to the original loci. The…

  19. Featureous: A Tool for Feature-Centric Analysis of Java Software

    DEFF Research Database (Denmark)

    Olszak, Andrzej; Jørgensen, Bo Nørregaard

    2010-01-01

    Feature-centric comprehension of source code is necessary for incorporating user-requested modifications during software evolution and maintenance. However, such comprehension is difficult to achieve in case of large object-oriented programs due to the size, complexity, and implicit character...

  20. Software tools for data modelling and processing of human body temperature circadian dynamics.

    Science.gov (United States)

    Petrova, Elena S; Afanasova, Anastasia I

    2015-01-01

    This paper is presenting a software development for simulating and processing thermometry data. The motivation of this research is the miniaturization of actuators attached to human body which allow frequent temperature measurements and improve the medical diagnosis procedures related to circadian dynamics.

  1. Cyber-physical systems software development: way of working and tool suite

    NARCIS (Netherlands)

    Bezemer, Maarten Matthijs

    2013-01-01

    Designing embedded control software for modern cyber-physical systems becomes more and more difficult, because of the increasing amount and complexity of their requirements. The regular requirements are extended with modern requirements, for example, to get a general purpose cyber-physical system ca

  2. APASVO: A free software tool for automatic P-phase picking and event detection in seismic traces

    Science.gov (United States)

    Romero, José Emilio; Titos, Manuel; Bueno, Ángel; Álvarez, Isaac; García, Luz; Torre, Ángel de la; Benítez, M.a. Carmen

    2016-05-01

    The accurate estimation of the arrival time of seismic waves or picking is a problem of major interest in seismic research given its relevance in many seismological applications, such as earthquake source location and active seismic tomography. In the last decades, several automatic picking methods have been proposed with the ultimate goal of implementing picking algorithms whose results are comparable to those obtained by manual picking. In order to facilitate the use of these automated methods in the analysis of seismic traces, this paper presents a new free, open source, software graphical tool, named APASVO, which allows picking tasks in an easy and user-friendly way. The tool also provides event detection functionality, where a relatively imprecise estimation of the onset time is sufficient. The application implements the STA-LTA detection algorithm and the AMPA picking algorithm. An autoregressive AIC-based picking method can also be applied. Besides, this graphical tool is complemented with two additional command line tools, an event picking tool and a synthetic earthquake generator. APASVO is a multiplatform tool that works on Windows, Linux and OS X. The application can process data in a large variety of file formats. It is implemented in Python and relies on well-known scientific computing packages such as ObsPy, NumPy, SciPy and Matplotlib.

  3. Milling tool wear diagnosis by feed motor current signal using an artificial neural network

    Energy Technology Data Exchange (ETDEWEB)

    Khajavi, Mehrdad Nouri; Nasernia, Ebrahim; Rostaghi, Mostafa [Dept. of Mechanical Engineering, Shahid Rajaee Teacher Training University, Tehran (Iran, Islamic Republic of)

    2016-11-15

    In this paper, a Multi-layer perceptron (MLP) neural network was used to predict tool wear in face milling. For this purpose, a series of experiments was conducted using a milling machine on a CK45 work piece. Tool wear was measured by an optical microscope. To improve the accuracy and reliability of the monitoring system, tool wear state was classified into five groups, namely, no wear, slight wear, normal wear, severe wear and broken tool. Experiments were conducted with the aforementioned tool wear states, and different machining conditions and data were extracted. An increase in current amplitude was observed as the tool wear increased. Furthermore, effects of parameters such as tool wear, feed, and cut depth on motor current consumption were analyzed. Considering the complexity of the wear state classification, a multi-layer neural network was used. The root mean square of motor current, feed, cut depth, and tool rpm were chosen as the input and amount of flank wear as the output of MLP. Results showed good performance of the designed tool wear monitoring system.

  4. Investigation of the current requirements engineering practices among software developers at the Universiti Utara Malaysia Information Technology (UUMIT) centre

    Science.gov (United States)

    Hussain, Azham; Mkpojiogu, Emmanuel O. C.; Abdullah, Inam

    2016-08-01

    Requirements Engineering (RE) is a systemic and integrated process of eliciting, elaborating, negotiating, validating and managing of the requirements of a system in a software development project. UUM has been supported by various systems developed and maintained by the UUM Information Technology (UUMIT) Centre. The aim of this study was to assess the current requirements engineering practices at UUMIT. The main problem that prompted this research is the lack of studies that support software development activities at the UUMIT. The study is geared at helping UUMIT produce quality but time and cost saving software products by implementing cutting edge and state of the art requirements engineering practices. Also, the study contributes to UUM by identifying the activities needed for software development so that the management will be able to allocate budget to provide adequate and precise training for the software developers. Three variables were investigated: Requirement Description, Requirements Development (comprising: Requirements Elicitation, Requirements Analysis and Negotiation, Requirements Validation), and Requirement Management. The results from the study showed that the current practice of requirement engineering in UUMIT is encouraging, but still need further development and improvement because a few RE practices were seldom practiced.

  5. The Application of Intentional Subjective Properties and Mediated Communication Tools to Software Agents in Online Disputes Resolution Environments

    Directory of Open Access Journals (Sweden)

    Renzo Gobbin

    2004-11-01

    Full Text Available This paper examines the use of subjective properties in modeling an architecture for cooperative agents using Agent Communication Language (ACL that is used as a mediating tool for cooperative communication activities between and within software agents. The role that subjective and objective properties have in explaining and modeling agent internalization and externalization of ACL messages is investigated and related to Vygotsky’s developmental learning theories such as Mediated Activity Theory. A novel agent architecture ALMA (Agent Language Mediated Activity based on the integration of agents’ subjective and objective properties within an agent communication activity framework will be presented. The relevance of software agents subjective properties in modeling applications such as e-Law Online Dispute Resolution for e-business contractual arrangements using natural language subject/object relation in their communication patterns will be discussed.

  6. APPLICATION Of MODEL PSP MANUAL AND SUPPORTED BY TOOL MARRIES IN A STUDY OF CASE OF BRAZILIAN PLANT OF SOFTWARE

    Directory of Open Access Journals (Sweden)

    Denis Ávila Montini

    2006-06-01

    Full Text Available In a context of continuous improvement of quality in software development’s projects, the PSP experimental process was applied to discipline some of the processes suggested by CMMI level 2 with two different strategies. The first one consists of observing the behavior of a software factory on collecting the necessary data to assist the PSP model manually, and in the second one the collecting happened with the help of a CASE tool. The results show the impacts in the performance and in the quality patterns that the two strategies provided with their advantages and their vulnerabilities. In both cases the fulfillment of the stated periods was obtained from the moment where the specification and the course of the activities were controlled by the two PSP’ strategies suggested. Key words: CMMI, PSP, CASE, Factory, Improvement of processes.

  7. The Image-Guided Surgery ToolKit IGSTK: an open source C++ software toolkit

    Science.gov (United States)

    Cheng, Peng; Ibanez, Luis; Gobbi, David; Gary, Kevin; Aylward, Stephen; Jomier, Julien; Enquobahrie, Andinet; Zhang, Hui; Kim, Hee-su; Blake, M. Brian; Cleary, Kevin

    2007-03-01

    The Image-Guided Surgery Toolkit (IGSTK) is an open source C++ software library that provides the basic components needed to develop image-guided surgery applications. The focus of the toolkit is on robustness using a state machine architecture. This paper presents an overview of the project based on a recent book which can be downloaded from igstk.org. The paper includes an introduction to open source projects, a discussion of our software development process and the best practices that were developed, and an overview of requirements. The paper also presents the architecture framework and main components. This presentation is followed by a discussion of the state machine model that was incorporated and the associated rationale. The paper concludes with an example application.

  8. STEM_CELL: a software tool for electron microscopy: part 2--analysis of crystalline materials.

    Science.gov (United States)

    Grillo, Vincenzo; Rossi, Francesca

    2013-02-01

    A new graphical software (STEM_CELL) for analysis of HRTEM and STEM-HAADF images is here introduced in detail. The advantage of the software, beyond its graphic interface, is to put together different analysis algorithms and simulation (described in an associated article) to produce novel analysis methodologies. Different implementations and improvements to state of the art approach are reported in the image analysis, filtering, normalization, background subtraction. In particular two important methodological results are here highlighted: (i) the definition of a procedure for atomic scale quantitative analysis of HAADF images, (ii) the extension of geometric phase analysis to large regions up to potentially 1μm through the use of under sampled images with aliasing effects.

  9. A System Level Tool for Translating Software to Reconfigurable Hardware Project

    Data.gov (United States)

    National Aeronautics and Space Administration — In this research we will develop a system level tool to translate binary code of a general-purpose processor into Register Transfer Level VHDL code to be mapped onto...

  10. SafetyBarrierManager, a software tool to perform risk analysis using ARAMIS's principles

    DEFF Research Database (Denmark)

    Duijm, Nijs Jan

    of the ARAMIS project, Risø National Laboratory started developing a tool that could implement these methodologies, leading to SafetyBarrierManager. The tool is based on the principles of “safety‐barrier diagrams”, which are very similar to “bowties”, with the possibility of performing quantitative analysis......The ARAMIS project resulted in a number of methodologies, dealing with among others: the development of standard fault trees and “bowties”; the identification and classification of safety barriers; and including the quality of safety management into the quantified risk assessment. After conclusion....... The tool allows constructing comprehensive fault trees, event trees and safety‐barrier diagrams. The tool implements the ARAMIS idea of a set of safety barrier types, to which a number of safety management issues can be linked. By rating the quality of these management issues, the operational probability...

  11. Providing a Connection between a Bayesian Inverse Modeling Tool and a Coupled Hydrogeological Processes Modeling Software

    Science.gov (United States)

    Frystacky, H.; Osorio-Murillo, C. A.; Over, M. W.; Kalbacher, T.; Gunnell, D.; Kolditz, O.; Ames, D.; Rubin, Y.

    2013-12-01

    The Method of Anchored Distributions (MAD) is a Bayesian technique for characterizing the uncertainty in geostatistical model parameters. Open-source software has been developed in a modular framework such that this technique can be applied to any forward model software via a driver. This presentation is about the driver that has been developed for OpenGeoSys (OGS), open-source software that can simulate many hydrogeological processes, including couple processes. MAD allows the use of multiple data types for conditioning the spatially random fields and assessing model parameter likelihood. For example, if simulating flow and mass transport, the inversion target variable could be hydraulic conductivity and the inversion data types could be head, concentration, or both. The driver detects from the OGS files which processes and variables are being used in a given project and allows MAD to prompt the user to choose those that are to be modeled or to be treated deterministically. In this way, any combination of processes allowed by OGS can have MAD applied. As for the software, there are two versions, each with its own OGS driver. A Windows desktop version is available as a graphical user interface and is ideal for the learning and teaching environment. High-throughput computing can even be achieved with this version via HTCondor if large projects want to be pursued in a computer lab. In addition to this desktop application, a Linux version is available equipped with MPI such that it can be run in parallel on a computer cluster. All releases can be downloaded from the MAD Codeplex site given below.

  12. Techniques and Tools for Trustworthy Composition of Pre-Designed Embedded Software Components

    Science.gov (United States)

    2012-07-01

    that was developed as a part of the Arduino open-source electronics prototyping platform. The Ardupilot system consists of the hardware which is placed...hand of the user is used to communicate the roll, pitch and the throttle information to the UAV. The firmware for the system is written in the Arduino ...Measurement Unit (IMU) sensors it can be used to develop an Unmanned Aerial Vehicle. Software for the Ardupilot can be programmed using the Arduino

  13. Establishing a Web-Based DICOM Teaching File Authoring Tool Using Open-Source Public Software

    OpenAIRE

    Lee, Wen-Jeng; Yang, Chung-Yi; Liu, Kao-Lang; Liu, Hon-Man; Ching, Yu-Tai; Chen, Shyh-Jye

    2005-01-01

    Online teaching files are an important source of educational and referential materials in the radiology community. The commonly used Digital Imaging and Communications in Medicine (DICOM) file format of the radiology community is not natively supported by common Web browsers. The ability of the Web server to convert and parse DICOM is important when the DICOM-converting tools are not available. In this paper, we describe our approach to develop a Web-based teaching file authoring tool. Our se...

  14. Instrument-independent software tools for the analysis of MS-MS and LC-MS lipidomics data.

    Science.gov (United States)

    Haimi, Perttu; Chaithanya, Krishna; Kainu, Ville; Hermansson, Martin; Somerharju, Pentti

    2009-01-01

    Mass spectrometry (MS), particularly electrospray-MS, is the key tool in modern lipidomics. However, as even a modest scale experiment produces a great amount of data, data processing often becomes limiting. Notably, the software provided with MS instruments are not well suited for quantitative analysis of lipidomes because of the great variety of species present and complexities in response calibration. Here we describe the use of two recently introduced software tools: lipid mass spectrum analysis (LIMSA) and spectrum extraction from chromatographic data (SECD), which significantly increase the speed and reliability of mass spectrometric analysis of complex lipidomes. LIMSA is a Microsoft Excel add-on that (1) finds and integrates the peaks in an imported spectrum, (2) identifies the peaks, (3) corrects the peak areas for overlap by isotopic peaks of other species and (4) quantifies the identified species using included internal standards. LIMSA is instrument-independent because it processes text-format MS spectra. Typically, the analysis of one spectrum takes only a few seconds.The SECD software allows one to display MS chromatograms as two-dimensional maps, which is useful for visual inspection of the data. More importantly, however, SECD allows one to extract mass spectra from user-defined regions of the map for further analysis with, e.g., LIMSA. The use of select regions rather than simple time-range averaging significantly improves the signal-to-noise ratio as signals outside the region of interest are more efficiently excluded. LIMSA and SECD have proven to be robust and convenient tools and are available free of charge from the authors.

  15. Current trends in hardware and software for brain-computer interfaces (BCIs).

    Science.gov (United States)

    Brunner, P; Bianchi, L; Guger, C; Cincotti, F; Schalk, G

    2011-04-01

    A brain-computer interface (BCI) provides a non-muscular communication channel to people with and without disabilities. BCI devices consist of hardware and software. BCI hardware records signals from the brain, either invasively or non-invasively, using a series of device components. BCI software then translates these signals into device output commands and provides feedback. One may categorize different types of BCI applications into the following four categories: basic research, clinical/translational research, consumer products, and emerging applications. These four categories use BCI hardware and software, but have different sets of requirements. For example, while basic research needs to explore a wide range of system configurations, and thus requires a wide range of hardware and software capabilities, applications in the other three categories may be designed for relatively narrow purposes and thus may only need a very limited subset of capabilities. This paper summarizes technical aspects for each of these four categories of BCI applications. The results indicate that BCI technology is in transition from isolated demonstrations to systematic research and commercial development. This process requires several multidisciplinary efforts, including the development of better integrated and more robust BCI hardware and software, the definition of standardized interfaces, and the development of certification, dissemination and reimbursement procedures.

  16. The evaluation of Computed Tomography hard- and software tools for micropaleontologic studies on foraminifera

    Science.gov (United States)

    van Loo, D.; Speijer, R.; Masschaele, B.; Dierick, M.; Cnudde, V.; Boone, M.; de Witte, Y.; Dewanckele, J.; van Hoorebeke, L.; Jacobs, P.

    2009-04-01

    Foraminifera (Forams) are single-celled amoeba-like organisms in the sea, which build a tiny calcareous multi-chambered shell for protection. Their enormous abundance, great variation of shape through time and their presence in all marine deposits made these tiny microfossils the oil companies' best friend by facilitating the detection of new oil wells. Besides the success of forams in the oil and gas industry, they are also a most powerful tool for reconstructing climate change in the past. The shell of a foraminifer is a tiny gold mine of information both geometrical as chemical. However, until recently the best information on this architecture was only obtained through imaging the outside of a shell with Scanning Electron Microscopy (SEM), giving no clues towards internal structures other than single snapshots through breaking a specimen apart. With X-ray computed tomography (CT) it is possible to overcome this problem and uncover a huge amount of geometrical information without destructing the samples. Using the last generation of micro-CT's, called nano-CT, because of the sub-micron resolution, it is now possible to perform adequate imaging even on these tiny samples without needing huge facilities. In this research, a comparison is made between different X-ray sources and X-ray detectors and the resulting image resolution. Both sharpness, noise and contrast are very important parameters that will have important effects on the accuracy of the results and on the speed of data-processing. Combining this tomography technique with specific image processing software, called segmentation, it is possible to obtain a 3D virtual representation of the entire forams shell. This 3D virtual object can then be used for many purposes, from which automatic measurement of the chambers size is one of the most important ones. The segmentation process is a combination of several algorithms that are often used in CT evaluation, in this work an evaluation of those algorithms is

  17. Corganiser: a web-based software tool for planning time-sensitive sampling of whole rounds during scientific drilling

    DEFF Research Database (Denmark)

    Marshall, Ian

    2014-01-01

    Corganiser is a software tool developed to simplify the process of preparing whole-round sampling plans for time-sensitive microbiology and geochemistry sampling during scientific drilling. It was developed during the Integrated Ocean Drilling Program (IODP) Expedition 347, but is designed to work...... with a wide range of core and section configurations and can thus be used in future drilling projects. Corganiser is written in the Python programming language and is implemented both as a graphical web interface and command-line interface. It can be accessed online at http://130.226.247.137/....

  18. ATLAS software packaging

    Science.gov (United States)

    Rybkin, Grigory

    2012-12-01

    Software packaging is indispensable part of build and prerequisite for deployment processes. Full ATLAS software stack consists of TDAQ, HLT, and Offline software. These software groups depend on some 80 external software packages. We present tools, package PackDist, developed and used to package all this software except for TDAQ project. PackDist is based on and driven by CMT, ATLAS software configuration and build tool, and consists of shell and Python scripts. The packaging unit used is CMT project. Each CMT project is packaged as several packages—platform dependent (one per platform available), source code excluding header files, other platform independent files, documentation, and debug information packages (the last two being built optionally). Packaging can be done recursively to package all the dependencies. The whole set of packages for one software release, distribution kit, also includes configuration packages and contains some 120 packages for one platform. Also packaged are physics analysis projects (currently 6) used by particular physics groups on top of the full release. The tools provide an installation test for the full distribution kit. Packaging is done in two formats for use with the Pacman and RPM package managers. The tools are functional on the platforms supported by ATLAS—GNU/Linux and Mac OS X. The packaged software is used for software deployment on all ATLAS computing resources from the detector and trigger computing farms, collaboration laboratories computing centres, grid sites, to physicist laptops, and CERN VMFS and covers the use cases of running all applications as well as of software development.

  19. Plots, Calculations and Graphics Tools (PCG2). Software Transfer Request Presentation

    Science.gov (United States)

    Richardson, Marilou R.

    2010-01-01

    This slide presentation reviews the development of the Plots, Calculations and Graphics Tools (PCG2) system. PCG2 is an easy to use tool that provides a single user interface to view data in a pictorial, tabular or graphical format. It allows the user to view the same display and data in the Control Room, engineering office area, or remote sites. PCG2 supports extensive and regular engineering needs that are both planned and unplanned and it supports the ability to compare, contrast and perform ad hoc data mining over the entire domain of a program's test data.

  20. A software tool integrated risk assessment of spent fuel transpotation and storage

    Energy Technology Data Exchange (ETDEWEB)

    Yun, Mi Rae; Almomani, Belal; Ham, Jae Hyun; Kang, Hyun Gook [Dept. of Nuclear and Quantum Engineering, Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of); Christian, Robby [Dept. of Mechanical, Aerospace, and Nuclear Engineering, Rensselaer Polytechnic Institute, Troy (Korea, Republic of); Kim, Bo Gyung [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of); Lee, Sang Hoon [Dept. of Mechanical and Automotive Engineering, Keimyung University, Daegu (Korea, Republic of)

    2017-06-15

    When temporary spent fuel storage pools at nuclear power plants reach their capacity limit, the spent fuel must be moved to an alternative storage facility. However, radioactive materials must be handled and stored carefully to avoid severe consequences to the environment. In this study, the risks of three potential accident scenarios (i.e., maritime transportation, an aircraft crashing into an interim storage facility, and on-site transportation) associated with the spent fuel transportation process were analyzed using a probabilistic approach. For each scenario, the probabilities and the consequences were calculated separately to assess the risks: the probabilities were calculated using existing data and statistical models, and the consequences were calculated using computation models. Risk assessment software was developed to conveniently integrate the three scenarios. The risks were analyzed using the developed software according to the shipment route, building characteristics, and spent fuel handling environment. As a result of the risk analysis with varying accident conditions, transportation and storage strategies with relatively low risk were developed for regulators and licensees. The focus of this study was the risk assessment methodology; however, the applied model and input data have some uncertainties. Further research to reduce these uncertainties will improve the accuracy of this mode.

  1. A software tool for integrated risk assessment of spent fuel transportation and storage

    Directory of Open Access Journals (Sweden)

    Mirae Yun

    2017-06-01

    Full Text Available When temporary spent fuel storage pools at nuclear power plants reach their capacity limit, the spent fuel must be moved to an alternative storage facility. However, radioactive materials must be handled and stored carefully to avoid severe consequences to the environment. In this study, the risks of three potential accident scenarios (i.e., maritime transportation, an aircraft crashing into an interim storage facility, and on-site transportation associated with the spent fuel transportation process were analyzed using a probabilistic approach. For each scenario, the probabilities and the consequences were calculated separately to assess the risks: the probabilities were calculated using existing data and statistical models, and the consequences were calculated using computation models. Risk assessment software was developed to conveniently integrate the three scenarios. The risks were analyzed using the developed software according to the shipment route, building characteristics, and spent fuel handling environment. As a result of the risk analysis with varying accident conditions, transportation and storage strategies with relatively low risk were developed for regulators and licensees. The focus of this study was the risk assessment methodology; however, the applied model and input data have some uncertainties. Further research to reduce these uncertainties will improve the accuracy of this model.

  2. The Development of a Computer Simulation Software as a Tool in Pharmacokinetics Analisys

    Directory of Open Access Journals (Sweden)

    Joshita Djajadisastra

    2006-04-01

    Full Text Available This research uses mammilary model (one compartment and two compartments as compartment model in which a single drug administration is given via oral and intravenous. Mathematical modeling in differential equations can be derived from a pharmacokinetics model. The solutions are pharmacokinetics variables and parameters that can be solved using some mathematical and numerical methods such as Laplace transformation, residual method, superposition principle, trapezoidal rule and some solving methods in differential equations.To overcome manual calculation and to visualize a drug's dynamics in the graph form, a computer simulation software based on Visual Basic has been built. The simulation results show that any particular sample data plasma can be checked whether it is given orally or injection and has a tendency to be compatible with an assumption of one compartment or two compartments. For urin data, the software capability is still limited only for one compartment. However, it can checked if the corresponding data is given via oral or injection. So the simulations show that pharmacokinetics variables and parameters will have individual effects.

  3. Numerical arc segmentation algorithm for a radio conference: A software tool for communication satellite systems planning

    Science.gov (United States)

    Whyte, W. A.; Heyward, A. O.; Ponchak, D. S.; Spence, R. L.; Zuzek, J. E.

    1988-01-01

    The Numerical Arc Segmentation Algorithm for a Radio Conference (NASARC) provides a method of generating predetermined arc segments for use in the development of an allotment planning procedure to be carried out at the 1988 World Administrative Radio Conference (WARC) on the Use of the Geostationary Satellite Orbit and the Planning of Space Services Utilizing It. Through careful selection of the predetermined arc (PDA) for each administration, flexibility can be increased in terms of choice of system technical characteristics and specific orbit location while reducing the need for coordination among administrations. The NASARC software determines pairwise compatibility between all possible service areas at discrete arc locations. NASARC then exhaustively enumerates groups of administrations whose satellites can be closely located in orbit, and finds the arc segment over which each such compatible group exists. From the set of all possible compatible groupings, groups and their associated arc segments are selected using a heuristic procedure such that a PDA is identified for each administration. Various aspects of the NASARC concept and how the software accomplishes specific features of allotment planning are discussed.

  4. A Tool for Optimizing the Build Performance of Large Software Code Bases

    NARCIS (Netherlands)

    Telea, Alexandru; Voinea, Lucian; Kontogiannis, K; Tjortjis, C; Winter, A

    2008-01-01

    We present Build Analyzer, a tool that helps developers optimize the build performance of huge systems written in C Due to complex C header dependencies, even small code changes can cause extremely long rebuilds, which are problematic when code is shared and modified by teams of hundreds of individu

  5. The Viability of a Software Tool to Assist Students in the Review of Literature

    Science.gov (United States)

    Anderson, Timothy R.

    2013-01-01

    Most doctoral students are novice researchers and may not possess the skills to effectively conduct a comprehensive review of the literature and frame a problem designed to conduct original research. Students need proper training and tools necessary to critically evaluate, synthesize and organize literature. The purpose of this concurrent mixed…

  6. A Tool for Optimizing the Build Performance of Large Software Code Bases

    NARCIS (Netherlands)

    Telea, Alexandru; Voinea, Lucian; Kontogiannis, K; Tjortjis, C; Winter, A

    2008-01-01

    We present Build Analyzer, a tool that helps developers optimize the build performance of huge systems written in C Due to complex C header dependencies, even small code changes can cause extremely long rebuilds, which are problematic when code is shared and modified by teams of hundreds of

  7. MS Data Miner: a web-based software tool to analyze, compare, and share mass spectrometry protein identifications.

    Science.gov (United States)

    Dyrlund, Thomas F; Poulsen, Ebbe T; Scavenius, Carsten; Sanggaard, Kristian W; Enghild, Jan J

    2012-09-01

    Data processing and analysis of proteomics data are challenging and time consuming. In this paper, we present MS Data Miner (MDM) (http://sourceforge.net/p/msdataminer), a freely available web-based software solution aimed at minimizing the time required for the analysis, validation, data comparison, and presentation of data files generated in MS software, including Mascot (Matrix Science), Mascot Distiller (Matrix Science), and ProteinPilot (AB Sciex). The program was developed to significantly decrease the time required to process large proteomic data sets for publication. This open sourced system includes a spectra validation system and an automatic screenshot generation tool for Mascot-assigned spectra. In addition, a Gene Ontology term analysis function and a tool for generating comparative Excel data reports are included. We illustrate the benefits of MDM during a proteomics study comprised of more than 200 LC-MS/MS analyses recorded on an AB Sciex TripleTOF 5600, identifying more than 3000 unique proteins and 3.5 million peptides. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. A software tool to estimate the dynamic behaviour of the IP2C samples as sensors for didactic purposes

    Science.gov (United States)

    Graziani, S.; Pagano, F.; Pitrone, N.; Umana, E.

    2010-07-01

    Ionic Polymer Polymer Composites (IP2Cs) are emerging materials used to realize motion actuators and sensors. In the former case a voltage input is able to cause the membrane to bend while in the latter case by bending an IP2C membrane, a voltage output is obtained. In this paper authors introduce a software tool able to estimate the dynamic behaviour for sensors based on IP2Cs working in air. In the proposed tool, geometrical quantities that rule the sensing properties of IP2C-based transducers are taken into account together with their dynamic characteristics. A graphical interface (GUI) has been developed in order to give a useful tool that allows the user to understand the behaviour and the role of the parameters involved in the transduction phenomena. The tool is based on the idea that a graphical user interface will allow persons not skilled in IP2C materials to observe their behaviour and to analyze their characteristics. This could greatly increase the interest of researchers towards this new class of transducers; moreover, it can support the educational activity of students involved in advanced academical courses.

  9. Testing the reliability of software tools in sex and ancestry estimation in a multi-ancestral Brazilian sample.

    Science.gov (United States)

    Urbanová, Petra; Ross, Ann H; Jurda, Mikoláš; Nogueira, Maria-Ines

    2014-09-01

    In the framework of forensic anthropology osteometric techniques are generally preferred over visual examinations due to a higher level of reproducibility and repeatability; qualities that are crucial within a legal context. The use of osteometric methods has been further reinforced by incorporating statistically-based algorithms and large reference samples in a variety of user-friendly software applications. However, the continued increase in admixture of human populations have made the use of osteometric methods for estimation of ancestry much more complex, which confounds one of major requirements of ancestry assessment - intra-population homogeneity. The present paper tests the accuracy of ancestry and sex assessment using four identification software tools, specifically FORDISC 2.0, FORDISC 3.1.293, COLIPR 1.5.2 and 3D-ID 1.0. Software accuracy was tested in a sample of 174 documented human crania of Brazilian origin composed of different ancestral groups (i.e., European Brazilians, Afro-Brazilians, and Japanese Brazilians and of admixed ancestry). The results show that regardless of the software algorithm employed and composition of the reference database, all methods were able to allocate approximately 50% of Brazilian specimens to an appropriate major reference group. Of the three ancestral groups, Afro-Brazilians were especially prone to misclassification. Japanese Brazilians, by contrast, were shown to be relatively easily recognizable as being of Asian descent but at the same time showed a strong affinity towards Hispanic crania, in particularly when the classification based on FDB was carried out in FORDISC. For crania of admixed origin all of the algorithms showed a considerable higher rate of inconsistency with a tendency for misclassification into Asian and American Hispanic groups. Sex assessments revealed an overall modest to poor reliability (60-71% of correctly classified specimens) using the tested software programs with unbalanced individual

  10. ATAQS: A computational software tool for high throughput transition optimization and validation for selected reaction monitoring mass spectrometry

    Directory of Open Access Journals (Sweden)

    Ramos Hector

    2011-03-01

    proteomics via SRM is a powerful new technique that enables the reproducible and accurate identification and quantification of sets of proteins of interest. ATAQS is the first open-source software that supports all steps of the targeted proteomics workflow. ATAQS also provides software API (Application Program Interface documentation that enables the addition of new algorithms to each of the workflow steps. The software, installation guide and sample dataset can be found in http://tools.proteomecenter.org/ATAQS/ATAQS.html

  11. The GLEaMviz computational tool, a publicly available software to explore realistic epidemic spreading scenarios at the global scale

    Directory of Open Access Journals (Sweden)

    Quaggiotto Marco

    2011-02-01

    Full Text Available Abstract Background Computational models play an increasingly important role in the assessment and control of public health crises, as demonstrated during the 2009 H1N1 influenza pandemic. Much research has been done in recent years in the development of sophisticated data-driven models for realistic computer-based simulations of infectious disease spreading. However, only a few computational tools are presently available for assessing scenarios, predicting epidemic evolutions, and managing health emergencies that can benefit a broad audience of users including policy makers and health institutions. Results We present "GLEaMviz", a publicly available software system that simulates the spread of emerging human-to-human infectious diseases across the world. The GLEaMviz tool comprises three components: the client application, the proxy middleware, and the simulation engine. The latter two components constitute the GLEaMviz server. The simulation engine leverages on the Global Epidemic and Mobility (GLEaM framework, a stochastic computational scheme that integrates worldwide high-resolution demographic and mobility data to simulate disease spread on the global scale. The GLEaMviz design aims at maximizing flexibility in defining the disease compartmental model and configuring the simulation scenario; it allows the user to set a variety of parameters including: compartment-specific features, transition values, and environmental effects. The output is a dynamic map and a corresponding set of charts that quantitatively describe the geo-temporal evolution of the disease. The software is designed as a client-server system. The multi-platform client, which can be installed on the user's local machine, is used to set up simulations that will be executed on the server, thus avoiding specific requirements for large computational capabilities on the user side. Conclusions The user-friendly graphical interface of the GLEaMviz tool, along with its high level

  12. The pyPHaz software, an interactive tool to analyze and visualize results from probabilistic hazard assessments

    Science.gov (United States)

    Tonini, Roberto; Selva, Jacopo; Costa, Antonio; Sandri, Laura

    2014-05-01

    Probabilistic Hazard Assessment (PHA) is becoming an essential tool for risk mitigation policies, since it allows to quantify the hazard due to hazardous phenomena and, differently from the deterministic approach, it accounts for both aleatory and epistemic uncertainties. On the other hand, one of the main disadvantages of PHA methods is that their results are not easy to understand and interpret by people who are not specialist in probabilistic tools. For scientists, this leads to the issue of providing tools that can be easily used and understood by decision makers (i.e., risk managers or local authorities). The work here presented fits into the problem of simplifying the transfer between scientific knowledge and land protection policies, by providing an interface between scientists, who produce PHA's results, and decision makers, who use PHA's results for risk analyses. In this framework we present pyPHaz, an open tool developed and designed to visualize and analyze PHA results due to one or more phenomena affecting a specific area of interest. The software implementation has been fully developed with the free and open-source Python programming language and some featured Python-based libraries and modules. The pyPHaz tool allows to visualize the Hazard Curves (HC) calculated in a selected target area together with different levels of uncertainty (mean and percentiles) on maps that can be interactively created and modified by the user, thanks to a dedicated Graphical User Interface (GUI). Moreover, the tool can be used to compare the results of different PHA models and to merge them, by creating ensemble models. The pyPHaz software has been designed with the features of storing and accessing all the data through a MySQL database and of being able to read as input the XML-based standard file formats defined in the frame of GEM (Global Earthquake Model). This format model is easy to extend also to any other kind of hazard, as it will be shown in the applications

  13. SU-E-J-199: A Software Tool for Quality Assurance of Online Replanning with MR-Linac

    Energy Technology Data Exchange (ETDEWEB)

    Chen, G; Ahunbay, E; Li, X [Medical College of Wisconsin, Milwaukee, WI (United States)

    2015-06-15

    Purpose: To develop a quality assurance software tool, ArtQA, capable of automatically checking radiation treatment plan parameters, verifying plan data transfer from treatment planning system (TPS) to record and verify (R&V) system, performing a secondary MU calculation considering the effect of magnetic field from MR-Linac, and verifying the delivery and plan consistency, for online replanning. Methods: ArtQA was developed by creating interfaces to TPS (e.g., Monaco, Elekta), R&V system (Mosaiq, Elekta), and secondary MU calculation system. The tool obtains plan parameters from the TPS via direct file reading, and retrieves plan data both transferred from TPS and recorded during the actual delivery in the R&V system database via open database connectivity and structured query language. By comparing beam/plan datasets in different systems, ArtQA detects and outputs discrepancies between TPS, R&V system and secondary MU calculation system, and delivery. To consider the effect of 1.5T transverse magnetic field from MR-Linac in the secondary MU calculation, a method based on modified Clarkson integration algorithm was developed and tested for a series of clinical situations. Results: ArtQA is capable of automatically checking plan integrity and logic consistency, detecting plan data transfer errors, performing secondary MU calculations with or without a transverse magnetic field, and verifying treatment delivery. The tool is efficient and effective for pre- and post-treatment QA checks of all available treatment parameters that may be impractical with the commonly-used visual inspection. Conclusion: The software tool ArtQA can be used for quick and automatic pre- and post-treatment QA check, eliminating human error associated with visual inspection. While this tool is developed for online replanning to be used on MR-Linac, where the QA needs to be performed rapidly as the patient is lying on the table waiting for the treatment, ArtQA can be used as a general QA tool

  14. The GlycanBuilder: a fast, intuitive and flexible software tool for building and displaying glycan structures

    Directory of Open Access Journals (Sweden)

    Dell Anne

    2007-08-01

    Full Text Available Abstract Background Carbohydrates play a critical role in human diseases and their potential utility as biomarkers for pathological conditions is a major driver for characterization of the glycome. However, the additional complexity of glycans compared to proteins and nucleic acids has slowed the advancement of glycomics in comparison to genomics and proteomics. The branched nature of carbohydrates, the great diversity of their constituents and the numerous alternative symbolic notations, make the input and display of glycans not as straightforward as for example the amino-acid sequence of a protein. Every glycoinformatic tool providing a user interface would benefit from a fast, intuitive, appealing mechanism for input and output of glycan structures in a computer readable format. Results A software tool for building and displaying glycan structures using a chosen symbolic notation is described here. The "GlycanBuilder" uses an automatic rendering algorithm to draw the saccharide symbols and to place them on the drawing board. The information about the symbolic notation is derived from a configurable graphical model as a set of rules governing the aspect and placement of residues and linkages. The algorithm is able to represent a structure using only few traversals of the tree and is inherently fast. The tool uses an XML format for import and export of encoded structures. Conclusion The rendering algorithm described here is able to produce high-quality representations of glycan structures in a chosen symbolic notation. The automated rendering process enables the "GlycanBuilder" to be used both as a user-independent component for displaying glycans and as an easy-to-use drawing tool. The "GlycanBuilder" can be integrated in web pages as a Java applet for the visual editing of glycans. The same component is available as a web service to render an encoded structure into a graphical format. Finally, the "GlycanBuilder" can be integrated into other

  15. On a Formal Tool for Reasoning About Flight Software Cost Analysis

    Science.gov (United States)

    Spagnuolo, John N., Jr.; Stukes, Sherry A.

    2013-01-01

    A report focuses on the development of flight software (FSW) cost estimates for 16 Discovery-class missions at JPL. The techniques and procedures developed enabled streamlining of the FSW analysis process, and provided instantaneous confirmation that the data and processes used for these estimates were consistent across all missions. The research provides direction as to how to build a prototype rule-based system for FSW cost estimation that would provide (1) FSW cost estimates, (2) explanation of how the estimates were arrived at, (3) mapping of costs, (4) mathematical trend charts with explanations of why the trends are what they are, (5) tables with ancillary FSW data of interest to analysts, (6) a facility for expert modification/enhancement of the rules, and (7) a basis for conceptually convenient expansion into more complex, useful, and general rule-based systems.

  16. A Systematic Mapping Study of Tools for Distributed Software Development Teams

    DEFF Research Database (Denmark)

    Tell, Paolo; Ali Babar, Muhammad

    Context: A wide variety of technologies have been developed to support Global Software Development (GSD). However, the information about the dozens of available solutions is quite diverse and scattered making it quite difficult to have an overview able to identify common trends and unveil research...... gaps. Objective: The objective of this research is to systematically identify and classify a comprehensive list of the technologies that have been developed and/or used for supporting GSD teams. Method: This study has been undertaken as a Systematic Mapping Study (SMS). Our searches identified 1958...... papers out of which 182 were found suitable for inclusion in this study. Results: We have identified 412 technologies reported to be developed and/or used for supporting GSD teams from 182 papers. The identified technologies have been analyzed in order to categorize them using four main classification...

  17. Use of slide presentation software as a tool to measure hip arthroplasty wear.

    Science.gov (United States)

    Yun, Ho Hyun; Jajodia, Nirmal K; Myung, Jae Sung; Oh, Jong Keon; Park, Sang Won; Shon, Won Yong

    2009-12-01

    The authors propose a manual measurement method for wear in total hip arthroplasty (PowerPoint method) based on the well-known Microsoft PowerPoint software (Microsoft Corporation, Redmond, Wash). In addition, the accuracy and reproducibility of the devised method were quantified and compared with two methods previously described by Livermore and Dorr, and accuracies were determined at different degrees of wear. The 57 hips recruited were allocated to: class 1 (retrieval series), class 2 (clinical series), and class 3 (a repeat film analysis series). The PowerPoint method was found to have good reproducibility and to better detect wear differences between classes. The devised method can be easily used for recording wear at follow-up visits and could be used as a supplementary method when computerized methods cannot be employed.

  18. SIGSAC Software: A tool for the Management of Chronic Disease and Telecare.

    Science.gov (United States)

    Claudia, Bustamante; Claudia, Alcayaga; Ilta, Lange; Iñigo, Meza

    2012-01-01

    Chronic disease management is highly complex because multiple interventions are required to improve clinical outcomes. From the patient's perspective, his main problems are dealing with self-management without support and feeling isolated between clinical visits. A strategy for providing continuous self-management support is the use of communication technologies, such as the telephone. However, to be efficient and effective, an information system is required for telecare planning and follows up. The use of electronic clinical records facilitates the implementation of telecare, but those systems often do not allow to combine usual care (visits to the health clinics) with telecare. This paper presents the experience of developing an application called SIGSAC (Software de Información, Gestión y Seguimiento para el Autocuidado Crónico) for Chronic Disease Management and Telecare follow up.

  19. Combining On-Line Characterization Tools with Modern Software Environments for Optimal Operation of Polymerization Processes

    Directory of Open Access Journals (Sweden)

    Navid Ghadipasha

    2016-02-01

    Full Text Available This paper discusses the initial steps towards the formulation and implementation of a generic and flexible model centric framework for integrated simulation, estimation, optimization and feedback control of polymerization processes. For the first time it combines the powerful capabilities of the automatic continuous on-line monitoring of polymerization system (ACOMP, with a modern simulation, estimation and optimization software environment towards an integrated scheme for the optimal operation of polymeric processes. An initial validation of the framework was performed for modelling and optimization using literature data, illustrating the flexibility of the method to apply under different systems and conditions. Subsequently, off-line capabilities of the system were fully tested experimentally for model validations, parameter estimation and process optimization using ACOMP data. Experimental results are provided for free radical solution polymerization of methyl methacrylate.

  20. Software engineering

    CERN Document Server

    Sommerville, Ian

    2010-01-01

    The ninth edition of Software Engineering presents a broad perspective of software engineering, focusing on the processes and techniques fundamental to the creation of reliable, software systems. Increased coverage of agile methods and software reuse, along with coverage of 'traditional' plan-driven software engineering, gives readers the most up-to-date view of the field currently available. Practical case studies, a full set of easy-to-access supplements, and extensive web resources make teaching the course easier than ever.