WorldWideScience

Sample records for sophisticated software tool

  1. xSyn: A Software Tool for Identifying Sophisticated 3-Way Interactions From Cancer Expression Data

    Directory of Open Access Journals (Sweden)

    Baishali Bandyopadhyay

    2017-08-01

    Full Text Available Background: Constructing gene co-expression networks from cancer expression data is important for investigating the genetic mechanisms underlying cancer. However, correlation coefficients or linear regression models are not able to model sophisticated relationships among gene expression profiles. Here, we address the 3-way interaction that 2 genes’ expression levels are clustered in different space locations under the control of a third gene’s expression levels. Results: We present xSyn, a software tool for identifying such 3-way interactions from cancer gene expression data based on an optimization procedure involving the usage of UPGMA (Unweighted Pair Group Method with Arithmetic Mean and synergy. The effectiveness is demonstrated by application to 2 real gene expression data sets. Conclusions: xSyn is a useful tool for decoding the complex relationships among gene expression profiles. xSyn is available at http://www.bdxconsult.com/xSyn.html .

  2. Automatically Assessing Lexical Sophistication: Indices, Tools, Findings, and Application

    Science.gov (United States)

    Kyle, Kristopher; Crossley, Scott A.

    2015-01-01

    This study explores the construct of lexical sophistication and its applications for measuring second language lexical and speaking proficiency. In doing so, the study introduces the Tool for the Automatic Analysis of LExical Sophistication (TAALES), which calculates text scores for 135 classic and newly developed lexical indices related to word…

  3. The Systems Biology Research Tool: evolvable open-source software

    Directory of Open Access Journals (Sweden)

    Wright Jeremiah

    2008-06-01

    Full Text Available Abstract Background Research in the field of systems biology requires software for a variety of purposes. Software must be used to store, retrieve, analyze, and sometimes even to collect the data obtained from system-level (often high-throughput experiments. Software must also be used to implement mathematical models and algorithms required for simulation and theoretical predictions on the system-level. Results We introduce a free, easy-to-use, open-source, integrated software platform called the Systems Biology Research Tool (SBRT to facilitate the computational aspects of systems biology. The SBRT currently performs 35 methods for analyzing stoichiometric networks and 16 methods from fields such as graph theory, geometry, algebra, and combinatorics. New computational techniques can be added to the SBRT via process plug-ins, providing a high degree of evolvability and a unifying framework for software development in systems biology. Conclusion The Systems Biology Research Tool represents a technological advance for systems biology. This software can be used to make sophisticated computational techniques accessible to everyone (including those with no programming ability, to facilitate cooperation among researchers, and to expedite progress in the field of systems biology.

  4. RSYST: From nuclear reactor calculations towards a highly sophisticated scientific software integration environment

    International Nuclear Information System (INIS)

    Noack, M.; Seybold, J.; Ruehle, R.

    1996-01-01

    The software environment RSYST was originally used to solve problems of reactor physics. The consideration of advanced scientific simulation requirements and the strict application of modern software design principles led to a system which is perfectly suitable to solve problems in various complex scientific problem domains. Starting with a review of the early days of RSYST, we describe the straight evolution driven by the need of software environment which combines the advantages of a high-performance database system with the capability to integrate sophisticated scientific technical applications. The RSYST architecture is presented and the data modelling capabilities are described. To demonstrate the powerful possibilities and flexibility of the RSYST environment, we describe a wide range of RSYST applications, e.g., mechanical simulations of multibody systems, which are used in biomechanical research, civil engineering and robotics. In addition, a hypermedia system which is used for scientific technical training and documentation is presented. (orig.) [de

  5. Machine Tool Software

    Science.gov (United States)

    1988-01-01

    A NASA-developed software package has played a part in technical education of students who major in Mechanical Engineering Technology at William Rainey Harper College. Professor Hack has been using (APT) Automatically Programmed Tool Software since 1969 in his CAD/CAM Computer Aided Design and Manufacturing curriculum. Professor Hack teaches the use of APT programming languages for control of metal cutting machines. Machine tool instructions are geometry definitions written in APT Language to constitute a "part program." The part program is processed by the machine tool. CAD/CAM students go from writing a program to cutting steel in the course of a semester.

  6. Tools for Embedded Computing Systems Software

    Science.gov (United States)

    1978-01-01

    A workshop was held to assess the state of tools for embedded systems software and to determine directions for tool development. A synopsis of the talk and the key figures of each workshop presentation, together with chairmen summaries, are presented. The presentations covered four major areas: (1) tools and the software environment (development and testing); (2) tools and software requirements, design, and specification; (3) tools and language processors; and (4) tools and verification and validation (analysis and testing). The utility and contribution of existing tools and research results for the development and testing of embedded computing systems software are described and assessed.

  7. Software architecture analysis tool : software architecture metrics collection

    NARCIS (Netherlands)

    Muskens, J.; Chaudron, M.R.V.; Westgeest, R.

    2002-01-01

    The Software Engineering discipline lacks the ability to evaluate software architectures. Here we describe a tool for software architecture analysis that is based on metrics. Metrics can be used to detect possible problems and bottlenecks in software architectures. Even though metrics do not give a

  8. Improvement of Computer Software Quality through Software Automated Tools.

    Science.gov (United States)

    1986-08-30

    information that are returned from the tools to the human user, and the forms in which these outputs are presented. Page 2 of 4 STAGE OF DEVELOPMENT: What... AUTOMIATED SOFTWARE TOOL MONITORING SYSTEM APPENDIX 2 2-1 INTRODUCTION This document and Automated Software Tool Monitoring Program (Appendix 1) are...t Output Output features provide links from the tool to both the human user and the target machine (where applicable). They describe the types

  9. Software engineering methodologies and tools

    Science.gov (United States)

    Wilcox, Lawrence M.

    1993-01-01

    Over the years many engineering disciplines have developed, including chemical, electronic, etc. Common to all engineering disciplines is the use of rigor, models, metrics, and predefined methodologies. Recently, a new engineering discipline has appeared on the scene, called software engineering. For over thirty years computer software has been developed and the track record has not been good. Software development projects often miss schedules, are over budget, do not give the user what is wanted, and produce defects. One estimate is there are one to three defects per 1000 lines of deployed code. More and more systems are requiring larger and more complex software for support. As this requirement grows, the software development problems grow exponentially. It is believed that software quality can be improved by applying engineering principles. Another compelling reason to bring the engineering disciplines to software development is productivity. It has been estimated that productivity of producing software has only increased one to two percent a year in the last thirty years. Ironically, the computer and its software have contributed significantly to the industry-wide productivity, but computer professionals have done a poor job of using the computer to do their job. Engineering disciplines and methodologies are now emerging supported by software tools that address the problems of software development. This paper addresses some of the current software engineering methodologies as a backdrop for the general evaluation of computer assisted software engineering (CASE) tools from actual installation of and experimentation with some specific tools.

  10. Software tools to aid Pascal and Ada program design

    Energy Technology Data Exchange (ETDEWEB)

    Jankowitz, H.T.

    1987-01-01

    This thesis describes a software tool which analyses the style and structure of Pascal and Ada programs by ensuring that some minimum design requirements are fulfilled. The tool is used in much the same way as a compiler is used to teach students the syntax of a language, only in this case issues related to the design and structure of the program are of paramount importance. The tool operates by analyzing the design and structure of a syntactically correct program, automatically generating a report detailing changes that need to be made in order to ensure that the program is structurally sound. The author discusses how the model gradually evolved from a plagiarism detection system which extracted several measurable characteristics in a program to a model that analyzed the style of Pascal programs. In order to incorporate more-sophistical concepts like data abstraction, information hiding and data protection, this model was then extended to analyze the composition of Ada programs. The Ada model takes full advantage of facilities offered in the language and by using this tool the standard and quality of written programs is raised whilst the fundamental principles of program design are grasped through a process of self-tuition.

  11. Benchmarking therapeutic drug monitoring software: a review of available computer tools.

    Science.gov (United States)

    Fuchs, Aline; Csajka, Chantal; Thoma, Yann; Buclin, Thierry; Widmer, Nicolas

    2013-01-01

    Therapeutic drug monitoring (TDM) aims to optimize treatments by individualizing dosage regimens based on the measurement of blood concentrations. Dosage individualization to maintain concentrations within a target range requires pharmacokinetic and clinical capabilities. Bayesian calculations currently represent the gold standard TDM approach but require computation assistance. In recent decades computer programs have been developed to assist clinicians in this assignment. The aim of this survey was to assess and compare computer tools designed to support TDM clinical activities. The literature and the Internet were searched to identify software. All programs were tested on personal computers. Each program was scored against a standardized grid covering pharmacokinetic relevance, user friendliness, computing aspects, interfacing and storage. A weighting factor was applied to each criterion of the grid to account for its relative importance. To assess the robustness of the software, six representative clinical vignettes were processed through each of them. Altogether, 12 software tools were identified, tested and ranked, representing a comprehensive review of the available software. Numbers of drugs handled by the software vary widely (from two to 180), and eight programs offer users the possibility of adding new drug models based on population pharmacokinetic analyses. Bayesian computation to predict dosage adaptation from blood concentration (a posteriori adjustment) is performed by ten tools, while nine are also able to propose a priori dosage regimens, based only on individual patient covariates such as age, sex and bodyweight. Among those applying Bayesian calculation, MM-USC*PACK© uses the non-parametric approach. The top two programs emerging from this benchmark were MwPharm© and TCIWorks. Most other programs evaluated had good potential while being less sophisticated or less user friendly. Programs vary in complexity and might not fit all healthcare

  12. Software Tools for Battery Design | Transportation Research | NREL

    Science.gov (United States)

    Software Tools for Battery Design Software Tools for Battery Design Under the Computer-Aided Engineering for Electric Drive Vehicle Batteries (CAEBAT) project, NREL has developed software tools to help using CAEBAT software tools. Knowledge of the interplay of multi-physics at varied scales is imperative

  13. Software tool for portal dosimetry research.

    Science.gov (United States)

    Vial, P; Hunt, P; Greer, P B; Oliver, L; Baldock, C

    2008-09-01

    This paper describes a software tool developed for research into the use of an electronic portal imaging device (EPID) to verify dose for intensity modulated radiation therapy (IMRT) beams. A portal dose image prediction (PDIP) model that predicts the EPID response to IMRT beams has been implemented into a commercially available treatment planning system (TPS). The software tool described in this work was developed to modify the TPS PDIP model by incorporating correction factors into the predicted EPID image to account for the difference in EPID response to open beam radiation and multileaf collimator (MLC) transmitted radiation. The processes performed by the software tool include; i) read the MLC file and the PDIP from the TPS, ii) calculate the fraction of beam-on time that each point in the IMRT beam is shielded by MLC leaves, iii) interpolate correction factors from look-up tables, iv) create a corrected PDIP image from the product of the original PDIP and the correction factors and write the corrected image to file, v) display, analyse, and export various image datasets. The software tool was developed using the Microsoft Visual Studio.NET framework with the C# compiler. The operation of the software tool was validated. This software provided useful tools for EPID dosimetry research, and it is being utilised and further developed in ongoing EPID dosimetry and IMRT dosimetry projects.

  14. Towards E-CASE Tools for Software Engineering

    OpenAIRE

    Nabil Arman

    2013-01-01

    CASE tools are having an important role in all phases of software systems development and engineering. This is evident in the huge benefits obtained from using these tools including their cost-effectiveness, rapid software application development, and improving the possibility of software reuse to name just a few. In this paper, the idea of moving towards E-CASE tools, rather than traditional CASE tools, is advocated since these E-CASE tools have all the benefits and advantages of traditional...

  15. Software project management tools in global software development: a systematic mapping study.

    Science.gov (United States)

    Chadli, Saad Yasser; Idri, Ali; Ros, Joaquín Nicolás; Fernández-Alemán, José Luis; de Gea, Juan M Carrillo; Toval, Ambrosio

    2016-01-01

    Global software development (GSD) which is a growing trend in the software industry is characterized by a highly distributed environment. Performing software project management (SPM) in such conditions implies the need to overcome new limitations resulting from cultural, temporal and geographic separation. The aim of this research is to discover and classify the various tools mentioned in literature that provide GSD project managers with support and to identify in what way they support group interaction. A systematic mapping study has been performed by means of automatic searches in five sources. We have then synthesized the data extracted and presented the results of this study. A total of 102 tools were identified as being used in SPM activities in GSD. We have classified these tools, according to the software life cycle process on which they focus and how they support the 3C collaboration model (communication, coordination and cooperation). The majority of the tools found are standalone tools (77%). A small number of platforms (8%) also offer a set of interacting tools that cover the software development lifecycle. Results also indicate that SPM areas in GSD are not adequately supported by corresponding tools and deserve more attention from tool builders.

  16. Software Tools for Software Maintenance

    Science.gov (United States)

    1988-10-01

    COMMUNICATIONS, AND COMPUTER SCIENCES I ,(AIRMICS) FO~SOFTWARE TOOLS (.o FOR SOF1 ’ARE MAINTENANCE (ASQBG-1-89-001) October, 1988 DTIC ELECTE -ifB...SUNWW~. B..c Program An~Iysw HA.c C-Tractr C Cobol Stncturing Facility VS Cobol 11 F-Scan Foctma Futbol Cobol Fortran Sltiuc Code Anaiyaer Fortran IS

  17. Sophisticated Players and Sophisticated Agents

    NARCIS (Netherlands)

    Rustichini, A.

    1998-01-01

    A sophisticated player is an individual who takes the action of the opponents, in a strategic situation, as determined by decision of rational opponents, and acts accordingly. A sophisticated agent is rational in the choice of his action, but ignores the fact that he is part of a strategic

  18. Techniques and tools for software qualification in KNICS

    International Nuclear Information System (INIS)

    Cha, Kyung H.; Lee, Yeong J.; Cheon, Se W.; Kim, Jang Y.; Lee, Jang S.; Kwon, Kee C.

    2004-01-01

    This paper describes techniques and tools for qualifying safety software in Korea Nuclear Instrumentation and Control System (KNICS). Safety software are developed and applied for a Reactor Protection System (RPS), an Engineered Safety Features and Component Control System (ESF-CCS), and a safety Programmable Logic Controller (PLC) in the KNICS. Requirements and design specifications of safety software are written by both natural language and formal specification languages. Statechart is used for formal specification of software of the ESF-CCS and the safety PLC while NuSCR is used for formal specification of them of the RPS. pSET (POSCON Software Engineering Tool) as a software development tool has been developed and utilized for the IEC61131-3 based PLC programming. The qualification of the safety software consists of software verification and validation (V and V) through software life cycle, software safety analysis, and software configuration management, software quality assurance, and COTS (Commercial-Off-The-Shelf) dedication. The criteria and requirements for qualifying the safety software have been established with them in Software Review Plan (SRP)/Branch Technical Positions (BTP)-14, IEEE Std. 7-4.3.2-1998, NUREG/CR-6463, IEEE Std. 1012-1998, and so on. Figure 1 summarizes qualification techniques and tools for the safety software

  19. A Software Tool for Legal Drafting

    Directory of Open Access Journals (Sweden)

    Daniel Gorín

    2011-09-01

    Full Text Available Although many attempts at automated aids for legal drafting have been made, they were based on the construction of a new tool, completely from scratch. This is at least curious, considering that a strong parallelism can be established between a normative document and a software specification: both describe what an entity should or should not do, can or cannot do. In this article we compare normative documents and software specifications to find out their similarities and differences. The comparison shows that there are distinctive particularities, but they are restricted to a very specific subclass of normative propositions. The rest, we postulate, can be dealt with software tools. For such an enterprise the FormaLex tool set was devised: an LTL-based language and companion tools that utilize model checking to find out normative incoherences in regulations, contracts and other legal documents. A feature-rich case study is analyzed with the presented tools.

  20. Workshop on Software Development Tools for Petascale Computing

    Energy Technology Data Exchange (ETDEWEB)

    Vetter, Jeffrey [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Georgia Inst. of Technology, Atlanta, GA (United States)

    2007-08-01

    Petascale computing systems will soon be available to the DOE science community. Recent studies in the productivity of HPC platforms point to better software environments as a key enabler to science on these systems. To prepare for the deployment and productive use of these petascale platforms, the DOE science and general HPC community must have the software development tools, such as performance analyzers and debuggers that meet application requirements for scalability, functionality, reliability, and ease of use. In this report, we identify and prioritize the research opportunities in the area of software development tools for high performance computing. To facilitate this effort, DOE hosted a group of 55 leading international experts in this area at the Software Development Tools for PetaScale Computing (SDTPC) Workshop, which was held in Washington, D.C. on August 1 and 2, 2007. Software development tools serve as an important interface between the application teams and the target HPC architectures. Broadly speaking, these roles can be decomposed into three categories: performance tools, correctness tools, and development environments. Accordingly, this SDTPC report has four technical thrusts: performance tools, correctness tools, development environment infrastructures, and scalable tool infrastructures. The last thrust primarily targets tool developers per se, rather than end users. Finally, this report identifies non-technical strategic challenges that impact most tool development. The organizing committee emphasizes that many critical areas are outside the scope of this charter; these important areas include system software, compilers, and I/O.

  1. Software tool for physics chart checks.

    Science.gov (United States)

    Li, H Harold; Wu, Yu; Yang, Deshan; Mutic, Sasa

    2014-01-01

    Physics chart check has long been a central quality assurance (QC) measure in radiation oncology. The purpose of this work is to describe a software tool that aims to accomplish simplification, standardization, automation, and forced functions in the process. Nationally recognized guidelines, including American College of Radiology and American Society for Radiation Oncology guidelines and technical standards, and the American Association of Physicists in Medicine Task Group reports were identified, studied, and summarized. Meanwhile, the reported events related to physics chart check service were analyzed using an event reporting and learning system. A number of shortfalls in the chart check process were identified. To address these problems, a software tool was designed and developed under Microsoft. Net in C# to hardwire as many components as possible at each stage of the process. The software consists of the following 4 independent modules: (1) chart check management; (2) pretreatment and during treatment chart check assistant; (3) posttreatment chart check assistant; and (4) quarterly peer-review management. The users were a large group of physicists in the author's radiation oncology clinic. During over 1 year of use the tool has proven very helpful in chart checking management, communication, documentation, and maintaining consistency. The software tool presented in this work aims to assist physicists at each stage of the physics chart check process. The software tool is potentially useful for any radiation oncology clinics that are either in the process of pursuing or maintaining the American College of Radiology accreditation.

  2. New software tool for dynamic radiological characterisation and monitoring in nuclear sites

    International Nuclear Information System (INIS)

    Szoeke, Istvan; Louka, Michael N.; Mark, Niels K.; Bryntesen, Tom R.; Bratteli, Joachim; Edvardsen, Svein T.; Gustavsen, Morten A.; Toppe, Aleksander L.; Johnsen, Terje; Rindahl, Grete

    2012-01-01

    The Halden Reactor Project (HRP) is a jointly sponsored international cooperation, under the aegis of the Organisation for Economic Co-operation and Development - Nuclear Energy Agency. Extensive and valuable guidance and tools, connected to safe and reliable operation of nuclear facilities, has been elaborated throughout the years within the frame of this programme. The HRP has particularly high level results in virtual-reality based tools for real-time areal and personal monitoring. The techniques, developed earlier, are now being supplemented to enhance the planning and monitoring capabilities, and support general radiological characterisation connected to nuclear sites and facilities. Due to the complexity and abundance of the input information required, software tools, dedicated to the radiological characterization of contaminated materials, buildings, land and groundwater, are applied to review, evaluate and visualize the data. Characterisation of the radiation situation in a realistic environment can be very complex, and efficient visualisation of the data to the user is not straight forward. The monitoring and planning tools elaborated in the frame of the HRP feature very sophisticated three-dimensional (3D) high definition visualisation and user interfaces to promote easy interpretation of the input data. The visualisation tools permit dynamic visualisation of radiation fields in virtual or augmented reality by various techniques and real-time personal monitoring of humanoid models. In addition new techniques are being elaborated to visualise the 3D distribution of activities in structures and materials. The dosimetric algorithms, feeding information to the visualisation and user interface of these planning tools, include deterministic radiation transport techniques suitable for fast photon dose estimates, in case physical and radio- and spectrometric characteristics of the gamma sources are known. The basic deterministic model, implemented in earlier

  3. Tools & training for more secure software

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    Just by fate of nature, software today is shipped out as “beta”, coming with vulnerabilities and weaknesses, which should already have been fixed at the programming stage. This presentation will show the consequences of suboptimal software, why good programming, thorough software design, and a proper software development process is imperative for the overall security of the Organization, and how a few simple tools and training are supposed to make CERN software more secure.

  4. Software Development Methods and Tools: a New Zealand study

    Directory of Open Access Journals (Sweden)

    Chris Phillips

    2005-05-01

    Full Text Available This study is a more detailed follow-up to a preliminary investigation of the practices of software engineers in New Zealand. The focus of this study is on the methods and tools used by software developers in their current organisation. The project involved detailed questionnaires being piloted and sent out to several hundred software developers. A central part of the research involved the identification of factors affecting the use and take-up of existing software development tools in the workplace. The full spectrum of tools from fully integrated I-CASE tools to individual software applications, such as drawing tools was investigated. This paper describes the project and presents the findings.

  5. Criteria and tools for scientific software quality measurements

    Energy Technology Data Exchange (ETDEWEB)

    Tseng, M Y [Previse Inc., Willowdale ON (Canada)

    1995-12-01

    Not all software used in the nuclear industry needs the rigorous formal verification, reliability testing and quality assessment that are being applied to safety critical software. Recently, however, there is increasing recognition that systematic and objective quality assessment of the scientific software used in design and safety analyses of nuclear facilities is necessary to support safety and licensing decisions. Because of the complexity and large size of these programs and the resource constraints faced by the AECB reviewer, it is desirable that appropriate automated tools are used wherever practical. To objectively assess the quality of software, a set of attributes of a software product by which its quality is described and evaluated must be established. These attributes must be relevant to the application domain of software under evaluation. To effectively assess the quality of software, metrics defining quantitative scale and method appropriate to determine the value of attributes need to be applied. To cost-effectively perform the evaluation, use of suitable automated tools is desirable. In this project, criteria for evaluating the quality of scientific software are presented; metrics for which those criteria can be evaluated are identified; a survey of automated tools to measure those metrics was conducted and the most appropriate tool (QA Fortran) was acquired; and the tool usage was demonstrated on three sample programs. (author) 5 refs.

  6. Criteria and tools for scientific software quality measurements

    International Nuclear Information System (INIS)

    Tseng, M.Y.

    1995-12-01

    Not all software used in the nuclear industry needs the rigorous formal verification, reliability testing and quality assessment that are being applied to safety critical software. Recently, however, there is increasing recognition that systematic and objective quality assessment of the scientific software used in design and safety analyses of nuclear facilities is necessary to support safety and licensing decisions. Because of the complexity and large size of these programs and the resource constraints faced by the AECB reviewer, it is desirable that appropriate automated tools are used wherever practical. To objectively assess the quality of software, a set of attributes of a software product by which its quality is described and evaluated must be established. These attributes must be relevant to the application domain of software under evaluation. To effectively assess the quality of software, metrics defining quantitative scale and method appropriate to determine the value of attributes need to be applied. To cost-effectively perform the evaluation, use of suitable automated tools is desirable. In this project, criteria for evaluating the quality of scientific software are presented; metrics for which those criteria can be evaluated are identified; a survey of automated tools to measure those metrics was conducted and the most appropriate tool (QA Fortran) was acquired; and the tool usage was demonstrated on three sample programs. (author) 5 refs

  7. Applying CASE Tools for On-Board Software Development

    Science.gov (United States)

    Brammer, U.; Hönle, A.

    For many space projects the software development is facing great pressure with respect to quality, costs and schedule. One way to cope with these challenges is the application of CASE tools for automatic generation of code and documentation. This paper describes two CASE tools: Rhapsody (I-Logix) featuring UML and ISG (BSSE) that provides modeling of finite state machines. Both tools have been used at Kayser-Threde in different space projects for the development of on-board software. The tools are discussed with regard to the full software development cycle.

  8. The tool for the automatic analysis of lexical sophistication (TAALES): version 2.0.

    Science.gov (United States)

    Kyle, Kristopher; Crossley, Scott; Berger, Cynthia

    2017-07-11

    This study introduces the second release of the Tool for the Automatic Analysis of Lexical Sophistication (TAALES 2.0), a freely available and easy-to-use text analysis tool. TAALES 2.0 is housed on a user's hard drive (allowing for secure data processing) and is available on most operating systems (Windows, Mac, and Linux). TAALES 2.0 adds 316 indices to the original tool. These indices are related to word frequency, word range, n-gram frequency, n-gram range, n-gram strength of association, contextual distinctiveness, word recognition norms, semantic network, and word neighbors. In this study, we validated TAALES 2.0 by investigating whether its indices could be used to model both holistic scores of lexical proficiency in free writes and word choice scores in narrative essays. The results indicated that the TAALES 2.0 indices could be used to explain 58% of the variance in lexical proficiency scores and 32% of the variance in word-choice scores. Newly added TAALES 2.0 indices, including those related to n-gram association strength, word neighborhood, and word recognition norms, featured heavily in these predictor models, suggesting that TAALES 2.0 represents a substantial upgrade.

  9. Tool Use Within NASA Software Quality Assurance

    Science.gov (United States)

    Shigeta, Denise; Port, Dan; Nikora, Allen P.; Wilf, Joel

    2013-01-01

    As space mission software systems become larger and more complex, it is increasingly important for the software assurance effort to have the ability to effectively assess both the artifacts produced during software system development and the development process itself. Conceptually, assurance is a straightforward idea - it is the result of activities carried out by an organization independent of the software developers to better inform project management of potential technical and programmatic risks, and thus increase management's confidence in the decisions they ultimately make. In practice, effective assurance for large, complex systems often entails assessing large, complex software artifacts (e.g., requirements specifications, architectural descriptions) as well as substantial amounts of unstructured information (e.g., anomaly reports resulting from testing activities during development). In such an environment, assurance engineers can benefit greatly from appropriate tool support. In order to do so, an assurance organization will need accurate and timely information on the tool support available for various types of assurance activities. In this paper, we investigate the current use of tool support for assurance organizations within NASA, and describe on-going work at JPL for providing assurance organizations with the information about tools they need to use them effectively.

  10. The evolution of CMS software performance studies

    CERN Document Server

    Kortelainen, Matti J

    2010-01-01

    CMS has had an ongoing and dedicated effort to optimize software performance for several years. Initially this effort focused primarily on the cleanup of many issues coming from basic C++ errors, namely reducing dynamic memory churn, unnecessary copies/temporaries and tools to routinely monitor these things. Over the past 1.5 years, however, the transition to 64bit, newer versions of the gcc compiler, newer tools and the enabling of techniques like vectorization have made possible more sophisticated improvements to the software performance. This presentation will cover this evolution and describe the current avenues being pursued for software performance, as well as the corresponding gains.

  11. The evolution of CMS software performance studies

    Science.gov (United States)

    Kortelainen, M. J.; Elmer, P.; Eulisse, G.; Innocente, V.; Jones, C. D.; Tuura, L.

    2011-12-01

    CMS has had an ongoing and dedicated effort to optimize software performance for several years. Initially this effort focused primarily on the cleanup of many issues coming from basic C++ errors, namely reducing dynamic memory churn, unnecessary copies/temporaries and tools to routinely monitor these things. Over the past 1.5 years, however, the transition to 64bit, newer versions of the gcc compiler, newer tools and the enabling of techniques like vectorization have made possible more sophisticated improvements to the software performance. This presentation will cover this evolution and describe the current avenues being pursued for software performance, as well as the corresponding gains.

  12. Software Users Manual (SUM): Extended Testability Analysis (ETA) Tool

    Science.gov (United States)

    Maul, William A.; Fulton, Christopher E.

    2011-01-01

    This software user manual describes the implementation and use the Extended Testability Analysis (ETA) Tool. The ETA Tool is a software program that augments the analysis and reporting capabilities of a commercial-off-the-shelf (COTS) testability analysis software package called the Testability Engineering And Maintenance System (TEAMS) Designer. An initial diagnostic assessment is performed by the TEAMS Designer software using a qualitative, directed-graph model of the system being analyzed. The ETA Tool utilizes system design information captured within the diagnostic model and testability analysis output from the TEAMS Designer software to create a series of six reports for various system engineering needs. The ETA Tool allows the user to perform additional studies on the testability analysis results by determining the detection sensitivity to the loss of certain sensors or tests. The ETA Tool was developed to support design and development of the NASA Ares I Crew Launch Vehicle. The diagnostic analysis provided by the ETA Tool was proven to be valuable system engineering output that provided consistency in the verification of system engineering requirements. This software user manual provides a description of each output report generated by the ETA Tool. The manual also describes the example diagnostic model and supporting documentation - also provided with the ETA Tool software release package - that were used to generate the reports presented in the manual

  13. Software Tools for Development on the Peregrine System | High-Performance

    Science.gov (United States)

    Computing | NREL Software Tools for Development on the Peregrine System Software Tools for and manage software at the source code level. Cross-Platform Make and SCons The "Cross-Platform Make" (CMake) package is from Kitware, and SCons is a modern software build tool based on Python

  14. PISCES: A Tool for Predicting Software Testability

    Science.gov (United States)

    Voas, Jeffrey M.; Miller, Keith W.; Payne, Jeffery E.

    1991-01-01

    Before a program can fail, a software fault must be executed, that execution must alter the data state, and the incorrect data state must propagate to a state that results directly in an incorrect output. This paper describes a tool called PISCES (developed by Reliable Software Technologies Corporation) for predicting the probability that faults in a particular program location will accomplish all three of these steps causing program failure. PISCES is a tool that is used during software verification and validation to predict a program's testability.

  15. EISA 432 Energy Audits Best Practices: Software Tools

    Energy Technology Data Exchange (ETDEWEB)

    Maryl Fisher

    2014-11-01

    Five whole building analysis software tools that can aid an energy manager with fulfilling energy audit and commissioning/retro-commissioning requirements were selected for review in this best practices study. A description of each software tool is provided as well as a discussion of the user interface and level of expertise required for each tool, a review of how to use the tool for analyzing energy conservation opportunities, the format and content of reports generated by the tool, and a discussion on the applicability of the tool for commissioning.

  16. WLS software for the Los Alamos geophysical instrumentation truck

    International Nuclear Information System (INIS)

    Ideker, C.D.; LaDelfe, C.M.

    1985-01-01

    Los Alamos National Laboratory's capabilities for special downhole geophysical well logging has increased steadily over the past few years. Software was developed originally for each individual tool as it became operational. With little or no standardization for tool software modules, software development became redundant, time consuming, and cost ineffective. With long-term use and the rapid evolution of well logging capacity in mind. Los Alamos and EG and G personnel decided to purchase a software system. The system was designed to offer: wide-range use and programming flexibility; standardization subroutines for tool module development; user friendly operation which would reduce training time; operator error checking and alarm activation; maximum growth capacity for new tools as they are added to the inventory; and the ability to incorporate changes made to the computer operating system and hardware. The end result is a sophisticated and flexible software tool and for transferring downhole geophysical measurement data to computer disk files. This paper outlines the need, design, development, and implementation of the WLS software for geophysical data acquisition. A demonstration and working examples are included in the presentation

  17. Testing tool for software concerning nuclear power plant safety

    International Nuclear Information System (INIS)

    Boulc'h, J.; Le Meur, M.; Collart, J.M.; Segalard, J.; Uberschlag, J.

    1984-11-01

    In the present case, softwares to be analyzed are all written in assembler language. This paper presents the study and the realization of a tool to analyze softwares which have an important role for nuclear reactor protection and sauvegarde: principles of the tool design, working principle, realization and evolution of dynamic analyze tool [fr

  18. Structure and software tools of AIDA.

    Science.gov (United States)

    Duisterhout, J S; Franken, B; Witte, F

    1987-01-01

    AIDA consists of a set of software tools to allow for fast development and easy-to-maintain Medical Information Systems. AIDA supports all aspects of such a system both during development and operation. It contains tools to build and maintain forms for interactive data entry and on-line input validation, a database management system including a data dictionary and a set of run-time routines for database access, and routines for querying the database and output formatting. Unlike an application generator, the user of AIDA may select parts of the tools to fulfill his needs and program other subsystems not developed with AIDA. The AIDA software uses as host language the ANSI-standard programming language MUMPS, an interpreted language embedded in an integrated database and programming environment. This greatly facilitates the portability of AIDA applications. The database facilities supported by AIDA are based on a relational data model. This data model is built on top of the MUMPS database, the so-called global structure. This relational model overcomes the restrictions of the global structure regarding string length. The global structure is especially powerful for sorting purposes. Using MUMPS as a host language allows the user an easy interface between user-defined data validation checks or other user-defined code and the AIDA tools. AIDA has been designed primarily for prototyping and for the construction of Medical Information Systems in a research environment which requires a flexible approach. The prototyping facility of AIDA operates terminal independent and is even to a great extent multi-lingual. Most of these features are table-driven; this allows on-line changes in the use of terminal type and language, but also causes overhead. AIDA has a set of optimizing tools by which it is possible to build a faster, but (of course) less flexible code from these table definitions. By separating the AIDA software in a source and a run-time version, one is able to write

  19. Caesy: A software tool for computer-aided engineering

    Science.gov (United States)

    Wette, Matt

    1993-01-01

    A new software tool, Caesy, is described. This tool provides a strongly typed programming environment for research in the development of algorithms and software for computer-aided control system design. A description of the user language and its implementation as they currently stand are presented along with a description of work in progress and areas of future work.

  20. Software Development Methods and Tools: a New Zealand study

    OpenAIRE

    Chris Phillips; Elizabeth Kemp; Duncan Hedderley

    2005-01-01

    This study is a more detailed follow-up to a preliminary investigation of the practices of software engineers in New Zealand. The focus of this study is on the methods and tools used by software developers in their current organisation. The project involved detailed questionnaires being piloted and sent out to several hundred software developers. A central part of the research involved the identification of factors affecting the use and take-up of existing software development tools in the wo...

  1. Toxicity Estimation Software Tool (TEST)

    Science.gov (United States)

    The Toxicity Estimation Software Tool (TEST) was developed to allow users to easily estimate the toxicity of chemicals using Quantitative Structure Activity Relationships (QSARs) methodologies. QSARs are mathematical models used to predict measures of toxicity from the physical c...

  2. Software development tools using GPGPU potentialities

    International Nuclear Information System (INIS)

    Dudnik, V.A.; Kudryavtsev, V.I.; Sereda, T.M.; Us, S.A.; Shestakov, M.V.

    2011-01-01

    The paper deals with potentialities of various up-to-date software development tools for making use of graphic processor (GPU) parallel computing resources. Examples are given to illustrate the use of present-day software tools for the development of applications and realization of algorithms for scientific-technical calculations performed by GPGPU. The paper presents some classes of hard mathematical problems of scientific-technical calculations, for which the GPGPU can be efficiently used. is possible. To reduce the time of calculation program development with the use of GPGPU capabilities, various dedicated programming systems and problem-oriented subroutine libraries are recommended. Performance parameters when solving the problems with and without the use of GPGPU potentialities are compared.

  3. Runtime Performance Monitoring Tool for RTEMS System Software

    Science.gov (United States)

    Cho, B.; Kim, S.; Park, H.; Kim, H.; Choi, J.; Chae, D.; Lee, J.

    2007-08-01

    RTEMS is a commercial-grade real-time operating system that supports multi-processor computers. However, there are not many development tools for RTEMS. In this paper, we report new RTEMS-based runtime performance monitoring tool. We have implemented a light weight runtime monitoring task with an extension to the RTEMS APIs. Using our tool, software developers can verify various performance- related parameters during runtime. Our tool can be used during software development phase and in-orbit operation as well. Our implemented target agent is light weight and has small overhead using SpaceWire interface. Efforts to reduce overhead and to add other monitoring parameters are currently under research.

  4. FFI: A software tool for ecological monitoring

    Science.gov (United States)

    Duncan C. Lutes; Nathan C. Benson; MaryBeth Keifer; John F. Caratti; S. Austin Streetman

    2009-01-01

    A new monitoring tool called FFI (FEAT/FIREMON Integrated) has been developed to assist managers with collection, storage and analysis of ecological information. The tool was developed through the complementary integration of two fire effects monitoring systems commonly used in the United States: FIREMON and the Fire Ecology Assessment Tool. FFI provides software...

  5. Software tool for data mining and its applications

    Science.gov (United States)

    Yang, Jie; Ye, Chenzhou; Chen, Nianyi

    2002-03-01

    A software tool for data mining is introduced, which integrates pattern recognition (PCA, Fisher, clustering, hyperenvelop, regression), artificial intelligence (knowledge representation, decision trees), statistical learning (rough set, support vector machine), computational intelligence (neural network, genetic algorithm, fuzzy systems). It consists of nine function models: pattern recognition, decision trees, association rule, fuzzy rule, neural network, genetic algorithm, Hyper Envelop, support vector machine, visualization. The principle and knowledge representation of some function models of data mining are described. The software tool of data mining is realized by Visual C++ under Windows 2000. Nonmonotony in data mining is dealt with by concept hierarchy and layered mining. The software tool of data mining has satisfactorily applied in the prediction of regularities of the formation of ternary intermetallic compounds in alloy systems, and diagnosis of brain glioma.

  6. Claire, a simulation and testing tool for critical softwares

    International Nuclear Information System (INIS)

    Gassino, J.; Henry, J.Y.

    1996-01-01

    The CEA and IPSN (Institute of Nuclear Protection and Safety) needs concerning the testing of critical softwares, have led to the development of the CLAIRE tool which is able to test the softwares without modification. This tool allows to graphically model the system and its environment and to include components into the model which observe and do not modify the behaviour of the system to be tested. The executable codes are integrated in the model. The tool uses target machine simulators (microprocessors). The technique used (the event simulation) allows to associate actions with events such as the execution of an instruction, the access to a variable etc.. The simulation results are exploited using graphic, states research and test cover measurement tools. In particular, this tool can give help to the evaluation of critical softwares with pre-existing components. (J.S.)

  7. Software-based acoustical measurements

    CERN Document Server

    Miyara, Federico

    2017-01-01

    This textbook provides a detailed introduction to the use of software in combination with simple and economical hardware (a sound level meter with calibrated AC output and a digital recording system) to obtain sophisticated measurements usually requiring expensive equipment. It emphasizes the use of free, open source, and multiplatform software. Many commercial acoustical measurement systems use software algorithms as an integral component; however the methods are not disclosed. This book enables the reader to develop useful algorithms and provides insight into the use of digital audio editing tools to document features in the signal. Topics covered include acoustical measurement principles, in-depth critical study of uncertainty applied to acoustical measurements, digital signal processing from the basics, and metrologically-oriented spectral and statistical analysis of signals. The student will gain a deep understanding of the use of software for measurement purposes; the ability to implement software-based...

  8. Application software profiles 2010

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    2010-04-15

    This article presented information on new software applications designed to facilitate petroleum exploration, drilling and production activities. Computer modelling and analysis enables oil and gas producers to characterize reservoirs, estimate reserves forecast production, plan operations and manage assets. Seven Calgary-based organizations were highlighted along with their sophisticated software tools, the applications and the new features available in each product. The geoSCOUT version 7.7 by GeoLOGIC Systems Ltd. integrates public and proprietary data on wells, well logs, reserves, pipelines, production, ownership and seismic location data. The Value Navigator and AFE Navigator by Energy Navigator provides control over reserves, production and cash flow forecasting. FAST Harmony, FAST Evolution, FAST CBM, FAST FieldNotes, Fast Piper, FAST RTA, FAST VirtuWell and FAST WellTest by Fekete Associates Inc. provide reserve evaluations for reservoir engineering projects and production data analysis. The esi.manage software program by 3esi improves business results for upstream oil and gas companies through enhanced decision making and workforce effectiveness. WELLFLO, PIPEFLO, FORGAS, OLGA, Drillbench, and MEPO wellbore solutions by Neotec provide unique platforms for flow simulation to optimize oil and gas production systems. Petrel, ECLIPSE, Avocet, PipeSim and Merak software tools by Schlumberger Information Solutions are petroleum systems modelling tools for geologic mapping, visualization modelling and reservoir engineering. StudioSL by Streamsim Technologies Inc. is a modelling tool for optimizing flood management. figs.

  9. Installing and Setting Up Git Software Tool on Windows | High-Performance

    Science.gov (United States)

    Computing | NREL Git Software Tool on Windows Installing and Setting Up Git Software Tool on Windows Learn how to set up the Git software tool on Windows for use with the Peregrine system. Git is this doc, we'll show you how to get git installed on Windows 7, and how to get things set up on NREL's

  10. Software tools for microprocessor based systems

    International Nuclear Information System (INIS)

    Halatsis, C.

    1981-01-01

    After a short review of the hardware and/or software tools for the development of single-chip, fixed instruction set microprocessor-based sytems we focus on the software tools for designing systems based on microprogrammed bit-sliced microprocessors. Emphasis is placed on meta-microassemblers and simulation facilties at the register-transfer-level and architecture level. We review available meta-microassemblers giving their most important features, advantages and disadvantages. We also make extentions to higher-level microprogramming languages and associated systems specifically developed for bit-slices. In the area of simulation facilities we first discuss the simulation objectives and the criteria for chosing the right simulation language. We consertrate to simulation facilities already used in bit-slices projects and discuss the gained experience. We conclude by describing the way the Signetics meta-microassembler and the ISPS simulation tool have been employed in the design of a fast microprogrammed machine, called MICE, made out of ECL bit-slices. (orig.)

  11. Estimation of toxicity using a Java based software tool

    Science.gov (United States)

    A software tool has been developed that will allow a user to estimate the toxicity for a variety of endpoints (such as acute aquatic toxicity). The software tool is coded in Java and can be accessed using a web browser (or alternatively downloaded and ran as a stand alone applic...

  12. The classification and evaluation of Computer-Aided Software Engineering tools

    OpenAIRE

    Manley, Gary W.

    1990-01-01

    Approved for public release; distribution unlimited. The use of Computer-Aided Software Engineering (CASE) tools has been viewed as a remedy for the software development crisis by achieving improved productivity and system quality via the automation of all or part of the software engineering process. The proliferation and tremendous variety of tools available have stretched the understanding of experienced practitioners and has had a profound impact on the software engineering process itse...

  13. A coherent environment of software improvement tools for CMS

    International Nuclear Information System (INIS)

    Eulisse, G.; Muzaffar, S.; Osborne, I.; Taylor, L.; Tuura, L.A.

    2004-01-01

    CMS has developed approximately one million lines of C++ code and uses many more from HEP, Grid and public domain projects. We describe a suite of tools which help to manage this complexity by measuring software dependencies, quality metrics, and CPU and memory performance. This coherent environment integrates and extends existing open-source tools where possible and provides new in-house components where a suitable solution does not already exist. This is a freely available environment with graphical user interface which can be run on any software without the need to recompile or instrument it. We have developed ignominy which performs software dependency analysis of source code, binary products and external software. CPU profiling is provided based on oprofile, with added features such as profile snapshots, distributed profiling and aggregate profiles for farm systems including server-side tools for collecting profile data. Finally, we have developed a low-overhead performance and memory profiling tool, MemProf, which can perform (gprof-style) hierarchical performance profiling, in a way that works with multiple threads and dynamically loaded libraries (unlike gprof). It also gathers exact memory allocation profiles including which code allocates most, in what sizes of chunks, for how long, where the memory is getting freed and where it is getting leaked. We describe this tool suite and how it has been used to enhance the quality of CMS software

  14. Technology Transfer Challenges for High-Assurance Software Engineering Tools

    Science.gov (United States)

    Koga, Dennis (Technical Monitor); Penix, John; Markosian, Lawrence Z.

    2003-01-01

    In this paper, we describe our experience with the challenges thar we are currently facing in our effort to develop advanced software verification and validation tools. We categorize these challenges into several areas: cost benefits modeling, tool usability, customer application domain, and organizational issues. We provide examples of challenges in each area and identrfj, open research issues in areas which limit our ability to transfer high-assurance software engineering tools into practice.

  15. Computer-aided design in power engineering. Application of software tools

    International Nuclear Information System (INIS)

    Stojkovic, Zlatan

    2012-01-01

    Demonstrates the use software tools in the practice of design in the field of power systems. Presents many applications in the design in the field of power systems. Useful for educative purposes and practical work. This textbooks demonstrates the application of software tools in solving a series of problems from the field of designing power system structures and systems. It contains four chapters: The first chapter leads the reader through all the phases necessary in the procedures of computer aided modeling and simulation. It guides through the complex problems presenting on the basis of eleven original examples. The second chapter presents application of software tools in power system calculations of power systems equipment design. Several design example calculations are carried out using engineering standards like MATLAB, EMTP/ATP, Excel and Access, AutoCAD and Simulink. The third chapters focuses on the graphical documentation using a collection of software tools (AutoCAD, EPLAN, SIMARIS SIVACON, SIMARIS DESIGN) which enable the complete automation of the development of graphical documentation of a power systems. In the fourth chapter, the application of software tools in the project management in power systems is discussed. Here, the emphasis is put on the standard software MS Excel and MS Project.

  16. Computer-aided design in power engineering. Application of software tools

    Energy Technology Data Exchange (ETDEWEB)

    Stojkovic, Zlatan

    2012-07-01

    Demonstrates the use software tools in the practice of design in the field of power systems. Presents many applications in the design in the field of power systems. Useful for educative purposes and practical work. This textbooks demonstrates the application of software tools in solving a series of problems from the field of designing power system structures and systems. It contains four chapters: The first chapter leads the reader through all the phases necessary in the procedures of computer aided modeling and simulation. It guides through the complex problems presenting on the basis of eleven original examples. The second chapter presents application of software tools in power system calculations of power systems equipment design. Several design example calculations are carried out using engineering standards like MATLAB, EMTP/ATP, Excel and Access, AutoCAD and Simulink. The third chapters focuses on the graphical documentation using a collection of software tools (AutoCAD, EPLAN, SIMARIS SIVACON, SIMARIS DESIGN) which enable the complete automation of the development of graphical documentation of a power systems. In the fourth chapter, the application of software tools in the project management in power systems is discussed. Here, the emphasis is put on the standard software MS Excel and MS Project.

  17. User Studies: Developing Learning Strategy Tool Software for Children.

    Science.gov (United States)

    Fitzgerald, Gail E.; Koury, Kevin A.; Peng, Hsinyi

    This paper is a report of user studies for developing learning strategy tool software for children. The prototype software demonstrated is designed for children with learning and behavioral disabilities. The tools consist of easy-to-use templates for creating organizational, memory, and learning approach guides for use in classrooms and at home.…

  18. A communication protocol for interactively controlling software tools

    NARCIS (Netherlands)

    Wulp, van der J.

    2008-01-01

    We present a protocol for interactively using software tools in a loosely coupled tool environment. Such an environment can assist the user in doing tasks that require the use of multiple tools. For example, it can invoke tools on certain input, set processing parameters, await task completion and

  19. The evolution of CACSD tools-a software engineering perspective

    DEFF Research Database (Denmark)

    Ravn, Ole; Szymkat, Maciej

    1992-01-01

    The earlier evolution of computer-aided control system design (CACSD) tools is discussed from a software engineering perspective. A model of the design process is presented as the basis for principles and requirements of future CACSD tools. Combinability, interfacing in memory, and an open...... workspace are seen as important concepts in CACSD. Some points are made about the problem of buy or make when new software is required, and the idea of buy and make is put forward. Emphasis is put on the time perspective and the life cycle of the software...

  20. A software communication tool for the tele-ICU.

    Science.gov (United States)

    Pimintel, Denise M; Wei, Shang Heng; Odor, Alberto

    2013-01-01

    The Tele Intensive Care Unit (tele-ICU) supports a high volume, high acuity population of patients. There is a high-volume of incoming and outgoing calls, especially during the evening and night hours, through the tele-ICU hubs. The tele-ICU clinicians must be able to communicate effectively to team members in order to support the care of complex and critically ill patients while supporting and maintaining a standard to improve time to intervention. This study describes a software communication tool that will improve the time to intervention, over the paper-driven communication format presently used in the tele-ICU. The software provides a multi-relational database of message instances to mine information for evaluation and quality improvement for all entities that touch the tele-ICU. The software design incorporates years of critical care and software design experience combined with new skills acquired in an applied Health Informatics program. This software tool will function in the tele-ICU environment and perform as a front-end application that gathers, routes, and displays internal communication messages for intervention by priority and provider.

  1. Tuning COCOMO-II for Software Process Improvement: A Tool Based Approach

    Directory of Open Access Journals (Sweden)

    SYEDA UMEMA HANI

    2016-10-01

    Full Text Available In order to compete in the international software development market the software organizations have to adopt internationally accepted software practices i.e. standard like ISO (International Standard Organization or CMMI (Capability Maturity Model Integration in spite of having scarce resources and tools. The aim of this study is to develop a tool which could be used to present an actual picture of Software Process Improvement benefits in front of the software development companies. However, there are few tools available to assist in making predictions, they are too expensive and could not cover dataset that reflect the cultural behavior of organizations for software development in developing countries. In extension to our previously done research reported elsewhere for Pakistani software development organizations which has quantified benefits of SDPI (Software Development Process Improvement, this research has used sixty-two datasets from three different software development organizations against the set of metrics used in COCOMO-II (Constructive Cost Model 2000. It derived a verifiable equation for calculating ISF (Ideal Scale Factor and tuned the COCOMO-II model to bring prediction capability for SDPI (benefit measurement classes such as ESCP (Effort, Schedule, Cost, and Productivity. This research has contributed towards software industry by giving a reliable and low-cost mechanism for generating prediction models with high prediction accuracy. Hopefully, this study will help software organizations to use this tool not only to predict ESCP but also to predict an exact impact of SDPI.

  2. Computer- Aided Design in Power Engineering Application of Software Tools

    CERN Document Server

    Stojkovic, Zlatan

    2012-01-01

    This textbooks demonstrates the application of software tools in solving a series of problems from the field of designing power system structures and systems. It contains four chapters: The first chapter leads the reader through all the phases necessary in the procedures of computer aided modeling and simulation. It guides through the complex problems presenting on the basis of eleven original examples. The second chapter presents  application of software tools in power system calculations of power systems equipment design. Several design example calculations are carried out using engineering standards like MATLAB, EMTP/ATP, Excel & Access, AutoCAD and Simulink. The third chapters focuses on the graphical documentation using a collection of software tools (AutoCAD, EPLAN, SIMARIS SIVACON, SIMARIS DESIGN) which enable the complete automation of the development of graphical documentation of a power systems. In the fourth chapter, the application of software tools in the project management in power systems ...

  3. Sandia software guidelines: Volume 5, Tools, techniques, and methodologies

    Energy Technology Data Exchange (ETDEWEB)

    1989-07-01

    This volume is one in a series of Sandia Software Guidelines intended for use in producing quality software within Sandia National Laboratories. This volume describes software tools and methodologies available to Sandia personnel for the development of software, and outlines techniques that have proven useful within the Laboratories and elsewhere. References and evaluations by Sandia personnel are included. 6 figs.

  4. Software Quality Control at Belle II

    Science.gov (United States)

    Ritter, M.; Kuhr, T.; Hauth, T.; Gebard, T.; Kristof, M.; Pulvermacher, C.; Belle Software Group, II

    2017-10-01

    Over the last seven years the software stack of the next generation B factory experiment Belle II has grown to over one million lines of C++ and Python code, counting only the part included in offline software releases. There are several thousand commits to the central repository by about 100 individual developers per year. To keep a coherent software stack of high quality that it can be sustained and used efficiently for data acquisition, simulation, reconstruction, and analysis over the lifetime of the Belle II experiment is a challenge. A set of tools is employed to monitor the quality of the software and provide fast feedback to the developers. They are integrated in a machinery that is controlled by a buildbot master and automates the quality checks. The tools include different compilers, cppcheck, the clang static analyzer, valgrind memcheck, doxygen, a geometry overlap checker, a check for missing or extra library links, unit tests, steering file level tests, a sophisticated high-level validation suite, and an issue tracker. The technological development infrastructure is complemented by organizational means to coordinate the development.

  5. OST: analysis tool for real time software by simulation of material and software environments

    International Nuclear Information System (INIS)

    Boulc'h; Le Meur; Lapassat; Salichon; Segalard

    1988-07-01

    The utilization of microprocessors systems in a nuclear installation control oblige a great operation safety in the installation operation and in the environment protection. For the safety analysis of these installations the Institute of Protection and Nuclear Safety (IPSN) will dispose tools which permit to make controls during all the life of the software. The simulation and test tool (OST) which have been created is completely made by softwares. It is used on VAX calculators and can be easily transportable on other calculators [fr

  6. A software tool for ecosystem services assessments

    Science.gov (United States)

    Riegels, Niels; Klinting, Anders; Butts, Michael; Middelboe, Anne Lise; Mark, Ole

    2017-04-01

    The EU FP7 DESSIN project is developing methods and tools for assessment of ecosystem services (ESS) and associated economic values, with a focus on freshwater ESS in urban settings. Although the ESS approach has gained considerable visibility over the past ten years, operationalizing the approach remains a challenge. Therefore, DESSSIN is also supporting development of a free software tool to support users implementing the DESSIN ESS evaluation framework. The DESSIN ESS evaluation framework is a structured approach to measuring changes in ecosystem services. The main purpose of the framework is to facilitate the application of the ESS approach in the appraisal of projects that have impacts on freshwater ecosystems and their services. The DESSIN framework helps users evaluate changes in ESS by linking biophysical, economic, and sustainability assessments sequentially. It was developed using the Common International Classification of Ecosystem Services (CICES) and the DPSIR (Drivers, Pressures, States, Impacts, Responses) adaptive management cycle. The former is a standardized system for the classification of ESS developed by the European Union to enhance the consistency and comparability of ESS assessments. The latter is a well-known concept to disentangle the biophysical and social aspects of a system under study. As part of its analytical component, the DESSIN framework also integrates elements of the Final Ecosystem Goods and Services-Classification System (FEGS-CS) of the US Environmental Protection Agency (USEPA). As implemented in the software tool, the DESSIN framework consists of five parts: • In part I of the evaluation, the ecosystem is defined and described and the local stakeholders are identified. In addition, administrative details and objectives of the assessment are defined. • In part II, drivers and pressures are identified. Once these first two elements of the DPSIR scheme have been characterized, the claimed/expected capabilities of a

  7. Improving design processes through structured reflection : a prototype software tool

    OpenAIRE

    Reymen, I.M.M.J.; Melby, E.

    2001-01-01

    A prototype software tool facilitating the use of a design method supporting structured reflection on design processes is presented. The prototype, called Echo, has been developed to explore the benefits of using a software system to facilitate the use of the design method. Both the prototype software tool and the design method are developed as part of the Ph.D. project of Isabelle Reymen. The goal of the design method is supporting designers with reflection on design processes in a systemati...

  8. Software Tools to Support the Assessment of System Health

    Science.gov (United States)

    Melcher, Kevin J.

    2013-01-01

    This presentation provides an overview of three software tools that were developed by the NASA Glenn Research Center to support the assessment of system health: the Propulsion Diagnostic Method Evaluation Strategy (ProDIMES), the Systematic Sensor Selection Strategy (S4), and the Extended Testability Analysis (ETA) tool. Originally developed to support specific NASA projects in aeronautics and space, these software tools are currently available to U.S. citizens through the NASA Glenn Software Catalog. The ProDiMES software tool was developed to support a uniform comparison of propulsion gas path diagnostic methods. Methods published in the open literature are typically applied to dissimilar platforms with different levels of complexity. They often address different diagnostic problems and use inconsistent metrics for evaluating performance. As a result, it is difficult to perform a one ]to ]one comparison of the various diagnostic methods. ProDIMES solves this problem by serving as a theme problem to aid in propulsion gas path diagnostic technology development and evaluation. The overall goal is to provide a tool that will serve as an industry standard, and will truly facilitate the development and evaluation of significant Engine Health Management (EHM) capabilities. ProDiMES has been developed under a collaborative project of The Technical Cooperation Program (TTCP) based on feedback provided by individuals within the aircraft engine health management community. The S4 software tool provides a framework that supports the optimal selection of sensors for health management assessments. S4 is structured to accommodate user ]defined applications, diagnostic systems, search techniques, and system requirements/constraints. One or more sensor suites that maximize this performance while meeting other user ]defined system requirements that are presumed to exist. S4 provides a systematic approach for evaluating combinations of sensors to determine the set or sets of

  9. REVEAL - A tool for rule driven analysis of safety critical software

    International Nuclear Information System (INIS)

    Miedl, H.; Kersken, M.

    1998-01-01

    As the determination of ultrahigh reliability figures for safety critical software is hardly possible, national and international guidelines and standards give mainly requirements for the qualitative evaluation of software. An analysis whether all these requirements are fulfilled is time and effort consuming and prone to errors, if performed manually by analysts, and should instead be dedicated to tools as far as possible. There are many ''general-purpose'' software analysis tools, both static and dynamic, which help analyzing the source code. However, they are not designed to assess the adherence to specific requirements of guidelines and standards in the nuclear field. Against the background of the development of I and C systems in the nuclear field which are based on digital techniques and implemented in high level language, it is essential that the assessor or licenser has a tool with which he can automatically and uniformly qualify as many aspects as possible of the high level language software. For this purpose the software analysis tool REVEAL has been developed at ISTec and the Halden Reactor Project. (author)

  10. Lessons learned applying CASE methods/tools to Ada software development projects

    Science.gov (United States)

    Blumberg, Maurice H.; Randall, Richard L.

    1993-01-01

    This paper describes the lessons learned from introducing CASE methods/tools into organizations and applying them to actual Ada software development projects. This paper will be useful to any organization planning to introduce a software engineering environment (SEE) or evolving an existing one. It contains management level lessons learned, as well as lessons learned in using specific SEE tools/methods. The experiences presented are from Alpha Test projects established under the STARS (Software Technology for Adaptable and Reliable Systems) project. They reflect the front end efforts by those projects to understand the tools/methods, initial experiences in their introduction and use, and later experiences in the use of specific tools/methods and the introduction of new ones.

  11. Holistic Framework For Establishing Interoperability of Heterogeneous Software Development Tools

    National Research Council Canada - National Science Library

    Puett, Joseph

    2003-01-01

    This dissertation presents a Holistic Framework for Software Engineering (HFSE) that establishes collaborative mechanisms by which existing heterogeneous software development tools and models will interoperate...

  12. Software Engineering Tools for Scientific Models

    Science.gov (United States)

    Abrams, Marc; Saboo, Pallabi; Sonsini, Mike

    2013-01-01

    Software tools were constructed to address issues the NASA Fortran development community faces, and they were tested on real models currently in use at NASA. These proof-of-concept tools address the High-End Computing Program and the Modeling, Analysis, and Prediction Program. Two examples are the NASA Goddard Earth Observing System Model, Version 5 (GEOS-5) atmospheric model in Cell Fortran on the Cell Broadband Engine, and the Goddard Institute for Space Studies (GISS) coupled atmosphere- ocean model called ModelE, written in fixed format Fortran.

  13. Module Testing Techniques for Nuclear Safety Critical Software Using LDRA Testing Tool

    International Nuclear Information System (INIS)

    Moon, Kwon-Ki; Kim, Do-Yeon; Chang, Hoon-Seon; Chang, Young-Woo; Yun, Jae-Hee; Park, Jee-Duck; Kim, Jae-Hack

    2006-01-01

    The safety critical software in the I and C systems of nuclear power plants requires high functional integrity and reliability. To achieve those requirement goals, the safety critical software should be verified and tested according to related codes and standards through verification and validation (V and V) activities. The safety critical software testing is performed at various stages during the development of the software, and is generally classified as three major activities: module testing, system integration testing, and system validation testing. Module testing involves the evaluation of module level functions of hardware and software. System integration testing investigates the characteristics of a collection of modules and aims at establishing their correct interactions. System validation testing demonstrates that the complete system satisfies its functional requirements. In order to generate reliable software and reduce high maintenance cost, it is important that software testing is carried out at module level. Module testing for the nuclear safety critical software has rarely been performed by formal and proven testing tools because of its various constraints. LDRA testing tool is a widely used and proven tool set that provides powerful source code testing and analysis facilities for the V and V of general purpose software and safety critical software. Use of the tool set is indispensable where software is required to be reliable and as error-free as possible, and its use brings in substantial time and cost savings, and efficiency

  14. Characteristics and possibilities of software tool for metal-oxide surge arresters selection

    Directory of Open Access Journals (Sweden)

    Đorđević Dragan

    2012-01-01

    Full Text Available This paper presents a procedure for the selection of metal-oxide surge arresters based on the instructions given in the Siemens and ABB catalogues, respecting their differences and the characteristics and possibilities of the software tool. The software tool was developed during the preparation of a Master's thesis titled, 'Automation of Metal-Oxide Surge Arresters Selection'. An example is presented of the selection of metal-oxide surge arresters using the developed software tool.

  15. Experience with case tools in the design of process-oriented software

    Science.gov (United States)

    Novakov, Ognian; Sicard, Claude-Henri

    1994-12-01

    In Accelerator systems such as the CERN PS complex, process equipment has a life time which may exceed the typical life cycle of its related software. Taking into account the variety of such equipment, it is important to keep the analysis and design of the software in a system-independent form. This paper discusses the experience gathered in using commercial CASE tools for analysis, design and reverse engineering of different process-oriented software modules, with a principal emphasis on maintaining the initial analysis in a standardized form. Such tools have been in existence for several years, but this paper shows that they are not fully adapted to our needs. In particular, the paper stresses the problems of integrating such a tool into an existing data-base-dependent development chain, the lack of real-time simulation tools and of Object-Oriented concepts in existing commercial packages. Finally, the paper gives a broader view of software engineering needs in our particular context.

  16. Software Tools Used for Continuous Assessment

    Directory of Open Access Journals (Sweden)

    Corina SBUGHEA

    2016-04-01

    Full Text Available he present paper addresses the subject of continuous evaluation and of the IT tools that support it. The approach starts from the main concepts and methods used in the teaching process, according to the assessment methodology and, then, it focuses on their implementation in the Wondershare QuizCreator software.

  17. A multicenter study benchmarks software tools for label-free proteome quantification.

    Science.gov (United States)

    Navarro, Pedro; Kuharev, Jörg; Gillet, Ludovic C; Bernhardt, Oliver M; MacLean, Brendan; Röst, Hannes L; Tate, Stephen A; Tsou, Chih-Chiang; Reiter, Lukas; Distler, Ute; Rosenberger, George; Perez-Riverol, Yasset; Nesvizhskii, Alexey I; Aebersold, Ruedi; Tenzer, Stefan

    2016-11-01

    Consistent and accurate quantification of proteins by mass spectrometry (MS)-based proteomics depends on the performance of instruments, acquisition methods and data analysis software. In collaboration with the software developers, we evaluated OpenSWATH, SWATH 2.0, Skyline, Spectronaut and DIA-Umpire, five of the most widely used software methods for processing data from sequential window acquisition of all theoretical fragment-ion spectra (SWATH)-MS, which uses data-independent acquisition (DIA) for label-free protein quantification. We analyzed high-complexity test data sets from hybrid proteome samples of defined quantitative composition acquired on two different MS instruments using different SWATH isolation-window setups. For consistent evaluation, we developed LFQbench, an R package, to calculate metrics of precision and accuracy in label-free quantitative MS and report the identification performance, robustness and specificity of each software tool. Our reference data sets enabled developers to improve their software tools. After optimization, all tools provided highly convergent identification and reliable quantification performance, underscoring their robustness for label-free quantitative proteomics.

  18. A multi-professional software tool for radiation therapy treatment verification

    International Nuclear Information System (INIS)

    Fox, Tim; Brooks, Ken; Davis, Larry

    1996-01-01

    Purpose: Verification of patient setup is important in conformal therapy because it provides a means of quality assurance for treatment delivery. Electronic portal imaging systems have led to software tools for performing digital comparison and verification of patient setup. However, these software tools are typically designed from a radiation oncologist's perspective even though treatment verification is a team effort involving oncologists, physicists, and therapists. A new software tool, Treatment Verification Tool (TVT), has been developed as an interactive, multi-professional application for reviewing and verifying treatment plan setup using conventional personal computers. This study will describe our approach to electronic treatment verification and demonstrate the features of TVT. Methods and Materials: TVT is an object-oriented software tool written in C++ using the PC-based Windows NT environment. The software utilizes the selection of a patient's images from a database. The software is also developed as a single window interface to reduce the amount of windows presented to the user. However, the user can select from four different possible views of the patient data. One of the views is side-by-side comparison of portal images (on-line portal images or digitized port film) with a prescription image (digitized simulator film or digitally reconstructed radiograph), and another view is a textual summary of the grades of each portal image. The grades of a portal image are assigned by a radiation oncologist using an evaluation method, and the physicists and therapists may only review these results. All users of TVT can perform image enhancement processes, measure distances, and perform semi-automated registration methods. An electronic dialogue can be established through a set of annotations and notes among the radiation oncologists and the technical staff. Results: Features of TVT include: 1) side-by-side comparison of portal images and a prescription image; 2

  19. Possibilities for using software tools in the process of secuirty design

    Directory of Open Access Journals (Sweden)

    Ladislav Mariš

    2013-07-01

    Full Text Available The authors deal with the use of software support the process of security design. The article proposes the theoretical basis of the implementation of software tools to design activities. Based on the selected design standards of electrical safety systems application design solutions, especially in drawing documentation. The article should serve the needs of the project team members in order to use selected software tools and a subsequent increase in the degree of automation of design activities.

  20. Leveraging Existing Mission Tools in a Re-Usable, Component-Based Software Environment

    Science.gov (United States)

    Greene, Kevin; Grenander, Sven; Kurien, James; z,s (fshir. z[orttr); z,scer; O'Reilly, Taifun

    2006-01-01

    Emerging methods in component-based software development offer significant advantages but may seem incompatible with existing mission operations applications. In this paper we relate our positive experiences integrating existing mission applications into component-based tools we are delivering to three missions. In most operations environments, a number of software applications have been integrated together to form the mission operations software. In contrast, with component-based software development chunks of related functionality and data structures, referred to as components, can be individually delivered, integrated and re-used. With the advent of powerful tools for managing component-based development, complex software systems can potentially see significant benefits in ease of integration, testability and reusability from these techniques. These benefits motivate us to ask how component-based development techniques can be relevant in a mission operations environment, where there is significant investment in software tools that are not component-based and may not be written in languages for which component-based tools even exist. Trusted and complex software tools for sequencing, validation, navigation, and other vital functions cannot simply be re-written or abandoned in order to gain the advantages offered by emerging component-based software techniques. Thus some middle ground must be found. We have faced exactly this issue, and have found several solutions. Ensemble is an open platform for development, integration, and deployment of mission operations software that we are developing. Ensemble itself is an extension of an open source, component-based software development platform called Eclipse. Due to the advantages of component-based development, we have been able to vary rapidly develop mission operations tools for three surface missions by mixing and matching from a common set of mission operation components. We have also had to determine how to

  1. A computer-aided software-tool for sustainable process synthesis-intensification

    DEFF Research Database (Denmark)

    Kumar Tula, Anjan; Babi, Deenesh K.; Bottlaender, Jack

    2017-01-01

    and determine within the design space, the more sustainable processes. In this paper, an integrated computer-aided software-tool that searches the design space for hybrid/intensified more sustainable process options is presented. Embedded within the software architecture are process synthesis...... operations as well as reported hybrid/intensified unit operations is large and can be difficult to manually navigate in order to determine the best process flowsheet for the production of a desired chemical product. Therefore, it is beneficial to utilize computer-aided methods and tools to enumerate, analyze...... constraints while also matching the design targets, they are therefore more sustainable than the base case. The application of the software-tool to the production of biodiesel is presented, highlighting the main features of the computer-aided, multi-stage, multi-scale methods that are able to determine more...

  2. Westinghouse waste simulation and optimization software tool

    International Nuclear Information System (INIS)

    Mennicken, Kim; Aign, Jorg

    2013-01-01

    Applications for dynamic simulation can be found in virtually all areas of process engineering. The tangible benefits of using dynamic simulation can be seen in tighter design, smoother start-ups and optimized operation. Thus, proper implementation of dynamic simulation can deliver substantial benefits. These benefits are typically derived from improved process understanding. Simulation gives confidence in evidence based decisions and enables users to try out lots of 'what if' scenarios until one is sure that a decision is the right one. In radioactive waste treatment tasks different kinds of waste with different volumes and properties have to be treated, e.g. from NPP operation or D and D activities. Finding a commercially and technically optimized waste treatment concept is a time consuming and difficult task. The Westinghouse Waste Simulation and Optimization Software Tool will enable the user to quickly generate reliable simulation models of various process applications based on equipment modules. These modules can be built with ease and be integrated into the simulation model. This capability ensures that this tool is applicable to typical waste treatment tasks. The identified waste streams and the selected treatment methods are the basis of the simulation and optimization software. After implementing suitable equipment data into the model, process requirements and waste treatment data are fed into the simulation to finally generate primary simulation results. A sensitivity analysis of automated optimization features of the software generates the lowest possible lifecycle cost for the simulated waste stream. In combination with proven waste management equipments and integrated waste management solutions, this tool provides reliable qualitative results that lead to an effective planning and minimizes the total project planning risk of any waste management activity. It is thus the ideal tool for designing a waste treatment facility in an optimum manner

  3. Westinghouse waste simulation and optimization software tool

    Energy Technology Data Exchange (ETDEWEB)

    Mennicken, Kim; Aign, Jorg [Westinghouse Electric Germany GmbH, Hamburg (Germany)

    2013-07-01

    Applications for dynamic simulation can be found in virtually all areas of process engineering. The tangible benefits of using dynamic simulation can be seen in tighter design, smoother start-ups and optimized operation. Thus, proper implementation of dynamic simulation can deliver substantial benefits. These benefits are typically derived from improved process understanding. Simulation gives confidence in evidence based decisions and enables users to try out lots of 'what if' scenarios until one is sure that a decision is the right one. In radioactive waste treatment tasks different kinds of waste with different volumes and properties have to be treated, e.g. from NPP operation or D and D activities. Finding a commercially and technically optimized waste treatment concept is a time consuming and difficult task. The Westinghouse Waste Simulation and Optimization Software Tool will enable the user to quickly generate reliable simulation models of various process applications based on equipment modules. These modules can be built with ease and be integrated into the simulation model. This capability ensures that this tool is applicable to typical waste treatment tasks. The identified waste streams and the selected treatment methods are the basis of the simulation and optimization software. After implementing suitable equipment data into the model, process requirements and waste treatment data are fed into the simulation to finally generate primary simulation results. A sensitivity analysis of automated optimization features of the software generates the lowest possible lifecycle cost for the simulated waste stream. In combination with proven waste management equipments and integrated waste management solutions, this tool provides reliable qualitative results that lead to an effective planning and minimizes the total project planning risk of any waste management activity. It is thus the ideal tool for designing a waste treatment facility in an optimum manner

  4. Three novel software tools for ASDEX Upgrade

    International Nuclear Information System (INIS)

    Martinov, S.; Löbhard, T.; Lunt, T.; Behler, K.; Drube, R.; Eixenberger, H.; Herrmann, A.; Lohs, A.; Lüddecke, K.; Merkel, R.; Neu, G.; ASDEX Upgrade Team; MPCDF Garching

    2016-01-01

    Highlights: • Key features of innovative software tools for data visualization and inspection are presented to the nuclear fusion research community. • 3D animation of experiment geometry together with diagnostic data and images allow better understanding of measurements and influence of machine construction details behind them. • Multi-video viewer with fusion relevant image manipulation abilities and event database features allows faster and better decision making from video streams coming from various plasma and machine diagnostics. • Platform independant Web technologies enable the inspection of diagnostic raw signals with virtually any kind of display device. - Abstract: Visualization of measurements together with experimental settings is a general subject in experiments analysis. The complex engineering design, 3D geometry, and manifold of diagnostics in larger fusion research experiments justify the development of special analysis and visualization programs. Novel ASDEX Upgrade (AUG) software tools bring together virtual navigation through 3D device models and advanced play-back and interpretation of video streams from plasma discharges. A third little tool allows the web-based platform independent observation of real-time diagnostic signals. While all three tools stem from spontaneous development ideas and are not considered mission critical for the operation of a fusion device, they with time and growing completeness shaped up as valuable helpers to visualize acquired data in fusion research. A short overview on the goals, the features, and the design as well as the operation of these tools is given in this paper.

  5. Three novel software tools for ASDEX Upgrade

    Energy Technology Data Exchange (ETDEWEB)

    Martinov, S. [Max-Planck-Institut für Plasmaphysik, Boltzmannstr. 2, D-85748 Garching bei München (Germany); Löbhard, T. [Conovum GmbH & Co. KG, Nymphenburger Straße 13, D-80335 München (Germany); Lunt, T. [Max-Planck-Institut für Plasmaphysik, Boltzmannstr. 2, D-85748 Garching bei München (Germany); Behler, K., E-mail: karl.behler@ipp.mpg.de [Max-Planck-Institut für Plasmaphysik, Boltzmannstr. 2, D-85748 Garching bei München (Germany); Drube, R.; Eixenberger, H.; Herrmann, A.; Lohs, A. [Max-Planck-Institut für Plasmaphysik, Boltzmannstr. 2, D-85748 Garching bei München (Germany); Lüddecke, K. [Unlimited Computer Systems GmbH, Seeshaupterstr. 15, D-82393 Iffeldorf (Germany); Merkel, R.; Neu, G.; ASDEX Upgrade Team [Max-Planck-Institut für Plasmaphysik, Boltzmannstr. 2, D-85748 Garching bei München (Germany); MPCDF Garching [Max Planck Compu ting and Data Facility, Boltzmannstr. 2, D-85748 Garching (Germany)

    2016-11-15

    Highlights: • Key features of innovative software tools for data visualization and inspection are presented to the nuclear fusion research community. • 3D animation of experiment geometry together with diagnostic data and images allow better understanding of measurements and influence of machine construction details behind them. • Multi-video viewer with fusion relevant image manipulation abilities and event database features allows faster and better decision making from video streams coming from various plasma and machine diagnostics. • Platform independant Web technologies enable the inspection of diagnostic raw signals with virtually any kind of display device. - Abstract: Visualization of measurements together with experimental settings is a general subject in experiments analysis. The complex engineering design, 3D geometry, and manifold of diagnostics in larger fusion research experiments justify the development of special analysis and visualization programs. Novel ASDEX Upgrade (AUG) software tools bring together virtual navigation through 3D device models and advanced play-back and interpretation of video streams from plasma discharges. A third little tool allows the web-based platform independent observation of real-time diagnostic signals. While all three tools stem from spontaneous development ideas and are not considered mission critical for the operation of a fusion device, they with time and growing completeness shaped up as valuable helpers to visualize acquired data in fusion research. A short overview on the goals, the features, and the design as well as the operation of these tools is given in this paper.

  6. Use of Software Tools in Teaching Relational Database Design.

    Science.gov (United States)

    McIntyre, D. R.; And Others

    1995-01-01

    Discusses the use of state-of-the-art software tools in teaching a graduate, advanced, relational database design course. Results indicated a positive student response to the prototype of expert systems software and a willingness to utilize this new technology both in their studies and in future work applications. (JKP)

  7. PTaaS: Platform for Providing Software Developing Applications and Tools as a Service

    DEFF Research Database (Denmark)

    Chauhan, Muhammad Aufeef; Babar, Muhammad Ali

    2014-01-01

    technological support for it that is not limited to one specific tools and a particular phase of software development life cycle. In this thesis, we have explored the possibility of offering software development applications and tools as services that can be acquired on demand according to the software...... with process. Information gained from the review of literature on GSD tools and processes is used to extract functional requirements for the middleware platform for provisioning of software development applications and tools as services. Finding from the review of literature on architecture solutions for cloud......Cloud computing has become an established paradigm for enabling organizations to build scalable software systems and to meet challenges of rapid demand of computing and storage resources. There has been a significant success in building cloud-enabled applications for many disciplines ranging from...

  8. Knickpoint finder: A software tool that improves neotectonic analysis

    Science.gov (United States)

    Queiroz, G. L.; Salamuni, E.; Nascimento, E. R.

    2015-03-01

    This work presents a new software tool for morphometric analysis of drainage networks based on the methods of Hack (1973) and Etchebehere et al. (2004). This tool is applicable to studies of morphotectonics and neotectonics. The software used a digital elevation model (DEM) to identify the relief breakpoints along drainage profiles (knickpoints). The program was coded in Python for use on the ArcGIS platform and is called Knickpoint Finder. A study area was selected to test and evaluate the software's ability to analyze and identify neotectonic morphostructures based on the morphology of the terrain. For an assessment of its validity, we chose an area of the James River basin, which covers most of the Piedmont area of Virginia (USA), which is an area of constant intraplate seismicity and non-orogenic active tectonics and exhibits a relatively homogeneous geodesic surface currently being altered by the seismogenic features of the region. After using the tool in the chosen area, we found that the knickpoint locations are associated with the geologic structures, epicenters of recent earthquakes, and drainages with rectilinear anomalies. The regional analysis demanded the use of a spatial representation of the data after processing using Knickpoint Finder. The results were satisfactory in terms of the correlation of dense areas of knickpoints with active lineaments and the rapidity of the identification of deformed areas. Therefore, this software tool may be considered useful in neotectonic analyses of large areas and may be applied to any area where there is DEM coverage.

  9. The Value of Open Source Software Tools in Qualitative Research

    Science.gov (United States)

    Greenberg, Gary

    2011-01-01

    In an era of global networks, researchers using qualitative methods must consider the impact of any software they use on the sharing of data and findings. In this essay, I identify researchers' main areas of concern regarding the use of qualitative software packages for research. I then examine how open source software tools, wherein the publisher…

  10. Software simulation: a tool for enhancing control system design

    International Nuclear Information System (INIS)

    Sze, B.; Ridgway, G.H.

    2008-01-01

    The creation, implementation and management of engineering design tools are important to the quality and efficiency of any large engineering project. Some of the most complicated tools to develop are system simulators. The development and implementation of system simulators to support replacement fuel handling control systems is of particular interest to the Canadian nuclear industry given the current age of installations and the risk of obsolescence to many utilities. The use of such simulator tools has been known to significantly improve successful deployment of new software packages and maintenance-related software changes while reducing the time required for their overall development. Moreover, these simulation systems can also serve as operator training stations and provide a virtual environment for site engineers to test operational changes before they are uploaded to the actual system. (author)

  11. Biology Needs Evolutionary Software Tools: Let’s Build Them Right

    Science.gov (United States)

    Team, Galaxy; Goecks, Jeremy; Taylor, James

    2018-01-01

    Abstract Research in population genetics and evolutionary biology has always provided a computational backbone for life sciences as a whole. Today evolutionary and population biology reasoning are essential for interpretation of large complex datasets that are characteristic of all domains of today’s life sciences ranging from cancer biology to microbial ecology. This situation makes algorithms and software tools developed by our community more important than ever before. This means that we, developers of software tool for molecular evolutionary analyses, now have a shared responsibility to make these tools accessible using modern technological developments as well as provide adequate documentation and training. PMID:29688462

  12. PAnalyzer: A software tool for protein inference in shotgun proteomics

    Directory of Open Access Journals (Sweden)

    Prieto Gorka

    2012-11-01

    Full Text Available Abstract Background Protein inference from peptide identifications in shotgun proteomics must deal with ambiguities that arise due to the presence of peptides shared between different proteins, which is common in higher eukaryotes. Recently data independent acquisition (DIA approaches have emerged as an alternative to the traditional data dependent acquisition (DDA in shotgun proteomics experiments. MSE is the term used to name one of the DIA approaches used in QTOF instruments. MSE data require specialized software to process acquired spectra and to perform peptide and protein identifications. However the software available at the moment does not group the identified proteins in a transparent way by taking into account peptide evidence categories. Furthermore the inspection, comparison and report of the obtained results require tedious manual intervention. Here we report a software tool to address these limitations for MSE data. Results In this paper we present PAnalyzer, a software tool focused on the protein inference process of shotgun proteomics. Our approach considers all the identified proteins and groups them when necessary indicating their confidence using different evidence categories. PAnalyzer can read protein identification files in the XML output format of the ProteinLynx Global Server (PLGS software provided by Waters Corporation for their MSE data, and also in the mzIdentML format recently standardized by HUPO-PSI. Multiple files can also be read simultaneously and are considered as technical replicates. Results are saved to CSV, HTML and mzIdentML (in the case of a single mzIdentML input file files. An MSE analysis of a real sample is presented to compare the results of PAnalyzer and ProteinLynx Global Server. Conclusions We present a software tool to deal with the ambiguities that arise in the protein inference process. Key contributions are support for MSE data analysis by ProteinLynx Global Server and technical replicates

  13. Software engineering techniques and CASE tools in RD13

    Science.gov (United States)

    Buono, S.; Gaponenko, I.; Jones, R.; Khodabandeh, A.; Mapelli, L.; Mornacchi, G.; Prigent, D.; Sanchez-Corral, E.; Skiadelli, M.; Toppers, A.; Duval, P. Y.; Ferrato, D.; Le Van Suu, A.; Qian, Z.; Rondot, C.; Ambrosini, G.; Fumagalli, G.; Polesello, G.; Aguer, M.; Huet, M.

    1994-12-01

    The RD13 project was approved in April 1991 for the development of a scalable data-taking system suitable for hosting various LHC studies. One of its goals is the exploitation of software engineering techniques, in order to indicate their overall suitability for data acquisition (DAQ), software design and implementation. This paper describes how such techniques have been applied to the development of components of the RD13 DAQ used in test-beam runs at CERN. We describe our experience with the Artifex CASE tool and its associated methodology. The issues raised when code generated by a CASE tool has to be integrated into an existing environment are also discussed.

  14. A software tool for design of process monitoring and analysis systems

    DEFF Research Database (Denmark)

    Singh, Ravendra; Gernaey, Krist; Gani, Rafiqul

    2009-01-01

    A well designed process monitoring and analysis system is necessary to consistently achieve any predefined end product quality. Systematic computer aided methods and tools provide the means to design the necessary process monitoring and analysis systems and/or to validate any existing monitoring...... and analysis system. A software to achieve this has been developed. Two developed supporting tools for the design, a knowledge base (consisting of the process knowledge as well as the knowledge on measurement methods & tools) and a model library (consisting of the process operational models) have been extended...... rigorously and integrated with the user interface, which made the software more generic and applicable to a wide range of problems. The software for the design of a process monitoring and analysis system is presented and illustrated with a tablet manufacturing process example....

  15. Design of parametric software tools

    DEFF Research Database (Denmark)

    Sabra, Jakob Borrits; Mullins, Michael

    2011-01-01

    The studies investigate the field of evidence-based design used in architectural design practice and propose a method using 2D/3D CAD applications to: 1) enhance integration of evidence-based design knowledge in architectural design phases with a focus on lighting and interior design and 2) assess...... fulfilment of evidence-based design criterion regarding light distribution and location in relation to patient safety in architectural health care design proposals. The study uses 2D/3D CAD modelling software Rhinoceros 3D with plug-in Grasshopper to create parametric tool prototypes to exemplify...... the operations and functions of the design method. To evaluate the prototype potentials, surveys with architectural and healthcare design companies are conducted. Evaluation is done by the administration of questionnaires being part of the development of the tools. The results show that architects, designers...

  16. Innovative Software Algorithms and Tools parallel sessions summary

    International Nuclear Information System (INIS)

    Gaines, Irwin

    2001-01-01

    A variety of results were presented in the poster and 5 parallel sessions of the Innovative Software, Algorithms and Tools (ISAT) sessions. I will briefly summarize these presentations and attempt to identify some unifying trends

  17. Software Tools Streamline Project Management

    Science.gov (United States)

    2009-01-01

    Three innovative software inventions from Ames Research Center (NETMARK, Program Management Tool, and Query-Based Document Management) are finding their way into NASA missions as well as industry applications. The first, NETMARK, is a program that enables integrated searching of data stored in a variety of databases and documents, meaning that users no longer have to look in several places for related information. NETMARK allows users to search and query information across all of these sources in one step. This cross-cutting capability in information analysis has exponentially reduced the amount of time needed to mine data from days or weeks to mere seconds. NETMARK has been used widely throughout NASA, enabling this automatic integration of information across many documents and databases. NASA projects that use NETMARK include the internal reporting system and project performance dashboard, Erasmus, NASA s enterprise management tool, which enhances organizational collaboration and information sharing through document routing and review; the Integrated Financial Management Program; International Space Station Knowledge Management; Mishap and Anomaly Information Reporting System; and management of the Mars Exploration Rovers. Approximately $1 billion worth of NASA s projects are currently managed using Program Management Tool (PMT), which is based on NETMARK. PMT is a comprehensive, Web-enabled application tool used to assist program and project managers within NASA enterprises in monitoring, disseminating, and tracking the progress of program and project milestones and other relevant resources. The PMT consists of an integrated knowledge repository built upon advanced enterprise-wide database integration techniques and the latest Web-enabled technologies. The current system is in a pilot operational mode allowing users to automatically manage, track, define, update, and view customizable milestone objectives and goals. The third software invention, Query

  18. New technologies for supporting real-time on-board software development

    Science.gov (United States)

    Kerridge, D.

    1995-03-01

    The next generation of on-board data management systems will be significantly more complex than current designs, and will be required to perform more complex and demanding tasks in software. Improved hardware technology, in the form of the MA31750 radiation hard processor, is one key component in addressing the needs of future embedded systems. However, to complement these hardware advances, improved support for the design and implementation of real-time data management software is now needed. This will help to control the cost and risk assoicated with developing data management software development as it becomes an increasingly significant element within embedded systems. One particular problem with developing embedded software is managing the non-functional requirements in a systematic way. This paper identifies how Logica has exploited recent developments in hard real-time theory to address this problem through the use of new hard real-time analysis and design methods which can be supported by specialized tools. The first stage in transferring this technology from the research domain to industrial application has already been completed. The MA37150 Hard Real-Time Embedded Software Support Environment (HESSE) is a loosely integrated set of hardware and software tools which directly support the process of hard real-time analysis for software targeting the MA31750 processor. With further development, this HESSE promises to provide embedded system developers with software tools which can reduce the risks associated with developing complex hard real-time software. Supported in this way by more sophisticated software methods and tools, it is foreseen that MA31750 based embedded systems can meet the processing needs for the next generation of on-board data management systems.

  19. Improving design processes through structured reflection : a prototype software tool

    NARCIS (Netherlands)

    Reymen, I.M.M.J.; Melby, E.

    2001-01-01

    A prototype software tool facilitating the use of a design method supporting structured reflection on design processes is presented. The prototype, called Echo, has been developed to explore the benefits of using a software system to facilitate the use of the design method. Both the prototype

  20. Atrioventricular junction (AVJ) motion tracking: a software tool with ITK/VTK/Qt.

    Science.gov (United States)

    Pengdong Xiao; Shuang Leng; Xiaodan Zhao; Hua Zou; Ru San Tan; Wong, Philip; Liang Zhong

    2016-08-01

    The quantitative measurement of the Atrioventricular Junction (AVJ) motion is an important index for ventricular functions of one cardiac cycle including systole and diastole. In this paper, a software tool that can conduct AVJ motion tracking from cardiovascular magnetic resonance (CMR) images is presented by using Insight Segmentation and Registration Toolkit (ITK), The Visualization Toolkit (VTK) and Qt. The software tool is written in C++ by using Visual Studio Community 2013 integrated development environment (IDE) containing both an editor and a Microsoft complier. The software package has been successfully implemented. From the software engineering practice, it is concluded that ITK, VTK, and Qt are very handy software systems to implement automatic image analysis functions for CMR images such as quantitative measure of motion by visual tracking.

  1. A Web-based modeling tool for the SEMAT Essence theory of software engineering

    Directory of Open Access Journals (Sweden)

    Daniel Graziotin

    2013-09-01

    Full Text Available As opposed to more mature subjects, software engineering lacks general theories that establish its foundations as a discipline. The Essence Theory of software engineering (Essence has been proposed by the Software Engineering Methods and Theory (SEMAT initiative. The goal of Essence is to develop a theoretically sound basis for software engineering practice and its wide adoption. However, Essence is far from reaching academic- and industry-wide adoption. The reasons for this include a struggle to foresee its utilization potential and a lack of tools for implementation. SEMAT Accelerator (SematAcc is a Web-positioning tool for a software engineering endeavor, which implements the SEMAT’s Essence kernel. SematAcc permits the use of Essence, thus helping to understand it. The tool enables the teaching, adoption, and research of Essence in controlled experiments and case studies.

  2. PT-SAFE: a software tool for development and annunciation of medical audible alarms.

    Science.gov (United States)

    Bennett, Christopher L; McNeer, Richard R

    2012-03-01

    Recent reports by The Joint Commission as well as the Anesthesia Patient Safety Foundation have indicated that medical audible alarm effectiveness needs to be improved. Several recent studies have explored various approaches to improving the audible alarms, motivating the authors to develop real-time software capable of comparing such alarms. We sought to devise software that would allow for the development of a variety of audible alarm designs that could also integrate into existing operating room equipment configurations. The software is meant to be used as a tool for alarm researchers to quickly evaluate novel alarm designs. A software tool was developed for the purpose of creating and annunciating audible alarms. The alarms consisted of annunciators that were mapped to vital sign data received from a patient monitor. An object-oriented approach to software design was used to create a tool that is flexible and modular at run-time, can annunciate wave-files from disk, and can be programmed with MATLAB by the user to create custom alarm algorithms. The software was tested in a simulated operating room to measure technical performance and to validate the time-to-annunciation against existing equipment alarms. The software tool showed efficacy in a simulated operating room environment by providing alarm annunciation in response to physiologic and ventilator signals generated by a human patient simulator, on average 6.2 seconds faster than existing equipment alarms. Performance analysis showed that the software was capable of supporting up to 15 audible alarms on a mid-grade laptop computer before audio dropouts occurred. These results suggest that this software tool provides a foundation for rapidly staging multiple audible alarm sets from the laboratory to a simulation environment for the purpose of evaluating novel alarm designs, thus producing valuable findings for medical audible alarm standardization.

  3. Pension fund sophistication and investment policy

    NARCIS (Netherlands)

    de Dreu, J.|info:eu-repo/dai/nl/364537906; Bikker, J.A.|info:eu-repo/dai/nl/06912261X

    This paper assesses the sophistication of pension funds’ investment policies using data on 748 Dutch pension funds during the 1999–2006 period. We develop three indicators of sophistication: gross rounding of investment choices, investments in alternative sophisticated asset classes and ‘home bias’.

  4. New software tools for enhanced precision in robot-assisted laser phonomicrosurgery.

    Science.gov (United States)

    Dagnino, Giulio; Mattos, Leonardo S; Caldwell, Darwin G

    2012-01-01

    This paper describes a new software package created to enhance precision during robot-assisted laser phonomicrosurgery procedures. The new software is composed of three tools for camera calibration, automatic tumor segmentation, and laser tracking. These were designed and developed to improve the outcome of this demanding microsurgical technique, and were tested herein to produce quantitative performance data. The experimental setup was based on the motorized laser micromanipulator created by Istituto Italiano di Tecnologia and the experimental protocols followed are fully described in this paper. The results show the new tools are robust and effective: The camera calibration tool reduced residual errors (RMSE) to 0.009 ± 0.002 mm under 40× microscope magnification; the automatic tumor segmentation tool resulted in deep lesion segmentations comparable to manual segmentations (RMSE= 0.160 ± 0.028 mm under 40× magnification); and the laser tracker tool proved to be reliable even during cutting procedures (RMSE= 0.073 ± 0.023 mm under 40× magnification). These results demonstrate the new software package can provide excellent improvements to the previous microsurgical system, leading to important enhancements in surgical outcome.

  5. A multi-center study benchmarks software tools for label-free proteome quantification

    Science.gov (United States)

    Gillet, Ludovic C; Bernhardt, Oliver M.; MacLean, Brendan; Röst, Hannes L.; Tate, Stephen A.; Tsou, Chih-Chiang; Reiter, Lukas; Distler, Ute; Rosenberger, George; Perez-Riverol, Yasset; Nesvizhskii, Alexey I.; Aebersold, Ruedi; Tenzer, Stefan

    2016-01-01

    The consistent and accurate quantification of proteins by mass spectrometry (MS)-based proteomics depends on the performance of instruments, acquisition methods and data analysis software. In collaboration with the software developers, we evaluated OpenSWATH, SWATH2.0, Skyline, Spectronaut and DIA-Umpire, five of the most widely used software methods for processing data from SWATH-MS (sequential window acquisition of all theoretical fragment ion spectra), a method that uses data-independent acquisition (DIA) for label-free protein quantification. We analyzed high-complexity test datasets from hybrid proteome samples of defined quantitative composition acquired on two different MS instruments using different SWATH isolation windows setups. For consistent evaluation we developed LFQbench, an R-package to calculate metrics of precision and accuracy in label-free quantitative MS, and report the identification performance, robustness and specificity of each software tool. Our reference datasets enabled developers to improve their software tools. After optimization, all tools provided highly convergent identification and reliable quantification performance, underscoring their robustness for label-free quantitative proteomics. PMID:27701404

  6. Software Tools for Measuring and Calculating Electromagnetic Shielding Effectiveness

    National Research Council Canada - National Science Library

    Tesny, Neal

    2005-01-01

    The evaluation and the analysis of high-altitude electromagnetic pulse response of shielded enclosures require the availability of software tools able to acquire data and calculate shielding effectiveness...

  7. ProteoWizard: open source software for rapid proteomics tools development.

    Science.gov (United States)

    Kessner, Darren; Chambers, Matt; Burke, Robert; Agus, David; Mallick, Parag

    2008-11-01

    The ProteoWizard software project provides a modular and extensible set of open-source, cross-platform tools and libraries. The tools perform proteomics data analyses; the libraries enable rapid tool creation by providing a robust, pluggable development framework that simplifies and unifies data file access, and performs standard proteomics and LCMS dataset computations. The library contains readers and writers of the mzML data format, which has been written using modern C++ techniques and design principles and supports a variety of platforms with native compilers. The software has been specifically released under the Apache v2 license to ensure it can be used in both academic and commercial projects. In addition to the library, we also introduce a rapidly growing set of companion tools whose implementation helps to illustrate the simplicity of developing applications on top of the ProteoWizard library. Cross-platform software that compiles using native compilers (i.e. GCC on Linux, MSVC on Windows and XCode on OSX) is available for download free of charge, at http://proteowizard.sourceforge.net. This website also provides code examples, and documentation. It is our hope the ProteoWizard project will become a standard platform for proteomics development; consequently, code use, contribution and further development are strongly encouraged.

  8. Proceedings of the Workshop on software tools for distributed intelligent control systems

    Energy Technology Data Exchange (ETDEWEB)

    Herget, C.J. (ed.)

    1990-09-01

    The Workshop on Software Tools for Distributed Intelligent Control Systems was organized by Lawrence Livermore National Laboratory for the United States Army Headquarters Training and Doctrine Command and the Defense Advanced Research Projects Agency. The goals of the workshop were to the identify the current state of the art in tools which support control systems engineering design and implementation, identify research issues associated with writing software tools which would provide a design environment to assist engineers in multidisciplinary control design and implementation, formulate a potential investment strategy to resolve the research issues and develop public domain code which can form the core of more powerful engineering design tools, and recommend test cases to focus the software development process and test associated performance metrics. Recognizing that the development of software tools for distributed intelligent control systems will require a multidisciplinary effort, experts in systems engineering, control systems engineering, and compute science were invited to participate in the workshop. In particular, experts who could address the following topics were selected: operating systems, engineering data representation and manipulation, emerging standards for manufacturing data, mathematical foundations, coupling of symbolic and numerical computation, user interface, system identification, system representation at different levels of abstraction, system specification, system design, verification and validation, automatic code generation, and integration of modular, reusable code.

  9. COSTMODL - AN AUTOMATED SOFTWARE DEVELOPMENT COST ESTIMATION TOOL

    Science.gov (United States)

    Roush, G. B.

    1994-01-01

    The cost of developing computer software consumes an increasing portion of many organizations' budgets. As this trend continues, the capability to estimate the effort and schedule required to develop a candidate software product becomes increasingly important. COSTMODL is an automated software development estimation tool which fulfills this need. Assimilating COSTMODL to any organization's particular environment can yield significant reduction in the risk of cost overruns and failed projects. This user-customization capability is unmatched by any other available estimation tool. COSTMODL accepts a description of a software product to be developed and computes estimates of the effort required to produce it, the calendar schedule required, and the distribution of effort and staffing as a function of the defined set of development life-cycle phases. This is accomplished by the five cost estimation algorithms incorporated into COSTMODL: the NASA-developed KISS model; the Basic, Intermediate, and Ada COCOMO models; and the Incremental Development model. This choice affords the user the ability to handle project complexities ranging from small, relatively simple projects to very large projects. Unique to COSTMODL is the ability to redefine the life-cycle phases of development and the capability to display a graphic representation of the optimum organizational structure required to develop the subject project, along with required staffing levels and skills. The program is menu-driven and mouse sensitive with an extensive context-sensitive help system that makes it possible for a new user to easily install and operate the program and to learn the fundamentals of cost estimation without having prior training or separate documentation. The implementation of these functions, along with the customization feature, into one program makes COSTMODL unique within the industry. COSTMODL was written for IBM PC compatibles, and it requires Turbo Pascal 5.0 or later and Turbo

  10. Software tools for the particle accelerator designs

    International Nuclear Information System (INIS)

    Sugimoto, Masayoshi

    1988-01-01

    The software tools used for the designs of the particle accelerators are going to be implemented on the small computer systems, such as the personal computers or the work stations. These are called from the interactive environment like a window application program. The environment contains the small expert system to make easy to select the design parameters. (author)

  11. In Praise of the Sophists.

    Science.gov (United States)

    Gibson, Walker

    1993-01-01

    Discusses the thinking of the Greek Sophist philosophers, particularly Gorgias and Protagoras, and their importance and relevance for contemporary English instructors. Considers the problem of language as signs of reality in the context of Sophist philosophy. (HB)

  12. Benchmarking State-of-the-Art Deep Learning Software Tools

    OpenAIRE

    Shi, Shaohuai; Wang, Qiang; Xu, Pengfei; Chu, Xiaowen

    2016-01-01

    Deep learning has been shown as a successful machine learning method for a variety of tasks, and its popularity results in numerous open-source deep learning software tools. Training a deep network is usually a very time-consuming process. To address the computational challenge in deep learning, many tools exploit hardware features such as multi-core CPUs and many-core GPUs to shorten the training time. However, different tools exhibit different features and running performance when training ...

  13. Exoskeletons, Robots and System Software: Tools for the Warfighter

    Science.gov (United States)

    2012-04-24

    Exoskeletons , Robots and System Software: Tools for the Warfighter? Paul Flanagan, Tuesday, April 24, 2012 11:15 am– 12:00 pm 1 “The views...Emerging technologies such as exoskeletons , robots , drones, and the underlying software are and will change the face of the battlefield. Warfighters will...global hub for educating, informing, and connecting Information Age leaders.” What is an exoskeleton ? An exoskeleton is a wearable robot suit that

  14. A software tool for analyzing multichannel cochlear implant signals.

    Science.gov (United States)

    Lai, Wai Kong; Bögli, Hans; Dillier, Norbert

    2003-10-01

    A useful and convenient means to analyze the radio frequency (RF) signals being sent by a speech processor to a cochlear implant would be to actually capture and display them with appropriate software. This is particularly useful for development or diagnostic purposes. sCILab (Swiss Cochlear Implant Laboratory) is such a PC-based software tool intended for the Nucleus family of Multichannel Cochlear Implants. Its graphical user interface provides a convenient and intuitive means for visualizing and analyzing the signals encoding speech information. Both numerical and graphic displays are available for detailed examination of the captured CI signals, as well as an acoustic simulation of these CI signals. sCILab has been used in the design and verification of new speech coding strategies, and has also been applied as an analytical tool in studies of how different parameter settings of existing speech coding strategies affect speech perception. As a diagnostic tool, it is also useful for troubleshooting problems with the external equipment of the cochlear implant systems.

  15. SDMdata: A Web-Based Software Tool for Collecting Species Occurrence Records.

    Directory of Open Access Journals (Sweden)

    Xiaoquan Kong

    Full Text Available It is important to easily and efficiently obtain high quality species distribution data for predicting the potential distribution of species using species distribution models (SDMs. There is a need for a powerful software tool to automatically or semi-automatically assist in identifying and correcting errors. Here, we use Python to develop a web-based software tool (SDMdata to easily collect occurrence data from the Global Biodiversity Information Facility (GBIF and check species names and the accuracy of coordinates (latitude and longitude. It is an open source software (GNU Affero General Public License/AGPL licensed allowing anyone to access and manipulate the source code. SDMdata is available online free of charge from .

  16. Classifying Desirable Features of Software Visualization Tools for Corrective Maintenance

    NARCIS (Netherlands)

    Sensalire, Mariam; Ogao, Patrick; Telea, Alexandru

    2008-01-01

    We provide an evaluation of 15 software visualization tools applicable to corrective maintenance. The tasks supported as well as the techniques used are presented and graded based on the support level. By analyzing user acceptation of current tools, we aim to help developers to select what to

  17. An online database for plant image analysis software tools

    OpenAIRE

    Lobet, Guillaume; Draye, Xavier; Périlleux, Claire

    2013-01-01

    Background: Recent years have seen an increase in methods for plant phenotyping using image analyses. These methods require new software solutions for data extraction and treatment. These solutions are instrumental in supporting various research pipelines, ranging from the localisation of cellular compounds to the quantification of tree canopies. However, due to the variety of existing tools and the lack of central repository, it is challenging for researchers to identify the software that is...

  18. ISWHM: Tools and Techniques for Software and System Health Management

    Science.gov (United States)

    Schumann, Johann; Mengshoel, Ole J.; Darwiche, Adnan

    2010-01-01

    This presentation presents status and results of research on Software Health Management done within the NRA "ISWHM: Tools and Techniques for Software and System Health Management." Topics include: Ingredients of a Guidance, Navigation, and Control System (GN and C); Selected GN and C Testbed example; Health Management of major ingredients; ISWHM testbed architecture; and Conclusions and next Steps.

  19. Accuracy Test of Software Architecture Compliance Checking Tools – Test Instruction

    NARCIS (Netherlands)

    Pruijt, Leo; van der Werf, J.M.E.M.|info:eu-repo/dai/nl/36950674X; Brinkkemper., Sjaak|info:eu-repo/dai/nl/07500707X

    2015-01-01

    Software Architecture Compliance Checking (SACC) is an approach to verify conformance of implemented program code to high-level models of architectural design. Static SACC focuses on the modular software architecture and on the existence of rule violating dependencies between modules. Accurate tool

  20. Choosing your weapons : on sentiment analysis tools for software engineering research

    NARCIS (Netherlands)

    Jongeling, R.M.; Datta, S.; Serebrenik, A.; Koschke, R.; Krinke, J.; Robillard, M.

    2015-01-01

    Recent years have seen an increasing attention to social aspects of software engineering, including studies of emotions and sentiments experienced and expressed by the software developers. Most of these studies reuse existing sentiment analysis tools such as SentiStrength and NLTK. However, these

  1. Development to requirements for a procedures software tool

    International Nuclear Information System (INIS)

    Yasutake, J.Y.; Hachiro Isoda

    1993-01-01

    In 1989, the Electric Power Research Institute (EPRI) and the Central Research Institute of the Electric Power Industry (CRIEPI) in Japan initiated a joint research program to investigate various interventions to reduce personnel errors and inefficiencies in the maintenance of nuclear power plants. This program, consisting of several interrelated projects, was initiated because of the mutual recognition of the importance of the human element in the efficient and safe operation of utilities and the continuing need to enhance personnel performance to sustain plant safety and availability. This paper summarizes one of the projects, jointly funded by EPRI and CRIEPI, to analyze the requirements for, and prepare a functional description of, a procedures software tool (PST). The primary objective of this project was to develop a description of the features and functions of a software tool that would help procedure writers to improve the quality of maintenance and testing procedures, thereby enhancing the performance of both procedure writers and maintenance personnel

  2. An evaluation of software tools for the design and development of cockpit displays

    Science.gov (United States)

    Ellis, Thomas D., Jr.

    1993-01-01

    The use of all-glass cockpits at the NASA Langley Research Center (LaRC) simulation facility has changed the means of design, development, and maintenance of instrument displays. The human-machine interface has evolved from a physical hardware device to a software-generated electronic display system. This has subsequently caused an increased workload at the facility. As computer processing power increases and the glass cockpit becomes predominant in facilities, software tools used in the design and development of cockpit displays are becoming both feasible and necessary for a more productive simulation environment. This paper defines LaRC requirements of a display software development tool and compares two available applications against these requirements. As a part of the software engineering process, these tools reduce development time, provide a common platform for display development, and produce exceptional real-time results.

  3. TEST (Toxicity Estimation Software Tool) Ver 4.1

    Science.gov (United States)

    The Toxicity Estimation Software Tool (T.E.S.T.) has been developed to allow users to easily estimate toxicity and physical properties using a variety of QSAR methodologies. T.E.S.T allows a user to estimate toxicity without requiring any external programs. Users can input a chem...

  4. Development of tools for safety analysis of control software in advanced reactors

    International Nuclear Information System (INIS)

    Guarro, S.; Yau, M.; Motamed, M.

    1996-04-01

    Software based control systems have gained a pervasive presence in a wide variety of applications, including nuclear power plant control and protection systems which are within the oversight and licensing responsibility of the US Nuclear Regulatory Commission. While the cost effectiveness and flexibility of software based plant process control is widely recognized, it is very difficult to achieve and prove high levels of demonstrated dependability and safety assurance for the functions performed by process control software, due to the very flexibility and potential complexity of the software itself. The development of tools to model, analyze and test software design and implementations in the context of the system that the software is designed to control can greatly assist the task of providing higher levels of assurance than those obtainable by software testing alone. This report presents and discusses the development of the Dynamic Flowgraph Methodology (DFM) and its application in the dependability and assurance analysis of software-based control systems. The features of the methodology and full-scale examples of application to both generic process and nuclear power plant control systems are presented and discussed in detail. The features of a workstation software tool developed to assist users in the application of DFM are also described

  5. Development of tools for safety analysis of control software in advanced reactors

    Energy Technology Data Exchange (ETDEWEB)

    Guarro, S.; Yau, M.; Motamed, M. [Advanced Systems Concepts Associates, El Segundo, CA (United States)

    1996-04-01

    Software based control systems have gained a pervasive presence in a wide variety of applications, including nuclear power plant control and protection systems which are within the oversight and licensing responsibility of the US Nuclear Regulatory Commission. While the cost effectiveness and flexibility of software based plant process control is widely recognized, it is very difficult to achieve and prove high levels of demonstrated dependability and safety assurance for the functions performed by process control software, due to the very flexibility and potential complexity of the software itself. The development of tools to model, analyze and test software design and implementations in the context of the system that the software is designed to control can greatly assist the task of providing higher levels of assurance than those obtainable by software testing alone. This report presents and discusses the development of the Dynamic Flowgraph Methodology (DFM) and its application in the dependability and assurance analysis of software-based control systems. The features of the methodology and full-scale examples of application to both generic process and nuclear power plant control systems are presented and discussed in detail. The features of a workstation software tool developed to assist users in the application of DFM are also described.

  6. Architecture independent environment for developing engineering software on MIMD computers

    Science.gov (United States)

    Valimohamed, Karim A.; Lopez, L. A.

    1990-01-01

    Engineers are constantly faced with solving problems of increasing complexity and detail. Multiple Instruction stream Multiple Data stream (MIMD) computers have been developed to overcome the performance limitations of serial computers. The hardware architectures of MIMD computers vary considerably and are much more sophisticated than serial computers. Developing large scale software for a variety of MIMD computers is difficult and expensive. There is a need to provide tools that facilitate programming these machines. First, the issues that must be considered to develop those tools are examined. The two main areas of concern were architecture independence and data management. Architecture independent software facilitates software portability and improves the longevity and utility of the software product. It provides some form of insurance for the investment of time and effort that goes into developing the software. The management of data is a crucial aspect of solving large engineering problems. It must be considered in light of the new hardware organizations that are available. Second, the functional design and implementation of a software environment that facilitates developing architecture independent software for large engineering applications are described. The topics of discussion include: a description of the model that supports the development of architecture independent software; identifying and exploiting concurrency within the application program; data coherence; engineering data base and memory management.

  7. An Open-Source Tool Set Enabling Analog-Digital-Software Co-Design

    Directory of Open Access Journals (Sweden)

    Michelle Collins

    2016-02-01

    Full Text Available This paper presents an analog-digital hardware-software co-design environment for simulating and programming reconfigurable systems. The tool simulates, designs, as well as enables experimental measurements after compiling to configurable systems in the same integrated design tool framework. High level software in Scilab/Xcos (open-source programs similar to MATLAB/Simulink that converts the high-level block description by the user to blif format (sci2blif, which acts as an input to the modified VPR tool, including the code v p r 2 s w c s , encoding the specific platform through specific architecture files, resulting in a targetable switch list on the resulting configurable analog–digital system. The resulting tool uses an analog and mixed-signal library of components, enabling users and future researchers access to the basic analog operations/computations that are possible.

  8. A Reference Architecture for Providing Tools as a Service to Support Global Software Development

    DEFF Research Database (Denmark)

    Chauhan, Aufeef

    2014-01-01

    -computing paradigm for addressing above-mentioned issues by providing a framework to select appropriate tools as well as associated services and reference architecture of the cloud-enabled middleware platform that allows on demand provisioning of software engineering Tools as a Service (TaaS) with focus......Global Software Development (GSD) teams encounter challenges that are associated with distribution of software development activities across multiple geographic regions. The limited support for performing collaborative development and engineering activities and lack of sufficient support......-based solutions. The restricted ability of the organizations to have desired alignment of tools with software engineering and development processes results in administrative and managerial overhead that incur increased development cost and poor product quality. Moreover, stakeholders involved in the projects have...

  9. Classroom Live: a software-assisted gamification tool

    Science.gov (United States)

    de Freitas, Adrian A.; de Freitas, Michelle M.

    2013-06-01

    Teachers have come to rely on a variety of approaches in order to elicit and sustain student interest in the classroom. One particular approach, known as gamification, seeks to improve student engagement by transforming the traditional classroom experience into a competitive multiplayer game. Initial attempts at classroom gamification relied on the teacher manually tracking student progress. At the US Air Force Academy, we wanted to experiment with a software gamification tool. Our client/server suite, dubbed Classroom Live, streamlines the gamification process for the teacher by simplifying common tasks. Simultaneously, the tool provides students with an esthetically pleasing user interface that offers in game rewards in exchange for their participation. Classroom Live is still in development, but our initial experience using the tool has been extremely positive and confirms our belief that students respond positively to gamification, even at the undergraduate level.

  10. DAE Tools: equation-based object-oriented modelling, simulation and optimisation software

    Directory of Open Access Journals (Sweden)

    Dragan D. Nikolić

    2016-04-01

    Full Text Available In this work, DAE Tools modelling, simulation and optimisation software, its programming paradigms and main features are presented. The current approaches to mathematical modelling such as the use of modelling languages and general-purpose programming languages are analysed. The common set of capabilities required by the typical simulation software are discussed, and the shortcomings of the current approaches recognised. A new hybrid approach is introduced, and the modelling languages and the hybrid approach are compared in terms of the grammar, compiler, parser and interpreter requirements, maintainability and portability. The most important characteristics of the new approach are discussed, such as: (1 support for the runtime model generation; (2 support for the runtime simulation set-up; (3 support for complex runtime operating procedures; (4 interoperability with the third party software packages (i.e. NumPy/SciPy; (5 suitability for embedding and use as a web application or software as a service; and (6 code-generation, model exchange and co-simulation capabilities. The benefits of an equation-based approach to modelling, implemented in a fourth generation object-oriented general purpose programming language such as Python are discussed. The architecture and the software implementation details as well as the type of problems that can be solved using DAE Tools software are described. Finally, some applications of the software at different levels of abstraction are presented, and its embedding capabilities and suitability for use as a software as a service is demonstrated.

  11. Can agile software tools bring the benefits of a task board to globally distributed teams?

    NARCIS (Netherlands)

    Katsma, Christiaan; Amrit, Chintan Amrit; van Hillegersberg, Jos; Sikkel, Nicolaas; Oshri, Ilan; Kotlarsky, Julia; Willcocks, Leslie P.

    Software-based tooling has become an essential part of globally disitrbuted software development. In this study we focus on the usage of such tools and task boards in particular. We investigate the deployment of these tools through a field research in 4 different companies that feature agile and

  12. Software reference for SaTool - a Tool for Structural Analysis of Automated Systems

    DEFF Research Database (Denmark)

    Lorentzen, Torsten; Blanke, Mogens

    2004-01-01

    This software reference details the functions of SaTool – a tool for structural analysis of technical systems. SaTool is intended used as part of an industrial systems design cycle. Structural analysis is a graph-based technique where principal relations between variables express the system’s...... of the graph. SaTool makes analysis of the structure graph to provide knowledge about fundamental properties of the system in normal and faulty conditions. Salient features of SaTool include rapid analysis of possibility to diagnose faults and ability to make autonomous recovery should faults occur........ The list of such variables and functional relations constitute the system’s structure graph. Normal operation means all functional relations are intact. Should faults occur, one or more functional relations cease to be valid. In a structure graph, this is seen as the disappearance of one or more nodes...

  13. CLAIRE, an event-driven simulation tool for testing software

    International Nuclear Information System (INIS)

    Raguideau, J.; Schoen, D.; Henry, J.Y.; Boulc'h, J.

    1994-06-01

    CLAIRE is a software tool created to perform validations on executable codes or on specifications of distributed real-time applications for nuclear safety. CLAIRE can be used both to verify the safety properties by modelling the specifications, and also to validate the final code by simulating the behaviour of its equipment and software interfaces. It can be used to observe and provide dynamic control of the simulation process, and also to record changes to the simulated data for off-line analysis. (R.P.)

  14. Free Software and Free Textbooks

    Science.gov (United States)

    Takhteyev, Yuri

    2012-01-01

    Some of the world's best and most sophisticated software is distributed today under "free" or "open source" licenses, which allow the recipients of such software to use, modify, and share it without paying royalties or asking for permissions. If this works for software, could it also work for educational resources, such as books? The economics of…

  15. Towards early software reliability prediction for computer forensic tools (case study).

    Science.gov (United States)

    Abu Talib, Manar

    2016-01-01

    Versatility, flexibility and robustness are essential requirements for software forensic tools. Researchers and practitioners need to put more effort into assessing this type of tool. A Markov model is a robust means for analyzing and anticipating the functioning of an advanced component based system. It is used, for instance, to analyze the reliability of the state machines of real time reactive systems. This research extends the architecture-based software reliability prediction model for computer forensic tools, which is based on Markov chains and COSMIC-FFP. Basically, every part of the computer forensic tool is linked to a discrete time Markov chain. If this can be done, then a probabilistic analysis by Markov chains can be performed to analyze the reliability of the components and of the whole tool. The purposes of the proposed reliability assessment method are to evaluate the tool's reliability in the early phases of its development, to improve the reliability assessment process for large computer forensic tools over time, and to compare alternative tool designs. The reliability analysis can assist designers in choosing the most reliable topology for the components, which can maximize the reliability of the tool and meet the expected reliability level specified by the end-user. The approach of assessing component-based tool reliability in the COSMIC-FFP context is illustrated with the Forensic Toolkit Imager case study.

  16. The GLEaMviz computational tool, a publicly available software to explore realistic epidemic spreading scenarios at the global scale

    Directory of Open Access Journals (Sweden)

    Quaggiotto Marco

    2011-02-01

    Full Text Available Abstract Background Computational models play an increasingly important role in the assessment and control of public health crises, as demonstrated during the 2009 H1N1 influenza pandemic. Much research has been done in recent years in the development of sophisticated data-driven models for realistic computer-based simulations of infectious disease spreading. However, only a few computational tools are presently available for assessing scenarios, predicting epidemic evolutions, and managing health emergencies that can benefit a broad audience of users including policy makers and health institutions. Results We present "GLEaMviz", a publicly available software system that simulates the spread of emerging human-to-human infectious diseases across the world. The GLEaMviz tool comprises three components: the client application, the proxy middleware, and the simulation engine. The latter two components constitute the GLEaMviz server. The simulation engine leverages on the Global Epidemic and Mobility (GLEaM framework, a stochastic computational scheme that integrates worldwide high-resolution demographic and mobility data to simulate disease spread on the global scale. The GLEaMviz design aims at maximizing flexibility in defining the disease compartmental model and configuring the simulation scenario; it allows the user to set a variety of parameters including: compartment-specific features, transition values, and environmental effects. The output is a dynamic map and a corresponding set of charts that quantitatively describe the geo-temporal evolution of the disease. The software is designed as a client-server system. The multi-platform client, which can be installed on the user's local machine, is used to set up simulations that will be executed on the server, thus avoiding specific requirements for large computational capabilities on the user side. Conclusions The user-friendly graphical interface of the GLEaMviz tool, along with its high level

  17. Advantages and Disadvantages in Image Processing with Free Software in Radiology.

    Science.gov (United States)

    Mujika, Katrin Muradas; Méndez, Juan Antonio Juanes; de Miguel, Andrés Framiñan

    2018-01-15

    Currently, there are sophisticated applications that make it possible to visualize medical images and even to manipulate them. These software applications are of great interest, both from a teaching and a radiological perspective. In addition, some of these applications are known as Free Open Source Software because they are free and the source code is freely available, and therefore it can be easily obtained even on personal computers. Two examples of free open source software are Osirix Lite® and 3D Slicer®. However, this last group of free applications have limitations in its use. For the radiological field, manipulating and post-processing images is increasingly important. Consequently, sophisticated computing tools that combine software and hardware to process medical images are needed. In radiology, graphic workstations allow their users to process, review, analyse, communicate and exchange multidimensional digital images acquired with different image-capturing radiological devices. These radiological devices are basically CT (Computerised Tomography), MRI (Magnetic Resonance Imaging), PET (Positron Emission Tomography), etc. Nevertheless, the programs included in these workstations have a high cost which always depends on the software provider and is always subject to its norms and requirements. With this study, we aim to present the advantages and disadvantages of these radiological image visualization systems in the advanced management of radiological studies. We will compare the features of the VITREA2® and AW VolumeShare 5® radiology workstation with free open source software applications like OsiriX® and 3D Slicer®, with examples from specific studies.

  18. User Driven Development of Software Tools for Open Data Discovery and Exploration

    Science.gov (United States)

    Schlobinski, Sascha; Keppel, Frank; Dihe, Pascal; Boot, Gerben; Falkenroth, Esa

    2016-04-01

    The use of open data in research faces challenges not restricted to inherent properties such as data quality, resolution of open data sets. Often Open data is catalogued insufficiently or fragmented. Software tools that support the effective discovery including the assessment of the data's appropriateness for research have shortcomings such as the lack of essential functionalities like support for data provenance. We believe that one of the reasons is the neglect of real end users requirements in the development process of aforementioned software tools. In the context of the FP7 Switch-On project we have pro-actively engaged the relevant user user community to collaboratively develop a means to publish, find and bind open data relevant for hydrologic research. Implementing key concepts of data discovery and exploration we have used state of the art web technologies to provide an interactive software tool that is easy to use yet powerful enough to satisfy the data discovery and access requirements of the hydrological research community.

  19. Performance Evaluation of a Software Engineering Tool for Automated Design of Cooling Systems in Injection Moulding

    DEFF Research Database (Denmark)

    Jauregui-Becker, Juan M.; Tosello, Guido; van Houten, Fred J.A.M.

    2013-01-01

    This paper presents a software tool for automating the design of cooling systems for injection moulding and a validation of its performance. Cooling system designs were automatically generated by the proposed software tool and by applying a best practice tool engineering design approach. The two...

  20. Learn by Yourself: The Self-Learning Tools for Qualitative Analysis Software Packages

    Science.gov (United States)

    Freitas, Fábio; Ribeiro, Jaime; Brandão, Catarina; Reis, Luís Paulo; de Souza, Francislê Neri; Costa, António Pedro

    2017-01-01

    Computer Assisted Qualitative Data Analysis Software (CAQDAS) are tools that help researchers to develop qualitative research projects. These software packages help the users with tasks such as transcription analysis, coding and text interpretation, writing and annotation, content search and analysis, recursive abstraction, grounded theory…

  1. RAVEN AS A TOOL FOR DYNAMIC PROBABILISTIC RISK ASSESSMENT: SOFTWARE OVERVIEW

    Energy Technology Data Exchange (ETDEWEB)

    Alfonsi Andrea; Mandelli Diego; Rabiti Cristian; Joshua Cogliati; Robert Kinoshita

    2013-05-01

    RAVEN is a software tool under development at the Idaho National Laboratory (INL) that acts as the control logic driver and post-processing tool for the newly developed Thermo-Hydraylic code RELAP- 7. The scope of this paper is to show the software structure of RAVEN and its utilization in connection with RELAP-7. A short overview of the mathematical framework behind the code is presented along with its main capabilities such as on-line controlling/monitoring and Monte-Carlo sampling. A demo of a Station Black Out PRA analysis of a simplified Pressurized Water Reactor (PWR) model is shown in order to demonstrate the Monte-Carlo and clustering capabilities.

  2. WannierTools: An open-source software package for novel topological materials

    Science.gov (United States)

    Wu, QuanSheng; Zhang, ShengNan; Song, Hai-Feng; Troyer, Matthias; Soluyanov, Alexey A.

    2018-03-01

    We present an open-source software package WannierTools, a tool for investigation of novel topological materials. This code works in the tight-binding framework, which can be generated by another software package Wannier90 (Mostofi et al., 2008). It can help to classify the topological phase of a given material by calculating the Wilson loop, and can get the surface state spectrum, which is detected by angle resolved photoemission (ARPES) and in scanning tunneling microscopy (STM) experiments. It also identifies positions of Weyl/Dirac points and nodal line structures, calculates the Berry phase around a closed momentum loop and Berry curvature in a part of the Brillouin zone (BZ).

  3. Evaluation of static analysis tools used to assess software important to nuclear power plant safety

    Energy Technology Data Exchange (ETDEWEB)

    Ourghanlian, Alain [EDF Lab CHATOU, Simulation and Information Technologies for Power Generation Systems Department, EDF R and D, Cedex (France)

    2015-03-15

    We describe a comparative analysis of different tools used to assess safety-critical software used in nuclear power plants. To enhance the credibility of safety assessments and to optimize safety justification costs, Electricit e de France (EDF) investigates the use of methods and tools for source code semantic analysis, to obtain indisputable evidence and help assessors focus on the most critical issues. EDF has been using the PolySpace tool for more than 10 years. Currently, new industrial tools based on the same formal approach, Abstract Interpretation, are available. Practical experimentation with these new tools shows that the precision obtained on one of our shutdown systems software packages is substantially improved. In the first part of this article, we present the analysis principles of the tools used in our experimentation. In the second part, we present the main characteristics of protection-system software, and why these characteristics are well adapted for the new analysis tools.

  4. Integrated management tool for controls software problems, requests and project tasking at SLAC

    International Nuclear Information System (INIS)

    Rogind, D.; Allen, W.; Colocho, W.; DeContreras, G.; Gordon, J.; Pandey, P.; Shoaee, H.

    2012-01-01

    The Accelerator Directorate (AD) Instrumentation and Controls (ICD) Software (SW) Department at SLAC, with its service center model, continuously receives engineering requests to design, build and support controls for accelerator systems lab-wide. Each customer request can vary in complexity from a small software engineering change to a major enhancement. SLAC's Accelerator Improvement Projects (AIPs), along with DOE Construction projects, also contribute heavily to the work load. The various customer requests and projects, paired with the ongoing operational maintenance and problem reports, place a demand on the department that consistently exceeds the capacity of available resources. A centralized repository - comprised of all requests, project tasks, and problems - available to physicists, operators, managers, and engineers alike, is essential to capture, communicate, prioritize, assign, schedule, track, and finally, commission all work components. The Software Department has recently integrated request / project tasking into SLAC's custom online problem tracking 'Comprehensive Accelerator Tool for Enhancing Reliability' (CATER) tool. This paper discusses the newly implemented software request management tool - the workload it helps to track, its structure, features, reports, work-flow and its many usages. (authors)

  5. Software engineering and data management for automated payload experiment tool

    Science.gov (United States)

    Maddux, Gary A.; Provancha, Anna; Chattam, David

    1994-01-01

    The Microgravity Projects Office identified a need to develop a software package that will lead experiment developers through the development planning process, obtain necessary information, establish an electronic data exchange avenue, and allow easier manipulation/reformatting of the collected information. An MS-DOS compatible software package called the Automated Payload Experiment Tool (APET) has been developed and delivered. The objective of this task is to expand on the results of the APET work previously performed by University of Alabama in Huntsville (UAH) and provide versions of the software in a Macintosh and Windows compatible format. Appendix 1 science requirements document (SRD) Users Manual is attached.

  6. A portable software tool for computing digitally reconstructed radiographs

    International Nuclear Information System (INIS)

    Chaney, Edward L.; Thorn, Jesse S.; Tracton, Gregg; Cullip, Timothy; Rosenman, Julian G.; Tepper, Joel E.

    1995-01-01

    Purpose: To develop a portable software tool for fast computation of digitally reconstructed radiographs (DRR) with a friendly user interface and versatile image format and display options. To provide a means for interfacing with commercial and custom three-dimensional (3D) treatment planning systems. To make the tool freely available to the Radiation Oncology community. Methods and Materials: A computer program for computing DRRs was enhanced with new features and rewritten to increase computational efficiency. A graphical user interface was added to improve ease of data input and DRR display. Installer, programmer, and user manuals were written, and installation test data sets were developed. The code conforms to the specifications of the Cooperative Working Group (CWG) of the National Cancer Institute (NCI) Contract on Radiotherapy Treatment Planning Tools. Results: The interface allows the user to select DRR input data and image formats primarily by point-and-click mouse operations. Digitally reconstructed radiograph formats are predefined by configuration files that specify 19 calculation parameters. Enhancements include improved contrast resolution for visualizing surgical clips, an extended source model to simulate the penumbra region in a computed port film, and the ability to easily modify the CT numbers of objects contoured on the planning computed tomography (CT) scans. Conclusions: The DRR tool can be used with 3D planning systems that lack this functionality, or perhaps improve the quality and functionality of existing DRR software. The tool can be interfaced to 3D planning systems that run on most modern graphics workstations, and can also function as a stand-alone program

  7. Automated software development tools in the MIS (Management Information Systems) environment

    Energy Technology Data Exchange (ETDEWEB)

    Arrowood, L.F.; Emrich, M.L.

    1987-09-11

    Quantitative and qualitative benefits can be obtained through the use of automated software development tools. Such tools are best utilized when they complement existing procedures and standards. They can assist systems analysts and programmers with project specification, design, implementation, testing, and documentation. Commercial products have been evaluated to determine their efficacy. User comments have been included to illustrate actual benefits derived from introducing these tools into MIS organizations.

  8. An expert system based software sizing tool, phase 2

    Science.gov (United States)

    Friedlander, David

    1990-01-01

    A software tool was developed for predicting the size of a future computer program at an early stage in its development. The system is intended to enable a user who is not expert in Software Engineering to estimate software size in lines of source code with an accuracy similar to that of an expert, based on the program's functional specifications. The project was planned as a knowledge based system with a field prototype as the goal of Phase 2 and a commercial system planned for Phase 3. The researchers used techniques from Artificial Intelligence and knowledge from human experts and existing software from NASA's COSMIC database. They devised a classification scheme for the software specifications, and a small set of generic software components that represent complexity and apply to large classes of programs. The specifications are converted to generic components by a set of rules and the generic components are input to a nonlinear sizing function which makes the final prediction. The system developed for this project predicted code sizes from the database with a bias factor of 1.06 and a fluctuation factor of 1.77, an accuracy similar to that of human experts but without their significant optimistic bias.

  9. Northwestern University Schizophrenia Data and Software Tool (NUSDAST

    Directory of Open Access Journals (Sweden)

    Lei eWang

    2013-11-01

    Full Text Available The schizophrenia research community has invested substantial resources on collecting, managing and sharing large neuroimaging datasets. As part of this effort, our group has collected high resolution magnetic resonance (MR datasets from individuals with schizophrenia, their non-psychotic siblings, healthy controls and their siblings. This effort has resulted in a growing resource, the Northwestern University Schizophrenia Data and Software Tool (NUSDAST, an NIH-funded data sharing project to stimulate new research. This resource resides on XNAT Central, and it contains neuroimaging (MR scans, landmarks and surface maps for deep subcortical structures, and FreeSurfer cortical parcellation and measurement data, cognitive (cognitive domain scores for crystallized intelligence, working memory, episodic memory, and executive function, clinical (demographic, sibling relationship, SAPS and SANS psychopathology, and genetic (20 polymorphisms data, collected from more than 450 subjects, most with 2-year longitudinal follow-up. A neuroimaging mapping, analysis and visualization software tool, CAWorks, is also part of this resource. Moreover, in making our existing neuroimaging data along with the associated meta-data and computational tools publically accessible, we have established a web-based information retrieval portal that allows the user to efficiently search the collection. This research-ready dataset meaningfully combines neuroimaging data with other relevant information, and it can be used to help facilitate advancing neuroimaging research. It is our hope that this effort will help to overcome some of the commonly recognized technical barriers in advancing neuroimaging research such as lack of local organization and standard descriptions.

  10. Software tool for horizontal-axis wind turbine simulation

    Energy Technology Data Exchange (ETDEWEB)

    Vitale, A.J. [Instituto Argentino de Oceanografia, Camino La Carrindanga Km. 7, 5 CC 804, B8000FWB Bahia Blanca (Argentina); Universidad Tecnologica Nacional Facultad Regional Bahia Blanca, GESE, 11 de Abril 461, B8000LMI Bahia Blanca (Argentina); Dpto. de Ing. Electrica y de Computadoras, Universidad Nacional del Sur, Av. Alem 1253, 8000 Bahia Blanca (Argentina); Rossi, A.P. [Universidad Tecnologica Nacional Facultad Regional Bahia Blanca, GESE, 11 de Abril 461, B8000LMI Bahia Blanca (Argentina); Dpto. de Ing. Electrica y de Computadoras, Universidad Nacional del Sur, Av. Alem 1253, 8000 Bahia Blanca (Argentina)

    2008-07-15

    The main problem of a wind turbine generator design project is the design of the right blades capable of satisfying the specific energy requirement of an electric system with optimum performance. Once the blade has been designed for optimum operation at a particular rotor angular speed, it is necessary to determine the overall performance of the rotor under the range of wind speed that it will encounter. A software tool that simulates low-power, horizontal-axis wind turbines was developed for this purpose. With this program, the user can calculate the rotor power output for any combination of wind and rotor speeds, with definite blade shape and airfoil characteristics. The software also provides information about distribution of forces along the blade span, for different operational conditions. (author)

  11. About new software and hardware tools in the education of 'Semiconductor Devices'

    International Nuclear Information System (INIS)

    Taneva, Ljudmila; Basheva, Bistra

    2009-01-01

    This paper describes the new tools, used in the education of ”Semiconductor Devices”, developed at the Technological School “Electronic Systems”, Department of the Technical University, Sofia. The software and hardware tools give the opportunity to achieve the right balance between theory and practice, and the students are given the chance to accumulate valuable “hands-on” skills. The main purpose of the developed lab exercises is to demonstrate the use of some electronic components and practice with them. Keywords: semiconductors, media software tool, hardware, education

  12. Sighten Final Technical Report DEEE0006690 Deploying an integrated and comprehensive solar financing software platform

    Energy Technology Data Exchange (ETDEWEB)

    O' Leary, Conlan [Sighten, Inc., San Francisco, CA (United States)

    2017-10-15

    Over the project, Sighten built a comprehensive software-as-a-service (Saas) platform to automate and streamline the residential solar financing workflow. Before the project period, significant time and money was spent by companies on front-end tools related to system design and proposal creation, but comparatively few resources were available to support the many back-end calculations and data management processes that underpin third party financing. Without a tool like Sighten, the solar financing processes involved passing information from the homeowner prospect into separate tools for system design, financing, and then later to reporting tools including Microsoft Excel, CRM software, in-house software, outside software, and offline, manual processes. Passing data between tools and attempting to connect disparate systems results in inefficiency and inaccuracy for the industry. Sighten was built to consolidate all financial and solar-related calculations in a single software platform. It significantly improves upon the accuracy of these calculations and exposes sophisticated new analysis tools resulting in a rigorous, efficient and cost-effective toolset for scaling residential solar. Widely deploying a platform like Sighten’s significantly and immediately impacts the residential solar space in several important ways: 1) standardizing and improving the quality of all quantitative calculations involved in the residential financing process, most notably project finance, system production and reporting calculations; 2) representing a true step change in terms of reporting and analysis capabilities by maintaining more accurate data and exposing sophisticated tools around simulation, tranching, and financial reporting, among others, to all stakeholders in the space; 3) allowing a broader group of developers/installers/finance companies to access the capital markets by providing an out-of-the-box toolset that handles the execution of running investor capital through a

  13. COMPUTER TOOLS OF DYNAMIC MATHEMATIC SOFTWARE AND METHODICAL PROBLEMS OF THEIR USE

    OpenAIRE

    Olena V. Semenikhina; Maryna H. Drushliak

    2014-01-01

    The article presents results of analyses of standard computer tools of dynamic mathematic software which are used in solving tasks, and tools on which the teacher can support in the teaching of mathematics. Possibility of the organization of experimental investigating of mathematical objects on the basis of these tools and the wording of new tasks on the basis of the limited number of tools, fast automated check are specified. Some methodological comments on application of computer tools and ...

  14. Techniques and tools for measuring energy efficiency of scientific software applications

    International Nuclear Information System (INIS)

    Abdurachmanov, David; Elmer, Peter; Eulisse, Giulio; Knight, Robert; Niemi, Tapio; Pestana, Gonçalo; Khan, Kashif; Nurminen, Jukka K; Nyback, Filip; Ou, Zhonghong

    2015-01-01

    The scale of scientific High Performance Computing (HPC) and High Throughput Computing (HTC) has increased significantly in recent years, and is becoming sensitive to total energy use and cost. Energy-efficiency has thus become an important concern in scientific fields such as High Energy Physics (HEP). There has been a growing interest in utilizing alternate architectures, such as low power ARM processors, to replace traditional Intel x86 architectures. Nevertheless, even though such solutions have been successfully used in mobile applications with low I/O and memory demands, it is unclear if they are suitable and more energy-efficient in the scientific computing environment. Furthermore, there is a lack of tools and experience to derive and compare power consumption between the architectures for various workloads, and eventually to support software optimizations for energy efficiency. To that end, we have performed several physical and software-based measurements of workloads from HEP applications running on ARM and Intel architectures, and compare their power consumption and performance. We leverage several profiling tools (both in hardware and software) to extract different characteristics of the power use. We report the results of these measurements and the experience gained in developing a set of measurement techniques and profiling tools to accurately assess the power consumption for scientific workloads. (paper)

  15. Tool Support for Parametric Analysis of Large Software Simulation Systems

    Science.gov (United States)

    Schumann, Johann; Gundy-Burlet, Karen; Pasareanu, Corina; Menzies, Tim; Barrett, Tony

    2008-01-01

    The analysis of large and complex parameterized software systems, e.g., systems simulation in aerospace, is very complicated and time-consuming due to the large parameter space, and the complex, highly coupled nonlinear nature of the different system components. Thus, such systems are generally validated only in regions local to anticipated operating points rather than through characterization of the entire feasible operational envelope of the system. We have addressed the factors deterring such an analysis with a tool to support envelope assessment: we utilize a combination of advanced Monte Carlo generation with n-factor combinatorial parameter variations to limit the number of cases, but still explore important interactions in the parameter space in a systematic fashion. Additional test-cases, automatically generated from models (e.g., UML, Simulink, Stateflow) improve the coverage. The distributed test runs of the software system produce vast amounts of data, making manual analysis impossible. Our tool automatically analyzes the generated data through a combination of unsupervised Bayesian clustering techniques (AutoBayes) and supervised learning of critical parameter ranges using the treatment learner TAR3. The tool has been developed around the Trick simulation environment, which is widely used within NASA. We will present this tool with a GN&C (Guidance, Navigation and Control) simulation of a small satellite system.

  16. Software Tools for Electrical Quality Assurance in the LHC

    CERN Document Server

    Bednarek, Mateusz

    2011-01-01

    There are over 1600 superconducting magnet circuits in the LHC machine. Many of them consist of a large number of components electrically connected in series. This enhances the sensitivity of the whole circuits to electrical faults of individual components. Furthermore, circuits are equipped with a large number of instrumentation wires, which are exposed to accidental damage or swapping. In order to ensure safe operation, an Electrical Quality Assurance (ELQA) campaign is needed after each thermal cycle. Due to the complexity of the circuits, as well as their distant geographical distribution (tunnel of 27km circumference divided in 8 sectors), suitable software and hardware platforms had to be developed. The software combines an Oracle database, LabView data acquisition applications and PHP-based web follow-up tools. This paper describes the software used for the ELQA of the LHC.

  17. Software tools for electrical quality assurance in the LHC

    International Nuclear Information System (INIS)

    Bednarek, M.; Ludwin, J.

    2012-01-01

    There are over 1600 superconducting magnet circuits in the LHC machine. Many of them consist of a large number of components electrically connected in series. This enhances the sensitivity of the whole circuits to electrical faults of individual components. Furthermore, circuits are equipped with a large number of instrumentation wires, which are exposed to accidental damage or swapping. In order to ensure safe operation, an Electrical Quality Assurance (ELQA) campaign is needed after each thermal cycle. Due to the complexity of the circuits, as well as their distant geographical distribution (tunnel of 27 km circumference divided in 8 sectors), suitable software and hardware platforms had to be developed. The software combines an Oracle database, LabView data acquisition applications and PHP-based web follow-up tools. This paper describes the software used for the ELQA of the LHC. (authors)

  18. NuFTA: A CASE Tool for Automatic Software Fault Tree Analysis

    International Nuclear Information System (INIS)

    Yun, Sang Hyun; Lee, Dong Ah; Yoo, Jun Beom

    2010-01-01

    Software fault tree analysis (SFTA) is widely used for analyzing software requiring high-reliability. In SFTA, experts predict failures of system through HA-ZOP (Hazard and Operability study) or FMEA (Failure Mode and Effects Analysis) and draw software fault trees about the failures. Quality and cost of the software fault tree, therefore, depend on knowledge and experience of the experts. This paper proposes a CASE tool NuFTA in order to assist experts of safety analysis. The NuFTA automatically generate software fault trees from NuSCR formal requirements specification. NuSCR is a formal specification language used for specifying software requirements of KNICS RPS (Reactor Protection System) in Korea. We used the SFTA templates proposed by in order to generate SFTA automatically. The NuFTA also generates logical formulae summarizing the failure's cause, and we have a plan to use the formulae usefully through formal verification techniques

  19. Software Tools: A One-Semester Secondary School Computer Course.

    Science.gov (United States)

    Bromley, John; Lakatos, John

    1985-01-01

    Provides a course outline, describes equipment and teacher requirements, discusses student evaluation and course outcomes, and details the computer programs used in a high school course. The course is designed to teach students use of the microcomputer as a tool through hands-on experience with a variety of commercial software programs. (MBR)

  20. PyElph - a software tool for gel images analysis and phylogenetics

    Directory of Open Access Journals (Sweden)

    Pavel Ana Brânduşa

    2012-01-01

    Full Text Available Abstract Background This paper presents PyElph, a software tool which automatically extracts data from gel images, computes the molecular weights of the analyzed molecules or fragments, compares DNA patterns which result from experiments with molecular genetic markers and, also, generates phylogenetic trees computed by five clustering methods, using the information extracted from the analyzed gel image. The software can be successfully used for population genetics, phylogenetics, taxonomic studies and other applications which require gel image analysis. Researchers and students working in molecular biology and genetics would benefit greatly from the proposed software because it is free, open source, easy to use, has a friendly Graphical User Interface and does not depend on specific image acquisition devices like other commercial programs with similar functionalities do. Results PyElph software tool is entirely implemented in Python which is a very popular programming language among the bioinformatics community. It provides a very friendly Graphical User Interface which was designed in six steps that gradually lead to the results. The user is guided through the following steps: image loading and preparation, lane detection, band detection, molecular weights computation based on a molecular weight marker, band matching and finally, the computation and visualization of phylogenetic trees. A strong point of the software is the visualization component for the processed data. The Graphical User Interface provides operations for image manipulation and highlights lanes, bands and band matching in the analyzed gel image. All the data and images generated in each step can be saved. The software has been tested on several DNA patterns obtained from experiments with different genetic markers. Examples of genetic markers which can be analyzed using PyElph are RFLP (Restriction Fragment Length Polymorphism, AFLP (Amplified Fragment Length Polymorphism, RAPD

  1. Knowledge Architect : A Tool Suite for Managing Software Architecture Knowledge

    NARCIS (Netherlands)

    Liang, Peng; Jansen, Anton; Avgeriou, Paris

    2009-01-01

    Management of software architecture knowledge (AK) is vital for improving an organization’s architectural capabilities. To support the architecting process within our industrial partner: Astron, the Dutch radio astronomy institute, we implemented the Knowledge Architect (KA): a tool suite for

  2. Metabolic interrelationships software application: Interactive learning tool for intermediary metabolism

    NARCIS (Netherlands)

    A.J.M. Verhoeven (Adrie); M. Doets (Mathijs); J.M.J. Lamers (Jos); J.F. Koster (Johan)

    2005-01-01

    textabstractWe developed and implemented the software application titled Metabolic Interrelationships as a self-learning and -teaching tool for intermediary metabolism. It is used by undergraduate medical students in an integrated organ systems-based and disease-oriented core curriculum, which

  3. Variation of densitometry on computed tomography in COPD--influence of different software tools.

    Directory of Open Access Journals (Sweden)

    Mark O Wielpütz

    Full Text Available Quantitative multidetector computed tomography (MDCT as a potential biomarker is increasingly used for severity assessment of emphysema in chronic obstructive pulmonary disease (COPD. Aim of this study was to evaluate the user-independent measurement variability between five different fully-automatic densitometry software tools.MDCT and full-body plethysmography incl. forced expiratory volume in 1s and total lung capacity were available for 49 patients with advanced COPD (age = 64±9 years, forced expiratory volume in 1 s = 31±6% predicted. Measurement variation regarding lung volume, emphysema volume, emphysema index, and mean lung density was evaluated for two scientific and three commercially available lung densitometry software tools designed to analyze MDCT from different scanner types.One scientific tool and one commercial tool failed to process most or all datasets, respectively, and were excluded. One scientific and another commercial tool analyzed 49, the remaining commercial tool 30 datasets. Lung volume, emphysema volume, emphysema index and mean lung density were significantly different amongst these three tools (p<0.001. Limits of agreement for lung volume were [-0.195, -0.052 l], [-0.305, -0.131 l], and [-0.123, -0.052 l] with correlation coefficients of r = 1.00 each. Limits of agreement for emphysema index were [-6.2, 2.9%], [-27.0, 16.9%], and [-25.5, 18.8%], with r = 0.79 to 0.98. Correlation of lung volume with total lung capacity was good to excellent (r = 0.77 to 0.91, p<0.001, but segmented lung volume (6.7±1.3-6.8±1.3 l were significantly lower than total lung capacity (7.7±1.7 l, p<0.001.Technical incompatibilities hindered evaluation of two of five tools. The remaining three showed significant measurement variation for emphysema, hampering quantitative MDCT as a biomarker in COPD. Follow-up studies should currently use identical software, and standardization efforts should encompass software as

  4. Oxygen octahedra picker: A software tool to extract quantitative information from STEM images

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Yi, E-mail: y.wang@fkf.mpg.de; Salzberger, Ute; Sigle, Wilfried; Eren Suyolcu, Y.; Aken, Peter A. van

    2016-09-15

    In perovskite oxide based materials and hetero-structures there are often strong correlations between oxygen octahedral distortions and functionality. Thus, atomistic understanding of the octahedral distortion, which requires accurate measurements of atomic column positions, will greatly help to engineer their properties. Here, we report the development of a software tool to extract quantitative information of the lattice and of BO{sub 6} octahedral distortions from STEM images. Center-of-mass and 2D Gaussian fitting methods are implemented to locate positions of individual atom columns. The precision of atomic column distance measurements is evaluated on both simulated and experimental images. The application of the software tool is demonstrated using practical examples. - Highlights: • We report a software tool for mapping atomic positions from HAADF and ABF images. • It enables quantification of both crystal lattice and oxygen octahedral distortions. • We test the measurement accuracy and precision on simulated and experimental images. • It works well for different orientations of perovskite structures and interfaces.

  5. An Accurate FFPA-PSR Estimator Algorithm and Tool for Software Effort Estimation

    Directory of Open Access Journals (Sweden)

    Senthil Kumar Murugesan

    2015-01-01

    Full Text Available Software companies are now keen to provide secure software with respect to accuracy and reliability of their products especially related to the software effort estimation. Therefore, there is a need to develop a hybrid tool which provides all the necessary features. This paper attempts to propose a hybrid estimator algorithm and model which incorporates quality metrics, reliability factor, and the security factor with a fuzzy-based function point analysis. Initially, this method utilizes a fuzzy-based estimate to control the uncertainty in the software size with the help of a triangular fuzzy set at the early development stage. Secondly, the function point analysis is extended by the security and reliability factors in the calculation. Finally, the performance metrics are added with the effort estimation for accuracy. The experimentation is done with different project data sets on the hybrid tool, and the results are compared with the existing models. It shows that the proposed method not only improves the accuracy but also increases the reliability, as well as the security, of the product.

  6. A SOFTWARE TOOL TO COMPARE MEASURED AND SIMULATED BUILDING ENERGY PERFORMANCE DATA

    Energy Technology Data Exchange (ETDEWEB)

    Maile, Tobias; Bazjanac, Vladimir; O' Donnell, James; Garr, Matthew

    2011-11-01

    Building energy performance is often inadequate when compared to design goals. To link design goals to actual operation one can compare measured with simulated energy performance data. Our previously developed comparison approach is the Energy Performance Comparison Methodology (EPCM), which enables the identification of performance problems based on a comparison of measured and simulated performance data. In context of this method, we developed a software tool that provides graphing and data processing capabilities of the two performance data sets. The software tool called SEE IT (Stanford Energy Efficiency Information Tool) eliminates the need for manual generation of data plots and data reformatting. SEE IT makes the generation of time series, scatter and carpet plots independent of the source of data (measured or simulated) and provides a valuable tool for comparing measurements with simulation results. SEE IT also allows assigning data points on a predefined building object hierarchy and supports different versions of simulated performance data. This paper briefly introduces the EPCM, describes the SEE IT tool and illustrates its use in the context of a building case study.

  7. A free software tool for the development of decision support systems

    Directory of Open Access Journals (Sweden)

    COLONESE, G

    2008-06-01

    Full Text Available This article describes PostGeoOlap, a free software open source tool for decision support that integrates OLAP (On-Line Analytical Processing and GIS (Geographical Information Systems. Besides describing the tool, we show how it can be used to achieve effective and low cost decision support that is adequate for small and medium companies and for small public offices.

  8. Distortion Analysis Toolkit—A Software Tool for Easy Analysis of Nonlinear Audio Systems

    Directory of Open Access Journals (Sweden)

    Jyri Pakarinen

    2010-01-01

    Full Text Available Several audio effects devices deliberately add nonlinear distortion to the processed signal in order to create a desired sound. When creating virtual analog models of nonlinearly distorting devices, it would be very useful to carefully analyze the type of distortion, so that the model could be made as realistic as possible. While traditional system analysis tools such as the frequency response give detailed information on the operation of linear and time-invariant systems, they are less useful for analyzing nonlinear devices. Furthermore, although there do exist separate algorithms for nonlinear distortion analysis, there is currently no unified, easy-to-use tool for rapid analysis of distorting audio systems. This paper offers a remedy by introducing a new software tool for easy analysis of distorting effects. A comparison between a well-known guitar tube amplifier and two commercial software simulations is presented as a case study. This freely available software is written in Matlab language, but the analysis tool can also run as a standalone program, so the user does not need to have Matlab installed in order to perform the analysis.

  9. Three Software Tools for Viewing Sectional Planes, Volume Models, and Surface Models of a Cadaver Hand.

    Science.gov (United States)

    Chung, Beom Sun; Chung, Min Suk; Shin, Byeong Seok; Kwon, Koojoo

    2018-02-19

    The hand anatomy, including the complicated hand muscles, can be grasped by using computer-assisted learning tools with high quality two-dimensional images and three-dimensional models. The purpose of this study was to present up-to-date software tools that promote learning of stereoscopic morphology of the hand. On the basis of horizontal sectioned images and outlined images of a male cadaver, vertical planes, volume models, and surface models were elaborated. Software to browse pairs of the sectioned and outlined images in orthogonal planes and software to peel and rotate the volume models, as well as a portable document format (PDF) file to select and rotate the surface models, were produced. All of the software tools were downloadable free of charge and usable off-line. The three types of tools for viewing multiple aspects of the hand could be adequately employed according to individual needs. These new tools involving the realistic images of a cadaver and the diverse functions are expected to improve comprehensive knowledge of the hand shape. © 2018 The Korean Academy of Medical Sciences.

  10. D-VASim: A Software Tool to Simulate and Analyze Genetic Logic Circuits

    DEFF Research Database (Denmark)

    Baig, Hasan; Madsen, Jan

    2016-01-01

    -stage researchers with limited experience in the field of biology. The Solution: Using LabVIEW to develop a user-friendly simulation tool named Dynamic Virtual Analyzer and Simulator (D-VASim), which is the first software tool in the domain of synthetic biology that provides a virtual laboratory environment...

  11. Review of software tools for design and analysis of large scale MRM proteomic datasets.

    Science.gov (United States)

    Colangelo, Christopher M; Chung, Lisa; Bruce, Can; Cheung, Kei-Hoi

    2013-06-15

    Selective or Multiple Reaction monitoring (SRM/MRM) is a liquid-chromatography (LC)/tandem-mass spectrometry (MS/MS) method that enables the quantitation of specific proteins in a sample by analyzing precursor ions and the fragment ions of their selected tryptic peptides. Instrumentation software has advanced to the point that thousands of transitions (pairs of primary and secondary m/z values) can be measured in a triple quadrupole instrument coupled to an LC, by a well-designed scheduling and selection of m/z windows. The design of a good MRM assay relies on the availability of peptide spectra from previous discovery-phase LC-MS/MS studies. The tedious aspect of manually developing and processing MRM assays involving thousands of transitions has spurred to development of software tools to automate this process. Software packages have been developed for project management, assay development, assay validation, data export, peak integration, quality assessment, and biostatistical analysis. No single tool provides a complete end-to-end solution, thus this article reviews the current state and discusses future directions of these software tools in order to enable researchers to combine these tools for a comprehensive targeted proteomics workflow. Copyright © 2013 The Authors. Published by Elsevier Inc. All rights reserved.

  12. Programming heterogeneous MPSoCs tool flows to close the software productivity gap

    CERN Document Server

    Castrillón Mazo, Jerónimo

    2014-01-01

    This book provides embedded software developers with techniques for programmingheterogeneous Multi-Processor Systems-on-Chip (MPSoCs), capable of executing multiple applications simultaneously. It describes a set of algorithms and methodologies to narrow the software productivity gap, as well as an in-depth description of the underlying problems and challenges of today’s programming practices. The authors present four different tool flows: A parallelism extraction flow for applications writtenusing the C programming language, a mapping and scheduling flow for parallel applications, a special mapping flow for baseband applications in the context of Software Defined Radio (SDR) and a final flow for analyzing multiple applications at design time. The tool flows are evaluated on Virtual Platforms (VPs), which mimic different characteristics of state-of-the-art heterogeneous MPSoCs.   • Provides a novel set of algorithms and methodologies for programming heterogeneous Multi-Processor Systems-on-Chip (MPSoCs)...

  13. Integrated structural analysis tool using the linear matching method part 1 – Software development

    International Nuclear Information System (INIS)

    Ure, James; Chen, Haofeng; Tipping, David

    2014-01-01

    A number of direct methods based upon the Linear Matching Method (LMM) framework have been developed to address structural integrity issues for components subjected to cyclic thermal and mechanical load conditions. This paper presents a new integrated structural analysis tool using the LMM framework for the assessment of load carrying capacity, shakedown limit, ratchet limit and steady state cyclic response of structures. First, the development of the LMM for the evaluation of design limits in plasticity is introduced. Second, preliminary considerations for the development of the LMM into a tool which can be used on a regular basis by engineers are discussed. After the re-structuring of the LMM subroutines for multiple central processing unit (CPU) solution, the LMM software tool for the assessment of design limits in plasticity is implemented by developing an Abaqus CAE plug-in with graphical user interfaces. Further demonstration of this new LMM analysis tool including practical application and verification is presented in an accompanying paper. - Highlights: • A new structural analysis tool using the Linear Matching Method (LMM) is developed. • The software tool is able to evaluate the design limits in plasticity. • Able to assess limit load, shakedown, ratchet limit and steady state cyclic response. • Re-structuring of the LMM subroutines for multiple CPU solution is conducted. • The software tool is implemented by developing an Abaqus CAE plug-in with GUI

  14. Commissioning software tools at the Advanced Photon Source

    International Nuclear Information System (INIS)

    Emery, L.

    1995-01-01

    A software tool-oriented approach has been adopted in the commissioning of the Advanced Photon Source (APS) at Argonne National Laboratory, particularly in the commissioning of the Positron Accumulator Ring (PAR). The general philosophy is to decompose a complicated procedure involving measurement, data processing, and control into a series of simpler steps, each accomplished by a generic toolkit program. The implementation is greatly facilitated by adopting the SDDS (self-describing data set protocol), which comes with its own toolkit. The combined toolkit has made accelerator physics measurements easier. For instance, the measurement of the optical functions of the PAR and the beamlines connected to it have been largely automated. Complicated measurements are feasible with a combination of tools running independently

  15. Software Tools for In-Situ Documentation of Built Heritage

    Science.gov (United States)

    Smars, P.

    2013-07-01

    The paper presents open source software tools developed by the author to facilitate in-situ documentation of architectural and archæological heritage. The design choices are exposed and related to a general issue in conservation and documentation: taking decisions about a valuable object under threat . The questions of level of objectivity is central to the three steps of this process. It is our belief that in-situ documentation has to be favoured in this demanding context, full of potential discoveries. The very powerful surveying techniques in rapid development nowadays enhance our vision but often tend to bring back a critical part of the documentation process to the office. The software presented facilitate a direct treatment of the data on the site. Emphasis is given to flexibility, interoperability and simplicity. Key features of the software are listed and illustrated with examples (3D model of Gothic vaults, analysis of the shape of a column, deformation of a wall, direct interaction with AutoCAD).

  16. COMSY - A Software Tool for Aging and Plant Life Management

    International Nuclear Information System (INIS)

    Zander, Andre; Nopper, Helmut

    2012-01-01

    A Plant-wide and systematic Aging and Plant Life Management is essential for the safe operation and/or availability of nuclear power plants. The Aging Management (AM) has the objective to monitor and control degradation effects for safety relevant Systems, Structures and Components (SSCs) which may compromise safety functions of the plant. The Plant Life Management (PLM) methodology also includes aging surveillance for availability relevant SSCs. AM and PLM cover mechanical components, electrical and I and C systems and civil structures All Aging and Plant Life Management rules call for a comprehensive approach, requiring the systematic collection of various aging and safety relevant data on a plant-wide basis. This data needs to be serviced and periodically evaluated. Due to the complexity of the process, this activity needs to be supported by a qualified software tool for the management of aging relevant data and associated documents (approx. 30 000 SSCs). In order to support the power plant operators AREVA NP has developed the software tool COMSY. The COMSY software with its integrated AM modules enables the design and setup of a knowledge-based power plant model compatible to the requirements of international and national rules (e.g. IAEA Safety Guide NS-G-2.12, KTA 1403). In this process, a key task is to identify and monitor degradation mechanisms. For this purpose the COMSY tool provides prognosis and trending functions, which are based on more than 30 years of experience in the evaluation of degradation effects and numerous experimental studies. Since 1998 COMSY has been applied successfully in more than fifty reactor units in this field. The current version 3.0 was revised completely and offers additional AM functions. All aging-relevant component data are compiled and allocated via an integrated power plant model. Owing to existing interfaces to other software solutions and flexible import functions, COMSY is highly compatible with already existing data

  17. HITCal: a software tool for analysis of video head impulse test responses.

    Science.gov (United States)

    Rey-Martinez, Jorge; Batuecas-Caletrio, Angel; Matiño, Eusebi; Perez Fernandez, Nicolás

    2015-09-01

    The developed software (HITCal) may be a useful tool in the analysis and measurement of the saccadic video head impulse test (vHIT) responses and with the experience obtained during its use the authors suggest that HITCal is an excellent method for enhanced exploration of vHIT outputs. To develop a (software) method to analyze and explore the vHIT responses, mainly saccades. HITCal was written using a computational development program; the function to access a vHIT file was programmed; extended head impulse exploration and measurement tools were created and an automated saccade analysis was developed using an experimental algorithm. For pre-release HITCal laboratory tests, a database of head impulse tests (HITs) was created with the data collected retrospectively in three reference centers. This HITs database was evaluated by humans and was also computed with HITCal. The authors have successfully built HITCal and it has been released as open source software; the developed software was fully operative and all the proposed characteristics were incorporated in the released version. The automated saccades algorithm implemented in HITCal has good concordance with the assessment by human observers (Cohen's kappa coefficient = 0.7).

  18. DAISY: a new software tool to test global identifiability of biological and physiological systems.

    Science.gov (United States)

    Bellu, Giuseppina; Saccomani, Maria Pia; Audoly, Stefania; D'Angiò, Leontina

    2007-10-01

    A priori global identifiability is a structural property of biological and physiological models. It is considered a prerequisite for well-posed estimation, since it concerns the possibility of recovering uniquely the unknown model parameters from measured input-output data, under ideal conditions (noise-free observations and error-free model structure). Of course, determining if the parameters can be uniquely recovered from observed data is essential before investing resources, time and effort in performing actual biomedical experiments. Many interesting biological models are nonlinear but identifiability analysis for nonlinear system turns out to be a difficult mathematical problem. Different methods have been proposed in the literature to test identifiability of nonlinear models but, to the best of our knowledge, so far no software tools have been proposed for automatically checking identifiability of nonlinear models. In this paper, we describe a software tool implementing a differential algebra algorithm to perform parameter identifiability analysis for (linear and) nonlinear dynamic models described by polynomial or rational equations. Our goal is to provide the biological investigator a completely automatized software, requiring minimum prior knowledge of mathematical modelling and no in-depth understanding of the mathematical tools. The DAISY (Differential Algebra for Identifiability of SYstems) software will potentially be useful in biological modelling studies, especially in physiology and clinical medicine, where research experiments are particularly expensive and/or difficult to perform. Practical examples of use of the software tool DAISY are presented. DAISY is available at the web site http://www.dei.unipd.it/~pia/.

  19. The value of multivariate model sophistication

    DEFF Research Database (Denmark)

    Rombouts, Jeroen; Stentoft, Lars; Violante, Francesco

    2014-01-01

    We assess the predictive accuracies of a large number of multivariate volatility models in terms of pricing options on the Dow Jones Industrial Average. We measure the value of model sophistication in terms of dollar losses by considering a set of 444 multivariate models that differ in their spec....... In addition to investigating the value of model sophistication in terms of dollar losses directly, we also use the model confidence set approach to statistically infer the set of models that delivers the best pricing performances.......We assess the predictive accuracies of a large number of multivariate volatility models in terms of pricing options on the Dow Jones Industrial Average. We measure the value of model sophistication in terms of dollar losses by considering a set of 444 multivariate models that differ...

  20. Transformation as a Design Process and Runtime Architecture for High Integrity Software

    Energy Technology Data Exchange (ETDEWEB)

    Bespalko, S.J.; Winter, V.L.

    1999-04-05

    We have discussed two aspects of creating high integrity software that greatly benefit from the availability of transformation technology, which in this case is manifest by the requirement for a sophisticated backtracking parser. First, because of the potential for correctly manipulating programs via small changes, an automated non-procedural transformation system can be a valuable tool for constructing high assurance software. Second, modeling the processing of translating data into information as a, perhaps, context-dependent grammar leads to an efficient, compact implementation. From a practical perspective, the transformation process should begin in the domain language in which a problem is initially expressed. Thus in order for a transformation system to be practical it must be flexible with respect to domain-specific languages. We have argued that transformation applied to specification results in a highly reliable system. We also attempted to briefly demonstrate that transformation technology applied to the runtime environment will result in a safe and secure system. We thus believe that the sophisticated multi-lookahead backtracking parsing technology is central to the task of being in a position to demonstrate the existence of HIS.

  1. Software tools for manipulating fe mesh, virtual surgery and post-processing

    Directory of Open Access Journals (Sweden)

    Milašinović Danko Z.

    2009-01-01

    Full Text Available This paper describes a set of software tools which we developed for the calculation of fluid flow through cardiovascular organs. Our tools work with medical data from a CT scanner, but could be used with any other 3D input data. For meshing we used a Tetgen tetrahedral mesh generator, as well as a mesh re-generator that we have developed for conversion of tetrahedral elements into bricks. After adequate meshing we used our PAKF solver for calculation of fluid flow. For human-friendly presentation of results we developed a set of post-processing software tools. With modification of 2D mesh (boundary of cardiovascular organ it is possible to do virtual surgery, so in a case of an aorta with aneurism, which we had received from University Clinical center in Heidelberg from a multi-slice 64-CT scanner, we removed the aneurism and ran calculations on both geometrical models afterwards. The main idea of this methodology is creating a system that could be used in clinics.

  2. Hardware and software and machine-tool simulation with parallel structures mechanisms

    Directory of Open Access Journals (Sweden)

    Keba P.V.

    2016-12-01

    Full Text Available The usage spectrum of mechanisms with parallel structure is spreading all the time. The mechanisms of machine-tools and manipulators become more complicated and it is necessary to improve the program-controlled modules. Closed circuit mechanisms are mostly spread in robotic complexes, where manipulator performs complicated spatial movements by the given trajectory. The usage spectrum is very wide and the most popular are sorting, welding, assembling and others. However, the problem of designing the operating programs is still present even today. It is just because the developed post-processors are created for the equipment that we have for now. But new machine tool constructions appear every day and there is a necessity to control them. The problems associated with using of hardware and software of mechanisms with parallel structure in computer-aided simulation are considered. The program for inverse problem kinematics solving is designed. New method of designing the control programs is found. The kinematic analysis methods options and calculated data obtained by computer mathematics systems are shown with «Tools Glide» software taken as an example.

  3. A Business Analytics Software Tool for Monitoring and Predicting Radiology Throughput Performance.

    Science.gov (United States)

    Jones, Stephen; Cournane, Seán; Sheehy, Niall; Hederman, Lucy

    2016-12-01

    Business analytics (BA) is increasingly being utilised by radiology departments to analyse and present data. It encompasses statistical analysis, forecasting and predictive modelling and is used as an umbrella term for decision support and business intelligence systems. The primary aim of this study was to determine whether utilising BA technologies could contribute towards improved decision support and resource management within radiology departments. A set of information technology requirements were identified with key stakeholders, and a prototype BA software tool was designed, developed and implemented. A qualitative evaluation of the tool was carried out through a series of semi-structured interviews with key stakeholders. Feedback was collated, and emergent themes were identified. The results indicated that BA software applications can provide visibility of radiology performance data across all time horizons. The study demonstrated that the tool could potentially assist with improving operational efficiencies and management of radiology resources.

  4. Featureous: A Tool for Feature-Centric Analysis of Java Software

    DEFF Research Database (Denmark)

    Olszak, Andrzej; Jørgensen, Bo Nørregaard

    2010-01-01

    Feature-centric comprehension of source code is necessary for incorporating user-requested modifications during software evolution and maintenance. However, such comprehension is difficult to achieve in case of large object-oriented programs due to the size, complexity, and implicit character...... of mappings between features and source code. To support programmers in overcoming these difficulties, we present a feature-centric analysis tool, Featureous. Our tool extends the NetBeans IDE with mechanisms for efficient location of feature implementations in legacy source code, and an extensive analysis...

  5. ATLAS software configuration and build tool optimisation

    Science.gov (United States)

    Rybkin, Grigory; Atlas Collaboration

    2014-06-01

    ATLAS software code base is over 6 million lines organised in about 2000 packages. It makes use of some 100 external software packages, is developed by more than 400 developers and used by more than 2500 physicists from over 200 universities and laboratories in 6 continents. To meet the challenge of configuration and building of this software, the Configuration Management Tool (CMT) is used. CMT expects each package to describe its build targets, build and environment setup parameters, dependencies on other packages in a text file called requirements, and each project (group of packages) to describe its policies and dependencies on other projects in a text project file. Based on the effective set of configuration parameters read from the requirements files of dependent packages and project files, CMT commands build the packages, generate the environment for their use, or query the packages. The main focus was on build time performance that was optimised within several approaches: reduction of the number of reads of requirements files that are now read once per package by a CMT build command that generates cached requirements files for subsequent CMT build commands; introduction of more fine-grained build parallelism at package task level, i.e., dependent applications and libraries are compiled in parallel; code optimisation of CMT commands used for build; introduction of package level build parallelism, i. e., parallelise the build of independent packages. By default, CMT launches NUMBER-OF-PROCESSORS build commands in parallel. The other focus was on CMT commands optimisation in general that made them approximately 2 times faster. CMT can generate a cached requirements file for the environment setup command, which is especially useful for deployment on distributed file systems like AFS or CERN VMFS. The use of parallelism, caching and code optimisation significantly-by several times-reduced software build time, environment setup time, increased the efficiency of

  6. A software tool for simulation of surfaces generated by ball nose end milling

    DEFF Research Database (Denmark)

    Bissacco, Giuliano

    2004-01-01

    , for prediction of surface topography of ball nose end milled surfaces, was developed. Such software tool is based on a simplified model of the ideal tool motion and neglects the effects due to run-out, static and dynamic deflections and error motions, but has the merit of generating in output a file in a format...... readable by a surface processor software (SPIP [2]), for calculation of a number of surface roughness parameters. In the next paragraph a description of the basic features of ball nose end milled surfaces is given, while in paragraph 3 the model is described....

  7. EINSTEIN - Expert system for an Intelligent Supply of Thermal Energy in Industry. Audit methodology and software tool

    Energy Technology Data Exchange (ETDEWEB)

    Schweiger, Hans; Danov, Stoyan (energyXperts.NET (Spain)); Vannoni, Claudia; Facci, Enrico (Sapienza Univ. of Rome, Dept. of Mechanics and Aeronautics, Rome (Italy)); Brunner, Christoph; Slawitsch, Bettina (Joanneum Research, Inst. of Sustainable Techniques and Systems - JOINTS, Graz (Austria))

    2009-07-01

    For optimising thermal energy supply in industry, a holistic integral approach is required that includes possibilities of demand reduction by heat recovery and process integration, and by an intelligent combination of efficient heat and cold supply technologies. EINSTEIN is a tool-kit for fast and high quality thermal energy audits in industry, composed by an audit guide describing the methodology and by a software tool that guides the auditor through all the audit steps. The main features of EINSTEIN are: (1) a basic questionnaire helps for systematic collection of the necessary information with the possibility to acquire data by distance; (2) special tools allow for fast consistency checking and estimation of missing data, so that already with very few data some first predictions can be made; (3) the data processing is based on standardised models for industrial processes and industrial heat supply systems; (4) semi-automatization: the software tool gives support to decision making for the generation of alternative heat and cold supply proposals, carries out automatically all the necessary calculations, including dynamic simulation of the heat supply system, and creates a standard audit report. The software tool includes modules for benchmarking, automatic design of heat exchanger networks, and design assistants for the heat and cold supply system. The core of the expert system software tool is available for free, as an open source software project. This type of software development has shown to be very efficient for dissemination of knowledge and for the continuous maintenance and improvement thanks to user contributions.

  8. Static and Dynamic Verification of Critical Software for Space Applications

    Science.gov (United States)

    Moreira, F.; Maia, R.; Costa, D.; Duro, N.; Rodríguez-Dapena, P.; Hjortnaes, K.

    Space technology is no longer used only for much specialised research activities or for sophisticated manned space missions. Modern society relies more and more on space technology and applications for every day activities. Worldwide telecommunications, Earth observation, navigation and remote sensing are only a few examples of space applications on which we rely daily. The European driven global navigation system Galileo and its associated applications, e.g. air traffic management, vessel and car navigation, will significantly expand the already stringent safety requirements for space based applications Apart from their usefulness and practical applications, every single piece of onboard software deployed into the space represents an enormous investment. With a long lifetime operation and being extremely difficult to maintain and upgrade, at least when comparing with "mainstream" software development, the importance of ensuring their correctness before deployment is immense. Verification &Validation techniques and technologies have a key role in ensuring that the onboard software is correct and error free, or at least free from errors that can potentially lead to catastrophic failures. Many RAMS techniques including both static criticality analysis and dynamic verification techniques have been used as a means to verify and validate critical software and to ensure its correctness. But, traditionally, these have been isolated applied. One of the main reasons is the immaturity of this field in what concerns to its application to the increasing software product(s) within space systems. This paper presents an innovative way of combining both static and dynamic techniques exploiting their synergy and complementarity for software fault removal. The methodology proposed is based on the combination of Software FMEA and FTA with Fault-injection techniques. The case study herein described is implemented with support from two tools: The SoftCare tool for the SFMEA and SFTA

  9. Continuous integration and quality control for scientific software

    Science.gov (United States)

    Neidhardt, A.; Ettl, M.; Brisken, W.; Dassing, R.

    2013-08-01

    Modern software has to be stable, portable, fast and reliable. This is going to be also more and more important for scientific software. But this requires a sophisticated way to inspect, check and evaluate the quality of source code with a suitable, automated infrastructure. A centralized server with a software repository and a version control system is one essential part, to manage the code basis and to control the different development versions. While each project can be compiled separately, the whole code basis can also be compiled with one central “Makefile”. This is used to create automated, nightly builds. Additionally all sources are inspected automatically with static code analysis and inspection tools, which check well-none error situations, memory and resource leaks, performance issues, or style issues. In combination with an automatic documentation generator it is possible to create the developer documentation directly from the code and the inline comments. All reports and generated information are presented as HTML page on a Web server. Because this environment increased the stability and quality of the software of the Geodetic Observatory Wettzell tremendously, it is now also available for scientific communities. One regular customer is already the developer group of the DiFX software correlator project.

  10. Using Case Study Videos as an Effective Active Learning Tool to Teach Software Development Best Practices (Invited Paper

    Directory of Open Access Journals (Sweden)

    Sushil Acharya

    2017-06-01

    Full Text Available The fundamental challenge to a solution to improve software quality is in the people and processes that develop software products. Imparting real world experiences in software development best practices to undergraduate students is often a challenge due to the lack of effective learning tools. This pedagogical requirement is important because graduates are expected to develop software that meets rigorous quality standards. Certain best practices are difficult to comprehend by course lectures alone and are enhanced with supplemental learning tools. Realizing the necessity of such teaching tools, we designed and developed six (6 delivery hours of case study videos for use in courses that impart knowledge on Software Verification & Validation (SV&V topics viz. requirements engineering, and software reviews. We see case study videos as an effective active learning tool in our flipped classroom approach. We present our design of the case study video in its generic components envisioning how it may be used in general. To evaluate our active learning tools we mapped the learning objectives of the case Study videos to the expected learning outcomes for ABET accreditation of an undergraduate engineering program. Our implementation has been disseminated to partner institutions. Results of delivery in a faculty workshop and in two different university courses are shared.

  11. SafetyAnalyst : software tools for safety management of specific highway sites

    Science.gov (United States)

    2010-07-01

    SafetyAnalyst provides a set of software tools for use by state and local highway agencies for highway safety management. SafetyAnalyst can be used by highway agencies to improve their programming of site-specific highway safety improvements. SafetyA...

  12. Exploiting Software Tool Towards Easier Use And Higher Efficiency

    Science.gov (United States)

    Lin, G. H.; Su, J. T.; Deng, Y. Y.

    2006-08-01

    In developing countries, using data based on instrument made by themselves in maximum extent is very important. It is not only related to maximizing science returns upon prophase investment -- deep accumulations in every aspects but also science output. Based on the idea, we are exploiting a software (called THDP: Tool of Huairou Data Processing). It is used for processing a series of issues, which is met necessary in processing data. This paper discusses its designed purpose, functions, method and specialities. The primary vehicle for general data interpretation is through various techniques of data visualization, techniques of interactive. In the software, we employed Object Oriented approach. It is appropriate to the vehicle. it is imperative that the approach provide not only function, but do so in as convenient a fashion as possible. As result of the software exploiting, it is not only easier to learn data processing for beginner and more convenienter to need further improvement for senior but also increase greatly efficiency in every phrases include analyse, parameter adjusting, result display. Under frame of virtual observatory, for developing countries, we should study more and newer related technologies, which can advance ability and efficiency in science research, like the software we are developing

  13. Learning Photogrammetry with Interactive Software Tool PhoX

    Directory of Open Access Journals (Sweden)

    T. Luhmann

    2016-06-01

    Full Text Available Photogrammetry is a complex topic in high-level university teaching, especially in the fields of geodesy, geoinformatics and metrology where high quality results are demanded. In addition, more and more black-box solutions for 3D image processing and point cloud generation are available that generate nice results easily, e.g. by structure-from-motion approaches. Within this context, the classical approach of teaching photogrammetry (e.g. focusing on aerial stereophotogrammetry has to be reformed in order to educate students and professionals with new topics and provide them with more information behind the scene. Since around 20 years photogrammetry courses at the Jade University of Applied Sciences in Oldenburg, Germany, include the use of digital photogrammetry software that provide individual exercises, deep analysis of calculation results and a wide range of visualization tools for almost all standard tasks in photogrammetry. During the last years the software package PhoX has been developed that is part of a new didactic concept in photogrammetry and related subjects. It also serves as analysis tool in recent research projects. PhoX consists of a project-oriented data structure for images, image data, measured points and features and 3D objects. It allows for almost all basic photogrammetric measurement tools, image processing, calculation methods, graphical analysis functions, simulations and much more. Students use the program in order to conduct predefined exercises where they have the opportunity to analyse results in a high level of detail. This includes the analysis of statistical quality parameters but also the meaning of transformation parameters, rotation matrices, calibration and orientation data. As one specific advantage, PhoX allows for the interactive modification of single parameters and the direct view of the resulting effect in image or object space.

  14. Learning Photogrammetry with Interactive Software Tool PhoX

    Science.gov (United States)

    Luhmann, T.

    2016-06-01

    Photogrammetry is a complex topic in high-level university teaching, especially in the fields of geodesy, geoinformatics and metrology where high quality results are demanded. In addition, more and more black-box solutions for 3D image processing and point cloud generation are available that generate nice results easily, e.g. by structure-from-motion approaches. Within this context, the classical approach of teaching photogrammetry (e.g. focusing on aerial stereophotogrammetry) has to be reformed in order to educate students and professionals with new topics and provide them with more information behind the scene. Since around 20 years photogrammetry courses at the Jade University of Applied Sciences in Oldenburg, Germany, include the use of digital photogrammetry software that provide individual exercises, deep analysis of calculation results and a wide range of visualization tools for almost all standard tasks in photogrammetry. During the last years the software package PhoX has been developed that is part of a new didactic concept in photogrammetry and related subjects. It also serves as analysis tool in recent research projects. PhoX consists of a project-oriented data structure for images, image data, measured points and features and 3D objects. It allows for almost all basic photogrammetric measurement tools, image processing, calculation methods, graphical analysis functions, simulations and much more. Students use the program in order to conduct predefined exercises where they have the opportunity to analyse results in a high level of detail. This includes the analysis of statistical quality parameters but also the meaning of transformation parameters, rotation matrices, calibration and orientation data. As one specific advantage, PhoX allows for the interactive modification of single parameters and the direct view of the resulting effect in image or object space.

  15. Software Construction and Composition Tools for Petascale Computing SCW0837 Progress Report

    Energy Technology Data Exchange (ETDEWEB)

    Epperly, T W; Hochstein, L

    2011-09-12

    The majority of scientific software is distributed as source code. As the number of library dependencies and supported platforms increases, so does the complexity of describing the rules for configuring and building software. In this project, we have performed an empirical study of the magnitude of the build problem by examining the development history of two DOE-funded scientific software projects. We have developed MixDown, a meta-build tool, to simplify the task of building applications that depend on multiple third-party libraries. The results of this research indicate that the effort that scientific programmers spend takes a significant fraction of the total development effort and that the use of MixDown can significantly simplify the task of building software with multiple dependencies.

  16. Software attribute visualization for high integrity software

    Energy Technology Data Exchange (ETDEWEB)

    Pollock, G.M.

    1998-03-01

    This report documents a prototype tool developed to investigate the use of visualization and virtual reality technologies for improving software surety confidence. The tool is utilized within the execution phase of the software life cycle. It provides a capability to monitor an executing program against prespecified requirements constraints provided in a program written in the requirements specification language SAGE. The resulting Software Attribute Visual Analysis Tool (SAVAnT) also provides a technique to assess the completeness of a software specification.

  17. Analyst Tools and Quality Control Software for the ARM Data System

    Energy Technology Data Exchange (ETDEWEB)

    Moore, S.T.

    2004-12-14

    ATK Mission Research develops analyst tools and automated quality control software in order to assist the Atmospheric Radiation Measurement (ARM) Data Quality Office with their data inspection tasks. We have developed a web-based data analysis and visualization tool, called NCVweb, that allows for easy viewing of ARM NetCDF files. NCVweb, along with our library of sharable Interactive Data Language procedures and functions, allows even novice ARM researchers to be productive with ARM data with only minimal effort. We also contribute to the ARM Data Quality Office by analyzing ARM data streams, developing new quality control metrics, new diagnostic plots, and integrating this information into DQ HandS - the Data Quality Health and Status web-based explorer. We have developed several ways to detect outliers in ARM data streams and have written software to run in an automated fashion to flag these outliers.

  18. Recent development in software and automation tools for high-throughput discovery bioanalysis.

    Science.gov (United States)

    Shou, Wilson Z; Zhang, Jun

    2012-05-01

    Bioanalysis with LC-MS/MS has been established as the method of choice for quantitative determination of drug candidates in biological matrices in drug discovery and development. The LC-MS/MS bioanalytical support for drug discovery, especially for early discovery, often requires high-throughput (HT) analysis of large numbers of samples (hundreds to thousands per day) generated from many structurally diverse compounds (tens to hundreds per day) with a very quick turnaround time, in order to provide important activity and liability data to move discovery projects forward. Another important consideration for discovery bioanalysis is its fit-for-purpose quality requirement depending on the particular experiments being conducted at this stage, and it is usually not as stringent as those required in bioanalysis supporting drug development. These aforementioned attributes of HT discovery bioanalysis made it an ideal candidate for using software and automation tools to eliminate manual steps, remove bottlenecks, improve efficiency and reduce turnaround time while maintaining adequate quality. In this article we will review various recent developments that facilitate automation of individual bioanalytical procedures, such as sample preparation, MS/MS method development, sample analysis and data review, as well as fully integrated software tools that manage the entire bioanalytical workflow in HT discovery bioanalysis. In addition, software tools supporting the emerging high-resolution accurate MS bioanalytical approach are also discussed.

  19. Application of image editing software for forensic detection of image ...

    African Journals Online (AJOL)

    Application of image editing software for forensic detection of image. ... The image editing software's available today is apt for creating visually compelling and sophisticated fake images, ... EMAIL FREE FULL TEXT EMAIL FREE FULL TEXT

  20. Design and implementation of a software tool intended for simulation and test of real time codes

    International Nuclear Information System (INIS)

    Le Louarn, C.

    1986-09-01

    The objective of real time software testing is to show off processing errors and unobserved functional requirements or timing constraints in a code. In the perspective of safety analysis of nuclear equipments of power plants testing should be carried independently from the physical process (which is not generally available), and because casual hardware failures must be considered. We propose here a simulation and test tool, integrally software, with large interactive possibilities for testing assembly code running on microprocessor. The OST (outil d'aide a la simulation et au Test de logiciels temps reel) simulates code execution and hardware or software environment behaviour. Test execution is closely monitored and many useful informations are automatically saved. The present thesis work details, after exposing methods and tools dedicated to real time software, the OST system. We show the internal mechanisms and objects of the system: particularly ''events'' (which describe evolutions of the system under test) and mnemonics (which describe the variables). Then, we detail the interactive means available to the user for constructing the test data and the environment of the tested software. Finally, a prototype implementation is presented along with the results of the tests carried out. This demonstrates the many advantages of the use of an automatic tool over a manual investigation. As a conclusion, further developments, nececessary to complete the final tool are rewieved [fr

  1. 2006 XSD Scientific Software Workshop report.

    Energy Technology Data Exchange (ETDEWEB)

    Evans, K., Jr.; De Carlo, F.; Jemian, P.; Lang, J.; Lienert, U.; Maclean, J.; Newville, M.; Tieman, B.; Toby, B.; van Veenendaal, B.; Univ. of Chicago

    2006-01-22

    In May of 2006, a committee was formed to assess the fundamental needs and opportunities in scientific software for x-ray data reduction, analysis, modeling, and simulation. This committee held a series of discussions throughout the summer, conducted a poll of the members of the x-ray community, and held a workshop. This report details the findings and recommendations of the committee. Each experiment performed at the APS requires three crucial ingredients: the powerful x-ray source, an optimized instrument to perform measurements, and computer software to acquire, visualize, and analyze the experimental observations. While the APS has invested significant resources in the accelerator, investment in other areas such as scientific software for data analysis and visualization has lagged behind. This has led to the adoption of a wide variety of software with variable levels of usability. In order to maximize the scientific output of the APS, it is essential to support the broad development of real-time analysis and data visualization software. As scientists attack problems of increasing sophistication and deal with larger and more complex data sets, software is playing an ever more important role. Furthermore, our need for excellent and flexible scientific software can only be expected to increase, as the upgrade of the APS facility and the implementation of advanced detectors create a host of new measurement capabilities. New software analysis tools must be developed to take full advantage of these capabilities. It is critical that the APS take the lead in software development and the implementation of theory to software to ensure the continued success of this facility. The topics described in this report are relevant to the APS today and critical for the APS upgrade plan. Implementing these recommendations will have a positive impact on the scientific productivity of the APS today and will be even more critical in the future.

  2. Sophistication and Performance of Italian Agri‐food Exports

    Directory of Open Access Journals (Sweden)

    Anna Carbone

    2012-06-01

    Full Text Available Nonprice competition is increasingly important in world food markets. Recently, the expression ‘export sophistication’ has been introduced in the economic literature to refer to a wide set of attributes that increase product value. An index has been proposed to measure sophistication in an indirect way through the per capita GDP of exporting countries (Lall et al., 2006; Haussmann et al., 2007.The paper applies the sophistication measure to the Italian food export sector, moving from an analysis of trends and performance of Italian food exports. An original way to disentangle different components in the temporal variation of the sophistication index is also proposed.Results show that the sophistication index offers original insights on recent trends in world food exports and with respect to Italian core food exports.

  3. A tool to include gamma analysis software into a quality assurance program.

    Science.gov (United States)

    Agnew, Christina E; McGarry, Conor K

    2016-03-01

    To provide a tool to enable gamma analysis software algorithms to be included in a quality assurance (QA) program. Four image sets were created comprising two geometric images to independently test the distance to agreement (DTA) and dose difference (DD) elements of the gamma algorithm, a clinical step and shoot IMRT field and a clinical VMAT arc. The images were analysed using global and local gamma analysis with 2 in-house and 8 commercially available software encompassing 15 software versions. The effect of image resolution on gamma pass rates was also investigated. All but one software accurately calculated the gamma passing rate for the geometric images. Variation in global gamma passing rates of 1% at 3%/3mm and over 2% at 1%/1mm was measured between software and software versions with analysis of appropriately sampled images. This study provides a suite of test images and the gamma pass rates achieved for a selection of commercially available software. This image suite will enable validation of gamma analysis software within a QA program and provide a frame of reference by which to compare results reported in the literature from various manufacturers and software versions. Copyright © 2015. Published by Elsevier Ireland Ltd.

  4. A review of electronic engineering design free software tools

    OpenAIRE

    Medrano Sánchez, Carlos; Plaza García, Inmaculada; Castro Gil, Manuel Alonso; García Sevilla, Francisco; Martínez Calero, J.D.; Pou Félix, Josep; Corbalán Fuertes, Montserrat

    2010-01-01

    In this paper, we review electronic design free software tools. We have searched open source programs that help with several tasks of the electronic design flow: analog and digital simulation, schematic capture, printed circuit board design and hardware description language compilation and simulation. Using some rapid criteria for verifying their availability, we have selected some of them which are worth working with. This work intends to perform a deeper analysis of fre...

  5. Ignominy: a tool for software dependency and metric analysis with examples from large HEP packages

    International Nuclear Information System (INIS)

    Tuura, L.A.; Taylor, L.

    2001-01-01

    Ignominy is a tool developed in the CMS IGUANA project to analyse the structure of software systems. Its primary component is a dependency scanner that distills information into human-usable forms. It also includes several tools to visualise the collected data in the form of graphical views and numerical metrics. Ignominy was designed to adapt to almost any reasonable structure, and it has been used to analyse several large projects. The original purpose of Ignominy was to help us better ensure the quality of our own software, and in particular warn us about possible structural problems early on. As a part of this activity it is now used as a standard part of our release procedure. The authors also use it to evaluate and study the quality of external packages they plan to make use of. The authors describe what Ignominy can find out, and how it can be used to visualise and assess a software structure. The authors also discuss the inherent problems of the analysis as well as the different approaches to modularity the tool makes quite evident. The focus is the illustration of these issues through the analysis results for several sizable HEP software projects

  6. Life Cycle Assessment Studies of Chemical and Biochemical Processes through the new LCSoft Software-tool

    DEFF Research Database (Denmark)

    Supawanich, Perapong; Malakul, Pomthong; Gani, Rafiqul

    2015-01-01

    requirements have to be evaluated together with environmental and economic aspects. The LCSoft software-tool has been developed to perform LCA as a stand-alone tool as well as integrated with other process design tools such as process simulation, economic analysis (ECON), and sustainable process design...

  7. An open-source software tool for the generation of relaxation time maps in magnetic resonance imaging

    International Nuclear Information System (INIS)

    Messroghli, Daniel R; Rudolph, Andre; Abdel-Aty, Hassan; Wassmuth, Ralf; Kühne, Titus; Dietz, Rainer; Schulz-Menger, Jeanette

    2010-01-01

    In magnetic resonance (MR) imaging, T1, T2 and T2* relaxation times represent characteristic tissue properties that can be quantified with the help of specific imaging strategies. While there are basic software tools for specific pulse sequences, until now there is no universal software program available to automate pixel-wise mapping of relaxation times from various types of images or MR systems. Such a software program would allow researchers to test and compare new imaging strategies and thus would significantly facilitate research in the area of quantitative tissue characterization. After defining requirements for a universal MR mapping tool, a software program named MRmap was created using a high-level graphics language. Additional features include a manual registration tool for source images with motion artifacts and a tabular DICOM viewer to examine pulse sequence parameters. MRmap was successfully tested on three different computer platforms with image data from three different MR system manufacturers and five different sorts of pulse sequences: multi-image inversion recovery T1; Look-Locker/TOMROP T1; modified Look-Locker (MOLLI) T1; single-echo T2/T2*; and multi-echo T2/T2*. Computing times varied between 2 and 113 seconds. Estimates of relaxation times compared favorably to those obtained from non-automated curve fitting. Completed maps were exported in DICOM format and could be read in standard software packages used for analysis of clinical and research MR data. MRmap is a flexible cross-platform research tool that enables accurate mapping of relaxation times from various pulse sequences. The software allows researchers to optimize quantitative MR strategies in a manufacturer-independent fashion. The program and its source code were made available as open-source software on the internet

  8. An open-source software tool for the generation of relaxation time maps in magnetic resonance imaging

    Directory of Open Access Journals (Sweden)

    Kühne Titus

    2010-07-01

    Full Text Available Abstract Background In magnetic resonance (MR imaging, T1, T2 and T2* relaxation times represent characteristic tissue properties that can be quantified with the help of specific imaging strategies. While there are basic software tools for specific pulse sequences, until now there is no universal software program available to automate pixel-wise mapping of relaxation times from various types of images or MR systems. Such a software program would allow researchers to test and compare new imaging strategies and thus would significantly facilitate research in the area of quantitative tissue characterization. Results After defining requirements for a universal MR mapping tool, a software program named MRmap was created using a high-level graphics language. Additional features include a manual registration tool for source images with motion artifacts and a tabular DICOM viewer to examine pulse sequence parameters. MRmap was successfully tested on three different computer platforms with image data from three different MR system manufacturers and five different sorts of pulse sequences: multi-image inversion recovery T1; Look-Locker/TOMROP T1; modified Look-Locker (MOLLI T1; single-echo T2/T2*; and multi-echo T2/T2*. Computing times varied between 2 and 113 seconds. Estimates of relaxation times compared favorably to those obtained from non-automated curve fitting. Completed maps were exported in DICOM format and could be read in standard software packages used for analysis of clinical and research MR data. Conclusions MRmap is a flexible cross-platform research tool that enables accurate mapping of relaxation times from various pulse sequences. The software allows researchers to optimize quantitative MR strategies in a manufacturer-independent fashion. The program and its source code were made available as open-source software on the internet.

  9. STARS software tool for analysis of reliability and safety

    International Nuclear Information System (INIS)

    Poucet, A.; Guagnini, E.

    1989-01-01

    This paper reports on the STARS (Software Tool for the Analysis of Reliability and Safety) project aims at developing an integrated set of Computer Aided Reliability Analysis tools for the various tasks involved in systems safety and reliability analysis including hazard identification, qualitative analysis, logic model construction and evaluation. The expert system technology offers the most promising perspective for developing a Computer Aided Reliability Analysis tool. Combined with graphics and analysis capabilities, it can provide a natural engineering oriented environment for computer assisted reliability and safety modelling and analysis. For hazard identification and fault tree construction, a frame/rule based expert system is used, in which the deductive (goal driven) reasoning and the heuristic, applied during manual fault tree construction, is modelled. Expert system can explain their reasoning so that the analyst can become aware of the why and the how results are being obtained. Hence, the learning aspect involved in manual reliability and safety analysis can be maintained and improved

  10. Development of software tools for 4-D visualization and quantitative analysis of PHITS simulation results

    International Nuclear Information System (INIS)

    Furutaka, Kazuyoshi

    2015-02-01

    A suite of software tools has been developed to facilitate the development of apparatus using a radiation transport simulation code PHITS by enabling 4D visualization (3D space and time) and quantitative analysis of so-called dieaway plots. To deliver useable tools as soon as possible, the existing software was utilized as much as possible; ParaView will be used for the 4D visualization of the results, whereas the analyses of dieaway plots will be done with ROOT toolkit with a tool named “diana”. To enable 4D visualization using ParaView, a group of tools (angel2vtk, DispDCAS1, CamPos) has been developed for the conversion of the data format to the one which can be read from ParaView and to ease the visualization. (author)

  11. BYMUR software: a free and open source tool for quantifying and visualizing multi-risk analyses

    Science.gov (United States)

    Tonini, Roberto; Selva, Jacopo

    2013-04-01

    The BYMUR software aims to provide an easy-to-use open source tool for both computing multi-risk and managing/visualizing/comparing all the inputs (e.g. hazard, fragilities and exposure) as well as the corresponding results (e.g. risk curves, risk indexes). For all inputs, a complete management of inter-model epistemic uncertainty is considered. The BYMUR software will be one of the final products provided by the homonymous ByMuR project (http://bymur.bo.ingv.it/) funded by Italian Ministry of Education, Universities and Research (MIUR), focused to (i) provide a quantitative and objective general method for a comprehensive long-term multi-risk analysis in a given area, accounting for inter-model epistemic uncertainty through Bayesian methodologies, and (ii) apply the methodology to seismic, volcanic and tsunami risks in Naples (Italy). More specifically, the BYMUR software will be able to separately account for the probabilistic hazard assessment of different kind of hazardous phenomena, the relative (time-dependent/independent) vulnerabilities and exposure data, and their possible (predefined) interactions: the software will analyze these inputs and will use them to estimate both single- and multi- risk associated to a specific target area. In addition, it will be possible to connect the software to further tools (e.g., a full hazard analysis), allowing a dynamic I/O of results. The use of Python programming language guarantees that the final software will be open source and platform independent. Moreover, thanks to the integration of some most popular and rich-featured Python scientific modules (Numpy, Matplotlib, Scipy) with the wxPython graphical user toolkit, the final tool will be equipped with a comprehensive Graphical User Interface (GUI) able to control and visualize (in the form of tables, maps and/or plots) any stage of the multi-risk analysis. The additional features of importing/exporting data in MySQL databases and/or standard XML formats (for

  12. Teaching structure: student use of software tools for understanding macromolecular structure in an undergraduate biochemistry course.

    Science.gov (United States)

    Jaswal, Sheila S; O'Hara, Patricia B; Williamson, Patrick L; Springer, Amy L

    2013-01-01

    Because understanding the structure of biological macromolecules is critical to understanding their function, students of biochemistry should become familiar not only with viewing, but also with generating and manipulating structural representations. We report a strategy from a one-semester undergraduate biochemistry course to integrate use of structural representation tools into both laboratory and homework activities. First, early in the course we introduce the use of readily available open-source software for visualizing protein structure, coincident with modules on amino acid and peptide bond properties. Second, we use these same software tools in lectures and incorporate images and other structure representations in homework tasks. Third, we require a capstone project in which teams of students examine a protein-nucleic acid complex and then use the software tools to illustrate for their classmates the salient features of the structure, relating how the structure helps explain biological function. To ensure engagement with a range of software and database features, we generated a detailed template file that can be used to explore any structure, and that guides students through specific applications of many of the software tools. In presentations, students demonstrate that they are successfully interpreting structural information, and using representations to illustrate particular points relevant to function. Thus, over the semester students integrate information about structural features of biological macromolecules into the larger discussion of the chemical basis of function. Together these assignments provide an accessible introduction to structural representation tools, allowing students to add these methods to their biochemical toolboxes early in their scientific development. © 2013 by The International Union of Biochemistry and Molecular Biology.

  13. Systematization and sophistication of a comprehensive sensitivity analysis program. Phase 2

    International Nuclear Information System (INIS)

    Oyamada, Kiyoshi; Ikeda, Takao

    2004-02-01

    This study developed minute estimation by adopting comprehensive sensitivity analytical program for reliability of TRU waste repository concepts in a crystalline rock condition. We examined each components and groundwater scenario of geological repository and prepared systematic bases to examine the reliability from the point of comprehensiveness. Models and data are sophisticated to examine the reliability. Based on an existing TRU waste repository concepts, effects of parameters to nuclide migration were quantitatively classified. Those parameters, that will be decided quantitatively, are such as site character of natural barrier and design specification of engineered barriers. Considering the feasibility of those figures of specifications, reliability is re-examined on combinations of those parameters within a practical range. Future issues are; Comprehensive representation of hybrid geosphere model including the fractured medium and permeable matrix medium. Sophistication of tools to develop the reliable combinations of parameters. It is significant to continue this study because the disposal concepts and specification of TRU nuclides containing waste on various sites shall be determined rationally and safely through these studies. (author)

  14. Omics Informatics: From Scattered Individual Software Tools to Integrated Workflow Management Systems.

    Science.gov (United States)

    Ma, Tianle; Zhang, Aidong

    2017-01-01

    Omic data analyses pose great informatics challenges. As an emerging subfield of bioinformatics, omics informatics focuses on analyzing multi-omic data efficiently and effectively, and is gaining momentum. There are two underlying trends in the expansion of omics informatics landscape: the explosion of scattered individual omics informatics tools with each of which focuses on a specific task in both single- and multi- omic settings, and the fast-evolving integrated software platforms such as workflow management systems that can assemble multiple tools into pipelines and streamline integrative analysis for complicated tasks. In this survey, we give a holistic view of omics informatics, from scattered individual informatics tools to integrated workflow management systems. We not only outline the landscape and challenges of omics informatics, but also sample a number of widely used and cutting-edge algorithms in omics data analysis to give readers a fine-grained view. We survey various workflow management systems (WMSs), classify them into three levels of WMSs from simple software toolkits to integrated multi-omic analytical platforms, and point out the emerging needs for developing intelligent workflow management systems. We also discuss the challenges, strategies and some existing work in systematic evaluation of omics informatics tools. We conclude by providing future perspectives of emerging fields and new frontiers in omics informatics.

  15. Ident 1D - a novel software tool for an easy identification of material constitutive parameters

    International Nuclear Information System (INIS)

    Le Ber, L.; Cotoni, V.; Nicola, L.; Sainte Catherine, C.

    1998-01-01

    Non-linear finite element computations make use of very sophisticated constitutive equations for description of materials behaviour. The first difficulty encountered by potential users is the gap existing between raw material characterisation on uniaxial specimens and the knowledge of the required equation's parameters. There are very few software for this particular task. IDENT 1D is a special software developed under Matlab language in our laboratory, which is able to provide a complete optimised parameters set for implemented models. The originality of IDENT 1D is that no initial estimation of the material parameters is requested of the user. Two main examples are described in this article: the Lemaitre and Chaboche creep law coupled with damage and a non unified cyclic law proposed by Contesti and Cailletaud with a separation of plastic and viscous strain terms which is called DDI model. For both laws, the identification method is completely described. Each method is then applied to a set of experimental data. In both cases, the results of the parameters identification show a very good agreement with experimental data. (authors)

  16. "SABER": A new software tool for radiotherapy treatment plan evaluation.

    Science.gov (United States)

    Zhao, Bo; Joiner, Michael C; Orton, Colin G; Burmeister, Jay

    2010-11-01

    Both spatial and biological information are necessary in order to perform true optimization of a treatment plan and for predicting clinical outcome. The goal of this work is to develop an enhanced treatment plan evaluation tool which incorporates biological parameters and retains spatial dose information. A software system is developed which provides biological plan evaluation with a novel combination of features. It incorporates hyper-radiosensitivity using the induced-repair model and applies the new concept of dose convolution filter (DCF) to simulate dose wash-out effects due to cell migration, bystander effect, and/or tissue motion during treatment. Further, the concept of spatial DVH (sDVH) is introduced to evaluate and potentially optimize the spatial dose distribution in the target volume. Finally, generalized equivalent uniform dose is derived from both the physical dose distribution (gEUD) and the distribution of equivalent dose in 2 Gy fractions (gEUD2) and the software provides three separate models for calculation of tumor control probability (TCP), normal tissue complication probability (NTCP), and probability of uncomplicated tumor control (P+). TCP, NTCP, and P+ are provided as a function of prescribed dose and multivariable TCP, NTCP, and P+ plots are provided to illustrate the dependence on individual parameters used to calculate these quantities. Ten plans from two clinical treatment sites are selected to test the three calculation models provided by this software. By retaining both spatial and biological information about the dose distribution, the software is able to distinguish features of radiotherapy treatment plans not discernible using commercial systems. Plans that have similar DVHs may have different spatial and biological characteristics and the application of novel tools such as sDVH and DCF within the software may substantially change the apparent plan quality or predicted plan metrics such as TCP and NTCP. For the cases examined

  17. Scipion web tools: Easy to use cryo-EM image processing over the web.

    Science.gov (United States)

    Conesa Mingo, Pablo; Gutierrez, José; Quintana, Adrián; de la Rosa Trevín, José Miguel; Zaldívar-Peraza, Airén; Cuenca Alba, Jesús; Kazemi, Mohsen; Vargas, Javier; Del Cano, Laura; Segura, Joan; Sorzano, Carlos Oscar S; Carazo, Jose María

    2018-01-01

    Macromolecular structural determination by Electron Microscopy under cryogenic conditions is revolutionizing the field of structural biology, interesting a large community of potential users. Still, the path from raw images to density maps is complex, and sophisticated image processing suites are required in this process, often demanding the installation and understanding of different software packages. Here, we present Scipion Web Tools, a web-based set of tools/workflows derived from the Scipion image processing framework, specially tailored to nonexpert users in need of very precise answers at several key stages of the structural elucidation process. © 2017 The Protein Society.

  18. Contingency Contractor Optimization Phase 3 Sustainment Software Design Document - Contingency Contractor Optimization Tool - Prototype

    Energy Technology Data Exchange (ETDEWEB)

    Durfee, Justin David; Frazier, Christopher Rawls; Bandlow, Alisa; Jones, Katherine A

    2016-05-01

    This document describes the final software design of the Contingency Contractor Optimization Tool - Prototype. Its purpose is to provide the overall architecture of the software and the logic behind this architecture. Documentation for the individual classes is provided in the application Javadoc. The Contingency Contractor Optimization project is intended to address Department of Defense mandates by delivering a centralized strategic planning tool that allows senior decision makers to quickly and accurately assess the impacts, risks, and mitigation strategies associated with utilizing contract support. The Contingency Contractor Optimization Tool - Prototype was developed in Phase 3 of the OSD ATL Contingency Contractor Optimization project to support strategic planning for contingency contractors. The planning tool uses a model to optimize the Total Force mix by minimizing the combined total costs for selected mission scenarios. The model optimizes the match of personnel types (military, DoD civilian, and contractors) and capabilities to meet mission requirements as effectively as possible, based on risk, cost, and other requirements.

  19. MyETL: A Java Software Tool to Extract, Transform, and Load Your Business

    Directory of Open Access Journals (Sweden)

    Michele Nuovo

    2015-12-01

    Full Text Available The project follows the development of a Java Software Tool that extracts data from Flat File (Fixed Length Record Type, CSV (Comma Separated Values, and XLS (Microsoft Excel 97-2003 Worksheet file, apply transformation to those sources, and finally load the data into the end target RDBMS. The software refers to a process known as ETL (Extract Transform and Load. Those kinds of systems are called ETL systems.

  20. Development of a User Interface for a Regression Analysis Software Tool

    Science.gov (United States)

    Ulbrich, Norbert Manfred; Volden, Thomas R.

    2010-01-01

    An easy-to -use user interface was implemented in a highly automated regression analysis tool. The user interface was developed from the start to run on computers that use the Windows, Macintosh, Linux, or UNIX operating system. Many user interface features were specifically designed such that a novice or inexperienced user can apply the regression analysis tool with confidence. Therefore, the user interface s design minimizes interactive input from the user. In addition, reasonable default combinations are assigned to those analysis settings that influence the outcome of the regression analysis. These default combinations will lead to a successful regression analysis result for most experimental data sets. The user interface comes in two versions. The text user interface version is used for the ongoing development of the regression analysis tool. The official release of the regression analysis tool, on the other hand, has a graphical user interface that is more efficient to use. This graphical user interface displays all input file names, output file names, and analysis settings for a specific software application mode on a single screen which makes it easier to generate reliable analysis results and to perform input parameter studies. An object-oriented approach was used for the development of the graphical user interface. This choice keeps future software maintenance costs to a reasonable limit. Examples of both the text user interface and graphical user interface are discussed in order to illustrate the user interface s overall design approach.

  1. COMPUTER TOOLS OF DYNAMIC MATHEMATIC SOFTWARE AND METHODICAL PROBLEMS OF THEIR USE

    Directory of Open Access Journals (Sweden)

    Olena V. Semenikhina

    2014-08-01

    Full Text Available The article presents results of analyses of standard computer tools of dynamic mathematic software which are used in solving tasks, and tools on which the teacher can support in the teaching of mathematics. Possibility of the organization of experimental investigating of mathematical objects on the basis of these tools and the wording of new tasks on the basis of the limited number of tools, fast automated check are specified. Some methodological comments on application of computer tools and methodological features of the use of interactive mathematical environments are presented. Problems, which are arising from the use of computer tools, among which rethinking forms and methods of training by teacher, the search for creative problems, the problem of rational choice of environment, check the e-solutions, common mistakes in the use of computer tools are selected.

  2. A Process Framework for Designing Software Reference Architectures for Providing Tools as a Service

    DEFF Research Database (Denmark)

    Chauhan, Muhammad Aufeef; Babar, Muhammad Ali; Probst, Christian W.

    2016-01-01

    of software systems need customized and systematic SRA design and evaluation methods. In this paper, we present a software Reference Architecture Design process Framework (RADeF) that can be used for analysis, design and evaluation of the SRA for provisioning of Tools as a Service as part of a cloud......Software Reference Architecture (SRA), which is a generic architecture solution for a specific type of software systems, provides foundation for the design of concrete architectures in terms of architecture design guidelines and architecture elements. The complexity and size of certain types......-enabled workSPACE (TSPACE). The framework is based on the state of the art results from literature and our experiences with designing software architectures for cloud-based systems. We have applied RADeF SRA design two types of TSPACE: software architecting TSPACE and software implementation TSPACE...

  3. JULIDE: a software tool for 3D reconstruction and statistical analysis of autoradiographic mouse brain sections.

    Directory of Open Access Journals (Sweden)

    Delphine Ribes

    Full Text Available In this article we introduce JULIDE, a software toolkit developed to perform the 3D reconstruction, intensity normalization, volume standardization by 3D image registration and voxel-wise statistical analysis of autoradiographs of mouse brain sections. This software tool has been developed in the open-source ITK software framework and is freely available under a GPL license. The article presents the complete image processing chain from raw data acquisition to 3D statistical group analysis. Results of the group comparison in the context of a study on spatial learning are shown as an illustration of the data that can be obtained with this tool.

  4. AMIDE: A Free Software Tool for Multimodality Medical Image Analysis

    Directory of Open Access Journals (Sweden)

    Andreas Markus Loening

    2003-07-01

    Full Text Available Amide's a Medical Image Data Examiner (AMIDE has been developed as a user-friendly, open-source software tool for displaying and analyzing multimodality volumetric medical images. Central to the package's abilities to simultaneously display multiple data sets (e.g., PET, CT, MRI and regions of interest is the on-demand data reslicing implemented within the program. Data sets can be freely shifted, rotated, viewed, and analyzed with the program automatically handling interpolation as needed from the original data. Validation has been performed by comparing the output of AMIDE with that of several existing software packages. AMIDE runs on UNIX, Macintosh OS X, and Microsoft Windows platforms, and it is freely available with source code under the terms of the GNU General Public License.

  5. The First Sophists and the Uses of History.

    Science.gov (United States)

    Jarratt, Susan C.

    1987-01-01

    Reviews the history of intellectual views on the Greek sophists in three phases: (1) their disparagement by Plato and Aristotle as the morally disgraceful "other"; (2) nineteenth century British positivists' reappraisal of these relativists as ethically and scientifically superior; and (3) twentieth century versions of the sophists as…

  6. APT - NASA ENHANCED VERSION OF AUTOMATICALLY PROGRAMMED TOOL SOFTWARE - STAND-ALONE VERSION

    Science.gov (United States)

    Premo, D. A.

    1994-01-01

    The APT code is one of the most widely used software tools for complex numerically controlled (N/C) machining. APT is an acronym for Automatically Programmed Tools and is used to denote both a language and the computer software that processes that language. Development of the APT language and software system was begun over twenty years ago as a U. S. government sponsored industry and university research effort. APT is a "problem oriented" language that was developed for the explicit purpose of aiding the N/C machine tools. Machine-tool instructions and geometry definitions are written in the APT language to constitute a "part program." The APT part program is processed by the APT software to produce a cutter location (CL) file. This CL file may then be processed by user supplied post processors to convert the CL data into a form suitable for a particular N/C machine tool. This June, 1989 offering of the APT system represents an adaptation, with enhancements, of the public domain version of APT IV/SSX8 to the DEC VAX-11/780 for use by the Engineering Services Division of the NASA Goddard Space Flight Center. Enhancements include the super pocket feature which allows concave and convex polygon shapes of up to 40 points including shapes that overlap, that leave islands of material within the pocket, and that have one or more arcs as part of the pocket boundary. Recent modifications to APT include a rework of the POCKET subroutine and correction of an error that prevented the use within a macro of a macro variable cutter move statement combined with macro variable double check surfaces. Former modifications included the expansion of array and buffer sizes to accommodate larger part programs, and the insertion of a few user friendly error messages. The APT system software on the DEC VAX-11/780 is organized into two separate programs: the load complex and the APT processor. The load complex handles the table initiation phase and is usually only run when changes to the

  7. PIPER: Performance Insight for Programmers and Exascale Runtimes: Guiding the Development of the Exascale Software Stack

    Energy Technology Data Exchange (ETDEWEB)

    Mellor-Crummey, John [Rice Univ., Houston, TX (United States)

    2017-10-20

    The PIPER project set out to develop methodologies and software for measurement, analysis, attribution, and presentation of performance data for extreme-scale systems. Goals of the project were to support analysis of massive multi-scale parallelism, heterogeneous architectures, multi-faceted performance concerns, and to support both post-mortem performance analysis to identify program features that contribute to problematic performance and on-line performance analysis to drive adaptation. This final report summarizes the research and development activity at Rice University as part of the PIPER project. Producing a complete suite of performance tools for exascale platforms during the course of this project was impossible since both hardware and software for exascale systems is still a moving target. For that reason, the project focused broadly on the development of new techniques for measurement and analysis of performance on modern parallel architectures, enhancements to HPCToolkit’s software infrastructure to support our research goals or use on sophisticated applications, engaging developers of multithreaded runtimes to explore how support for tools should be integrated into their designs, engaging operating system developers with feature requests for enhanced monitoring support, engaging vendors with requests that they add hardware measure- ment capabilities and software interfaces needed by tools as they design new components of HPC platforms including processors, accelerators and networks, and finally collaborations with partners interested in using HPCToolkit to analyze and tune scalable parallel applications.

  8. CytoBayesJ: software tools for Bayesian analysis of cytogenetic radiation dosimetry data.

    Science.gov (United States)

    Ainsbury, Elizabeth A; Vinnikov, Volodymyr; Puig, Pedro; Maznyk, Nataliya; Rothkamm, Kai; Lloyd, David C

    2013-08-30

    A number of authors have suggested that a Bayesian approach may be most appropriate for analysis of cytogenetic radiation dosimetry data. In the Bayesian framework, probability of an event is described in terms of previous expectations and uncertainty. Previously existing, or prior, information is used in combination with experimental results to infer probabilities or the likelihood that a hypothesis is true. It has been shown that the Bayesian approach increases both the accuracy and quality assurance of radiation dose estimates. New software entitled CytoBayesJ has been developed with the aim of bringing Bayesian analysis to cytogenetic biodosimetry laboratory practice. CytoBayesJ takes a number of Bayesian or 'Bayesian like' methods that have been proposed in the literature and presents them to the user in the form of simple user-friendly tools, including testing for the most appropriate model for distribution of chromosome aberrations and calculations of posterior probability distributions. The individual tools are described in detail and relevant examples of the use of the methods and the corresponding CytoBayesJ software tools are given. In this way, the suitability of the Bayesian approach to biological radiation dosimetry is highlighted and its wider application encouraged by providing a user-friendly software interface and manual in English and Russian. Copyright © 2013 Elsevier B.V. All rights reserved.

  9. Building Software Tools for Combat Modeling and Analysis

    National Research Council Canada - National Science Library

    Yuanxin, Chen

    2004-01-01

    ... (Meta-Language for Combat Simulations) and its associated parser and C++ code generator were designed to reduce the amount of time and developmental efforts needed to build sophisticated real world combat simulations. A C++...

  10. Cumulative Dominance and Probabilistic Sophistication

    NARCIS (Netherlands)

    Wakker, P.P.; Sarin, R.H.

    2000-01-01

    Machina & Schmeidler (Econometrica, 60, 1992) gave preference conditions for probabilistic sophistication, i.e. decision making where uncertainty can be expressed in terms of (subjective) probabilities without commitment to expected utility maximization. This note shows that simpler and more general

  11. Assess/Mitigate Risk through the Use of Computer-Aided Software Engineering (CASE) Tools

    Science.gov (United States)

    Aguilar, Michael L.

    2013-01-01

    The NASA Engineering and Safety Center (NESC) was requested to perform an independent assessment of the mitigation of the Constellation Program (CxP) Risk 4421 through the use of computer-aided software engineering (CASE) tools. With the cancellation of the CxP, the assessment goals were modified to capture lessons learned and best practices in the use of CASE tools. The assessment goal was to prepare the next program for the use of these CASE tools. The outcome of the assessment is contained in this document.

  12. Managing the Testing Process Practical Tools and Techniques for Managing Hardware and Software Testing

    CERN Document Server

    Black, Rex

    2011-01-01

    New edition of one of the most influential books on managing software and hardware testing In this new edition of his top-selling book, Rex Black walks you through the steps necessary to manage rigorous testing programs of hardware and software. The preeminent expert in his field, Mr. Black draws upon years of experience as president of both the International and American Software Testing Qualifications boards to offer this extensive resource of all the standards, methods, and tools you'll need. The book covers core testing concepts and thoroughly examines the best test management practices

  13. SOFTWARE TOOL FOR LASER CUTTING PROCESS CONTROL – SOLVING REAL INDUSTRIAL CASE STUDIES

    Directory of Open Access Journals (Sweden)

    Miloš Madić

    2016-08-01

    Full Text Available Laser cutting is one of the leading non-conventional machining technologies with a wide spectrum of application in modern industry. It order to exploit a number of advantages that this technology offers for contour cutting of materials, it is necessary to carefully select laser cutting conditions for each given workpiece material, thickness and desired cut qualities. In other words, there is a need for process control of laser cutting. After a comprehensive analysis of the main laser cutting parameters and process performance characteristics, the application of the developed software tool “BRUTOMIZER” for off-line control of CO2 laser cutting process of three different workpiece materials (mild steel, stainless steel and aluminum is illustrated. Advantages and abilities of the developed software tool are also illustrated.

  14. High Energy Physics Forum for Computational Excellence: Working Group Reports (I. Applications Software II. Software Libraries and Tools III. Systems)

    Energy Technology Data Exchange (ETDEWEB)

    Habib, Salman [Argonne National Lab. (ANL), Argonne, IL (United States); Roser, Robert [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); LeCompte, Tom [Argonne National Lab. (ANL), Argonne, IL (United States); Marshall, Zach [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Borgland, Anders [SLAC National Accelerator Lab., Menlo Park, CA (United States); Viren, Brett [Brookhaven National Lab. (BNL), Upton, NY (United States); Nugent, Peter [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Asai, Makato [SLAC National Accelerator Lab., Menlo Park, CA (United States); Bauerdick, Lothar [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Finkel, Hal [Argonne National Lab. (ANL), Argonne, IL (United States); Gottlieb, Steve [Indiana Univ., Bloomington, IN (United States); Hoeche, Stefan [SLAC National Accelerator Lab., Menlo Park, CA (United States); Sheldon, Paul [Vanderbilt Univ., Nashville, TN (United States); Vay, Jean-Luc [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Elmer, Peter [Princeton Univ., NJ (United States); Kirby, Michael [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Patton, Simon [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Potekhin, Maxim [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Yanny, Brian [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Calafiura, Paolo [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Dart, Eli [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Gutsche, Oliver [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Izubuchi, Taku [Brookhaven National Lab. (BNL), Upton, NY (United States); Lyon, Adam [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Petravick, Don [Univ. of Illinois, Urbana-Champaign, IL (United States). National Center for Supercomputing Applications (NCSA)

    2015-10-29

    Computing plays an essential role in all aspects of high energy physics. As computational technology evolves rapidly in new directions, and data throughput and volume continue to follow a steep trend-line, it is important for the HEP community to develop an effective response to a series of expected challenges. In order to help shape the desired response, the HEP Forum for Computational Excellence (HEP-FCE) initiated a roadmap planning activity with two key overlapping drivers -- 1) software effectiveness, and 2) infrastructure and expertise advancement. The HEP-FCE formed three working groups, 1) Applications Software, 2) Software Libraries and Tools, and 3) Systems (including systems software), to provide an overview of the current status of HEP computing and to present findings and opportunities for the desired HEP computational roadmap. The final versions of the reports are combined in this document, and are presented along with introductory material.

  15. High Energy Physics Forum for Computational Excellence: Working Group Reports (I. Applications Software II. Software Libraries and Tools III. Systems)

    Energy Technology Data Exchange (ETDEWEB)

    Habib, Salman [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Roser, Robert [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States)

    2015-10-28

    Computing plays an essential role in all aspects of high energy physics. As computational technology evolves rapidly in new directions, and data throughput and volume continue to follow a steep trend-line, it is important for the HEP community to develop an effective response to a series of expected challenges. In order to help shape the desired response, the HEP Forum for Computational Excellence (HEP-FCE) initiated a roadmap planning activity with two key overlapping drivers -- 1) software effectiveness, and 2) infrastructure and expertise advancement. The HEP-FCE formed three working groups, 1) Applications Software, 2) Software Libraries and Tools, and 3) Systems (including systems software), to provide an overview of the current status of HEP computing and to present findings and opportunities for the desired HEP computational roadmap. The final versions of the reports are combined in this document, and are presented along with introductory material.

  16. SAMPA: A free software tool for skin and membrane permeation data analysis.

    Science.gov (United States)

    Bezrouk, Aleš; Fiala, Zdeněk; Kotingová, Lenka; Krulichová, Iva Selke; Kopečná, Monika; Vávrová, Kateřina

    2017-10-01

    Skin and membrane permeation experiments comprise an important step in the development of a transdermal or topical formulation or toxicological risk assessment. The standard method for analyzing these data relies on the linear part of a permeation profile. However, it is difficult to objectively determine when the profile becomes linear, or the experiment duration may be insufficient to reach a maximum or steady state. Here, we present a software tool for Skin And Membrane Permeation data Analysis, SAMPA, that is easy to use and overcomes several of these difficulties. The SAMPA method and software have been validated on in vitro and in vivo permeation data on human, pig and rat skin and model stratum corneum lipid membranes using compounds that range from highly lipophilic polycyclic aromatic hydrocarbons to highly hydrophilic antiviral drug, with and without two permeation enhancers. The SAMPA performance was compared with the standard method using a linear part of the permeation profile and a complex mathematical model. SAMPA is a user-friendly, open-source software tool for analyzing the data obtained from skin and membrane permeation experiments. It runs on a Microsoft Windows platform and is freely available as a Supporting file to this article. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Advanced software tool for the creation of a typical meteorological year

    International Nuclear Information System (INIS)

    Skeiker, Kamal; Ghani, Bashar Abdul

    2008-01-01

    The generation of a typical meteorological year is of great importance for calculations concerning many applications in the field of thermal engineering. In this context, method that has been proposed by Hall et al. is selected for generating typical data, and an improved criterion for final selection of typical meteorological month (TMM) was demonstrated. The final selection of the most representative year was done by examining a composite score S. The composite score was calculated as the weighed sum of the scores of the four meteorological parameters used. These parameters are air dry bulb temperature, relative humidity, wind velocity and global solar radiation intensity. Moreover, a new modern software tool using Delphi 6.0 has been developed, utilizing the Filkenstein-Schafer statistical method for the creation of a typical meteorological year for any site of concern. Whereas, an improved criterion for final selection of typical meteorological month was employed. Such tool allows the user to perform this task without an intimate knowledge of all of the computational details. The final alphanumerical and graphical results are presented on screen, and can be saved to a file or printed as a hard copy. Using this software tool, a typical meteorological year was generated for Damascus, capital of Syria, as a test run example. The data processed used were obtained from the Department of Meteorology and cover a period of 10 years (1991-2000)

  18. Using a Self-Administered Visual Basic Software Tool To Teach Psychological Concepts.

    Science.gov (United States)

    Strang, Harold R.; Sullivan, Amie K.; Schoeny, Zahrl G.

    2002-01-01

    Introduces LearningLinks, a Visual Basic software tool that allows teachers to create individualized learning modules that use constructivist and behavioral learning principles. Describes field testing of undergraduates at the University of Virginia that tested a module designed to improve understanding of the psychological concepts of…

  19. New tools for digital medical image processing implemented in DIP software

    International Nuclear Information System (INIS)

    Araujo, Erica A.C.; Santana, Ivan E.; Lima, Fernando R.A.; Viera, Jose W.

    2011-01-01

    The anthropomorphic models used in computational dosimetry, also called phantoms, are mostly built from stacks of images CT (Computed Tomography) or MRI (Magnetic Resonance Imaging) obtained from scans of patients or volunteers. The construction of voxel phantoms requires computational processing for transforming image formats, dimensional image compression (2D) to form three-dimensional arrays (3D), quantization, resampling, enhancement, restoration and image segmentation, among others. Hardly the computational dosimetry researcher finds all these skills into a single software and often it results in a decreased development of their research or inadequate use of alternative tools. The need to integrate the various tasks of the original digital image processing to obtain an image that can be used in a computational model of exposure led to the development of software DIP (Digital Image Processing). This software reads, writes and edits binary files containing the 3D matrix corresponding to a stack of cross-sectional images of a given geometry that can be a human body or other volume of interest. It can also read any type of computer image and do conversions. When the task involves only one output image, it is saved in the JPEG standard Windows. When it involves a stack of images, the binary output file is called SGI (Interactive Graphic Simulations, a symbol already used in other publications of the Research Group in Numerical Dosimetry). The following paper presents the third version of the DIP software and emphasizes the new tools it implemented. Currently it has the menus Basics, Views, Spatial Domain, Frequency Domain, Segmentations and Study. Each menu contains items and subitems with features that generally require an image as input and produce an image or an attribute in the output. (author)

  20. Practical experience with software tools to assess and improve the quality of existing nuclear analysis and safety codes

    International Nuclear Information System (INIS)

    Marshall, N.H.; Marwil, E.S.; Matthews, S.D.; Stacey, B.J.

    1990-01-01

    Within the constraints of schedule and budget, software tools and techniques were applied to existing FORTRAN codes determining software quality metrics and improving the code quality. Specifically discussed are INEL experiences in applying pretty printers, cross-reference analyzers, and computer aided software engineering (CASE) tools and techniques. These have provided management with measures of the risk potential for individual program modules so that rational decisions can be made on resource allocation. Selected program modules have been modified to reduce the complexity, achieve higher functional independence, and improve the code vectorization. (orig.)

  1. Use of a quality improvement tool, the prioritization matrix, to identify and prioritize triage software algorithm enhancement.

    Science.gov (United States)

    North, Frederick; Varkey, Prathiba; Caraballo, Pedro; Vsetecka, Darlene; Bartel, Greg

    2007-10-11

    Complex decision support software can require significant effort in maintenance and enhancement. A quality improvement tool, the prioritization matrix, was successfully used to guide software enhancement of algorithms in a symptom assessment call center.

  2. Activity Theory applied to Global Software Engineering: Theoretical Foundations and Implications for Tool Builders

    DEFF Research Database (Denmark)

    Tell, Paolo; Ali Babar, Muhammad

    2012-01-01

    Although a plethora of tools are available for Global Software Engineering (GSE) teams, it is being realized increasingly that the most prevalent desktop metaphor underpinning the majority of tools have several inherent limitations. We have proposed that Activity-Based Computing (ABC) can be a pr...... in building supporting infrastructure for GSE, and describe a proof of concept prototype....

  3. Harnessing scientific literature reports for pharmacovigilance. Prototype software analytical tool development and usability testing.

    Science.gov (United States)

    Sorbello, Alfred; Ripple, Anna; Tonning, Joseph; Munoz, Monica; Hasan, Rashedul; Ly, Thomas; Francis, Henry; Bodenreider, Olivier

    2017-03-22

    We seek to develop a prototype software analytical tool to augment FDA regulatory reviewers' capacity to harness scientific literature reports in PubMed/MEDLINE for pharmacovigilance and adverse drug event (ADE) safety signal detection. We also aim to gather feedback through usability testing to assess design, performance, and user satisfaction with the tool. A prototype, open source, web-based, software analytical tool generated statistical disproportionality data mining signal scores and dynamic visual analytics for ADE safety signal detection and management. We leveraged Medical Subject Heading (MeSH) indexing terms assigned to published citations in PubMed/MEDLINE to generate candidate drug-adverse event pairs for quantitative data mining. Six FDA regulatory reviewers participated in usability testing by employing the tool as part of their ongoing real-life pharmacovigilance activities to provide subjective feedback on its practical impact, added value, and fitness for use. All usability test participants cited the tool's ease of learning, ease of use, and generation of quantitative ADE safety signals, some of which corresponded to known established adverse drug reactions. Potential concerns included the comparability of the tool's automated literature search relative to a manual 'all fields' PubMed search, missing drugs and adverse event terms, interpretation of signal scores, and integration with existing computer-based analytical tools. Usability testing demonstrated that this novel tool can automate the detection of ADE safety signals from published literature reports. Various mitigation strategies are described to foster improvements in design, productivity, and end user satisfaction.

  4. Methods and tools used at the IPSN for the safety assessment of critical software

    International Nuclear Information System (INIS)

    Regnier, P.; Henry, J.Y.

    1998-01-01

    A significant feature of EDF's latest 1400MWe ''N4'' generation of pressurized water reactor (PWR) is the extensive use of computerized instrumentation and control, including a fully digital system for the reactor protection function. For the safety assessment of the software driving the operation of this digital reactor protection called SPIN, IPSN has developed and implemented a set of methods and tools. Using the lessons learned from this experience, IPSN has worked at improving those methods and tools, mainly trying to make them more automatic to use, and has participated in an international assessment exercise to test some other methods and tools, either new products on the market or self-developed products. As a result of these works, this paper presents an up to date overview of the IPSN methods and tools used for the assessment of safety critical software. This assessment, which consists of an analysis of all the documentation associated with the technical specifications and of a representative set of functions, is usually carried out in five steps: (1) critical examination of the documents, (2) evaluation of the quality of the code, (3) determination of the critical software components, (4) development of test cases and choice of testing strategy, (5) dynamic analysis (consistency and robustness). This paper also presents methods and tools developed or implemented by IPSN in order to: evaluate the completeness and consistency of specification and design documents written in natural language; build a model and simulate specification or design items; evaluate the quality of the source code; carry out FMEA analysis; run the binary code and perform tests (CLAIRE); perform random or mutational tests. (author)

  5. Conceptualization and software development of a simulation environment for probalistic safety assessment of radioactive waste repositories

    Energy Technology Data Exchange (ETDEWEB)

    Ghofrani, Javad

    2016-05-26

    Uncertainty and sensitivity analysis of complex simulation models are prominent issues, both in scientific research and education. ReSUS (Repository Simulation, Uncertainty propagation and Sensitivity analysis) is an integrated platform to perform such analysis with numerical models that simulate the THMC (Thermal Hydraulical Mechanical and Chemical) coupled processes via different programs, in particular in the context of safety assessments for radioactive waste repositories. This thesis presents the idea behind the software platform ReSUS and its working mechanisms. Apart from the idea and the working mechanisms, the thesis describes applications related to the safety assessment of radioactive waste disposal systems. In this thesis, previous simulation tools (including the preceding version of ReSUS) are analyzed in order to provide a comprehensive view of the state of the art. In comparison to this state, a more sophisticated software tool is developed here, which provides features which are not offered by previous simulation tools. To achieve this objective, the software platform ReSUS provides a framework for handling probabilistic data uncertainties using deterministic external simulation tools, thus enhancing uncertainty and sensitivity analysis. This platform performs probabilistic simulations of various models, in particular THMC coupled processes, using stand-alone deterministic simulation software tools. The complete software development process of the ReSUS Platform is discussed in this thesis. ReSUS components are developed as libraries, which are capable of being linked to other code implementations. In addition, ASCII template files are used as means for uncertainty propagation into the input files of deterministic simulation tools. The embedded input sampler and analysis tools allow for sensitivity analysis in several kinds of simulation designs. The novelty of the ReSUS platform consists in the flexibility to assign external stand-alone software

  6. Conceptualization and software development of a simulation environment for probalistic safety assessment of radioactive waste repositories

    International Nuclear Information System (INIS)

    Ghofrani, Javad

    2016-01-01

    Uncertainty and sensitivity analysis of complex simulation models are prominent issues, both in scientific research and education. ReSUS (Repository Simulation, Uncertainty propagation and Sensitivity analysis) is an integrated platform to perform such analysis with numerical models that simulate the THMC (Thermal Hydraulical Mechanical and Chemical) coupled processes via different programs, in particular in the context of safety assessments for radioactive waste repositories. This thesis presents the idea behind the software platform ReSUS and its working mechanisms. Apart from the idea and the working mechanisms, the thesis describes applications related to the safety assessment of radioactive waste disposal systems. In this thesis, previous simulation tools (including the preceding version of ReSUS) are analyzed in order to provide a comprehensive view of the state of the art. In comparison to this state, a more sophisticated software tool is developed here, which provides features which are not offered by previous simulation tools. To achieve this objective, the software platform ReSUS provides a framework for handling probabilistic data uncertainties using deterministic external simulation tools, thus enhancing uncertainty and sensitivity analysis. This platform performs probabilistic simulations of various models, in particular THMC coupled processes, using stand-alone deterministic simulation software tools. The complete software development process of the ReSUS Platform is discussed in this thesis. ReSUS components are developed as libraries, which are capable of being linked to other code implementations. In addition, ASCII template files are used as means for uncertainty propagation into the input files of deterministic simulation tools. The embedded input sampler and analysis tools allow for sensitivity analysis in several kinds of simulation designs. The novelty of the ReSUS platform consists in the flexibility to assign external stand-alone software

  7. QCScreen: a software tool for data quality control in LC-HRMS based metabolomics.

    Science.gov (United States)

    Simader, Alexandra Maria; Kluger, Bernhard; Neumann, Nora Katharina Nicole; Bueschl, Christoph; Lemmens, Marc; Lirk, Gerald; Krska, Rudolf; Schuhmacher, Rainer

    2015-10-24

    Metabolomics experiments often comprise large numbers of biological samples resulting in huge amounts of data. This data needs to be inspected for plausibility before data evaluation to detect putative sources of error e.g. retention time or mass accuracy shifts. Especially in liquid chromatography-high resolution mass spectrometry (LC-HRMS) based metabolomics research, proper quality control checks (e.g. for precision, signal drifts or offsets) are crucial prerequisites to achieve reliable and comparable results within and across experimental measurement sequences. Software tools can support this process. The software tool QCScreen was developed to offer a quick and easy data quality check of LC-HRMS derived data. It allows a flexible investigation and comparison of basic quality-related parameters within user-defined target features and the possibility to automatically evaluate multiple sample types within or across different measurement sequences in a short time. It offers a user-friendly interface that allows an easy selection of processing steps and parameter settings. The generated results include a coloured overview plot of data quality across all analysed samples and targets and, in addition, detailed illustrations of the stability and precision of the chromatographic separation, the mass accuracy and the detector sensitivity. The use of QCScreen is demonstrated with experimental data from metabolomics experiments using selected standard compounds in pure solvent. The application of the software identified problematic features, samples and analytical parameters and suggested which data files or compounds required closer manual inspection. QCScreen is an open source software tool which provides a useful basis for assessing the suitability of LC-HRMS data prior to time consuming, detailed data processing and subsequent statistical analysis. It accepts the generic mzXML format and thus can be used with many different LC-HRMS platforms to process both multiple

  8. New Tools for New Literacies Research: An Exploration of Usability Testing Software

    Science.gov (United States)

    Asselin, Marlene; Moayeri, Maryam

    2010-01-01

    Competency in the new literacies of the Internet is essential for participating in contemporary society. Researchers studying these new literacies are recognizing the limitations of traditional methodological tools and adapting new technologies and new media for use in research. This paper reports our exploration of usability testing software to…

  9. Towards a Reference Architecture to Provision Tools as a Service for Global Software Development

    DEFF Research Database (Denmark)

    Chauhan, Aufeef; Babar, Muhammad Ali

    2014-01-01

    Organizations involve in Global Software Development (GSD) face challenges in terms of having access to appropriate set of tools for performing distributed engineering and development activities, integration between heterogeneous desktop and web-based tools, management of artifacts developed...... distributed environment. In this paper, we argue the need to have a cloud-enabled platform for supporting GSD and propose reference architecture of a cloud based Platform for providing support to provision ecosystem of the Tools as a Service (PTaaS)....

  10. Training, Quality Assurance Factors, and Tools Investigation: a Work Report and Suggestions on Software Quality Assurance

    Science.gov (United States)

    Lee, Pen-Nan

    1991-01-01

    Previously, several research tasks have been conducted, some observations were obtained, and several possible suggestions have been contemplated involving software quality assurance engineering at NASA Johnson. These research tasks are briefly described. Also, a brief discussion is given on the role of software quality assurance in software engineering along with some observations and suggestions. A brief discussion on a training program for software quality assurance engineers is provided. A list of assurance factors as well as quality factors are also included. Finally, a process model which can be used for searching and collecting software quality assurance tools is presented.

  11. Does Investors' Sophistication Affect Persistence and Pricing of Discretionary Accruals?

    OpenAIRE

    Lanfeng Kao

    2007-01-01

    This paper examines whether the sophistication of market investors influences management's strategy on discretionary accounting choice, and thus changes the persistence of discretionary accruals. The results show that the persistence of discretionary accruals for firms face with naive investors is lower than that for firms face with sophisticated investors. The results also demonstrate that sophisticated investors indeed incorporate the implications of current earnings components into future ...

  12. Comparison of quality control software tools for diffusion tensor imaging.

    Science.gov (United States)

    Liu, Bilan; Zhu, Tong; Zhong, Jianhui

    2015-04-01

    Image quality of diffusion tensor imaging (DTI) is critical for image interpretation, diagnostic accuracy and efficiency. However, DTI is susceptible to numerous detrimental artifacts that may impair the reliability and validity of the obtained data. Although many quality control (QC) software tools are being developed and are widely used and each has its different tradeoffs, there is still no general agreement on an image quality control routine for DTIs, and the practical impact of these tradeoffs is not well studied. An objective comparison that identifies the pros and cons of each of the QC tools will be helpful for the users to make the best choice among tools for specific DTI applications. This study aims to quantitatively compare the effectiveness of three popular QC tools including DTI studio (Johns Hopkins University), DTIprep (University of North Carolina at Chapel Hill, University of Iowa and University of Utah) and TORTOISE (National Institute of Health). Both synthetic and in vivo human brain data were used to quantify adverse effects of major DTI artifacts to tensor calculation as well as the effectiveness of different QC tools in identifying and correcting these artifacts. The technical basis of each tool was discussed, and the ways in which particular techniques affect the output of each of the tools were analyzed. The different functions and I/O formats that three QC tools provide for building a general DTI processing pipeline and integration with other popular image processing tools were also discussed. Copyright © 2015 Elsevier Inc. All rights reserved.

  13. Genomic Islands: an overview of current software tools and future improvements

    Directory of Open Access Journals (Sweden)

    Soares Siomar de Castro

    2016-03-01

    Full Text Available Microbes are highly diverse and widely distributed organisms. They account for ~60% of Earth’s biomass and new predictions point for the existence of 1011 to 1012 species, which are constantly sharing genes through several different mechanisms. Genomic Islands (GI are critical in this context, as they are large regions acquired through horizontal gene transfer. Also, they present common features like genomic signature deviation, transposase genes, flanking tRNAs and insertion sequences. GIs carry large numbers of genes related to specific lifestyle and are commonly classified in Pathogenicity, Resistance, Metabolic or Symbiotic Islands. With the advent of the next-generation sequencing technologies and the deluge of genomic data, many software tools have been developed that aim to tackle the problem of GI prediction and they are all based on the prediction of GI common features. However, there is still room for the development of new software tools that implements new approaches, such as, machine learning and pangenomics based analyses. Finally, GIs will always hold a potential application in every newly invented genomic approach as they are directly responsible for much of the genomic plasticity of bacteria.

  14. Genomic Islands: an overview of current software tools and future improvements.

    Science.gov (United States)

    Soares, Siomar de Castro; Oliveira, Letícia de Castro; Jaiswal, Arun Kumar; Azevedo, Vasco

    2016-03-01

    Microbes are highly diverse and widely distributed organisms. They account for ~60% of Earth's biomass and new predictions point for the existence of 1011 to 1012 species, which are constantly sharing genes through several different mechanisms. Genomic Islands (GI) are critical in this context, as they are large regions acquired through horizontal gene transfer. Also, they present common features like genomic signature deviation, transposase genes, flanking tRNAs and insertion sequences. GIs carry large numbers of genes related to specific lifestyle and are commonly classified in Pathogenicity, Resistance, Metabolic or Symbiotic Islands. With the advent of the next-generation sequencing technologies and the deluge of genomic data, many software tools have been developed that aim to tackle the problem of GI prediction and they are all based on the prediction of GI common features. However, there is still room for the development of new software tools that implements new approaches, such as, machine learning and pangenomics based analyses. Finally, GIs will always hold a potential application in every newly invented genomic approach as they are directly responsible for much of the genomic plasticity of bacteria.

  15. InterFace: A software package for face image warping, averaging, and principal components analysis.

    Science.gov (United States)

    Kramer, Robin S S; Jenkins, Rob; Burton, A Mike

    2017-12-01

    We describe InterFace, a software package for research in face recognition. The package supports image warping, reshaping, averaging of multiple face images, and morphing between faces. It also supports principal components analysis (PCA) of face images, along with tools for exploring the "face space" produced by PCA. The package uses a simple graphical user interface, allowing users to perform these sophisticated image manipulations without any need for programming knowledge. The program is available for download in the form of an app, which requires that users also have access to the (freely available) MATLAB Runtime environment.

  16. Creating a strategic plan for configuration management using computer aided software engineering (CASE) tools

    International Nuclear Information System (INIS)

    Smith, P.R.; Sarfaty, R.

    1993-01-01

    This paper provides guidance in the definition, documentation, measurement, enhancement of processes, and validation of a strategic plan for configuration management (CM). The approach and methodology used in establishing a strategic plan is the same for any enterprise, including the Department of Energy (DOE), commercial nuclear plants, the Department of Defense (DOD), or large industrial complexes. The principles and techniques presented are used world wide by some of the largest corporations. The authors used industry knowledge and the areas of their current employment to illustrate and provide examples. Developing a strategic configuration and information management plan for DOE Idaho Field Office (DOE-ID) facilities is discussed in this paper. A good knowledge of CM principles is the key to successful strategic planning. This paper will describe and define CM elements, and discuss how CM integrates the facility's physical configuration, design basis, and documentation. The strategic plan does not need the support of a computer aided software engineering (CASE) tool. However, the use of the CASE tool provides a methodology for consistency in approach, graphics, and database capability combined to form an encyclopedia and a method of presentation that is easily understood and aids the process of reengineering. CASE tools have much more capability than those stated above. Some examples are supporting a joint application development group (JAD) to prepare a software functional specification document and, if necessary, provide the capability to automatically generate software application code. This paper briefly discusses characteristics and capabilities of two CASE tools that use different methodologies to generate similar deliverables

  17. The Need for Systematic Naming Software Tools for Exchange of Chemical Information

    Directory of Open Access Journals (Sweden)

    Andrey Yerin

    1999-09-01

    Full Text Available The availability of systematic names can enable the simple textual exchange of chemical structure information. The exchange of molecular structures in graphical format or connection tables has become well established in the field of cheminformatics and many structure drawing tools exist to enable this exchange. However, even with the availability of systematic naming rules, software tools to allow the generation of names from structures, and hopefully the reversal of these systematic names back to the original chemical structure, have been sorely lacking in capability and quality. Here we review the need for systematic naming as well as some of the tools and approaches being taken today in this area.

  18. Software Tools for Emittance Measurement and Matching for 12 GeV CEBAF

    Energy Technology Data Exchange (ETDEWEB)

    Turner, Dennis L. [Thomas Jefferson National Accelerator Facility (TJNAF), Newport News, VA (United States)

    2016-05-01

    This paper discusses model-driven setup of the Continuous Electron Beam Accelerator Facility (CEBAF) for the 12GeV era, focusing on qsUtility. qsUtility is a set of software tools created to perform emittance measurements, analyze those measurements, and compute optics corrections based upon the measurements.qsUtility was developed as a toolset to facilitate reducing machine configuration time and reproducibility by way of an accurate accelerator model, and to provide Operations staff with tools to measure and correct machine optics with little or no assistance from optics experts.

  19. GMFilter and SXTestPlate: software tools for improving the SNPlex™ genotyping system

    Directory of Open Access Journals (Sweden)

    Schreiber Stefan

    2009-03-01

    Full Text Available Abstract Background Genotyping of single-nucleotide polymorphisms (SNPs is a fundamental technology in modern genetics. The SNPlex™ mid-throughput genotyping system (Applied Biosystems, Foster City, CA, USA enables the multiplexed genotyping of up to 48 SNPs simultaneously in a single DNA sample. The high level of automation and the large amount of data produced in a high-throughput laboratory require advanced software tools for quality control and workflow management. Results We have developed two programs, which address two main aspects of quality control in a SNPlex™ genotyping environment: GMFilter improves the analysis of SNPlex™ plates by removing wells with a low overall signal intensity. It enables scientists to automatically process the raw data in a standardized way before analyzing a plate with the proprietary GeneMapper software from Applied Biosystems. SXTestPlate examines the genotype concordance of a SNPlex™ test plate, which was typed with a control SNP set. This program allows for regular quality control checks of a SNPlex™ genotyping platform. It is compatible to other genotyping methods as well. Conclusion GMFilter and SXTestPlate provide a valuable tool set for laboratories engaged in genotyping based on the SNPlex™ system. The programs enhance the analysis of SNPlex™ plates with the GeneMapper software and enable scientists to evaluate the performance of their genotyping platform.

  20. The conceptualization and measurement of cognitive health sophistication.

    Science.gov (United States)

    Bodie, Graham D; Collins, William B; Jensen, Jakob D; Davis, Lashara A; Guntzviller, Lisa M; King, Andy J

    2013-01-01

    This article develops a conceptualization and measure of cognitive health sophistication--the complexity of an individual's conceptual knowledge about health. Study 1 provides initial validity evidence for the measure--the Healthy-Unhealthy Other Instrument--by showing its association with other cognitive health constructs indicative of higher health sophistication. Study 2 presents data from a sample of low-income adults to provide evidence that the measure does not depend heavily on health-related vocabulary or ethnicity. Results from both studies suggest that the Healthy-Unhealthy Other Instrument can be used to capture variability in the sophistication or complexity of an individual's health-related schematic structures on the basis of responses to two simple open-ended questions. Methodological advantages of the Healthy-Unhealthy Other Instrument and suggestions for future research are highlighted in the discussion.

  1. A parallel and sensitive software tool for methylation analysis on multicore platforms.

    Science.gov (United States)

    Tárraga, Joaquín; Pérez, Mariano; Orduña, Juan M; Duato, José; Medina, Ignacio; Dopazo, Joaquín

    2015-10-01

    DNA methylation analysis suffers from very long processing time, as the advent of Next-Generation Sequencers has shifted the bottleneck of genomic studies from the sequencers that obtain the DNA samples to the software that performs the analysis of these samples. The existing software for methylation analysis does not seem to scale efficiently neither with the size of the dataset nor with the length of the reads to be analyzed. As it is expected that the sequencers will provide longer and longer reads in the near future, efficient and scalable methylation software should be developed. We present a new software tool, called HPG-Methyl, which efficiently maps bisulphite sequencing reads on DNA, analyzing DNA methylation. The strategy used by this software consists of leveraging the speed of the Burrows-Wheeler Transform to map a large number of DNA fragments (reads) rapidly, as well as the accuracy of the Smith-Waterman algorithm, which is exclusively employed to deal with the most ambiguous and shortest reads. Experimental results on platforms with Intel multicore processors show that HPG-Methyl significantly outperforms in both execution time and sensitivity state-of-the-art software such as Bismark, BS-Seeker or BSMAP, particularly for long bisulphite reads. Software in the form of C libraries and functions, together with instructions to compile and execute this software. Available by sftp to anonymous@clariano.uv.es (password 'anonymous'). juan.orduna@uv.es or jdopazo@cipf.es. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  2. Open source hardware and software platform for robotics and artificial intelligence applications

    Science.gov (United States)

    Liang, S. Ng; Tan, K. O.; Lai Clement, T. H.; Ng, S. K.; Mohammed, A. H. Ali; Mailah, Musa; Azhar Yussof, Wan; Hamedon, Zamzuri; Yussof, Zulkifli

    2016-02-01

    Recent developments in open source hardware and software platforms (Android, Arduino, Linux, OpenCV etc.) have enabled rapid development of previously expensive and sophisticated system within a lower budget and flatter learning curves for developers. Using these platform, we designed and developed a Java-based 3D robotic simulation system, with graph database, which is integrated in online and offline modes with an Android-Arduino based rubbish picking remote control car. The combination of the open source hardware and software system created a flexible and expandable platform for further developments in the future, both in the software and hardware areas, in particular in combination with graph database for artificial intelligence, as well as more sophisticated hardware, such as legged or humanoid robots.

  3. Open source hardware and software platform for robotics and artificial intelligence applications

    International Nuclear Information System (INIS)

    Liang, S Ng; Tan, K O; Clement, T H Lai; Ng, S K; Mohammed, A H Ali; Mailah, Musa; Yussof, Wan Azhar; Hamedon, Zamzuri; Yussof, Zulkifli

    2016-01-01

    Recent developments in open source hardware and software platforms (Android, Arduino, Linux, OpenCV etc.) have enabled rapid development of previously expensive and sophisticated system within a lower budget and flatter learning curves for developers. Using these platform, we designed and developed a Java-based 3D robotic simulation system, with graph database, which is integrated in online and offline modes with an Android-Arduino based rubbish picking remote control car. The combination of the open source hardware and software system created a flexible and expandable platform for further developments in the future, both in the software and hardware areas, in particular in combination with graph database for artificial intelligence, as well as more sophisticated hardware, such as legged or humanoid robots. (paper)

  4. Real-time animation software for customized training to use motor prosthetic systems.

    Science.gov (United States)

    Davoodi, Rahman; Loeb, Gerald E

    2012-03-01

    Research on control of human movement and development of tools for restoration and rehabilitation of movement after spinal cord injury and amputation can benefit greatly from software tools for creating precisely timed animation sequences of human movement. Despite their ability to create sophisticated animation and high quality rendering, existing animation software are not adapted for application to neural prostheses and rehabilitation of human movement. We have developed a software tool known as MSMS (MusculoSkeletal Modeling Software) that can be used to develop models of human or prosthetic limbs and the objects with which they interact and to animate their movement using motion data from a variety of offline and online sources. The motion data can be read from a motion file containing synthesized motion data or recordings from a motion capture system. Alternatively, motion data can be streamed online from a real-time motion capture system, a physics-based simulation program, or any program that can produce real-time motion data. Further, animation sequences of daily life activities can be constructed using the intuitive user interface of Microsoft's PowerPoint software. The latter allows expert and nonexpert users alike to assemble primitive movements into a complex motion sequence with precise timing by simply arranging the order of the slides and editing their properties in PowerPoint. The resulting motion sequence can be played back in an open-loop manner for demonstration and training or in closed-loop virtual reality environments where the timing and speed of animation depends on user inputs. These versatile animation utilities can be used in any application that requires precisely timed animations but they are particularly suited for research and rehabilitation of movement disorders. MSMS's modeling and animation tools are routinely used in a number of research laboratories around the country to study the control of movement and to develop and test

  5. Development of Safety-Critical Software for Nuclear Power Plant using a CASE Tool

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Chang Ho; Oh, Do Young; Kim, Koh Eun; Choi, Woong Seock; Sohn, Se Do; Kim, Jae Hack; Kim, Hang Bae [KEPCO E and C, Daejeon (Korea, Republic of)

    2011-08-15

    The Integrated SOftware Development Environment (ISODE) is developed to provide the major S/W life cycle processes that are composed of development process, V/V process, requirements traceability process, and automated document generation process and target importing process to Programmable Logic Controller (PLC) platform. This provides critical safety software developers with a certified, domain optimized, model-based development environment, and the associated services to reduce time and efforts to develop software such as debugging, simulation, code generation and document generation. This also provides critical safety software verifiers with integrated V/V features of each phase of the software life cycle using appropriate tools such as model test coverage, formal verification, and automated report generation. In addition to development and verification, the ISODE gives a complete traceability solution from the SW design phase to the testing phase. Using this information, the coverage and impact analysis can be done easily whenever software modification is necessary. The final source codes of ISODE are imported into the newly developed PLC environment, as a module based after automatically converted into the format required by PLC. Additional tests for module and unit level are performed on the target platform.

  6. Development of Safety-Critical Software for Nuclear Power Plant using a CASE Tool

    International Nuclear Information System (INIS)

    Kim, Chang Ho; Oh, Do Young; Kim, Koh Eun; Choi, Woong Seock; Sohn, Se Do; Kim, Jae Hack; Kim, Hang Bae

    2011-01-01

    The Integrated SOftware Development Environment (ISODE) is developed to provide the major S/W life cycle processes that are composed of development process, V/V process, requirements traceability process, and automated document generation process and target importing process to Programmable Logic Controller (PLC) platform. This provides critical safety software developers with a certified, domain optimized, model-based development environment, and the associated services to reduce time and efforts to develop software such as debugging, simulation, code generation and document generation. This also provides critical safety software verifiers with integrated V/V features of each phase of the software life cycle using appropriate tools such as model test coverage, formal verification, and automated report generation. In addition to development and verification, the ISODE gives a complete traceability solution from the SW design phase to the testing phase. Using this information, the coverage and impact analysis can be done easily whenever software modification is necessary. The final source codes of ISODE are imported into the newly developed PLC environment, as a module based after automatically converted into the format required by PLC. Additional tests for module and unit level are performed on the target platform

  7. Obfuscation, Learning, and the Evolution of Investor Sophistication

    OpenAIRE

    Bruce Ian Carlin; Gustavo Manso

    2011-01-01

    Investor sophistication has lagged behind the growing complexity of retail financial markets. To explore this, we develop a dynamic model to study the interaction between obfuscation and investor sophistication in mutual fund markets. Taking into account different learning mechanisms within the investor population, we characterize the optimal timing of obfuscation for financial institutions who offer retail products. We show that educational initiatives that are directed to facilitate learnin...

  8. Building an asynchronous web-based tool for machine learning classification.

    Science.gov (United States)

    Weber, Griffin; Vinterbo, Staal; Ohno-Machado, Lucila

    2002-01-01

    Various unsupervised and supervised learning methods including support vector machines, classification trees, linear discriminant analysis and nearest neighbor classifiers have been used to classify high-throughput gene expression data. Simpler and more widely accepted statistical tools have not yet been used for this purpose, hence proper comparisons between classification methods have not been conducted. We developed free software that implements logistic regression with stepwise variable selection as a quick and simple method for initial exploration of important genetic markers in disease classification. To implement the algorithm and allow our collaborators in remote locations to evaluate and compare its results against those of other methods, we developed a user-friendly asynchronous web-based application with a minimal amount of programming using free, downloadable software tools. With this program, we show that classification using logistic regression can perform as well as other more sophisticated algorithms, and it has the advantages of being easy to interpret and reproduce. By making the tool freely and easily available, we hope to promote the comparison of classification methods. In addition, we believe our web application can be used as a model for other bioinformatics laboratories that need to develop web-based analysis tools in a short amount of time and on a limited budget.

  9. Utilization of Software Tools for Uncertainty Calculation in Measurement Science Education

    International Nuclear Information System (INIS)

    Zangl, Hubert; Zine-Zine, Mariam; Hoermaier, Klaus

    2015-01-01

    Despite its importance, uncertainty is often neglected by practitioners in the design of system even in safety critical applications. Thus, problems arising from uncertainty may only be identified late in the design process and thus lead to additional costs. Although there exists numerous tools to support uncertainty calculation, reasons for limited usage in early design phases may be low awareness of the existence of the tools and insufficient training in the practical application. We present a teaching philosophy that addresses uncertainty from the very beginning of teaching measurement science, in particular with respect to the utilization of software tools. The developed teaching material is based on the GUM method and makes use of uncertainty toolboxes in the simulation environment. Based on examples in measurement science education we discuss advantages and disadvantages of the proposed teaching philosophy and include feedback from students

  10. QUALITY SERVICES EVALUATION MODEL BASED ON DEDICATED SOFTWARE TOOL

    Directory of Open Access Journals (Sweden)

    ANDREEA CRISTINA IONICĂ

    2012-10-01

    Full Text Available In this paper we introduced a new model, called Service Quality (SQ, which combines QFD and SERVQUAL methods. This model takes from the SERVQUAL method the five dimensions of requirements and three of characteristics and from the QFD method the application methodology. The originality of the SQ model consists in computing a global index that reflects the customers’ requirements accomplishment level by the quality characteristics. In order to prove the viability of the SQ model, there was developed a software tool that was applied for the evaluation of a health care services provider.

  11. Advanced software tools for digital loose part monitoring systems

    International Nuclear Information System (INIS)

    Ding, Y.

    1996-01-01

    The paper describes two software modules as analysis tools for digital loose part monitoring systems. The first module is called acoustic module which utilizes the multi-media features of modern personal computers to replay the digital stored short-time bursts with sufficient length and in good quality. This is possible due to the so-called puzzle technique developed at ISTec. The second module is called classification module which calculates advanced burst parameters and classifies the acoustic events in pre-defined classes with the help of an artificial multi-layer perception neural network trained with the back propagation algorithm. (author). 7 refs, 7 figs

  12. Object-Oriented Software Tools for the Construction of Preconditioners

    Directory of Open Access Journals (Sweden)

    Eva Mossberg

    1997-01-01

    Full Text Available In recent years, there has been considerable progress concerning preconditioned iterative methods for large and sparse systems of equations arising from the discretization of differential equations. Such methods are particularly attractive in the context of high-performance (parallel computers. However, the implementation of a preconditioner is a nontrivial task. The focus of the present contribution is on a set of object-oriented software tools that support the construction of a family of preconditioners based on fast transforms. By combining objects of different classes, it is possible to conveniently construct any preconditioner within this family.

  13. Advanced software tools for digital loose part monitoring systems

    Energy Technology Data Exchange (ETDEWEB)

    Ding, Y [Institute for Safety Technology (ISTec) GmbH, Garching (Germany)

    1997-12-31

    The paper describes two software modules as analysis tools for digital loose part monitoring systems. The first module is called acoustic module which utilizes the multi-media features of modern personal computers to replay the digital stored short-time bursts with sufficient length and in good quality. This is possible due to the so-called puzzle technique developed at ISTec. The second module is called classification module which calculates advanced burst parameters and classifies the acoustic events in pre-defined classes with the help of an artificial multi-layer perception neural network trained with the back propagation algorithm. (author). 7 refs, 7 figs.

  14. Probabilistic Sophistication, Second Order Stochastic Dominance, and Uncertainty Aversion

    OpenAIRE

    Simone Cerreia-Vioglio; Fabio Maccheroni; Massimo Marinacci; Luigi Montrucchio

    2010-01-01

    We study the interplay of probabilistic sophistication, second order stochastic dominance, and uncertainty aversion, three fundamental notions in choice under uncertainty. In particular, our main result, Theorem 2, characterizes uncertainty averse preferences that satisfy second order stochastic dominance, as well as uncertainty averse preferences that are probabilistically sophisticated.

  15. Wiki as a Corporate Learning Tool: Case Study for Software Development Company

    Science.gov (United States)

    Milovanovic, Milos; Minovic, Miroslav; Stavljanin, Velimir; Savkovic, Marko; Starcevic, Dusan

    2012-01-01

    In our study, we attempted to further investigate how Web 2.0 technologies influence workplace learning. Our particular interest was on using Wiki as a tool for corporate exchange of knowledge with the focus on informal learning. In this study, we collaborated with a multinational software development company that uses Wiki as a corporate tool…

  16. Academic software development tools and techniques (Report on the 1st Workshop WASDeTT at ECOOP 2008)

    NARCIS (Netherlands)

    Wuyts, R.; Kienle, H.M.; Mens, K.; Brand, van den M.G.J.; Kuhn, A.; Eugster, P.

    2009-01-01

    The objective of the 1st International Workshop on Advanced Software Development Tools and Techniques (WASDeTT-1) was to provide interested researchers with a forum to share their tool building experiences and to explore how tools can be built more effectively and efficiently. The theme for this

  17. A software tool for evaluation of hydrogen ingress in CANDU pressure tubes

    International Nuclear Information System (INIS)

    Mihalache, Maria; Vasile, Radu; Deaconu, Mariea

    2009-01-01

    The prediction of hydrogen isotopes concentration into the body and in the rolled joints of operating pressure tubes as a function of reactor hot hours is very important in many fitness-for-service assessments and end of life estimates. The rolled joints are high stress zones with potential for delayed hydride cracking. Predictive models for assessing the long-term deuterium ingress in both body and rolled joint of the pressure tubes have been implemented in a software tool, ROHID, developed in INR-Pitesti. ROHID is a PC-based Windows application with a user-friendly interface that predicts the equivalent hydrogen ingress for Zr-2.5Nb pressure tubes. It uses colour-coded reactor core maps to display the predicted deuterium concentration as a function of time for selected axial locations. Plots of deuterium versus axial location and time for individual pressure tubes are also available. Also, the software tool can predict the exceeding of hydrogen terminal solid solubility (HTSS) from hydrides during precipitation and dissolving processes as a function of time and axial location. (authors)

  18. A software tool for soil clean-up technology selection

    International Nuclear Information System (INIS)

    Vranes, S.; Gonzalez-Valencia, E.; Lodolo, A.; Miertus, S.

    2002-01-01

    Soil remediation is a difficult, time-consuming and expensive operation. A variety of mature and emerging soil remediation technologies is available and future trends in remediation will include continued competition among environmental service companies and technology developers, which will definitely result in further increase in the clean-up options. Consequently, the demand has enhanced developing decision support tools that could help the decision makers to select the most appropriate technology for the specific contaminated site, before the costly remedial actions are taken. Therefore, a software tool for soil clean-up technology selection is currently being developed with the aim of closely working with human decision makers (site owners, local community representatives, environmentalists, regulators, etc.) to assess the available technologies and preliminarily select the preferred remedial options. The analysis for the identification of the best remedial options is based on technical, financial, environmental, and social criteria. These criteria are ranked by all involved parties to determine their relative importance for a particular project. (author)

  19. DASAO: software tool for the management of safeguards, waste and decommissioning

    International Nuclear Information System (INIS)

    Noynaert, Luc; Verwaest, Isi; Libon, Henri; Cuchet, Jean-Marie

    2013-01-01

    Decommissioning of nuclear facilities is a complex process involving operations such as detailed surveys, decontamination and dismantling of equipment's, demolition of buildings and management of resulting waste and nuclear materials if any. This process takes place in a well-developed legal framework and is controlled and followed-up by stakeholders like the Safety Authority, the Radwaste management Agency and the Safeguards Organism. In the framework of its nuclear waste and decommissioning program and more specifically the decommissioning of the BR3 reactor, SCK-CEN has developed different software tools to secure the waste and material traceability, to support the sound management of the decommissioning project and to facilitate the control and the follow-up by the stakeholders. In the case of Belgium, it concerns the Federal Agency for Nuclear Control, the National Agency for radioactive waste management and fissile material and EURATOM and IAEA. In 2005, Belgonucleaire decided to shutdown her Dessel MOX fuel fabrication plant and the production stopped in 2006. According to the final decommissioning plan ('PDF') approved by NIRAS, the decommissioning works should start in 2008 at the earliest. In 2006, the management of Belgonucleaire identified the need for an integrated database and decided to entrust SCK-CEN with its development, because SCK-CEN relies on previous experience in comparable applications namely already approved by authorities such as NIRAS, FANC and EURATOM. The main objectives of this integrated software tool are: - simplified and updated safeguards; - waste and material traceability; - computerized documentation; - support to project management; - periodic and final reporting to waste and safety authorities. The software called DASAO (Database for Safeguards, Waste and Decommissioning) was successfully commissioned in 2008 and extensively used from 2009 to the satisfaction of Belgonucleaire and the stakeholders. SCK-CEN is

  20. Tool Support for Software Lookup Table Optimization

    Directory of Open Access Journals (Sweden)

    Chris Wilcox

    2011-01-01

    Full Text Available A number of scientific applications are performance-limited by expressions that repeatedly call costly elementary functions. Lookup table (LUT optimization accelerates the evaluation of such functions by reusing previously computed results. LUT methods can speed up applications that tolerate an approximation of function results, thereby achieving a high level of fuzzy reuse. One problem with LUT optimization is the difficulty of controlling the tradeoff between performance and accuracy. The current practice of manual LUT optimization adds programming effort by requiring extensive experimentation to make this tradeoff, and such hand tuning can obfuscate algorithms. In this paper we describe a methodology and tool implementation to improve the application of software LUT optimization. Our Mesa tool implements source-to-source transformations for C or C++ code to automate the tedious and error-prone aspects of LUT generation such as domain profiling, error analysis, and code generation. We evaluate Mesa with five scientific applications. Our results show a performance improvement of 3.0× and 6.9× for two molecular biology algorithms, 1.4× for a molecular dynamics program, 2.1× to 2.8× for a neural network application, and 4.6× for a hydrology calculation. We find that Mesa enables LUT optimization with more control over accuracy and less effort than manual approaches.

  1. Using McIDAS-V data analysis and visualization software as an educational tool for understanding the atmosphere

    Science.gov (United States)

    Achtor, T. H.; Rink, T.

    2010-12-01

    The University of Wisconsin’s Space Science and Engineering Center (SSEC) has been at the forefront in developing data analysis and visualization tools for environmental satellites and other geophysical data. The fifth generation of the Man-computer Interactive Data Access System (McIDAS-V) is Java-based, open-source, freely available software that operates on Linux, Macintosh and Windows systems. The software tools provide powerful new data manipulation and visualization capabilities that work with geophysical data in research, operational and educational environments. McIDAS-V provides unique capabilities to support innovative techniques for evaluating research results, teaching and training. McIDAS-V is based on three powerful software elements. VisAD is a Java library for building interactive, collaborative, 4 dimensional visualization and analysis tools. The Integrated Data Viewer (IDV) is a reference application based on the VisAD system and developed by the Unidata program that demonstrates the flexibility that is needed in this evolving environment, using a modern, object-oriented software design approach. The third tool, HYDRA, allows users to build, display and interrogate multi and hyperspectral environmental satellite data in powerful ways. The McIDAS-V software is being used for training and education in several settings. The McIDAS User Group provides training workshops at its annual meeting. Numerous online tutorials with training data sets have been developed to aid users in learning simple and more complex operations in McIDAS-V, all are available online. In a University of Wisconsin-Madison undergraduate course in Radar and Satellite Meteorology, McIDAS-V is used to create and deliver laboratory exercises using case study and real time data. At the high school level, McIDAS-V is used in several exercises in our annual Summer Workshop in Earth and Atmospheric Sciences to provide young scientists the opportunity to examine data with friendly and

  2. GUM2DFT—a software tool for uncertainty evaluation of transient signals in the frequency domain

    International Nuclear Information System (INIS)

    Eichstädt, S; Wilkens, V

    2016-01-01

    The Fourier transform and its counterpart for discrete time signals, the discrete Fourier transform (DFT), are common tools in measurement science and application. Although almost every scientific software package offers ready-to-use implementations of the DFT, the propagation of uncertainties in line with the guide to the expression of uncertainty in measurement (GUM) is typically neglected. This is of particular importance in dynamic metrology, when input estimation is carried out by deconvolution in the frequency domain. To this end, we present the new open-source software tool GUM2DFT, which utilizes closed formulas for the efficient propagation of uncertainties for the application of the DFT, inverse DFT and input estimation in the frequency domain. It handles different frequency domain representations, accounts for autocorrelation and takes advantage of the symmetry inherent in the DFT result for real-valued time domain signals. All tools are presented in terms of examples which form part of the software package. GUM2DFT will foster GUM-compliant evaluation of uncertainty in a DFT-based analysis and enable metrologists to include uncertainty evaluations in their routine work. (paper)

  3. Evaluation of The Virtual Cells Software: a Teaching Tool

    Directory of Open Access Journals (Sweden)

    C.C.P. da Silva

    2005-07-01

    handling,  having an accessible language,  supporting the  software  as an education  tool that is capable  to facilitate  the learning  of the fundamental concepts  about the theme.  Other  workshops are programmed to happen with participants from different educational institutions of Sao Carlos  city,  with the goal to broaden our sample.

  4. Clinical software for MR imaging system, 4

    International Nuclear Information System (INIS)

    Shimizu, Koji; Kasai, Akira; Okamura, Shoichi

    1992-01-01

    Magnetic resonance imaging continues to elicit new application software through the recent technological advances of MR equipment. This paper describes several applications of our newly developed clinical software. The fast SE sequence (RISE) has proved to reduce routine examination time and to improve image quality, and ultra-fast FE sequence (SMASH) was found to extend the diagnostic capabilities in the field of cardiac study. Diffusion/perfusion imaging achieved in our MR system showed significant promise for providing novel information regarding tissue characterization. Furthermore, Image quality and practicalities of MR angiography have been improved by advanced imaging sequences and sophisticated post-processing software. (author)

  5. Software reengineering

    Science.gov (United States)

    Fridge, Ernest M., III

    1991-01-01

    Today's software systems generally use obsolete technology, are not integrated properly with other software systems, and are difficult and costly to maintain. The discipline of reverse engineering is becoming prominent as organizations try to move their systems up to more modern and maintainable technology in a cost effective manner. JSC created a significant set of tools to develop and maintain FORTRAN and C code during development of the Space Shuttle. This tool set forms the basis for an integrated environment to re-engineer existing code into modern software engineering structures which are then easier and less costly to maintain and which allow a fairly straightforward translation into other target languages. The environment will support these structures and practices even in areas where the language definition and compilers do not enforce good software engineering. The knowledge and data captured using the reverse engineering tools is passed to standard forward engineering tools to redesign or perform major upgrades to software systems in a much more cost effective manner than using older technologies. A beta vision of the environment was released in Mar. 1991. The commercial potential for such re-engineering tools is very great. CASE TRENDS magazine reported it to be the primary concern of over four hundred of the top MIS executives.

  6. SIGKit: Software for Introductory Geophysics Toolkit

    Science.gov (United States)

    Kruse, S.; Bank, C. G.; Esmaeili, S.; Jazayeri, S.; Liu, S.; Stoikopoulos, N.

    2017-12-01

    The Software for Introductory Geophysics Toolkit (SIGKit) affords students the opportunity to create model data and perform simple processing of field data for various geophysical methods. SIGkit provides a graphical user interface built with the MATLAB programming language, but can run even without a MATLAB installation. At this time SIGkit allows students to pick first arrivals and match a two-layer model to seismic refraction data; grid total-field magnetic data, extract a profile, and compare this to a synthetic profile; and perform simple processing steps (subtraction of a mean trace, hyperbola fit) to ground-penetrating radar data. We also have preliminary tools for gravity, resistivity, and EM data representation and analysis. SIGkit is being built by students for students, and the intent of the toolkit is to provide an intuitive interface for simple data analysis and understanding of the methods, and act as an entrance to more sophisticated software. The toolkit has been used in introductory courses as well as field courses. First reactions from students are positive. Think-aloud observations of students using the toolkit have helped identify problems and helped shape it. We are planning to compare the learning outcomes of students who have used the toolkit in a field course to students in a previous course to test its effectiveness.

  7. The role of sophisticated accounting system in strategy management

    OpenAIRE

    Naranjo Gil, David

    2004-01-01

    Organizations are designing more sophisticated accounting information systems to meet the strategic goals and enhance their performance. This study examines the effect of accounting information system design on the performance of organizations pursuing different strategic priorities. The alignment between sophisticated accounting information systems and organizational strategy is analyzed. The enabling effect of the accounting information system on performance is also examined. Relationships ...

  8. Design, Development and Delivery of Active Learning Tools in Software Verification & Validation Education

    Science.gov (United States)

    Acharya, Sushil; Manohar, Priyadarshan Anant; Wu, Peter; Maxim, Bruce; Hansen, Mary

    2018-01-01

    Active learning tools are critical in imparting real world experiences to the students within a classroom environment. This is important because graduates are expected to develop software that meets rigorous quality standards in functional and application domains with little to no training. However, there is a well-recognized need for the…

  9. Financial Literacy and Financial Sophistication in the Older Population

    Science.gov (United States)

    Lusardi, Annamaria; Mitchell, Olivia S.; Curto, Vilsa

    2017-01-01

    Using a special-purpose module implemented in the Health and Retirement Study, we evaluate financial sophistication in the American population over the age of 50. We combine several financial literacy questions into an overall index to highlight which questions best capture financial sophistication and examine the sensitivity of financial literacy responses to framing effects. Results show that many older respondents are not financially sophisticated: they fail to grasp essential aspects of risk diversification, asset valuation, portfolio choice, and investment fees. Subgroups with notable deficits include women, the least educated, non-Whites, and those over age 75. In view of the fact that retirees increasingly must take on responsibility for their own retirement security, such meager levels of knowledge have potentially serious and negative implications. PMID:28553191

  10. Financial Literacy and Financial Sophistication in the Older Population.

    Science.gov (United States)

    Lusardi, Annamaria; Mitchell, Olivia S; Curto, Vilsa

    2014-10-01

    Using a special-purpose module implemented in the Health and Retirement Study, we evaluate financial sophistication in the American population over the age of 50. We combine several financial literacy questions into an overall index to highlight which questions best capture financial sophistication and examine the sensitivity of financial literacy responses to framing effects. Results show that many older respondents are not financially sophisticated: they fail to grasp essential aspects of risk diversification, asset valuation, portfolio choice, and investment fees. Subgroups with notable deficits include women, the least educated, non-Whites, and those over age 75. In view of the fact that retirees increasingly must take on responsibility for their own retirement security, such meager levels of knowledge have potentially serious and negative implications.

  11. PC Software graphics tool for conceptual design of space/planetary electrical power systems

    Science.gov (United States)

    Truong, Long V.

    1995-01-01

    This paper describes the Decision Support System (DSS), a personal computer software graphics tool for designing conceptual space and/or planetary electrical power systems. By using the DSS, users can obtain desirable system design and operating parameters, such as system weight, electrical distribution efficiency, and bus power. With this tool, a large-scale specific power system was designed in a matter of days. It is an excellent tool to help designers make tradeoffs between system components, hardware architectures, and operation parameters in the early stages of the design cycle. The DSS is a user-friendly, menu-driven tool with online help and a custom graphical user interface. An example design and results are illustrated for a typical space power system with multiple types of power sources, frequencies, energy storage systems, and loads.

  12. The Data Quality Monitoring Software for the CMS experiment at the LHC

    CERN Document Server

    AUTHOR|(CDS)2071602

    2016-01-01

    The Data Quality Monitoring (DQM) Software is a central tool in the CMS experiment. Its flexibility allows for integration in several key environments Online, for real-time detector monitoring; Offline, for the final, fine-grained Data Certification; Release-Validation, to constantly validate the functionalities and the performance of the reconstruction software; in Monte Carlo productions.Since the end of data taking at a center of mass energy of 8 TeV, the environment in which the DQM lives has undergone fundamental changes. In turn, the DQM system has made significant upgrades in many areas to respond to not only the changes in infrastructure, but also the growing specialized needs of the collaboration with an emphasis on more sophisticated methods for evaluating dataquality, as well as advancing the DQM system to provide quality assessments of various Monte Carlo simulations versus data distributions, monitoring changes in physical effects due to modifications of algorithms or framework, and enabling reg...

  13. PULSim: User-Based Adaptable Simulation Tool for Railway Planning and Operations

    Directory of Open Access Journals (Sweden)

    Yong Cui

    2018-01-01

    Full Text Available Simulation methods are widely used in the field of railway planning and operations. Currently, several commercial software tools are available that not only provide functionality for railway simulation but also enable further evaluation and optimisation of the network for scheduling, dispatching, and capacity research. However, the various tools are all lacking with respect to the standards they utilise as well as their published interfaces. For an end-user, the basic mechanism and the assumptions built into a simulation tool are unknown, which means that the true potential of these software tools is limited. One of the most critical issues is the lack of the ability of users to define a sophisticated workflow, integrated in several rounds of simulation with adjustable parameters and settings. This paper develops and describes a user-based, customisable platform. As the preconditions of the platform, the design aspects for modelling the components of a railway system and building the workflow of railway simulation are elaborated in detail. Based on the model and the workflow, an integrated simulation platform with open interfaces is developed. Users and researchers gain the ability to rapidly develop their own algorithms, supported by the tailored simulation process in a flexible manner. The productivity of using simulation tools for further evaluation and optimisation will be significantly improved through the user-adaptable open interfaces.

  14. Software reliability assessment

    International Nuclear Information System (INIS)

    Barnes, M.; Bradley, P.A.; Brewer, M.A.

    1994-01-01

    The increased usage and sophistication of computers applied to real time safety-related systems in the United Kingdom has spurred on the desire to provide a standard framework within which to assess dependable computing systems. Recent accidents and ensuing legislation have acted as a catalyst in this area. One particular aspect of dependable computing systems is that of software, which is usually designed to reduce risk at the system level, but which can increase risk if it is unreliable. Various organizations have recognized the problem of assessing the risk imposed to the system by unreliable software, and have taken initial steps to develop and use such assessment frameworks. This paper relates the approach of Consultancy Services of AEA Technology in developing a framework to assess the risk imposed by unreliable software. In addition, the paper discusses the experiences gained by Consultancy Services in applying the assessment framework to commercial and research projects. The framework is applicable to software used in safety applications, including proprietary software. Although the paper is written with Nuclear Reactor Safety applications in mind, the principles discussed can be applied to safety applications in all industries

  15. A software tool for advanced MRgFUS prostate therapy planning and follow up

    Science.gov (United States)

    van Straaten, Dörte; Hoogenboom, Martijn; van Amerongen, Martinus J.; Weiler, Florian; Issawi, Jumana Al; Günther, Matthias; Fütterer, Jurgen; Jenne, Jürgen W.

    2017-03-01

    US guided HIFU/FUS ablation for the therapy of prostate cancer is a clinical established method, while MR guided HIFU/FUS applications for prostate recently started clinical evaluation. Even if MRI examination is an excellent diagnostic tool for prostate cancer, it is a time consuming procedure and not practicable within an MRgFUS therapy session. The aim of our ongoing work is to develop software to support therapy planning and post-therapy follow-up for MRgFUS on localized prostate cancer, based on multi-parametric MR protocols. The clinical workflow of diagnosis, therapy and follow-up of MR guided FUS on prostate cancer was deeply analyzed. Based on this, the image processing workflow was designed and all necessary components, e.g. GUI, viewer, registration tools etc. were defined and implemented. The software bases on MeVisLab with several implemented C++ modules for the image processing tasks. The developed software, called LTC (Local Therapy Control) will register and visualize automatically all images (T1w, T2w, DWI etc.) and ADC or perfusion maps gained from the diagnostic MRI session. This maximum of diagnostic information helps to segment all necessary ROIs, e.g. the tumor, for therapy planning. Final therapy planning will be performed based on these segmentation data in the following MRgFUS therapy session. In addition, the developed software should help to evaluate the therapy success, by synchronization and display of pre-therapeutic, therapy and follow-up image data including the therapy plan and thermal dose information. In this ongoing project, the first stand-alone prototype was completed and will be clinically evaluated.

  16. Claire, a tool used for the simulation of events in software tests

    International Nuclear Information System (INIS)

    Henry, J.Y.; Boulc'h, J.; Raguideau, J.; Schoen, D.

    1994-06-01

    CLAIRE provides a purely software system which makes it possible to validate the on line applications dealt out to the specifications domain or the code. This tool offers easy graphic design of the application and of its environment. It caries out quite efficiently the simulation of any logged in model and runs the control of the evolution either dynamically or with prerecorded time. (TEC)

  17. A software tool for rapid flood inundation mapping

    Science.gov (United States)

    Verdin, James; Verdin, Kristine; Mathis, Melissa L.; Magadzire, Tamuka; Kabuchanga, Eric; Woodbury, Mark; Gadain, Hussein

    2016-06-02

    The GIS Flood Tool (GFT) was developed by the U.S. Geological Survey with support from the U.S. Agency for International Development’s Office of U.S. Foreign Disaster Assistance to provide a means for production of reconnaissance-level flood inundation mapping for data-sparse and resource-limited areas of the world. The GFT has also attracted interest as a tool for rapid assessment flood inundation mapping for the Flood Inundation Mapping Program of the U.S. Geological Survey. The GFT can fill an important gap for communities that lack flood inundation mapping by providing a first-estimate of inundation zones, pending availability of resources to complete an engineering study. The tool can also help identify priority areas for application of scarce flood inundation mapping resources. The technical basis of the GFT is an application of the Manning equation for steady flow in an open channel, operating on specially processed digital elevation data. The GFT is implemented as a software extension in ArcGIS. Output maps from the GFT were validated at 11 sites with inundation maps produced previously by the Flood Inundation Mapping Program using standard one-dimensional hydraulic modeling techniques. In 80 percent of the cases, the GFT inundation patterns matched 75 percent or more of the one-dimensional hydraulic model inundation patterns. Lower rates of pattern agreement were seen at sites with low relief and subtle surface water divides. Although the GFT is simple to use, it should be applied with the oversight or review of a qualified hydraulic engineer who understands the simplifying assumptions of the approach.

  18. Performance analysis and optimization of an advanced pharmaceutical wastewater treatment plant through a visual basic software tool (PWWT.VB).

    Science.gov (United States)

    Pal, Parimal; Thakura, Ritwik; Chakrabortty, Sankha

    2016-05-01

    A user-friendly, menu-driven simulation software tool has been developed for the first time to optimize and analyze the system performance of an advanced continuous membrane-integrated pharmaceutical wastewater treatment plant. The software allows pre-analysis and manipulation of input data which helps in optimization and shows the software performance visually on a graphical platform. Moreover, the software helps the user to "visualize" the effects of the operating parameters through its model-predicted output profiles. The software is based on a dynamic mathematical model, developed for a systematically integrated forward osmosis-nanofiltration process for removal of toxic organic compounds from pharmaceutical wastewater. The model-predicted values have been observed to corroborate well with the extensive experimental investigations which were found to be consistent under varying operating conditions like operating pressure, operating flow rate, and draw solute concentration. Low values of the relative error (RE = 0.09) and high values of Willmott-d-index (d will = 0.981) reflected a high degree of accuracy and reliability of the software. This software is likely to be a very efficient tool for system design or simulation of an advanced membrane-integrated treatment plant for hazardous wastewater.

  19. Review of free software tools for image analysis of fluorescence cell micrographs.

    Science.gov (United States)

    Wiesmann, V; Franz, D; Held, C; Münzenmayer, C; Palmisano, R; Wittenberg, T

    2015-01-01

    An increasing number of free software tools have been made available for the evaluation of fluorescence cell micrographs. The main users are biologists and related life scientists with no or little knowledge of image processing. In this review, we give an overview of available tools and guidelines about which tools the users should use to segment fluorescence micrographs. We selected 15 free tools and divided them into stand-alone, Matlab-based, ImageJ-based, free demo versions of commercial tools and data sharing tools. The review consists of two parts: First, we developed a criteria catalogue and rated the tools regarding structural requirements, functionality (flexibility, segmentation and image processing filters) and usability (documentation, data management, usability and visualization). Second, we performed an image processing case study with four representative fluorescence micrograph segmentation tasks with figure-ground and cell separation. The tools display a wide range of functionality and usability. In the image processing case study, we were able to perform figure-ground separation in all micrographs using mainly thresholding. Cell separation was not possible with most of the tools, because cell separation methods are provided only by a subset of the tools and are difficult to parametrize and to use. Most important is that the usability matches the functionality of a tool. To be usable, specialized tools with less functionality need to fulfill less usability criteria, whereas multipurpose tools need a well-structured menu and intuitive graphical user interface. © 2014 Fraunhofer-Institute for Integrated Circuits IIS Journal of Microscopy © 2014 Royal Microscopical Society.

  20. The Impact of Financial Sophistication on Adjustable Rate Mortgage Ownership

    Science.gov (United States)

    Smith, Hyrum; Finke, Michael S.; Huston, Sandra J.

    2011-01-01

    The influence of a financial sophistication scale on adjustable-rate mortgage (ARM) borrowing is explored. Descriptive statistics and regression analysis using recent data from the Survey of Consumer Finances reveal that ARM borrowing is driven by both the least and most financially sophisticated households but for different reasons. Less…

  1. Contingency Contractor Optimization Phase 3 Sustainment Third-Party Software List - Contingency Contractor Optimization Tool - Prototype

    Energy Technology Data Exchange (ETDEWEB)

    Durfee, Justin David; Frazier, Christopher Rawls; Bandlow, Alisa

    2016-05-01

    The Contingency Contractor Optimization Tool - Prototype (CCOT-P) requires several third-party software packages. These are documented below for each of the CCOT-P elements: client, web server, database server, solver, web application and polling application.

  2. A new online software tool for pressure ulcer monitoring as an educational instrument for unified nursing assessment in clinical settings

    Directory of Open Access Journals (Sweden)

    Andrea Pokorná

    2016-07-01

    Full Text Available Data collection and evaluation of that data is crucial for effective quality management and naturally also for prevention and treatment of pressure ulcers. Data collected in a uniform manner by nurses in clinical practice could be used for further analyses. Data about pressure ulcers are collected to differing degrees of quality based on the local policy of the given health care facility and in relation to the nurse’s actual level of knowledge concerning pressure ulcer identification and use of objective scales (i.e. categorization of pressure ulcers. Therefore, we have developed software suitable for data collection which includes some educational tools to promote unified reporting of data by nurses. A description of this software and some educational and learning components of the tool is presented herein. The planned process of clinical application of the newly developed software is also briefly mentioned. The discussion is focused on the usability of the online reporting tool and possible further development of the tool.

  3. An open source software tool to assign the material properties of bone for ABAQUS finite element simulations.

    Science.gov (United States)

    Pegg, Elise C; Gill, Harinderjit S

    2016-09-06

    A new software tool to assign the material properties of bone to an ABAQUS finite element mesh was created and compared with Bonemat, a similar tool originally designed to work with Ansys finite element models. Our software tool (py_bonemat_abaqus) was written in Python, which is the chosen scripting language for ABAQUS. The purpose of this study was to compare the software packages in terms of the material assignment calculation and processing speed. Three element types were compared (linear hexahedral (C3D8), linear tetrahedral (C3D4) and quadratic tetrahedral elements (C3D10)), both individually and as part of a mesh. Comparisons were made using a CT scan of a hemi-pelvis as a test case. A small difference, of -0.05kPa on average, was found between Bonemat version 3.1 (the current version) and our Python package. Errors were found in the previous release of Bonemat (version 3.0 downloaded from www.biomedtown.org) during calculation of the quadratic tetrahedron Jacobian, and conversion of the apparent density to modulus when integrating over the Young׳s modulus field. These issues caused up to 2GPa error in the modulus assignment. For these reasons, we recommend users upgrade to the most recent release of Bonemat. Processing speeds were assessed for the three different element types. Our Python package took significantly longer (110s on average) to perform the calculations compared with the Bonemat software (10s). Nevertheless, the workflow advantages of the package and added functionality makes 'py_bonemat_abaqus' a useful tool for ABAQUS users. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. Characteristics and evolution of the ecosystem of software tools supporting research in molecular biology.

    Science.gov (United States)

    Pazos, Florencio; Chagoyen, Monica

    2018-01-16

    Daily work in molecular biology presently depends on a large number of computational tools. An in-depth, large-scale study of that 'ecosystem' of Web tools, its characteristics, interconnectivity, patterns of usage/citation, temporal evolution and rate of decay is crucial for understanding the forces that shape it and for informing initiatives aimed at its funding, long-term maintenance and improvement. In particular, the long-term maintenance of these tools is compromised because of their specific development model. Hundreds of published studies become irreproducible de facto, as the software tools used to conduct them become unavailable. In this study, we present a large-scale survey of >5400 publications describing Web servers within the two main bibliographic resources for disseminating new software developments in molecular biology. For all these servers, we studied their citation patterns, the subjects they address, their citation networks and the temporal evolution of these factors. We also analysed how these factors affect the availability of these servers (whether they are alive). Our results show that this ecosystem of tools is highly interconnected and adapts to the 'trendy' subjects in every moment. The servers present characteristic temporal patterns of citation/usage, and there is a worrying rate of server 'death', which is influenced by factors such as the server popularity and the institutions that hosts it. These results can inform initiatives aimed at the long-term maintenance of these resources. © The Author(s) 2018. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  5. Building Models in the Classroom: Taking Advantage of Sophisticated Geomorphic Numerical Tools Using a Simple Graphical User Interface

    Science.gov (United States)

    Roy, S. G.; Koons, P. O.; Gerbi, C. C.; Capps, D. K.; Tucker, G. E.; Rogers, Z. A.

    2014-12-01

    Sophisticated numerical tools exist for modeling geomorphic processes and linking them to tectonic and climatic systems, but they are often seen as inaccessible for users with an exploratory level of interest. We have improved the accessibility of landscape evolution models by producing a simple graphics user interface (GUI) that takes advantage of the Channel-Hillslope Integrated Landscape Development (CHILD) model. Model access is flexible: the user can edit values for basic geomorphic, tectonic, and climate parameters, or obtain greater control by defining the spatiotemporal distributions of those parameters. Users can make educated predictions by choosing their own parametric values for the governing equations and interpreting the results immediately through model graphics. This method of modeling allows users to iteratively build their understanding through experimentation. Use of this GUI is intended for inquiry and discovery-based learning activities. We discuss a number of examples of how the GUI can be used at the upper high school, introductory university, and advanced university level. Effective teaching modules initially focus on an inquiry-based example guided by the instructor. As students become familiar with the GUI and the CHILD model, the class can shift to more student-centered exploration and experimentation. To make model interpretations more robust, digital elevation models can be imported and direct comparisons can be made between CHILD model results and natural topography. The GUI is available online through the University of Maine's Earth and Climate Sciences website, through the Community Surface Dynamics Modeling System (CSDMS) model repository, or by contacting the corresponding author.

  6. Software tool for representation and processing of experimental data on high energy interactions of elementary particles

    International Nuclear Information System (INIS)

    Cherepanov, E.O.; Skachkov, N.B.

    2002-01-01

    The software tool is developed for detailed and evident displaying of information about energy and space distribution of secondary particles produced in the processes of elementary particles collisions. As input information the data on the components of 4-momenta of secondary particles is used. As for these data the information obtained from different parts of physical detector (for example, from the calorimeter or tracker) as well as the information obtained with the help of event generator is taken. The tool is intended for use in Windows operation system and is developed on the basis of Borland Delphi. Mathematical architecture of the software tool allows user to receive complete information without making additional calculations. The program automatically performs analysis of structure and distributions of signals and displays the results in a transparent form which allows their quick analysis. To display the information the three-dimensional graphic methods as well as colour decisions based on intuitive associations are also used. (author)

  7. SAGES: a suite of freely-available software tools for electronic disease surveillance in resource-limited settings.

    Directory of Open Access Journals (Sweden)

    Sheri L Lewis

    Full Text Available Public health surveillance is undergoing a revolution driven by advances in the field of information technology. Many countries have experienced vast improvements in the collection, ingestion, analysis, visualization, and dissemination of public health data. Resource-limited countries have lagged behind due to challenges in information technology infrastructure, public health resources, and the costs of proprietary software. The Suite for Automated Global Electronic bioSurveillance (SAGES is a collection of modular, flexible, freely-available software tools for electronic disease surveillance in resource-limited settings. One or more SAGES tools may be used in concert with existing surveillance applications or the SAGES tools may be used en masse for an end-to-end biosurveillance capability. This flexibility allows for the development of an inexpensive, customized, and sustainable disease surveillance system. The ability to rapidly assess anomalous disease activity may lead to more efficient use of limited resources and better compliance with World Health Organization International Health Regulations.

  8. A practical comparison of de novo genome assembly software tools for next-generation sequencing technologies.

    Directory of Open Access Journals (Sweden)

    Wenyu Zhang

    Full Text Available The advent of next-generation sequencing technologies is accompanied with the development of many whole-genome sequence assembly methods and software, especially for de novo fragment assembly. Due to the poor knowledge about the applicability and performance of these software tools, choosing a befitting assembler becomes a tough task. Here, we provide the information of adaptivity for each program, then above all, compare the performance of eight distinct tools against eight groups of simulated datasets from Solexa sequencing platform. Considering the computational time, maximum random access memory (RAM occupancy, assembly accuracy and integrity, our study indicate that string-based assemblers, overlap-layout-consensus (OLC assemblers are well-suited for very short reads and longer reads of small genomes respectively. For large datasets of more than hundred millions of short reads, De Bruijn graph-based assemblers would be more appropriate. In terms of software implementation, string-based assemblers are superior to graph-based ones, of which SOAPdenovo is complex for the creation of configuration file. Our comparison study will assist researchers in selecting a well-suited assembler and offer essential information for the improvement of existing assemblers or the developing of novel assemblers.

  9. SimPhospho: a software tool enabling confident phosphosite assignment.

    Science.gov (United States)

    Suni, Veronika; Suomi, Tomi; Tsubosaka, Tomoya; Imanishi, Susumu Y; Elo, Laura L; Corthals, Garry L

    2018-03-27

    Mass spectrometry combined with enrichment strategies for phosphorylated peptides has been successfully employed for two decades to identify sites of phosphorylation. However, unambiguous phosphosite assignment is considered challenging. Given that site-specific phosphorylation events function as different molecular switches, validation of phosphorylation sites is of utmost importance. In our earlier study we developed a method based on simulated phosphopeptide spectral libraries, which enables highly sensitive and accurate phosphosite assignments. To promote more widespread use of this method, we here introduce a software implementation with improved usability and performance. We present SimPhospho, a fast and user-friendly tool for accurate simulation of phosphopeptide tandem mass spectra. Simulated phosphopeptide spectral libraries are used to validate and supplement database search results, with a goal to improve reliable phosphoproteome identification and reporting. The presented program can be easily used together with the Trans-Proteomic Pipeline and integrated in a phosphoproteomics data analysis workflow. SimPhospho is available for Windows, Linux and Mac operating systems at https://sourceforge.net/projects/simphospho/. It is open source and implemented in C ++. A user's manual with detailed description of data analysis using SimPhospho as well as test data can be found as supplementary material of this article. Supplementary data are available at https://www.btk.fi/research/ computational-biomedicine/software/.

  10. Software Simulates Sight: Flat Panel Mura Detection

    Science.gov (United States)

    2008-01-01

    In the increasingly sophisticated world of high-definition flat screen monitors and television screens, image clarity and the elimination of distortion are paramount concerns. As the devices that reproduce images become more and more sophisticated, so do the technologies that verify their accuracy. By simulating the manner in which a human eye perceives and interprets a visual stimulus, NASA scientists have found ways to automatically and accurately test new monitors and displays. The Spatial Standard Observer (SSO) software metric, developed by Dr. Andrew B. Watson at Ames Research Center, measures visibility and defects in screens, displays, and interfaces. In the design of such a software tool, a central challenge is determining which aspects of visual function to include while accuracy and generality are important, relative simplicity of the software module is also a key virtue. Based on data collected in ModelFest, a large cooperative multi-lab project hosted by the Optical Society of America, the SSO simulates a simplified model of human spatial vision, operating on a pair of images that are viewed at a specific viewing distance with pixels having a known relation to luminance. The SSO measures the visibility of foveal spatial patterns, or the discriminability of two patterns, by incorporating only a few essential components of vision. These components include local contrast transformation, a contrast sensitivity function, local masking, and local pooling. By this construction, the SSO provides output in units of "just noticeable differences" (JND) a unit of measure based on the assumed smallest difference of sensory input detectable by a human being. Herein is the truly amazing ability of the SSO, while conventional methods can manipulate images, the SSO models human perception. This set of equations actually defines a mathematical way of working with an image that accurately reflects the way in which the human eye and mind behold a stimulus. The SSO is

  11. Simulation software support (S3) system a software testing and debugging tool

    International Nuclear Information System (INIS)

    Burgess, D.C.; Mahjouri, F.S.

    1990-01-01

    The largest percentage of technical effort in the software development process is accounted for debugging and testing. It is not unusual for a software development organization to spend over 50% of the total project effort on testing. In the extreme, testing of human-rated software (e.g., nuclear reactor monitoring, training simulator) can cost three to five times as much as all other software engineering steps combined. The Simulation Software Support (S 3 ) System, developed by the Link-Miles Simulation Corporation is ideally suited for real-time simulation applications which involve a large database with models programmed in FORTRAN. This paper will focus on testing elements of the S 3 system. In this paper system support software utilities are provided which enable the loading and execution of modules in the development environment. These elements include the Linking/Loader (LLD) for dynamically linking program modules and loading them into memory and the interactive executive (IEXEC) for controlling the execution of the modules. Features of the Interactive Symbolic Debugger (SD) and the Real Time Executive (RTEXEC) to support the unit and integrated testing will be explored

  12. Updates on resources, software tools, and databases for plant proteomics in 2016-2017.

    Science.gov (United States)

    Misra, Biswapriya B

    2018-02-08

    Proteomics data processing, annotation, and analysis can often lead to major hurdles in large-scale high-throughput bottom-up proteomics experiments. Given the recent rise in protein-based big datasets being generated, efforts in in silico tool development occurrences have had an unprecedented increase; so much so, that it has become increasingly difficult to keep track of all the advances in a particular academic year. However, these tools benefit the plant proteomics community in circumventing critical issues in data analysis and visualization, as these continually developing open-source and community-developed tools hold potential in future research efforts. This review will aim to introduce and summarize more than 50 software tools, databases, and resources developed and published during 2016-2017 under the following categories: tools for data pre-processing and analysis, statistical analysis tools, peptide identification tools, databases and spectral libraries, and data visualization and interpretation tools. Intended for a well-informed proteomics community, finally, efforts in data archiving and validation datasets for the community will be discussed as well. Additionally, the author delineates the current and most commonly used proteomics tools in order to introduce novice readers to this -omics discovery platform. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. Review of Ground Systems Development and Operations (GSDO) Tools for Verifying Command and Control Software

    Science.gov (United States)

    Aguilar, Michael L.; Bonanne, Kevin H.; Favretto, Jeffrey A.; Jackson, Maddalena M.; Jones, Stephanie L.; Mackey, Ryan M.; Sarrel, Marc A.; Simpson, Kimberly A.

    2014-01-01

    The Exploration Systems Development (ESD) Standing Review Board (SRB) requested the NASA Engineering and Safety Center (NESC) conduct an independent review of the plan developed by Ground Systems Development and Operations (GSDO) for identifying models and emulators to create a tool(s) to verify their command and control software. The NESC was requested to identify any issues or weaknesses in the GSDO plan. This document contains the outcome of the NESC review.

  14. Hardware replacements and software tools for digital control computers

    International Nuclear Information System (INIS)

    Walker, R.A.P.; Wang, B-C.; Fung, J.

    1996-01-01

    computers which use 'Varian' technology. A new software program, Desk Top Tools, permits the designer greater flexibility in digital control computer software design and testing. This software development allows the user to emulate control of the CANDU reactor system by system. All discussions will highlight the ability of the replacements and the new developments to enhance the operation of the existing and 'repeat' plant digital control computers and will explore future applications of these developments. Examples of current use of all replacement components and software are provided. (author)

  15. Techniques and tools for measuring energy efficiency of scientific software applications

    CERN Document Server

    Abdurachmanov, David; Eulisse, Giulio; Knight, Robert; Niemi, Tapio; Nurminen, Jukka K.; Nyback, Filip; Pestana, Goncalo; Ou, Zhonghong; Khan, Kashif

    2014-01-01

    The scale of scientific High Performance Computing (HPC) and High Throughput Computing (HTC) has increased significantly in recent years, and is becoming sensitive to total energy use and cost. Energy-efficiency has thus become an important concern in scientific fields such as High Energy Physics (HEP). There has been a growing interest in utilizing alternate architectures, such as low power ARM processors, to replace traditional Intel x86 architectures. Nevertheless, even though such solutions have been successfully used in mobile applications with low I/O and memory demands, it is unclear if they are suitable and more energy-efficient in the scientific computing environment. Furthermore, there is a lack of tools and experience to derive and compare power consumption between the architectures for various workloads, and eventually to support software optimizations for energy efficiency. To that end, we have performed several physical and software-based measurements of workloads from HEP applications running o...

  16. Practical requirements for software tools to assist in the validation and verification of hybrid expert systems

    International Nuclear Information System (INIS)

    Singh, G.P.; Cadena, D.; Burgess, J.

    1992-01-01

    Any practical software development effort must remain focused on verification and validation of user requirements. Knowledge-based system development is no different in this regard. In industry today, most expert systems being produced are, in reality, hybrid software systems which, in addition to those components that provide the knowledge base and expert reasoning over the problem domain using various rule-based and object-oriented paradigms, incorporate significant bodies of code based on more traditional software techniques such as database management, graphical user interfaces, hypermedia, spreadsheets, as well as specially developed sequential code. Validation and verification of such hybrid systems must perforce integrate suitable methodologies from all such fields. This paper attempts to provide a broad overview of the practical requirements for methodologies and the concomitant groupware tools which would assist in such an enterprise. These methodologies and groupware tools would facilitate the teamwork efforts necessary to validate and verify all components of such hybrid systems by emphasizing cooperative recording of requirements and negotiated resolutions of any conflicts grounded in a solid understanding of the semantics of such a system

  17. Software tool for resolution of inverse problems using artificial intelligence techniques: an application in neutron spectrometry

    International Nuclear Information System (INIS)

    Castaneda M, V. H.; Martinez B, M. R.; Solis S, L. O.; Castaneda M, R.; Leon P, A. A.; Hernandez P, C. F.; Espinoza G, J. G.; Ortiz R, J. M.; Vega C, H. R.; Mendez, R.; Gallego, E.; Sousa L, M. A.

    2016-10-01

    The Taguchi methodology has proved to be highly efficient to solve inverse problems, in which the values of some parameters of the model must be obtained from the observed data. There are intrinsic mathematical characteristics that make a problem known as inverse. Inverse problems appear in many branches of science, engineering and mathematics. To solve this type of problem, researches have used different techniques. Recently, the use of techniques based on Artificial Intelligence technology is being explored by researches. This paper presents the use of a software tool based on artificial neural networks of generalized regression in the solution of inverse problems with application in high energy physics, specifically in the solution of the problem of neutron spectrometry. To solve this problem we use a software tool developed in the Mat Lab programming environment, which employs a friendly user interface, intuitive and easy to use for the user. This computational tool solves the inverse problem involved in the reconstruction of the neutron spectrum based on measurements made with a Bonner spheres spectrometric system. Introducing this information, the neural network is able to reconstruct the neutron spectrum with high performance and generalization capability. The tool allows that the end user does not require great training or technical knowledge in development and/or use of software, so it facilitates the use of the program for the resolution of inverse problems that are in several areas of knowledge. The techniques of Artificial Intelligence present singular veracity to solve inverse problems, given the characteristics of artificial neural networks and their network topology, therefore, the tool developed has been very useful, since the results generated by the Artificial Neural Network require few time in comparison to other techniques and are correct results comparing them with the actual data of the experiment. (Author)

  18. Herramientas libres para modelar software Free tools to model software

    Directory of Open Access Journals (Sweden)

    Mauro Callejas Cuervo Óscar Yovany Baquero Moreno

    2010-11-01

    Full Text Available Observación acerca del  software libre y de suimplicación en procesos de desarrollo de  softwarecon herramientas 4G por parte de entidades opersonas sin capitales astronómicos y sin lamentalidad acaparadora de dominar el mercado conproductos costosos que las hagan multimillonarias yque no ofrecen una garantía real, ni la posibilidadsiquiera de conocer el  software por el que se hapagado, y mucho menos de modificarlo si no cumplenuestras expectativas.

  19. FY1995 study of very flexible software structures based on soft-software components; 1995 nendo yawarankana software buhin ni motozuku software no choju kozo ni kansuru kenkyu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-03-01

    The purpose of this study is to develop the method and tools for changing the software structure flexibly along with the continuous continuous change of its environment and conditions of use. The goal is the software of very high adaptability by using soft-software components and flexible assembly. The CASE tool platform Sapid based on a fine-grained repository was developed and enforced for raising the abstraction level of program code and for mining potential flexible components. To reconstruct the software adaptable to a required environment, the SQM (Software Quark Model) was used in managing interconnectivity and other semantic relationships of among components. On these two basic systems, we developed various methods and tools such as those for static and dynamic analysis of very flexible software structures, program transformation description, program pattern extraction and composition component optimization by partial evaluation, component extraction by function slicing, code encapsulation, and component navigation and application. (NEDO)

  20. A software tool for modification of human voxel models used for application in radiation protection

    International Nuclear Information System (INIS)

    Becker, Janine; Zankl, Maria; Petoussi-Henss, Nina

    2007-01-01

    This note describes a new software tool called 'VolumeChange' that was developed to modify the masses and location of organs of virtual human voxel models. A voxel model is a three-dimensional representation of the human body in the form of an array of identification numbers that are arranged in slices, rows and columns. Each entry in this array represents a voxel; organs are represented by those voxels having the same identification number. With this tool, two human voxel models were adjusted to fit the reference organ masses of a male and a female adult, as defined by the International Commission on Radiological Protection (ICRP). The alteration of an already existing voxel model is a complicated process, leading to many problems that have to be solved. To solve those intricacies in an easy way, a new software tool was developed and is presented here. If the organs are modified, no bit of tissue, i.e. voxel, may vanish nor should an extra one appear. That means that organs cannot be modified without considering the neighbouring tissue. Thus, the principle of organ modification is based on the reassignment of voxels from one organ/tissue to another; actually deleting and adding voxels is only possible at the external surface, i.e. skin. In the software tool described here, the modifications are done by semi-automatic routines but including human control. Because of the complexity of the matter, a skilled person has to validate that the applied changes to organs are anatomically reasonable. A graphical user interface was designed to fulfil the purpose of a comfortable working process, and an adequate graphical display of the modified voxel model was developed. Single organs, organ complexes and even whole limbs can be edited with respect to volume, shape and location. (note)

  1. Evaluation of a new software tool for the automatic volume calculation of hepatic tumors. First results

    International Nuclear Information System (INIS)

    Meier, S.; Mildenberger, P.; Pitton, M.; Thelen, M.; Schenk, A.; Bourquain, H.

    2004-01-01

    Purpose: computed tomography has become the preferred method in detecting liver carcinomas. The introduction of spiral CT added volumetric assessment of intrahepatic tumors, which was unattainable in the clinical routine with incremental CT due to complex planimetric revisions and excessive computing time. In an ongoing clinical study, a new software tool was tested for the automatic detection of tumor volume and the time needed for this procedure. Materials and methods: we analyzed patients suffering from hepatocellular carcinoma (HCC). All patients underwent treatment with repeated transcatheter chemoembolization of the hepatic arteria. The volumes of the HCC lesions detected in CT were measured with the new software tool in HepaVison (MeVis, Germany). The results were compared with manual planimetric calculation of the volume performed by three independent radiologists. Results: our first results in 16 patients show a correlation between the automatically and the manually calculated volumes (up to a difference of 2 ml) of 96.8%. While the manual method of analyzing the volume of a lesion requires 2.5 minutes on average, the automatic method merely requires about 30 seconds of user interaction time. Conclusion: These preliminary results show a good correlation between automatic and manual calculations of the tumor volume. The new software tool requires less time for accurate determination of the tumor volume and can be applied in the daily clinical routine. (orig.) [de

  2. Software tools for quantification of X-ray microtomography at the UGCT

    Energy Technology Data Exchange (ETDEWEB)

    Vlassenbroeck, J. [Department of Subatomic and Radiation Physics, Ghent University, Proeftuinstraat 86, B-9000 Gent (Belgium)], E-mail: jelle.vlassenbroeck@ugent.be; Dierick, M.; Masschaele, B. [Department of Subatomic and Radiation Physics, Ghent University, Proeftuinstraat 86, B-9000 Gent (Belgium); Cnudde, V. [Department of Geology and Soil Science, Ghent University, Krijgslaan 281/S8, B-9000 Gent (Belgium); Van Hoorebeke, L. [Department of Subatomic and Radiation Physics, Ghent University, Proeftuinstraat 86, B-9000 Gent (Belgium); Jacobs, P. [Department of Geology and Soil Science, Ghent University, Krijgslaan 281/S8, B-9000 Gent (Belgium)

    2007-09-21

    The technique of X-ray microtomography using X-ray tube radiation offers an interesting tool for the non-destructive investigation of a wide range of materials. A major challenge lies in the analysis and quantification of the resulting data, allowing for a full characterization of the sample under investigation. In this paper, we discuss the software tools for reconstruction and analysis of tomographic data that are being developed at the UGCT. The tomographic reconstruction is performed using Octopus, a high-performance and user-friendly software package. The reconstruction process transforms the raw acquisition data into a stack of 2D cross-sections through the sample, resulting in a 3D data set. A number of artifact and noise reduction algorithms are integrated to reduce ring artifacts, beam hardening artifacts, COR misalignment, detector or stage tilt, pixel non-linearities, etc. These corrections are very important to facilitate the analysis of the 3D data. The analysis of the 3D data focuses primarily on the characterization of pore structures, but will be extended to other applications. A first package for the analysis of pore structures in three dimensions was developed under Matlab. A new package, called Morpho+, is being developed in a C++ environment, with optimizations and extensions of the previously used algorithms. The current status of this project will be discussed. Examples of pore analysis can be found in pharmaceuticals, material science, geology and numerous other fields.

  3. Economic Consequence Analysis of Disasters: The ECAT Software Tool

    Energy Technology Data Exchange (ETDEWEB)

    Rose, Adam; Prager, Fynn; Chen, Zhenhua; Chatterjee, Samrat; Wei, Dan; Heatwole, Nathaniel; Warren, Eric

    2017-04-15

    This study develops a methodology for rapidly obtaining approximate estimates of the economic consequences from numerous natural, man-made and technological threats. This software tool is intended for use by various decision makers and analysts to obtain estimates rapidly. It is programmed in Excel and Visual Basic for Applications (VBA) to facilitate its use. This tool is called E-CAT (Economic Consequence Analysis Tool) and accounts for the cumulative direct and indirect impacts (including resilience and behavioral factors that significantly affect base estimates) on the U.S. economy. E-CAT is intended to be a major step toward advancing the current state of economic consequence analysis (ECA) and also contributing to and developing interest in further research into complex but rapid turnaround approaches. The essence of the methodology involves running numerous simulations in a computable general equilibrium (CGE) model for each threat, yielding synthetic data for the estimation of a single regression equation based on the identification of key explanatory variables (threat characteristics and background conditions). This transforms the results of a complex model, which is beyond the reach of most users, into a "reduced form" model that is readily comprehensible. Functionality has been built into E-CAT so that its users can switch various consequence categories on and off in order to create customized profiles of economic consequences of numerous risk events. E-CAT incorporates uncertainty on both the input and output side in the course of the analysis.

  4. Computational Tools and Algorithms for Designing Customized Synthetic Genes

    Energy Technology Data Exchange (ETDEWEB)

    Gould, Nathan [Department of Computer Science, The College of New Jersey, Ewing, NJ (United States); Hendy, Oliver [Department of Biology, The College of New Jersey, Ewing, NJ (United States); Papamichail, Dimitris, E-mail: papamicd@tcnj.edu [Department of Computer Science, The College of New Jersey, Ewing, NJ (United States)

    2014-10-06

    Advances in DNA synthesis have enabled the construction of artificial genes, gene circuits, and genomes of bacterial scale. Freedom in de novo design of synthetic constructs provides significant power in studying the impact of mutations in sequence features, and verifying hypotheses on the functional information that is encoded in nucleic and amino acids. To aid this goal, a large number of software tools of variable sophistication have been implemented, enabling the design of synthetic genes for sequence optimization based on rationally defined properties. The first generation of tools dealt predominantly with singular objectives such as codon usage optimization and unique restriction site incorporation. Recent years have seen the emergence of sequence design tools that aim to evolve sequences toward combinations of objectives. The design of optimal protein-coding sequences adhering to multiple objectives is computationally hard, and most tools rely on heuristics to sample the vast sequence design space. In this review, we study some of the algorithmic issues behind gene optimization and the approaches that different tools have adopted to redesign genes and optimize desired coding features. We utilize test cases to demonstrate the efficiency of each approach, as well as identify their strengths and limitations.

  5. Computational Tools and Algorithms for Designing Customized Synthetic Genes

    International Nuclear Information System (INIS)

    Gould, Nathan; Hendy, Oliver; Papamichail, Dimitris

    2014-01-01

    Advances in DNA synthesis have enabled the construction of artificial genes, gene circuits, and genomes of bacterial scale. Freedom in de novo design of synthetic constructs provides significant power in studying the impact of mutations in sequence features, and verifying hypotheses on the functional information that is encoded in nucleic and amino acids. To aid this goal, a large number of software tools of variable sophistication have been implemented, enabling the design of synthetic genes for sequence optimization based on rationally defined properties. The first generation of tools dealt predominantly with singular objectives such as codon usage optimization and unique restriction site incorporation. Recent years have seen the emergence of sequence design tools that aim to evolve sequences toward combinations of objectives. The design of optimal protein-coding sequences adhering to multiple objectives is computationally hard, and most tools rely on heuristics to sample the vast sequence design space. In this review, we study some of the algorithmic issues behind gene optimization and the approaches that different tools have adopted to redesign genes and optimize desired coding features. We utilize test cases to demonstrate the efficiency of each approach, as well as identify their strengths and limitations.

  6. In-depth evaluation of software tools for data-independent acquisition based label-free quantification.

    Science.gov (United States)

    Kuharev, Jörg; Navarro, Pedro; Distler, Ute; Jahn, Olaf; Tenzer, Stefan

    2015-09-01

    Label-free quantification (LFQ) based on data-independent acquisition workflows currently experiences increasing popularity. Several software tools have been recently published or are commercially available. The present study focuses on the evaluation of three different software packages (Progenesis, synapter, and ISOQuant) supporting ion mobility enhanced data-independent acquisition data. In order to benchmark the LFQ performance of the different tools, we generated two hybrid proteome samples of defined quantitative composition containing tryptically digested proteomes of three different species (mouse, yeast, Escherichia coli). This model dataset simulates complex biological samples containing large numbers of both unregulated (background) proteins as well as up- and downregulated proteins with exactly known ratios between samples. We determined the number and dynamic range of quantifiable proteins and analyzed the influence of applied algorithms (retention time alignment, clustering, normalization, etc.) on quantification results. Analysis of technical reproducibility revealed median coefficients of variation of reported protein abundances below 5% for MS(E) data for Progenesis and ISOQuant. Regarding accuracy of LFQ, evaluation with synapter and ISOQuant yielded superior results compared to Progenesis. In addition, we discuss reporting formats and user friendliness of the software packages. The data generated in this study have been deposited to the ProteomeXchange Consortium with identifier PXD001240 (http://proteomecentral.proteomexchange.org/dataset/PXD001240). © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. Software Product Manager: A Mechanism to manage software products in small and medium ISVs

    NARCIS (Netherlands)

    Katchow, R.; van de Weerd, I.; Brinkkemper, S.; Rooswinkel, A.

    2009-01-01

    In this paper, we present SP Manager as an innovative tool for managing software products in small and medium independent software vendors (ISVs). This tool incorporates the operational software product management (SPM) processes focused on requirements management and release planning. By using

  8. Cognitive Load and Strategic Sophistication

    OpenAIRE

    Allred, Sarah; Duffy, Sean; Smith, John

    2013-01-01

    We study the relationship between the cognitive load manipulation and strategic sophistication. The cognitive load manipulation is designed to reduce the subject's cognitive resources that are available for deliberation on a choice. In our experiment, subjects are placed under a large cognitive load (given a difficult number to remember) or a low cognitive load (given a number which is not difficult to remember). Subsequently, the subjects play a one-shot game then they are asked to recall...

  9. Emerging role of bioinformatics tools and software in evolution of clinical research

    Directory of Open Access Journals (Sweden)

    Supreet Kaur Gill

    2016-01-01

    Full Text Available Clinical research is making toiling efforts for promotion and wellbeing of the health status of the people. There is a rapid increase in number and severity of diseases like cancer, hepatitis, HIV etc, resulting in high morbidity and mortality. Clinical research involves drug discovery and development whereas clinical trials are performed to establish safety and efficacy of drugs. Drug discovery is a long process starting with the target identification, validation and lead optimization. This is followed by the preclinical trials, intensive clinical trials and eventually post marketing vigilance for drug safety. Softwares and the bioinformatics tools play a great role not only in the drug discovery but also in drug development. It involves the use of informatics in the development of new knowledge pertaining to health and disease, data management during clinical trials and to use clinical data for secondary research. In addition, new technology likes molecular docking, molecular dynamics simulation, proteomics and quantitative structure activity relationship in clinical research results in faster and easier drug discovery process. During the preclinical trials, the software is used for randomization to remove bias and to plan study design. In clinical trials software like electronic data capture, Remote data capture and electronic case report form (eCRF is used to store the data. eClinical, Oracle clinical are software used for clinical data management and for statistical analysis of the data. After the drug is marketed the safety of a drug could be monitored by drug safety software like Oracle Argus or ARISg. Therefore, softwares are used from the very early stages of drug designing, to drug development, clinical trials and during pharmacovigilance. This review describes different aspects related to application of computers and bioinformatics in drug designing, discovery and development, formulation designing and clinical research.

  10. Software Quality Assessment Tool Based on Meta-Models

    OpenAIRE

    Doneva Rositsa; Gaftandzhieva Silvia; Doneva Zhelyana; Staevsky Nevena

    2015-01-01

    In the software industry it is indisputably essential to control the quality of produced software systems in terms of capabilities for easy maintenance, reuse, portability and others in order to ensure reliability in the software development. But it is also clear that it is very difficult to achieve such a control through a ‘manual’ management of quality.There are a number of approaches for software quality assurance based typically on software quality models (e.g. ISO 9126, McCall’s, Boehm’s...

  11. Towards a publicly available, map-based regional software tool to estimate unregulated daily streamflow at ungauged rivers

    Directory of Open Access Journals (Sweden)

    S. A. Archfield

    2013-01-01

    Full Text Available Streamflow information is critical for addressing any number of hydrologic problems. Often, streamflow information is needed at locations that are ungauged and, therefore, have no observations on which to base water management decisions. Furthermore, there has been increasing need for daily streamflow time series to manage rivers for both human and ecological functions. To facilitate negotiation between human and ecological demands for water, this paper presents the first publicly available, map-based, regional software tool to estimate historical, unregulated, daily streamflow time series (streamflow not affected by human alteration such as dams or water withdrawals at any user-selected ungauged river location. The map interface allows users to locate and click on a river location, which then links to a spreadsheet-based program that computes estimates of daily streamflow for the river location selected. For a demonstration region in the northeast United States, daily streamflow was, in general, shown to be reliably estimated by the software tool. Estimating the highest and lowest streamflows that occurred in the demonstration region over the period from 1960 through 2004 also was accomplished but with more difficulty and limitations. The software tool provides a general framework that can be applied to other regions for which daily streamflow estimates are needed.

  12. Towards a publicly available, map-based regional software tool to estimate unregulated daily streamflow at ungauged rivers

    Science.gov (United States)

    Archfield, Stacey A.; Steeves, Peter A.; Guthrie, John D.; Ries, Kernell G.

    2013-01-01

    Streamflow information is critical for addressing any number of hydrologic problems. Often, streamflow information is needed at locations that are ungauged and, therefore, have no observations on which to base water management decisions. Furthermore, there has been increasing need for daily streamflow time series to manage rivers for both human and ecological functions. To facilitate negotiation between human and ecological demands for water, this paper presents the first publicly available, map-based, regional software tool to estimate historical, unregulated, daily streamflow time series (streamflow not affected by human alteration such as dams or water withdrawals) at any user-selected ungauged river location. The map interface allows users to locate and click on a river location, which then links to a spreadsheet-based program that computes estimates of daily streamflow for the river location selected. For a demonstration region in the northeast United States, daily streamflow was, in general, shown to be reliably estimated by the software tool. Estimating the highest and lowest streamflows that occurred in the demonstration region over the period from 1960 through 2004 also was accomplished but with more difficulty and limitations. The software tool provides a general framework that can be applied to other regions for which daily streamflow estimates are needed.

  13. Fluctuating Finite Element Analysis (FFEA: A continuum mechanics software tool for mesoscale simulation of biomolecules.

    Directory of Open Access Journals (Sweden)

    Albert Solernou

    2018-03-01

    Full Text Available Fluctuating Finite Element Analysis (FFEA is a software package designed to perform continuum mechanics simulations of proteins and other globular macromolecules. It combines conventional finite element methods with stochastic thermal noise, and is appropriate for simulations of large proteins and protein complexes at the mesoscale (length-scales in the range of 5 nm to 1 μm, where there is currently a paucity of modelling tools. It requires 3D volumetric information as input, which can be low resolution structural information such as cryo-electron tomography (cryo-ET maps or much higher resolution atomistic co-ordinates from which volumetric information can be extracted. In this article we introduce our open source software package for performing FFEA simulations which we have released under a GPLv3 license. The software package includes a C ++ implementation of FFEA, together with tools to assist the user to set up the system from Electron Microscopy Data Bank (EMDB or Protein Data Bank (PDB data files. We also provide a PyMOL plugin to perform basic visualisation and additional Python tools for the analysis of FFEA simulation trajectories. This manuscript provides a basic background to the FFEA method, describing the implementation of the core mechanical model and how intermolecular interactions and the solvent environment are included within this framework. We provide prospective FFEA users with a practical overview of how to set up an FFEA simulation with reference to our publicly available online tutorials and manuals that accompany this first release of the package.

  14. Fluctuating Finite Element Analysis (FFEA): A continuum mechanics software tool for mesoscale simulation of biomolecules.

    Science.gov (United States)

    Solernou, Albert; Hanson, Benjamin S; Richardson, Robin A; Welch, Robert; Read, Daniel J; Harlen, Oliver G; Harris, Sarah A

    2018-03-01

    Fluctuating Finite Element Analysis (FFEA) is a software package designed to perform continuum mechanics simulations of proteins and other globular macromolecules. It combines conventional finite element methods with stochastic thermal noise, and is appropriate for simulations of large proteins and protein complexes at the mesoscale (length-scales in the range of 5 nm to 1 μm), where there is currently a paucity of modelling tools. It requires 3D volumetric information as input, which can be low resolution structural information such as cryo-electron tomography (cryo-ET) maps or much higher resolution atomistic co-ordinates from which volumetric information can be extracted. In this article we introduce our open source software package for performing FFEA simulations which we have released under a GPLv3 license. The software package includes a C ++ implementation of FFEA, together with tools to assist the user to set up the system from Electron Microscopy Data Bank (EMDB) or Protein Data Bank (PDB) data files. We also provide a PyMOL plugin to perform basic visualisation and additional Python tools for the analysis of FFEA simulation trajectories. This manuscript provides a basic background to the FFEA method, describing the implementation of the core mechanical model and how intermolecular interactions and the solvent environment are included within this framework. We provide prospective FFEA users with a practical overview of how to set up an FFEA simulation with reference to our publicly available online tutorials and manuals that accompany this first release of the package.

  15. Moral foundations and political attitudes: The moderating role of political sophistication.

    Science.gov (United States)

    Milesi, Patrizia

    2016-08-01

    Political attitudes can be associated with moral concerns. This research investigated whether people's level of political sophistication moderates this association. Based on the Moral Foundations Theory, this article examined whether political sophistication moderates the extent to which reliance on moral foundations, as categories of moral concerns, predicts judgements about policy positions. With this aim, two studies examined four policy positions shown by previous research to be best predicted by the endorsement of Sanctity, that is, the category of moral concerns focused on the preservation of physical and spiritual purity. The results showed that reliance on Sanctity predicted political sophisticates' judgements, as opposed to those of unsophisticates, on policy positions dealing with equal rights for same-sex and unmarried couples and with euthanasia. Political sophistication also interacted with Fairness endorsement, which includes moral concerns for equal treatment of everybody and reciprocity, in predicting judgements about equal rights for unmarried couples, and interacted with reliance on Authority, which includes moral concerns for obedience and respect for traditional authorities, in predicting opposition to stem cell research. Those findings suggest that, at least for these particular issues, endorsement of moral foundations can be associated with political attitudes more strongly among sophisticates than unsophisticates. © 2015 International Union of Psychological Science.

  16. Strengthening Software Authentication with the ROSE Software Suite

    International Nuclear Information System (INIS)

    White, G

    2006-01-01

    Many recent nonproliferation and arms control software projects include a software authentication regime. These include U.S. Government-sponsored projects both in the United States and in the Russian Federation (RF). This trend toward requiring software authentication is only accelerating. Demonstrating assurance that software performs as expected without hidden ''backdoors'' is crucial to a project's success. In this context, ''authentication'' is defined as determining that a software package performs only its intended purpose and performs said purpose correctly and reliably over the planned duration of an agreement. In addition to visual inspections by knowledgeable computer scientists, automated tools are needed to highlight suspicious code constructs, both to aid visual inspection and to guide program development. While many commercial tools are available for portions of the authentication task, they are proprietary and not extensible. An open-source, extensible tool can be customized to the unique needs of each project (projects can have both common and custom rules to detect flaws and security holes). Any such extensible tool has to be based on a complete language compiler. ROSE is precisely such a compiler infrastructure developed within the Department of Energy (DOE) and targeted at the optimization of scientific applications and user-defined libraries within large-scale applications (typically applications of a million lines of code). ROSE is a robust, source-to-source analysis and optimization infrastructure currently addressing large, million-line DOE applications in C and C++ (handling the full C, C99, C++ languages and with current collaborations to support Fortran90). We propose to extend ROSE to address a number of security-specific requirements, and apply it to software authentication for nonproliferation and arms control projects

  17. Data analysis software tools for enhanced collaboration at the DIII-D National Fusion Facility

    International Nuclear Information System (INIS)

    Schachter, J.; Peng, Q.; Schissel, D.P.

    2000-01-01

    Data analysis at the DIII-D National Fusion Facility is simplified by the use of two software packages in analysis codes. The first is 'GAPlotObj', an IDL-based object-oriented library used in visualization tools for dynamic plotting. GAPlotObj gives users the ability to manipulate graphs directly through mouse and keyboard-driven commands. The second software package is 'MDSplus', which is used at DIII-D as a central repository for analyzed data. GAPlotObj and MDSplus reduce the effort required for a collaborator to become familiar with the DIII-D analysis environment by providing uniform interfaces for data display and retrieval. Two visualization tools at DIII-D that benefit from them are 'ReviewPlus' and 'EFITviewer'. ReviewPlus is capable of displaying interactive 2D and 3D graphs of raw, analyzed, and simulation code data. EFITviewer is used to display results from the EFIT analysis code together with kinetic profiles and machine geometry. Both bring new possibilities for data exploration to the user, and are able to plot data from any fusion research site with an MDSplus data server

  18. Data Analysis Software Tools for Enhanced Collaboration at the DIII-D National Fusion Facility

    International Nuclear Information System (INIS)

    Schachter, J.; Peng, Q.; Schissel, D.P.

    1999-01-01

    Data analysis at the DIII-D National Fusion Facility is simplified by the use of two software packages in analysis codes. The first is GAP1otObj, an IDL-based object-oriented library used in visualization tools for dynamic plotting. GAPlotObj gives users the ability to manipulate graphs directly through mouse and keyboard-driven commands. The second software package is MDSplus, which is used at DIED as a central repository for analyzed data. GAPlotObj and MDSplus reduce the effort required for a collaborator to become familiar with the DIII-D analysis environment by providing uniform interfaces for data display and retrieval. Two visualization tools at DIII-D that benefit from them are ReviewPlus and EFITviewer. ReviewPlus is capable of displaying interactive 2D and 3D graphs of raw, analyzed, and simulation code data. EFITviewer is used to display results from the EFIT analysis code together with kinetic profiles and machine geometry. Both bring new possibilities for data exploration to the user, and are able to plot data from any fusion research site with an MDSplus data server

  19. Software system safety

    Science.gov (United States)

    Uber, James G.

    1988-01-01

    Software itself is not hazardous, but since software and hardware share common interfaces there is an opportunity for software to create hazards. Further, these software systems are complex, and proven methods for the design, analysis, and measurement of software safety are not yet available. Some past software failures, future NASA software trends, software engineering methods, and tools and techniques for various software safety analyses are reviewed. Recommendations to NASA are made based on this review.

  20. CMS software deployment on OSG

    International Nuclear Information System (INIS)

    Kim, B; Avery, P; Thomas, M; Wuerthwein, F

    2008-01-01

    A set of software deployment tools has been developed for the installation, verification, and removal of a CMS software release. The tools that are mainly targeted for the deployment on the OSG have the features of instant release deployment, corrective resubmission of the initial installation job, and an independent web-based deployment portal with Grid security infrastructure login mechanism. We have been deploying over 500 installations and found the tools are reliable and adaptable to cope with problems with changes in the Grid computing environment and the software releases. We present the design of the tools, statistics that we gathered during the operation of the tools, and our experience with the CMS software deployment on the OSG Grid computing environment

  1. CMS software deployment on OSG

    Energy Technology Data Exchange (ETDEWEB)

    Kim, B; Avery, P [University of Florida, Gainesville, FL 32611 (United States); Thomas, M [California Institute of Technology, Pasadena, CA 91125 (United States); Wuerthwein, F [University of California at San Diego, La Jolla, CA 92093 (United States)], E-mail: bockjoo@phys.ufl.edu, E-mail: thomas@hep.caltech.edu, E-mail: avery@phys.ufl.edu, E-mail: fkw@fnal.gov

    2008-07-15

    A set of software deployment tools has been developed for the installation, verification, and removal of a CMS software release. The tools that are mainly targeted for the deployment on the OSG have the features of instant release deployment, corrective resubmission of the initial installation job, and an independent web-based deployment portal with Grid security infrastructure login mechanism. We have been deploying over 500 installations and found the tools are reliable and adaptable to cope with problems with changes in the Grid computing environment and the software releases. We present the design of the tools, statistics that we gathered during the operation of the tools, and our experience with the CMS software deployment on the OSG Grid computing environment.

  2. Aristotle and Social-Epistemic Rhetoric: The Systematizing of the Sophistic Legacy.

    Science.gov (United States)

    Allen, James E.

    While Aristotle's philosophical views are more foundational than those of many of the Older Sophists, Aristotle's rhetorical theories inherit and incorporate many of the central tenets ascribed to Sophistic rhetoric, albeit in a more systematic fashion, as represented in the "Rhetoric." However, Aristotle was more than just a rhetorical…

  3. DIDACTIC PRINCIPLES AND PSYCHOLOGICAL CHARACTERISTICS IN DEFINITION OF QUALITY OF SOFTWARE TOOLS FOR EDUCATIONAL PURPOSE IN THE GENERAL EDUCATIONAL ENVIRONMENT OF UKRAINE

    Directory of Open Access Journals (Sweden)

    Maryna V. Pirko

    2011-02-01

    Full Text Available The fundamental feature of economy of postindustrial society is the knowledge that represents the basic source of competitive advantage. In the article the circle of didactic, psychological indicators in researches of problems of achievement of a high degree of quality of education and educational services is considered and described. The attention is paid to pedagogical requirements of the given period which are a standard substantiation in orientations for quality estimation of software tools for educational purpose of the general educational environment in Ukraine. The scheme of internal model of maintenance of quality of software tools for educational purpose is considered, the aspects integrated by internal model of quality of software for educational purpose are listed. The article describes the directions of researches in the conditions of formation of the global international educational environment and uniform information space of  education system taking into account the growth of availability of educational services. It is specified the main principles in the organization of pedagogical software tools.

  4. The electronic view box: a software tool for radiation therapy treatment verification

    International Nuclear Information System (INIS)

    Bosch, Walter R.; Low, Daniel A.; Gerber, Russell L.; Michalski, Jeff M.; Graham, Mary V.; Perez, Carlos A.; Harms, William B.; Purdy, James A.

    1995-01-01

    Purpose: We have developed a software tool for interactively verifying treatment plan implementation. The Electronic View Box (EVB) tool copies the paradigm of current practice but does so electronically. A portal image (online portal image or digitized port film) is displayed side by side with a prescription image (digitized simulator film or digitally reconstructed radiograph). The user can measure distances between features in prescription and portal images and 'write' on the display, either to approve the image or to indicate required corrective actions. The EVB tool also provides several features not available in conventional verification practice using a light box. Methods and Materials: The EVB tool has been written in ANSI C using the X window system. The tool makes use of the Virtual Machine Platform and Foundation Library specifications of the NCI-sponsored Radiation Therapy Planning Tools Collaborative Working Group for portability into an arbitrary treatment planning system that conforms to these specifications. The present EVB tool is based on an earlier Verification Image Review tool, but with a substantial redesign of the user interface. A graphical user interface prototyping system was used in iteratively refining the tool layout to allow rapid modifications of the interface in response to user comments. Results: Features of the EVB tool include 1) hierarchical selection of digital portal images based on physician name, patient name, and field identifier; 2) side-by-side presentation of prescription and portal images at equal magnification and orientation, and with independent grayscale controls; 3) 'trace' facility for outlining anatomical structures; 4) 'ruler' facility for measuring distances; 5) zoomed display of corresponding regions in both images; 6) image contrast enhancement; and 7) communication of portal image evaluation results (approval, block modification, repeat image acquisition, etc.). Conclusion: The EVB tool facilitates the rapid

  5. The D2G2 project: a new software tool for nuclear engineering design in Canada

    International Nuclear Information System (INIS)

    Rheaume, P.; Lefebvre, J.F.; Roy, R.; Koclas, J.

    2004-01-01

    Nowadays, high quality neutronic simulation codes are readily available. The open source software suite DRAGON/DONJON is a good example. It is free, it has proven quality and correctness over the years and is still developed and maintained at Ecole Polytechnique de Montreal. However, most simulation codes have the following weaknesses: limited usability, poor maintainability, no internal data standardization and poor portability. The D2G2 project is a software development initiative which aims to create an upper layer software tool that annihilates the weakness of classic simulation codes. This paper presents D2G2Client's and D2G2Server's principal capabilities, how they interact and the libraries they use. (author)

  6. Semantic integration of gene expression analysis tools and data sources using software connectors

    Science.gov (United States)

    2013-01-01

    Background The study and analysis of gene expression measurements is the primary focus of functional genomics. Once expression data is available, biologists are faced with the task of extracting (new) knowledge associated to the underlying biological phenomenon. Most often, in order to perform this task, biologists execute a number of analysis activities on the available gene expression dataset rather than a single analysis activity. The integration of heteregeneous tools and data sources to create an integrated analysis environment represents a challenging and error-prone task. Semantic integration enables the assignment of unambiguous meanings to data shared among different applications in an integrated environment, allowing the exchange of data in a semantically consistent and meaningful way. This work aims at developing an ontology-based methodology for the semantic integration of gene expression analysis tools and data sources. The proposed methodology relies on software connectors to support not only the access to heterogeneous data sources but also the definition of transformation rules on exchanged data. Results We have studied the different challenges involved in the integration of computer systems and the role software connectors play in this task. We have also studied a number of gene expression technologies, analysis tools and related ontologies in order to devise basic integration scenarios and propose a reference ontology for the gene expression domain. Then, we have defined a number of activities and associated guidelines to prescribe how the development of connectors should be carried out. Finally, we have applied the proposed methodology in the construction of three different integration scenarios involving the use of different tools for the analysis of different types of gene expression data. Conclusions The proposed methodology facilitates the development of connectors capable of semantically integrating different gene expression analysis tools

  7. The software improvement process - tools and rules to encourage quality

    International Nuclear Information System (INIS)

    Sigerud, K.; Baggiolini, V.

    2012-01-01

    The Applications section of the CERN accelerator controls group has decided to apply a systematic approach to quality assurance (QA), the 'Software Improvement Process' - SIP. This process focuses on three areas: the development process itself, suitable QA tools, and how to practically encourage developers to do QA. For each stage of the development process we have agreed on the recommended activities and deliverables, and identified tools to automate and support the task. For example we do more code reviews. As peer reviews are resource intensive, we only do them for complex parts of a product. As a complement, we are using static code checking tools, like FindBugs and Checkstyle. We also encourage unit testing and have agreed on a minimum level of test coverage recommended for all products, measured using Clover. Each of these tools is well integrated with our IDE (Eclipse) and give instant feedback to the developer about the quality of their code. The major challenges of SIP have been to 1) agree on common standards and configurations, for example common code formatting and Javadoc documentation guidelines, and 2) how to encourage the developers to do QA. To address the second point, we have successfully implemented 'SIP days', i.e. one day dedicated to QA work to which the whole group of developers participates, and 'Top/Flop' lists, clearly indicating the best and worst products with regards to SIP guidelines and standards, for example test coverage. This paper presents the SIP initiative in more detail, summarizing our experience since two years and our future plans. (authors)

  8. Embracing Open Software Development in Solar Physics

    Science.gov (United States)

    Hughitt, V. K.; Ireland, J.; Christe, S.; Mueller, D.

    2012-12-01

    We discuss two ongoing software projects in solar physics that have adopted best practices of the open source software community. The first, the Helioviewer Project, is a powerful data visualization tool which includes online and Java interfaces inspired by Google Maps (tm). This effort allows users to find solar features and events of interest, and download the corresponding data. Having found data of interest, the user now has to analyze it. The dominant solar data analysis platform is an open-source library called SolarSoft (SSW). Although SSW itself is open-source, the programming language used is IDL, a proprietary language with licensing costs that are prohibative for many institutions and individuals. SSW is composed of a collection of related scripts written by missions and individuals for solar data processing and analysis, without any consistent data structures or common interfaces. Further, at the time when SSW was initially developed, many of the best software development processes of today (mirrored and distributed version control, unit testing, continuous integration, etc.) were not standard, and have not since been adopted. The challenges inherent in developing SolarSoft led to a second software project known as SunPy. SunPy is an open-source Python-based library which seeks to create a unified solar data analysis environment including a number of core datatypes such as Maps, Lightcurves, and Spectra which have consistent interfaces and behaviors. By taking advantage of the large and sophisticated body of scientific software already available in Python (e.g. SciPy, NumPy, Matplotlib), and by adopting many of the best practices refined in open-source software development, SunPy has been able to develop at a very rapid pace while still ensuring a high level of reliability. The Helioviewer Project and SunPy represent two pioneering technologies in solar physics - simple yet flexible data visualization and a powerful, new data analysis environment. We

  9. The UP3-UP2 800 reprocessing plants control systems. Use of tools for the diagnosis, the track of control softwares and the management of technical data

    International Nuclear Information System (INIS)

    Chabert, J.; Michon, J.C.

    1995-01-01

    After a rapid presentation of control systems architectures of the La Hague COGEMA reprocessing plants, details are given about the tools used to master the control and instrumentation softwares and technical data. The paper focusses more particularly on the CML (Software Maintenance Center) tool which manages the software versions installed on the driving system, on the SYDDEX tool devoted to the management of the control and instrumentation associated data and documents, and on the SAD tool used for diagnosis assistance. (J.S.). 5 figs

  10. Sustaining an Online, Shared Community Resource for Models, Robust Open source Software Tools and Data for Volcanology - the Vhub Experience

    Science.gov (United States)

    Patra, A. K.; Valentine, G. A.; Bursik, M. I.; Connor, C.; Connor, L.; Jones, M.; Simakov, N.; Aghakhani, H.; Jones-Ivey, R.; Kosar, T.; Zhang, B.

    2015-12-01

    Over the last 5 years we have created a community collaboratory Vhub.org [Palma et al, J. App. Volc. 3:2 doi:10.1186/2191-5040-3-2] as a place to find volcanology-related resources, and a venue for users to disseminate tools, teaching resources, data, and an online platform to support collaborative efforts. As the community (current active users > 6000 from an estimated community of comparable size) embeds the tools in the collaboratory into educational and research workflows it became imperative to: a) redesign tools into robust, open source reusable software for online and offline usage/enhancement; b) share large datasets with remote collaborators and other users seamlessly with security; c) support complex workflows for uncertainty analysis, validation and verification and data assimilation with large data. The focus on tool development/redevelopment has been twofold - firstly to use best practices in software engineering and new hardware like multi-core and graphic processing units. Secondly we wish to enhance capabilities to support inverse modeling, uncertainty quantification using large ensembles and design of experiments, calibration, validation. Among software engineering practices we practice are open source facilitating community contributions, modularity and reusability. Our initial targets are four popular tools on Vhub - TITAN2D, TEPHRA2, PUFF and LAVA. Use of tools like these requires many observation driven data sets e.g. digital elevation models of topography, satellite imagery, field observations on deposits etc. These data are often maintained in private repositories that are privately shared by "sneaker-net". As a partial solution to this we tested mechanisms using irods software for online sharing of private data with public metadata and access limits. Finally, we adapted use of workflow engines (e.g. Pegasus) to support the complex data and computing workflows needed for usage like uncertainty quantification for hazard analysis using physical

  11. DVS-SOFTWARE: An Effective Tool for Applying Highly Parallelized Hardware To Computational Geophysics

    Science.gov (United States)

    Herrera, I.; Herrera, G. S.

    2015-12-01

    Most geophysical systems are macroscopic physical systems. The behavior prediction of such systems is carried out by means of computational models whose basic models are partial differential equations (PDEs) [1]. Due to the enormous size of the discretized version of such PDEs it is necessary to apply highly parallelized super-computers. For them, at present, the most efficient software is based on non-overlapping domain decomposition methods (DDM). However, a limiting feature of the present state-of-the-art techniques is due to the kind of discretizations used in them. Recently, I. Herrera and co-workers using 'non-overlapping discretizations' have produced the DVS-Software which overcomes this limitation [2]. The DVS-software can be applied to a great variety of geophysical problems and achieves very high parallel efficiencies (90%, or so [3]). It is therefore very suitable for effectively applying the most advanced parallel supercomputers available at present. In a parallel talk, in this AGU Fall Meeting, Graciela Herrera Z. will present how this software is being applied to advance MOD-FLOW. Key Words: Parallel Software for Geophysics, High Performance Computing, HPC, Parallel Computing, Domain Decomposition Methods (DDM)REFERENCES [1]. Herrera Ismael and George F. Pinder, Mathematical Modelling in Science and Engineering: An axiomatic approach", John Wiley, 243p., 2012. [2]. Herrera, I., de la Cruz L.M. and Rosas-Medina A. "Non Overlapping Discretization Methods for Partial, Differential Equations". NUMER METH PART D E, 30: 1427-1454, 2014, DOI 10.1002/num 21852. (Open source) [3]. Herrera, I., & Contreras Iván "An Innovative Tool for Effectively Applying Highly Parallelized Software To Problems of Elasticity". Geofísica Internacional, 2015 (In press)

  12. APPLICATION OF THE SPECTRUM ANALYSIS WITH USING BERG METHOD TO DEVELOPED SPECIAL SOFTWARE TOOLS FOR OPTICAL VIBRATION DIAGNOSTICS SYSTEM

    Directory of Open Access Journals (Sweden)

    E. O. Zaitsev

    2016-01-01

    Full Text Available The objective of this paper is development and experimental verification special software of spectral analysis. Spectral analysis use of controlled vibrations objects. Spectral analysis of vibration based on use maximum-entropy autoregressive method of spectral analysis by the Berg algorithm. For measured signals use preliminary analysis based on regression analysis. This analysis of the signal enables to eliminate uninformative parameters such as – the noise and the trend. For preliminary analysis developed special software tools. Non-contact measurement of mechanical vibrations parameters rotating diffusely-reflecting surfaces used in circumstances where the use of contact sensors difficult or impossible for a number of reasons, including lack of access to the object, the small size of the controlled area controlled portion has a high temperature or is affected by strong electromagnetic fields. For control use offered laser measuring system. This measuring system overcomes the shortcomings interference or Doppler optical measuring systems. Such as measure the large amplitude and inharmonious vibration. On the basis of the proposed methods developed special software tools for use measuring laser system. LabVIEW using for developed special software. Experimental research of the proposed method of vibration signals processing is checked in the analysis of the diagnostic information obtained by measuring the vibration system grinding diamond wheel cold solid tungsten-containing alloy TK8. A result of work special software tools was complex spectrum obtained «purified» from non-informative parameters. Spectrum of the signal corresponding to the vibration process observed object. 

  13. The Planetary Data System (PDS) Data Dictionary Tool (LDDTool)

    Science.gov (United States)

    Raugh, Anne C.; Hughes, John S.

    2017-10-01

    One of the major design goals of the PDS4 development effort was to provide an avenue for discipline specialists and large data preparers such as mission archivists to extend the core PDS4 Information Model (IM) to include metadata definitions specific to their own contexts. This capability is critical for the Planetary Data System - an archive that deals with a data collection that is diverse along virtually every conceivable axis. Amid such diversity, it is in the best interests of the PDS archive and its users that all extensions to the core IM follow the same design techniques, conventions, and restrictions as the core implementation itself. Notwithstanding, expecting all mission and discipline archivist seeking to define metadata for a new context to acquire expertise in information modeling, model-driven design, ontology, schema formulation, and PDS4 design conventions and philosophy is unrealistic, to say the least.To bridge that expertise gap, the PDS Engineering Node has developed the data dictionary creation tool known as “LDDTool”. This tool incorporates the same software used to maintain and extend the core IM, packaged with an interface that enables a developer to create his contextual information model using the same, open standards-based metadata framework PDS itself uses. Through this interface, the novice dictionary developer has immediate access to the common set of data types and unit classes for defining attributes, and a straight-forward method for constructing classes. The more experienced developer, using the same tool, has access to more sophisticated modeling methods like abstraction and extension, and can define very sophisticated validation rules.We present the key features of the PDS Local Data Dictionary Tool, which both supports the development of extensions to the PDS4 IM, and ensures their compatibility with the IM.

  14. Development and verification of a software tool for the acoustic location of partial discharge in a power transformer

    Directory of Open Access Journals (Sweden)

    Polužanski Vladimir

    2014-01-01

    Full Text Available This paper discusses the development and verification of software tool for determining the location of partial discharge in a power transformer with the acoustic method. Classification and systematization of physical principles and detection methods and tests of partial discharge in power transformers are shown at the beginning of this paper. The most important mathematical models, features, algorithms, and real problems that affect measurement accuracy are highlighted. This paper describes the development and implementation of a software tool for determining the location of partial discharge in a power transformer based on a no iterative mathematical algorithm. Verification and accuracy of measurement are proved both by computer simulation and experimental results available in the literature.

  15. Compensation of Digital Competence Deficiency with Software Ergonomic Tools

    Directory of Open Access Journals (Sweden)

    Zoltán Nyikes

    2018-03-01

    Full Text Available In the contemporary digital world, a lot of information needs to be processed every day. Our security depends on the quick and accurate processing of this information. Figure-based information processing is faster and simpler than the text-based one. The level of IT skills and skills may be lower in this way, so a system that is applicable to a broader social circle is needed. The solution would be to make the use of info-communication tools accessible to people with low vision, those with learning difficulties, non-speakers, the elderly and children who are unable to read because of their age. In this article, the author attempts to illustrate the difference in the processing of visual and textual information through a simple survey and make recommendations for the ergonomic compensation of digital competence gaps with software.

  16. Software Piracy in Research: A Moral Analysis.

    Science.gov (United States)

    Santillanes, Gary; Felder, Ryan Marshall

    2015-08-01

    Researchers in virtually every discipline rely on sophisticated proprietary software for their work. However, some researchers are unable to afford the licenses and instead procure the software illegally. We discuss the prohibition of software piracy by intellectual property laws, and argue that the moral basis for the copyright law offers the possibility of cases where software piracy may be morally justified. The ethics codes that scientific institutions abide by are informed by a rule-consequentialist logic: by preserving personal rights to authored works, people able to do so will be incentivized to create. By showing that the law has this rule-consequentialist grounding, we suggest that scientists who blindly adopt their institutional ethics codes will commit themselves to accepting that software piracy could be morally justified, in some cases. We hope that this conclusion will spark debate over important tensions between ethics codes, copyright law, and the underlying moral basis for these regulations. We conclude by offering practical solutions (other than piracy) for researchers.

  17. The predictors of economic sophistication: media, interpersonal communication and negative economic experiences

    NARCIS (Netherlands)

    Kalogeropoulos, A.; Albæk, E.; de Vreese, C.H.; van Dalen, A.

    2015-01-01

    In analogy to political sophistication, it is imperative that citizens have a certain level of economic sophistication, especially in times of heated debates about the economy. This study examines the impact of different influences (media, interpersonal communication and personal experiences) on

  18. SLDAssay: A software package and web tool for analyzing limiting dilution assays.

    Science.gov (United States)

    Trumble, Ilana M; Allmon, Andrew G; Archin, Nancie M; Rigdon, Joseph; Francis, Owen; Baldoni, Pedro L; Hudgens, Michael G

    2017-11-01

    Serial limiting dilution (SLD) assays are used in many areas of infectious disease related research. This paper presents SLDAssay, a free and publicly available R software package and web tool for analyzing data from SLD assays. SLDAssay computes the maximum likelihood estimate (MLE) for the concentration of target cells, with corresponding exact and asymptotic confidence intervals. Exact and asymptotic goodness of fit p-values, and a bias-corrected (BC) MLE are also provided. No other publicly available software currently implements the BC MLE or the exact methods. For validation of SLDAssay, results from Myers et al. (1994) are replicated. Simulations demonstrate the BC MLE is less biased than the MLE. Additionally, simulations demonstrate that exact methods tend to give better confidence interval coverage and goodness-of-fit tests with lower type I error than the asymptotic methods. Additional advantages of using exact methods are also discussed. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Validation of Tendril TrueHome Using Software-to-Software Comparison

    Energy Technology Data Exchange (ETDEWEB)

    Maguire, Jeffrey B [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Horowitz, Scott G [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Moore, Nathan [Tendril, Boulder, CO (United States); Sullivan, Patrick [Tendril, Boulder, CO (United States)

    2017-09-01

    This study performed comparative evaluation of EnergyPlus version 8.6 and Tendril TrueHome, two physics-based home energy simulation models, to identify differences in energy consumption predictions between the two programs and resolve discrepancies between them. EnergyPlus is considered a benchmark, best-in-class software tool for building energy simulation. This exercise sought to improve both software tools through additional evaluation/scrutiny.

  20. ABAQUS2MATLAB: A Novel Tool for Finite Element Post-Processing

    DEFF Research Database (Denmark)

    Martínez Pañeda, Emilio; Papazafeiropoulos, George; Muniz-Calvente, Miguel

    2017-01-01

    A novel piece of software is presented to connect Abaqus, a sophisticated finite element package, with Matlab, the most comprehensive program for mathematical analysis. This interface between these well-known codes not only benefits from the image processing and the integrated graph-plotting feat......A novel piece of software is presented to connect Abaqus, a sophisticated finite element package, with Matlab, the most comprehensive program for mathematical analysis. This interface between these well-known codes not only benefits from the image processing and the integrated graph...... to demonstrate its capabilities. The source code, detailed documentation and a large number of tutorials can be freely downloaded from www.abaqus2matlab.com....

  1. Automated support for experience-based software management

    Science.gov (United States)

    Valett, Jon D.

    1992-01-01

    To effectively manage a software development project, the software manager must have access to key information concerning a project's status. This information includes not only data relating to the project of interest, but also, the experience of past development efforts within the environment. This paper describes the concepts and functionality of a software management tool designed to provide this information. This tool, called the Software Management Environment (SME), enables the software manager to compare an ongoing development effort with previous efforts and with models of the 'typical' project within the environment, to predict future project status, to analyze a project's strengths and weaknesses, and to assess the project's quality. In order to provide these functions the tool utilizes a vast corporate memory that includes a data base of software metrics, a set of models and relationships that describe the software development environment, and a set of rules that capture other knowledge and experience of software managers within the environment. Integrating these major concepts into one software management tool, the SME is a model of the type of management tool needed for all software development organizations.

  2. The Computer-based Health Evaluation Software (CHES: a software for electronic patient-reported outcome monitoring

    Directory of Open Access Journals (Sweden)

    Holzner Bernhard

    2012-11-01

    Full Text Available Abstract Background Patient-reported Outcomes (PROs capturing e.g., quality of life, fatigue, depression, medication side-effects or disease symptoms, have become important outcome parameters in medical research and daily clinical practice. Electronic PRO data capture (ePRO with software packages to administer questionnaires, storing data, and presenting results has facilitated PRO assessment in hospital settings. Compared to conventional paper-pencil versions of PRO instruments, ePRO is more economical with regard to staff resources and time, and allows immediate presentation of results to the medical staff. The objective of our project was to develop software (CHES – Computer-based Health Evaluation System for ePRO in hospital settings and at home with a special focus on the presentation of individual patient’s results. Methods Following the Extreme Programming development approach architecture was not fixed up-front, but was done in close, continuous collaboration with software end users (medical staff, researchers and patients to meet their specific demands. Developed features include sophisticated, longitudinal charts linking patients’ PRO data to clinical characteristics and to PRO scores from reference populations, a web-interface for questionnaire administration, and a tool for convenient creating and editing of questionnaires. Results By 2012 CHES has been implemented at various institutions in Austria, Germany, Switzerland, and the UK and about 5000 patients participated in ePRO (with around 15000 assessments in total. Data entry is done by the patients themselves via tablet PCs with a study nurse or an intern approaching patients and supervising questionnaire completion. Discussion During the last decade several software packages for ePRO have emerged for different purposes. Whereas commercial products are available primarily for ePRO in clinical trials, academic projects have focused on data collection and presentation in daily

  3. User Guide for the STAYSL PNNL Suite of Software Tools

    Energy Technology Data Exchange (ETDEWEB)

    Greenwood, Lawrence R.; Johnson, Christian D.

    2013-02-27

    The STAYSL PNNL software suite provides a set of tools for working with neutron activation rates measured in a nuclear fission reactor, an accelerator-based neutron source, or any neutron field to determine the neutron flux spectrum through a generalized least-squares approach. This process is referred to as neutron spectral adjustment since the preferred approach is to use measured data to adjust neutron spectra provided by neutron physics calculations. The input data consist of the reaction rates based on measured activities, an initial estimate of the neutron flux spectrum, neutron activation cross sections and their associated uncertainties (covariances), and relevant correction factors. The output consists of the adjusted neutron flux spectrum and associated covariance matrix, which is useful for neutron dosimetry and radiation damage calculations.

  4. ROCKETSHIP: a flexible and modular software tool for the planning, processing and analysis of dynamic MRI studies

    International Nuclear Information System (INIS)

    Barnes, Samuel R.; Ng, Thomas S. C.; Santa-Maria, Naomi; Montagne, Axel; Zlokovic, Berislav V.; Jacobs, Russell E.

    2015-01-01

    Dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) is a promising technique to characterize pathology and evaluate treatment response. However, analysis of DCE-MRI data is complex and benefits from concurrent analysis of multiple kinetic models and parameters. Few software tools are currently available that specifically focuses on DCE-MRI analysis with multiple kinetic models. Here, we developed ROCKETSHIP, an open-source, flexible and modular software for DCE-MRI analysis. ROCKETSHIP incorporates analyses with multiple kinetic models, including data-driven nested model analysis. ROCKETSHIP was implemented using the MATLAB programming language. Robustness of the software to provide reliable fits using multiple kinetic models is demonstrated using simulated data. Simulations also demonstrate the utility of the data-driven nested model analysis. Applicability of ROCKETSHIP for both preclinical and clinical studies is shown using DCE-MRI studies of the human brain and a murine tumor model. A DCE-MRI software suite was implemented and tested using simulations. Its applicability to both preclinical and clinical datasets is shown. ROCKETSHIP was designed to be easily accessible for the beginner, but flexible enough for changes or additions to be made by the advanced user as well. The availability of a flexible analysis tool will aid future studies using DCE-MRI. A public release of ROCKETSHIP is available at (https://github.com/petmri/ROCKETSHIP)

  5. Radioactive waste management registry. A software tool for managing information on waste inventory

    International Nuclear Information System (INIS)

    Miaw, S.T.W.

    2001-01-01

    The IAEA developed a software tool, the RWM Registry (Radioactive Waste Management Registry) which is primarily concerned with the management and recording of reliable information on the radioactive waste during its life-cycle, i.e. from generation to disposal and beyond. In the current version, it aims to assist the management of waste from nuclear applications. the Registry is a managerial tool and offers an immediate overview of the various waste management steps and activities. This would facilitate controlling, keeping track of waste and waste package, planning, optimizing of resources, monitoring of related data, disseminating of information, taking actions and making decisions related to the waste management. Additionally, the quality control of waste products and a Member State's associated waste management quality assurance programme are addressed. The tool also facilitates to provide information on waste inventory as required by the national regulatory bodies. The RWM Registry contains two modules which are described in detail

  6. MSP-Tool: a VBA-based software tool for the analysis of multispecimen paleointensity data

    Science.gov (United States)

    Monster, Marilyn; de Groot, Lennart; Dekkers, Mark

    2015-12-01

    The multispecimen protocol (MSP) is a method to estimate the Earth's magnetic field's past strength from volcanic rocks or archeological materials. By reducing the amount of heating steps and aligning the specimens parallel to the applied field, thermochemical alteration and multi-domain effects are minimized. We present a new software tool, written for Microsoft Excel 2010 in Visual Basic for Applications (VBA), that evaluates paleointensity data acquired using this protocol. In addition to the three ratios (standard, fraction-corrected and domain-state-corrected) calculated following Dekkers and Böhnel (2006) and Fabian and Leonhardt (2010) and a number of other parameters proposed by Fabian and Leonhardt (2010), it also provides several reliability criteria. These include an alteration criterion, whether or not the linear regression intersects the y axis within the theoretically prescribed range, and two directional checks. Overprints and misalignment are detected by isolating the remaining natural remanent magnetization (NRM) and the partial thermoremanent magnetization (pTRM) gained and comparing their declinations and inclinations. The NRM remaining and pTRM gained are then used to calculate alignment-corrected multispecimen plots. Data are analyzed using bootstrap statistics. The program was tested on lava samples that were given a full TRM and that acquired their pTRMs at angles of 0, 15, 30 and 90° with respect to their NRMs. MSP-Tool adequately detected and largely corrected these artificial alignment errors.

  7. Computational Tools and Algorithms for Designing Customized Synthetic Genes

    Directory of Open Access Journals (Sweden)

    Nathan eGould

    2014-10-01

    Full Text Available Advances in DNA synthesis have enabled the construction of artificial genes, gene circuits, and genomes of bacterial scale. Freedom in de-novo design of synthetic constructs provides significant power in studying the impact of mutations in sequence features, and verifying hypotheses on the functional information that is encoded in nucleic and amino acids. To aid this goal, a large number of software tools of variable sophistication have been implemented, enabling the design of synthetic genes for sequence optimization based on rationally defined properties. The first generation of tools dealt predominantly with singular objectives such as codon usage optimization and unique restriction site incorporation. Recent years have seen the emergence of sequence design tools that aim to evolve sequences toward combinations of objectives. The design of optimal protein coding sequences adhering to multiple objectives is computationally hard, and most tools rely on heuristics to sample the vast sequence design space. In this review we study some of the algorithmic issues behind gene optimization and the approaches that different tools have adopted to redesign genes and optimize desired coding features. We utilize test cases to demonstrate the efficiency of each approach, as well as identify their strengths and limitations.

  8. Software compensation in Particle Flow reconstruction

    CERN Document Server

    Lan Tran, Huong; Sefkow, Felix; Green, Steven; Marshall, John; Thomson, Mark; Simon, Frank

    2017-01-01

    The Particle Flow approach to calorimetry requires highly granular calorimeters and sophisticated software algorithms in order to reconstruct and identify individual particles in complex event topologies. The high spatial granularity, together with analog energy information, can be further exploited in software compensation. In this approach, the local energy density is used to discriminate electromagnetic and purely hadronic sub-showers within hadron showers in the detector to improve the energy resolution for single particles by correcting for the intrinsic non-compensation of the calorimeter system. This improvement in the single particle energy resolution also results in a better overall jet energy resolution by improving the energy measurement of identified neutral hadrons and improvements in the pattern recognition stage by a more accurate matching of calorimeter energies to tracker measurements. This paper describes the software compensation technique and its implementation in Particle Flow reconstruct...

  9. Software development an open source approach

    CERN Document Server

    Tucker, Allen; de Silva, Chamindra

    2011-01-01

    Overview and Motivation Software Free and Open Source Software (FOSS)Two Case Studies Working with a Project Team Key FOSS Activities Client-Oriented vs. Community-Oriented Projects Working on a Client-Oriented Project Joining a Community-Oriented Project Using Project Tools Collaboration Tools Code Management Tools Run-Time System ConstraintsSoftware Architecture Architectural Patterns Layers, Cohesion, and Coupling Security Concurrency, Race Conditions, and DeadlocksWorking with Code Bad Smells and Metrics Refactoring Testing Debugging Extending the Software for a New ProjectDeveloping the D

  10. Application of the PredictAD Software Tool to Predict Progression in Patients with Mild Cognitive Impairment

    DEFF Research Database (Denmark)

    Simonsen, Anja H; Mattila, Jussi; Hejl, Anne-Mette

    2012-01-01

    of incremental data presentation using the software tool. A 5th phase was done with all available patient data presented on paper charts. Classifications by the clinical raters were compared to the clinical diagnoses made by the Alzheimer's Disease Neuroimaging Initiative investigators. Results: A statistical...... significant trend (p classification accuracy (from 62.6 to 70.0%) was found when using the PredictAD tool during the stepwise procedure. When the same data were presented on paper, classification accuracy of the raters dropped significantly from 70.0 to 63.2%. Conclusion: Best...... classification accuracy was achieved by the clinical raters when using the tool for decision support, suggesting that the tool can add value in diagnostic classification when large amounts of heterogeneous data are presented....

  11. PAUL AND SOPHISTIC RHETORIC: A PERSPECTIVE ON HIS ...

    African Journals Online (AJOL)

    use of modern rhetorical theories but analyses the letter in terms of the clas- ..... If a critical reader would have had the traditional anti-sophistic arsenal ..... pressions and that 'rhetoric' is mainly a matter of communicating these thoughts.

  12. Interactive software tool to comprehend the calculation of optimal sequence alignments with dynamic programming.

    Science.gov (United States)

    Ibarra, Ignacio L; Melo, Francisco

    2010-07-01

    Dynamic programming (DP) is a general optimization strategy that is successfully used across various disciplines of science. In bioinformatics, it is widely applied in calculating the optimal alignment between pairs of protein or DNA sequences. These alignments form the basis of new, verifiable biological hypothesis. Despite its importance, there are no interactive tools available for training and education on understanding the DP algorithm. Here, we introduce an interactive computer application with a graphical interface, for the purpose of educating students about DP. The program displays the DP scoring matrix and the resulting optimal alignment(s), while allowing the user to modify key parameters such as the values in the similarity matrix, the sequence alignment algorithm version and the gap opening/extension penalties. We hope that this software will be useful to teachers and students of bioinformatics courses, as well as researchers who implement the DP algorithm for diverse applications. The software is freely available at: http:/melolab.org/sat. The software is written in the Java computer language, thus it runs on all major platforms and operating systems including Windows, Mac OS X and LINUX. All inquiries or comments about this software should be directed to Francisco Melo at fmelo@bio.puc.cl.

  13. Practical Analysis of the Dynamic Characteristics of JavaScript

    OpenAIRE

    Wei, Shiyi

    2015-01-01

    JavaScript is a dynamic object-oriented programming language, which is designed with flexible programming mechanisms. JavaScript is widely used in developing sophisticated software systems, especially web applications. Despite of its popularity, there is a lack of software tools that support JavaScript for software engineering clients. Dataflow analysis approximates software behavior by analyzing the program code; it is the foundation for many software tools. However, several unique features...

  14. Development of computer-aided software engineering tool for sequential control of JT-60U

    International Nuclear Information System (INIS)

    Shimono, M.; Akasaka, H.; Kurihara, K.; Kimura, T.

    1995-01-01

    Discharge sequential control (DSC) is an essential control function for the intermittent and pulse discharge operation of a tokamak device, so that many subsystems may work with each other in correct order and/or synchronously. In the development of the DSC program, block diagrams of logical operation for sequential control are illustrated in its design at first. Then, the logical operators and I/O's which are involved in the block diagrams are compiled and converted to a certain particular form. Since the block diagrams of the sequential control amounts to about 50 sheets in the case of the JT-60 upgrade tokamak (JT-60U) high power discharge and the above steps of the development have been performed manually so far, a great effort has been required for the program development. In order to remove inefficiency in such development processes, a computer-aided software engineering (CASE) tool has been developed on a UNIX workstation. This paper reports how the authors design it for the development of the sequential control programs. The tool is composed of the following three tools: (1) Automatic drawing tool, (2) Editing tool, and (3) Trace tool. This CASE tool, an object-oriented programming tool having graphical formalism, can powerfully accelerate the cycle for the development of the sequential control function commonly associated with pulse discharge in a tokamak fusion device

  15. 76 FR 5832 - International Business Machines (IBM), Software Group Business Unit, Optim Data Studio Tools QA...

    Science.gov (United States)

    2011-02-02

    ... DEPARTMENT OF LABOR Employment and Training Administration [TA-W-74,554] International Business Machines (IBM), Software Group Business Unit, Optim Data Studio Tools QA, San Jose, CA; Notice of Affirmative Determination Regarding Application for Reconsideration By application dated November 29, 2010, a worker and a state workforce official...

  16. Cloud-Based SimJavaWeb Software Tool to Learn Simulation

    Directory of Open Access Journals (Sweden)

    A. Yu. Bykov

    2017-01-01

    Full Text Available Currently, in simulation there is a trend towards using the distributed software tools, particularly ones, which are using cloud technologies and the Internet. The article considers a simulation educational tool, implemented as a web application using the Java language with special Java class library developed for simulation. It is focused on a discrete event approach to modeling, similarly to the GPSS language, and intended for queuing systems simulation.The structure of the models obtained using this class library is similar to that of the GPSS language models. Also, the simulation language interpreter similar to GPSS is created using this class library, with some differences in the individual statements.Simulation experiments are performed on the server-side, and on client-side you must use a browser with standard functions to enter the source code into HTML-created form. Mobile devices can be used as clients. The source code of a model can be represented both in the Java language using a class library and in the language similar to GPSS.The simulation system implements functions especially for educational process. For example, there is possibility for a student to upload learning materials on the server, send developed software and reports of test control to the teacher via the Internet, and receive a detailed assessment of their results from the teacher. Also detailed results of passed tests in learning modules are entered, and some other functions are implemented in the system.As examples, the article considers models of the m/M/n/0 type queuing system in Java with a class library, and in the language similar to GPSS, shows simulation results, and presents the analytical model and calculations for this system. Analytical calculations proved that the modeling system is useful, as it overlaps simulation results with the acceptable error. Some approaches to the interaction with students through the Internet, used in modeling environment, can

  17. Subsystem software for TSTA [Tritium Systems Test Assembly

    International Nuclear Information System (INIS)

    Mann, L.W.; Claborn, G.W.; Nielson, C.W.

    1987-01-01

    The Subsystem Control Software at the Tritium System Test Assembly (TSTA) must control sophisticated chemical processes through the physical operation of valves, motor controllers, gas sampling devices, thermocouples, pressure transducers, and similar devices. Such control software has to be capable of passing stringent quality assurance (QA) criteria to provide for the safe handling of significant amounts of tritium on a routine basis. Since many of the chemical processes and physical components are experimental, the control software has to be flexible enough to allow for trial/error learning curve, but still protect the environment and personnel from exposure to unsafe levels of radiation. The software at TSTA is implemented in several levels as described in a preceding paper in these proceedings. This paper depends on information given in the preceding paper for understanding. The top level is the Subsystem Control level

  18. NASA Operational Simulator for Small Satellites: Tools for Software Based Validation and Verification of Small Satellites

    Science.gov (United States)

    Grubb, Matt

    2016-01-01

    The NASA Operational Simulator for Small Satellites (NOS3) is a suite of tools to aid in areas such as software development, integration test (IT), mission operations training, verification and validation (VV), and software systems check-out. NOS3 provides a software development environment, a multi-target build system, an operator interface-ground station, dynamics and environment simulations, and software-based hardware models. NOS3 enables the development of flight software (FSW) early in the project life cycle, when access to hardware is typically not available. For small satellites there are extensive lead times on many of the commercial-off-the-shelf (COTS) components as well as limited funding for engineering test units (ETU). Considering the difficulty of providing a hardware test-bed to each developer tester, hardware models are modeled based upon characteristic data or manufacturers data sheets for each individual component. The fidelity of each hardware models is such that FSW executes unaware that physical hardware is not present. This allows binaries to be compiled for both the simulation environment, and the flight computer, without changing the FSW source code. For hardware models that provide data dependent on the environment, such as a GPS receiver or magnetometer, an open-source tool from NASA GSFC (42 Spacecraft Simulation) is used to provide the necessary data. The underlying infrastructure used to transfer messages between FSW and the hardware models can also be used to monitor, intercept, and inject messages, which has proven to be beneficial for VV of larger missions such as James Webb Space Telescope (JWST). As hardware is procured, drivers can be added to the environment to enable hardware-in-the-loop (HWIL) testing. When strict time synchronization is not vital, any number of combinations of hardware components and software-based models can be tested. The open-source operator interface used in NOS3 is COSMOS from Ball Aerospace. For

  19. ELER software - a new tool for urban earthquake loss assessment

    Science.gov (United States)

    Hancilar, U.; Tuzun, C.; Yenidogan, C.; Erdik, M.

    2010-12-01

    ATC-55 (Yang, 2005). An urban loss assessment exercise for a scenario earthquake for the city of Istanbul is conducted and physical and social losses are presented. Damage to the urban environment is compared to the results obtained from similar software, i.e. KOERILoss (KOERI, 2002) and DBELA (Crowley et al., 2004). The European rapid loss estimation tool is expected to help enable effective emergency response, on both local and global level, as well as public information.

  20. A Critical Study of Effect of Web-Based Software Tools in Finding and Sharing Digital Resources--A Literature Review

    Science.gov (United States)

    Baig, Muntajeeb Ali

    2010-01-01

    The purpose of this paper is to review the effect of web-based software tools for finding and sharing digital resources. A positive correlation between learning and studying through online tools has been found in recent researches. In traditional classroom, searching resources are limited to the library and sharing of resources is limited to the…

  1. Conception and validation software tools for the level 0 muon trigger of LHCb

    International Nuclear Information System (INIS)

    Aslanides, E.; Cachemiche, J. P.; Cogan, J.; Duval, P. Y.; Le Gac, R.; Hachon, F.; Leroy, O.; Liotard, P. L.; Marin, F.; Tsaregorodtsev, A.

    2009-01-01

    The Level-0 muon trigger processor of the LHCb experiment looks for straight particles crossing muon detector and measures their transverse momentum. It processes 40*10 6 proton-proton collisions per second. The tracking uses a road algorithm relying on the projectivity of the muon detector (the logical layout in the 5 muon station is projective in y to the interaction point and it is also projective in x when the bending in the horizontal direction introduced by the magnetic field is ignored). The architecture of the Level-0 muon trigger is complex with a dense network of data interconnections. The design and validation of such an intricate system has only been possible with intense use of software tools for the detector simulation, the modelling of the hardware components behaviour and the validation. A database describing the data-flow is the corner stone between the software and hardware components. (authors)

  2. TU-C-17A-03: An Integrated Contour Evaluation Software Tool Using Supervised Pattern Recognition for Radiotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Chen, H; Tan, J; Kavanaugh, J; Dolly, S; Gay, H; Thorstad, W; Anastasio, M; Altman, M; Mutic, S; Li, H [Washington University School of Medicine, Saint Louis, MO (United States)

    2014-06-15

    Purpose: Radiotherapy (RT) contours delineated either manually or semiautomatically require verification before clinical usage. Manual evaluation is very time consuming. A new integrated software tool using supervised pattern contour recognition was thus developed to facilitate this process. Methods: The contouring tool was developed using an object-oriented programming language C# and application programming interfaces, e.g. visualization toolkit (VTK). The C# language served as the tool design basis. The Accord.Net scientific computing libraries were utilized for the required statistical data processing and pattern recognition, while the VTK was used to build and render 3-D mesh models from critical RT structures in real-time and 360° visualization. Principal component analysis (PCA) was used for system self-updating geometry variations of normal structures based on physician-approved RT contours as a training dataset. The inhouse design of supervised PCA-based contour recognition method was used for automatically evaluating contour normality/abnormality. The function for reporting the contour evaluation results was implemented by using C# and Windows Form Designer. Results: The software input was RT simulation images and RT structures from commercial clinical treatment planning systems. Several abilities were demonstrated: automatic assessment of RT contours, file loading/saving of various modality medical images and RT contours, and generation/visualization of 3-D images and anatomical models. Moreover, it supported the 360° rendering of the RT structures in a multi-slice view, which allows physicians to visually check and edit abnormally contoured structures. Conclusion: This new software integrates the supervised learning framework with image processing and graphical visualization modules for RT contour verification. This tool has great potential for facilitating treatment planning with the assistance of an automatic contour evaluation module in avoiding

  3. The impact of software and CAE tools on SEU in field programmable gate arrays

    International Nuclear Information System (INIS)

    Katz, R.; Wang, J.; McCollum, J.; Cronquist, B.

    1999-01-01

    Field programmable gate array (FPGA) devices, heavily used in spacecraft electronics, have grown substantially in size over the past few years, causing designers to work at a higher conceptual level, with computer aided engineering (CAE) tools synthesizing and optimizing the logic from a description. It is shown that the use of commercial-off-the-shelf (COTS) CAE tools can produce unreliable circuit designs when the device is used in a radiation environment and a flip-flop is upset. At a lower level, software can be used to improve the SEU performance of a flip-flop, exploiting the configurable nature of FPGA technology and on-chip delay, parasitic resistive, and capacitive circuit elements

  4. Clinical evaluation of monitor unit software and the application of action levels

    International Nuclear Information System (INIS)

    Georg, Dietmar; Nyholm, Tufve; Olofsson, Joergen; Kjaer-Kristoffersen, Flemming; Schnekenburger, Bruno; Winkler, Peter; Nystroem, Hakan; Ahnesjoe, Anders; Karlsson, Mikael

    2007-01-01

    Purpose: The aim of this study was the clinical evaluation of an independent dose and monitor unit verification (MUV) software which is based on sophisticated semi-analytical modelling. The software was developed within the framework of an ESTRO project. Finally, consistent handling of dose calculation deviations applying individual action levels is discussed. Materials and methods: A Matlab-based software ('MUV') was distributed to five well-established treatment centres in Europe (Vienna, Graz, Basel, Copenhagen, and Umea) and evaluated as a quality assurance (QA) tool in clinical routine. Results were acquired for 226 individual treatment plans including a total of 815 radiation fields. About 150 beam verification measurements were performed for a portion of the individual treatment plans, mainly with time variable fluence patterns. The deviations between dose calculations performed with a treatment planning system (TPS) and the MUV software were scored with respect to treatment area, treatment technique, geometrical depth, radiological depth, etc. Results: In general good agreement was found between calculations performed with the different TPSs and MUV, with a mean deviation per field of 0.2 ± 3.5% (1 SD) and mean deviations of 0.2 ± 2.2% for composite treatment plans. For pelvic treatments less than 10% of all fields showed deviations larger than 3%. In general, when using the radiological depth for verification calculations the results and the spread in the results improved significantly, especially for head-and-neck and for thorax treatments. For IMRT head-and-neck beams, mean deviations between MUV and the local TPS were -1.0 ± 7.3% for dynamic, and -1.3 ± 3.2% for step-and-shoot IMRT delivery. For dynamic IMRT beams in the pelvis good agreement was obtained between MUV and the local TPS (mean: -1.6 ± 1.5%). Treatment site and treatment technique dependent action levels between ±3% and ±5% seem to be clinically realistic if a radiological depth

  5. STAR-GENERIS - a software package for information processing

    International Nuclear Information System (INIS)

    Felkel, L.

    1985-01-01

    Man-machine-communication in electrical power plants is increasingly based on the capabilities of minicomputers. Rather than just displaying raw process data more complex processing is done to aid operators by improving information quality. Advanced operator aids for nuclear power plants are, e.g. alarm reduction, disturbance analysis and expert systems. Operator aids use complex combinations and computations of plant signals, which have to be described in a formal and homogeneous way. The design of such computer-based information systems requires extensive software and engineering efforts. The STAR software concept reduces the software effort to a minimum by proving an advanced program package which facilitates specification and implementation of engineering know-how necessary for sophisticated operator aids. (orig./HP) [de

  6. The Software Management Environment (SME)

    Science.gov (United States)

    Valett, Jon D.; Decker, William; Buell, John

    1988-01-01

    The Software Management Environment (SME) is a research effort designed to utilize the past experiences and results of the Software Engineering Laboratory (SEL) and to incorporate this knowledge into a tool for managing projects. SME provides the software development manager with the ability to observe, compare, predict, analyze, and control key software development parameters such as effort, reliability, and resource utilization. The major components of the SME, the architecture of the system, and examples of the functionality of the tool are discussed.

  7. Computer Software Configuration Item-Specific Flight Software Image Transfer Script Generator

    Science.gov (United States)

    Bolen, Kenny; Greenlaw, Ronald

    2010-01-01

    A K-shell UNIX script enables the International Space Station (ISS) Flight Control Team (FCT) operators in NASA s Mission Control Center (MCC) in Houston to transfer an entire or partial computer software configuration item (CSCI) from a flight software compact disk (CD) to the onboard Portable Computer System (PCS). The tool is designed to read the content stored on a flight software CD and generate individual CSCI transfer scripts that are capable of transferring the flight software content in a given subdirectory on the CD to the scratch directory on the PCS. The flight control team can then transfer the flight software from the PCS scratch directory to the Electronically Erasable Programmable Read Only Memory (EEPROM) of an ISS Multiplexer/ Demultiplexer (MDM) via the Indirect File Transfer capability. The individual CSCI scripts and the CSCI Specific Flight Software Image Transfer Script Generator (CFITSG), when executed a second time, will remove all components from their original execution. The tool will identify errors in the transfer process and create logs of the transferred software for the purposes of configuration management.

  8. Abaqus2Matlab: A suitable tool for finite element post-processing

    DEFF Research Database (Denmark)

    Papazafeiropoulos, George; Muñiz-Calvente, Miguel; Martínez Pañeda, Emilio

    2017-01-01

    A suitable piece of software is presented to connect Abaqus, a sophisticated finite element package, with Matlab, the most comprehensive program for mathematical analysis. This interface between these well- known codes not only benefits from the image processing and the integrated graph-plotting ......A suitable piece of software is presented to connect Abaqus, a sophisticated finite element package, with Matlab, the most comprehensive program for mathematical analysis. This interface between these well- known codes not only benefits from the image processing and the integrated graph...... crack propagation in structural materials by means of a cohesive zone approach. The source code, detailed documentation and a large number of tutorials can be freely downloaded from www.abaqus2matlab.com ....

  9. Isocratean Discourse Theory and Neo-Sophistic Pedagogy: Implications for the Composition Classroom.

    Science.gov (United States)

    Blair, Kristine L.

    With the recent interest in the fifth century B.C. theories of Protagoras and Gorgias come assumptions about the philosophical affinity of the Greek educator Isocrates to this pair of older sophists. Isocratean education in discourse, with its emphasis on collaborative political discourse, falls within recent definitions of a sophist curriculum.…

  10. SSAGES: Software Suite for Advanced General Ensemble Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Sidky, Hythem [Department of Chemical and Biomolecular Engineering, University of Notre Dame, Notre Dame, Indiana 46556, USA; Colón, Yamil J. [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Institute for Molecular Engineering and Materials Science Division, Argonne National Laboratory, Lemont, Illinois 60439, USA; Helfferich, Julian [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Steinbuch Center for Computing, Karlsruhe Institute of Technology, Hermann-von-Helmholtz-Platz 1, 76344 Eggenstein-Leopoldshafen, Germany; Sikora, Benjamin J. [Department of Chemical and Biomolecular Engineering, University of Notre Dame, Notre Dame, Indiana 46556, USA; Bezik, Cody [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Chu, Weiwei [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Giberti, Federico [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Guo, Ashley Z. [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Jiang, Xikai [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Lequieu, Joshua [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Li, Jiyuan [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Moller, Joshua [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Quevillon, Michael J. [Department of Chemical and Biomolecular Engineering, University of Notre Dame, Notre Dame, Indiana 46556, USA; Rahimi, Mohammad [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Ramezani-Dakhel, Hadi [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Department of Biochemistry and Molecular Biology, University of Chicago, Chicago, Illinois 60637, USA; Rathee, Vikramjit S. [Department of Chemical and Biomolecular Engineering, University of Notre Dame, Notre Dame, Indiana 46556, USA; Reid, Daniel R. [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Sevgen, Emre [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Thapar, Vikram [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Webb, Michael A. [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Institute for Molecular Engineering and Materials Science Division, Argonne National Laboratory, Lemont, Illinois 60439, USA; Whitmer, Jonathan K. [Department of Chemical and Biomolecular Engineering, University of Notre Dame, Notre Dame, Indiana 46556, USA; de Pablo, Juan J. [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Institute for Molecular Engineering and Materials Science Division, Argonne National Laboratory, Lemont, Illinois 60439, USA

    2018-01-28

    Molecular simulation has emerged as an essential tool for modern-day research, but obtaining proper results and making reliable conclusions from simulations requires adequate sampling of the system under consideration. To this end, a variety of methods exist in the literature that can enhance sampling considerably, and increasingly sophisticated, effective algorithms continue to be developed at a rapid pace. Implementation of these techniques, however, can be challenging for experts and non-experts alike. There is a clear need for software that provides rapid, reliable, and easy access to a wide range of advanced sampling methods, and that facilitates implementation of new techniques as they emerge. Here we present SSAGES, a publicly available Software Suite for Advanced General Ensemble Simulations designed to interface with multiple widely used molecular dynamics simulations packages. SSAGES allows facile application of a variety of enhanced sampling techniques—including adaptive biasing force, string methods, and forward flux sampling—that extract meaningful free energy and transition path data from all-atom and coarse grained simulations. A noteworthy feature of SSAGES is a user-friendly framework that facilitates further development and implementation of new methods and collective variables. In this work, the use of SSAGES is illustrated in the context of simple representative applications involving distinct methods and different collective variables that are available in the current release of the suite.

  11. SMEs and new ventures need business model sophistication

    DEFF Research Database (Denmark)

    Kesting, Peter; Günzel-Jensen, Franziska

    2015-01-01

    , and Spreadshirt, this article develops a framework that introduces five business model sophistication strategies: (1) uncover additional functions of your product, (2) identify strategic benefits for third parties, (3) take advantage of economies of scope, (4) utilize cross-selling opportunities, and (5) involve...

  12. Dynamic visualization techniques for high consequence software

    Energy Technology Data Exchange (ETDEWEB)

    Pollock, G.M.

    1998-02-01

    This report documents a prototype tool developed to investigate the use of visualization and virtual reality technologies for improving software surety confidence. The tool is utilized within the execution phase of the software life cycle. It provides a capability to monitor an executing program against prespecified requirements constraints provided in a program written in the requirements specification language SAGE. The resulting Software Attribute Visual Analysis Tool (SAVAnT) also provides a technique to assess the completeness of a software specification. The prototype tool is described along with the requirements constraint language after a brief literature review is presented. Examples of how the tool can be used are also presented. In conclusion, the most significant advantage of this tool is to provide a first step in evaluating specification completeness, and to provide a more productive method for program comprehension and debugging. The expected payoff is increased software surety confidence, increased program comprehension, and reduced development and debugging time.

  13. Software for safety critical applications

    International Nuclear Information System (INIS)

    Kropik, M.; Matejka, K.; Jurickova, M.; Chudy, R.

    2001-01-01

    The contribution gives an overview of the project of the software development for safety critical applications. This project has been carried out since 1997. The principal goal of the project was to establish a research laboratory for the development of the software with the highest requirements for quality and reliability. This laboratory was established at the department, equipped with proper hardware and software to support software development. A research team of predominantly young researchers for software development was created. The activities of the research team started with studying and proposing the software development methodology. In addition, this methodology was applied to the real software development. The verification and validation process followed the software development. The validation system for the integrated hardware and software tests was brought into being and its control software was developed. The quality of the software tools was also observed, and the SOSAT tool was used during these activities. National and international contacts were established and maintained during the project solution.(author)

  14. Validation of the Mobile Information Software Evaluation Tool (MISET) With Nursing Students.

    Science.gov (United States)

    Secco, M Loretta; Furlong, Karen E; Doyle, Glynda; Bailey, Judy

    2016-07-01

    This study evaluated the Mobile Information Software Evaluation Tool (MISET) with a sample of Canadian undergraduate nursing students (N = 240). Psychometric analyses determined how well the MISET assessed the extent that nursing students find mobile device-based information resources useful and supportive of learning in the clinical and classroom settings. The MISET has a valid three-factor structure with high explained variance (74.7%). Internal consistency reliabilities were high for the MISET total (.90) and three subscales: Usefulness/Helpfulness, Information Literacy Support, and Use of Evidence-Based Sources (.87 to .94). Construct validity evidence included significantly higher mean total MISET, Helpfulness/Usefulness, and Information Literacy Support scores for senior students and those with higher computer competence. The MISET is a promising tool to evaluate mobile information technologies and information literacy support; however, longitudinal assessment of changes in scores over time would determine scale sensitivity and responsiveness. [J Nurs Educ. 2016;55(7):385-390.]. Copyright 2016, SLACK Incorporated.

  15. FMT (Flight Software Memory Tracker) For Cassini Spacecraft-Software Engineering Using JAVA

    Science.gov (United States)

    Kan, Edwin P.; Uffelman, Hal; Wax, Allan H.

    1997-01-01

    The software engineering design of the Flight Software Memory Tracker (FMT) Tool is discussed in this paper. FMT is a ground analysis software set, consisting of utilities and procedures, designed to track the flight software, i.e., images of memory load and updatable parameters of the computers on-board Cassini spacecraft. FMT is implemented in Java.

  16. Software Tool for Automated Failure Modes and Effects Analysis (FMEA) of Hydraulic Systems

    DEFF Research Database (Denmark)

    Stecki, J. S.; Conrad, Finn; Oh, B.

    2002-01-01

    Offshore, marine,aircraft and other complex engineering systems operate in harsh environmental and operational conditions and must meet stringent requirements of reliability, safety and maintability. To reduce the hight costs of development of new systems in these fields improved the design...... management techniques and a vast array of computer aided techniques are applied during design and testing stages. The paper present and discusses the research and development of a software tool for automated failure mode and effects analysis - FMEA - of hydraulic systems. The paper explains the underlying...

  17. User Guide for the Plotting Software for the Los Alamos National Laboratory Nuclear Weapons Analysis Tools Version 2.0

    Energy Technology Data Exchange (ETDEWEB)

    Cleland, Timothy James [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2018-01-02

    The Los Alamos National Laboratory Plotting Software for the Nuclear Weapons Analysis Tools is a Java™ application based upon the open source library JFreeChart. The software provides a capability for plotting data on graphs with a rich variety of display options while allowing the viewer interaction via graph manipulation and scaling to best view the data. The graph types include XY plots, Date XY plots, Bar plots and Histogram plots.

  18. MyView2, a new visualization software tool for analysis of LHD data

    International Nuclear Information System (INIS)

    Moon, Chanho; Yoshinuma, Mikirou; Emoto, Masahiko; Ida, Katsumi

    2016-01-01

    The Large Helical Device (LHD) at the National Institute for Fusion Science (NIFS) is the world’s largest superconducting helical fusion device, providing a scientific research center to elucidate important physics research such as plasma transport, turbulence dynamics, and other topics. Furthermore, many types of advanced diagnostic devices are used to measure the confinement plasma characteristics, and these valuable physical data are registered over the 131,000 discharges in the LHD database. However, it is difficult to investigate the experimental data even though much physical data has been registered. In order to improve the efficiency for investigating plasma physics in LHD, we have developed a new data visualization software, MyView2, which consists of Python-based modules that can be easily set up and updated. MyView2 provides immediate access to experimental results, cross-shot analysis, and a collaboration point for scientific research. In particular, the MyView2 software is a portable structure for making viewable LHD experimental data in on- and off-site web servers, which is a capability not previously available in any general use tool. We will also discuss the benefits of using the MyView2 software for in-depth analysis of LHD experimental data.

  19. The Systems Biology Research Tool: evolvable open-source software

    OpenAIRE

    Wright, J; Wagner, A

    2008-01-01

    Abstract Background Research in the field of systems biology requires software for a variety of purposes. Software must be used to store, retrieve, analyze, and sometimes even to collect the data obtained from system-level (often high-throughput) experiments. Software must also be used to implement mathematical models and algorithms required for simulation and theoretical predictions on the system-level. Results We introduce a free, easy-to-use, open-source, integrated software platform calle...

  20. Software project profitability analysis using temporal probabilistic reasoning; an empirical study with the CASSE framework

    CSIR Research Space (South Africa)

    Balikuddembe, JK

    2009-04-01

    Full Text Available Undertaking adequate risk management by understanding project requirements and ensuring that viable estimates are made on software projects require extensive application and sophisticated techniques of analysis and interpretation. Informative...

  1. Evaluating a digital ship design tool prototype: Designers' perceptions of novel ergonomics software.

    Science.gov (United States)

    Mallam, Steven C; Lundh, Monica; MacKinnon, Scott N

    2017-03-01

    Computer-aided solutions are essential for naval architects to manage and optimize technical complexities when developing a ship's design. Although there are an array of software solutions aimed to optimize the human element in design, practical ergonomics methodologies and technological solutions have struggled to gain widespread application in ship design processes. This paper explores how a new ergonomics technology is perceived by naval architecture students using a mixed-methods framework. Thirteen Naval Architecture and Ocean Engineering Masters students participated in the study. Overall, results found participants perceived the software and its embedded ergonomics tools to benefit their design work, increasing their empathy and ability to understand the work environment and work demands end-users face. However, participant's questioned if ergonomics could be practically and efficiently implemented under real-world project constraints. This revealed underlying social biases and a fundamental lack of understanding in engineering postgraduate students regarding applied ergonomics in naval architecture. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. Software compensation in particle flow reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Tran, Huong Lan; Krueger, Katja; Sefkow, Felix [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany); Green, Steven; Marshall, John; Thomson, Mark [Cavendish Laboratory, Cambridge (United Kingdom); Simon, Frank [Max-Planck-Institut fuer Physik, Muenchen (Germany)

    2017-10-15

    The particle flow approach to calorimetry benefits from highly granular calorimeters and sophisticated software algorithms in order to reconstruct and identify individual particles in complex event topologies. The high spatial granularity, together with analogue energy information, can be further exploited in software compensation. In this approach, the local energy density is used to discriminate electromagnetic and purely hadronic sub-showers within hadron showers in the detector to improve the energy resolution for single particles by correcting for the intrinsic non-compensation of the calorimeter system. This improvement in the single particle energy resolution also results in a better overall jet energy resolution by improving the energy measurement of identified neutral hadrons and improvements in the pattern recognition stage by a more accurate matching of calorimeter energies to tracker measurements. This paper describes the software compensation technique and its implementation in particle flow reconstruction with the Pandora Particle Flow Algorithm (PandoraPFA). The impact of software compensation on the choice of optimal transverse granularity for the analogue hadronic calorimeter option of the International Large Detector (ILD) concept is also discussed.

  3. Software compensation in particle flow reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Tran, Huong Lan; Krueger, Katja; Sefkow, Felix [Deutsches Elektronen-Synchrotron DESY, Hamburg (Germany); Green, Steven; Marshall, John; Thomson, Mark [Cavendish Laboratory, Cambridge (United Kingdom); Simon, Frank [Max-Planck-Institut fuer Physik, Muenchen (Germany)

    2017-10-15

    The particle flow approach to calorimetry benefits from highly granular calorimeters and sophisticated software algorithms in order to reconstruct and identify individual particles in complex event topologies. The high spatial granularity, together with analogue energy information, can be further exploited in software compensation. In this approach, the local energy density is used to discriminate electromagnetic and purely hadronic sub-showers within hadron showers in the detector to improve the energy resolution for single particles by correcting for the intrinsic non-compensation of the calorimeter system. This improvement in the single particle energy resolution also results in a better overall jet energy resolution by improving the energy measurement of identified neutral hadrons and improvements in the pattern recognition stage by a more accurate matching of calorimeter energies to tracker measurements. This paper describes the software compensation technique and its implementation in particle flow reconstruction with the Pandora Particle Flow Algorithm (PandoraPFA). The impact of software compensation on the choice of optimal transverse granularity for the analogue hadronic calorimeter option of the International Large Detector (ILD) concept is also discussed. (orig.)

  4. Software compensation in particle flow reconstruction

    International Nuclear Information System (INIS)

    Tran, Huong Lan; Krueger, Katja; Sefkow, Felix; Green, Steven; Marshall, John; Thomson, Mark; Simon, Frank

    2017-10-01

    The particle flow approach to calorimetry benefits from highly granular calorimeters and sophisticated software algorithms in order to reconstruct and identify individual particles in complex event topologies. The high spatial granularity, together with analogue energy information, can be further exploited in software compensation. In this approach, the local energy density is used to discriminate electromagnetic and purely hadronic sub-showers within hadron showers in the detector to improve the energy resolution for single particles by correcting for the intrinsic non-compensation of the calorimeter system. This improvement in the single particle energy resolution also results in a better overall jet energy resolution by improving the energy measurement of identified neutral hadrons and improvements in the pattern recognition stage by a more accurate matching of calorimeter energies to tracker measurements. This paper describes the software compensation technique and its implementation in particle flow reconstruction with the Pandora Particle Flow Algorithm (PandoraPFA). The impact of software compensation on the choice of optimal transverse granularity for the analogue hadronic calorimeter option of the International Large Detector (ILD) concept is also discussed.

  5. Integrating and Managing Bim in GIS, Software Review

    Science.gov (United States)

    El Meouche, R.; Rezoug, M.; Hijazi, I.

    2013-08-01

    Since the advent of Computer-Aided Design (CAD) and Geographical Information System (GIS) tools, project participants have been increasingly leveraging these tools throughout the different phases of a civil infrastructure project. In recent years the number of GIS software that provides tools to enable the integration of Building information in geo context has risen sharply. More and more GIS software are added tools for this purposes and other software projects are regularly extending these tools. However, each software has its different strength and weakness and its purpose of use. This paper provides a thorough review to investigate the software capabilities and clarify its purpose. For this study, Autodesk Revit 2012 i.e. BIM editor software was used to create BIMs. In the first step, three building models were created, the resulted models were converted to BIM format and then the software was used to integrate it. For the evaluation of the software, general characteristics was studied such as the user interface, what formats are supported (import/export), and the way building information are imported.

  6. Development and assessment of a digital X-ray software tool to determine vertebral rotation in adolescent idiopathic scoliosis.

    Science.gov (United States)

    Eijgenraam, Susanne M; Boselie, Toon F M; Sieben, Judith M; Bastiaenen, Caroline H G; Willems, Paul C; Arts, Jacobus J; Lataster, Arno

    2017-02-01

    The amount of vertebral rotation in the axial plane is of key importance in the prognosis and treatment of adolescent idiopathic scoliosis (AIS). Current methods to determine vertebral rotation are either designed for use in analogue plain radiographs and not useful in digital images, or lack measurement precision and are therefore less suitable for the follow-up of rotation in AIS patients. This study aimed to develop a digital X-ray software tool with high measurement precision to determine vertebral rotation in AIS, and to assess its (concurrent) validity and reliability. In this study a combination of basic science and reliability methodology applied in both laboratory and clinical settings was used. Software was developed using the algorithm of the Perdriolle torsion meter for analogue AP plain radiographs of the spine. Software was then assessed for (1) concurrent validity and (2) intra- and interobserver reliability. Plain radiographs of both human cadaver vertebrae and outpatient AIS patients were used. Concurrent validity was measured by two independent observers, both experienced in the assessment of plain radiographs. Reliability-measurements were performed by three independent spine surgeons. Pearson correlation of the software compared with the analogue Perdriolle torsion meter for mid-thoracic vertebrae was 0.98, for low-thoracic vertebrae 0.97 and for lumbar vertebrae 0.97. Measurement exactness of the software was within 5° in 62% of cases and within 10° in 97% of cases. Intraclass correlation coefficient (ICC) for inter-observer reliability was 0.92 (0.91-0.95), ICC for intra-observer reliability was 0.96 (0.94-0.97). We developed a digital X-ray software tool to determine vertebral rotation in AIS with a substantial concurrent validity and reliability, which may be useful for the follow-up of vertebral rotation in AIS patients. Copyright © 2015 Elsevier Inc. All rights reserved.

  7. Inventory of safeguards software

    International Nuclear Information System (INIS)

    Suzuki, Mitsutoshi; Horino, Koichi

    2009-03-01

    The purpose of this survey activity will serve as a basis for determining what needs may exist in this arena for development of next-generation safeguards systems and approaches. 23 software tools are surveyed by JAEA and NMCC. Exchanging information regarding existing software tools for safeguards and discussing about a next R and D program of developing a general-purpose safeguards tool should be beneficial to a safeguards system design and indispensable to evaluate a safeguards system for future nuclear fuel facilities. (author)

  8. Easily configured real-time CPOE Pick Off Tool supporting focused clinical research and quality improvement.

    Science.gov (United States)

    Rosenbaum, Benjamin P; Silkin, Nikolay; Miller, Randolph A

    2014-01-01

    Real-time alerting systems typically warn providers about abnormal laboratory results or medication interactions. For more complex tasks, institutions create site-wide 'data warehouses' to support quality audits and longitudinal research. Sophisticated systems like i2b2 or Stanford's STRIDE utilize data warehouses to identify cohorts for research and quality monitoring. However, substantial resources are required to install and maintain such systems. For more modest goals, an organization desiring merely to identify patients with 'isolation' orders, or to determine patients' eligibility for clinical trials, may adopt a simpler, limited approach based on processing the output of one clinical system, and not a data warehouse. We describe a limited, order-entry-based, real-time 'pick off' tool, utilizing public domain software (PHP, MySQL). Through a web interface the tool assists users in constructing complex order-related queries and auto-generates corresponding database queries that can be executed at recurring intervals. We describe successful application of the tool for research and quality monitoring.

  9. A Software Tool for Optimal Sizing of PV Systems in Malaysia

    Directory of Open Access Journals (Sweden)

    Tamer Khatib

    2012-01-01

    Full Text Available This paper presents a MATLAB based user friendly software tool called as PV.MY for optimal sizing of photovoltaic (PV systems. The software has the capabilities of predicting the metrological variables such as solar energy, ambient temperature and wind speed using artificial neural network (ANN, optimizes the PV module/ array tilt angle, optimizes the inverter size and calculate optimal capacities of PV array, battery, wind turbine and diesel generator in hybrid PV systems. The ANN based model for metrological prediction uses four meteorological variables, namely, sun shine ratio, day number and location coordinates. As for PV system sizing, iterative methods are used for determining the optimal sizing of three types of PV systems, which are standalone PV system, hybrid PV/wind system and hybrid PV/diesel generator system. The loss of load probability (LLP technique is used for optimization in which the energy sources capacities are the variables to be optimized considering very low LLP. As for determining the optimal PV panels tilt angle and inverter size, the Liu and Jordan model for solar energy incident on a tilt surface is used in optimizing the monthly tilt angle, while a model for inverter efficiency curve is used in the optimization of inverter size.

  10. The Design and Development of a Computerized Tool Support for Conducting Senior Projects in Software Engineering Education

    Science.gov (United States)

    Chen, Chung-Yang; Teng, Kao-Chiuan

    2011-01-01

    This paper presents a computerized tool support, the Meetings-Flow Project Collaboration System (MFS), for designing, directing and sustaining the collaborative teamwork required in senior projects in software engineering (SE) education. Among many schools' SE curricula, senior projects serve as a capstone course that provides comprehensive…

  11. Component-based development of software language engineering tools

    NARCIS (Netherlands)

    Ssanyu, J.; Hemerik, C.

    2011-01-01

    In this paper we outline how Software Language Engineering (SLE) could benefit from Component-based Software Development (CBSD) techniques and present an architecture aimed at developing a coherent set of lightweight SLE components, fitting into a general-purpose component framework. In order to

  12. Assessment of the effect of Nd:YAG laser pulse operating parameters on the metallurgical characteristics of different tool steels using DOE software

    Directory of Open Access Journals (Sweden)

    T. Muhič

    2011-04-01

    Full Text Available To ensure the reliability of repair welded tool surfaces, clad quality should be improved. The relationships between metallurgical characteristics of cladding and laser input welding parameters were studied using the design of experiments software. The influence of laser power, welding speed, focal point position and diameter of welding wire on the weld-bead geometry (i.e. penetration, cladding zone width and heat-affected-zone width, microstructural homogeneity, dilution and bond strength was investigated on commonly used tool steels 1,2083, 1,2312 and 1,2343, using DOE software.

  13. TESPI (Tool for Environmental Sound Product Innovation): a simplified software tool to support environmentally conscious design in SMEs

    Science.gov (United States)

    Misceo, Monica; Buonamici, Roberto; Buttol, Patrizia; Naldesi, Luciano; Grimaldi, Filomena; Rinaldi, Caterina

    2004-12-01

    TESPI (Tool for Environmental Sound Product Innovation) is the prototype of a software tool developed within the framework of the "eLCA" project. The project, (www.elca.enea.it)financed by the European Commission, is realising "On line green tools and services for Small and Medium sized Enterprises (SMEs)". The implementation by SMEs of environmental product innovation (as fostered by the European Integrated Product Policy, IPP) needs specific adaptation to their economic model, their knowledge of production and management processes and their relationships with innovation and the environment. In particular, quality and costs are the main driving forces of innovation in European SMEs, and well known barriers exist to the adoption of an environmental approach in the product design. Starting from these considerations, the TESPI tool has been developed to support the first steps of product design taking into account both the quality and the environment. Two main issues have been considered: (i) classic Quality Function Deployment (QFD) can hardly be proposed to SMEs; (ii) the environmental aspects of the product life cycle need to be integrated with the quality approach. TESPI is a user friendly web-based tool, has a training approach and applies to modular products. Users are guided through the investigation of the quality aspects of their product (customer"s needs and requirements fulfilment) and the identification of the key environmental aspects in the product"s life cycle. A simplified check list allows analyzing the environmental performance of the product. Help is available for a better understanding of the analysis criteria. As a result, the significant aspects for the redesign of the product are identified.

  14. Proceedings of the Fifth Triennial Software Quality Forum 2000, Software for the Next Millennium, Software Quality Forum

    Energy Technology Data Exchange (ETDEWEB)

    Scientific Software Engineering Group, CIC-12

    2000-04-01

    The Software Quality Forum is a triennial conference held by the Software Quality Assurance Subcommittee for the Department of Energy's Quality Managers. The forum centers on key issues, information, and technology important in software development for the Nuclear Weapons Complex. This year it will be opened up to include local information technology companies and software vendors presenting their solutions, ideas, and lessons learned. The Software Quality Forum 2000 will take on a more hands-on, instructional tone than those previously held. There will be an emphasis on providing information, tools, and resources to assist developers in their goal of producing next generation software.

  15. Gammasphere software development

    International Nuclear Information System (INIS)

    Piercey, R.B.

    1994-01-01

    This report describes the activities of the nuclear physics group at Mississippi State University which were performed during 1993. Significant progress has been made in the focus areas: chairing the Gammasphere Software Working Group (SWG); assisting with the porting and enhancement of the ORNL UPAK histogramming software package; and developing standard formats for Gammasphere data products. In addition, they have established a new public ftp archive to distribute software and software development tools and information

  16. Evolving software reengineering technology for the emerging innovative-competitive era

    Science.gov (United States)

    Hwang, Phillip Q.; Lock, Evan; Prywes, Noah

    1994-01-01

    This paper reports on a multi-tool commercial/military environment combining software Domain Analysis techniques with Reusable Software and Reengineering of Legacy Software. It is based on the development of a military version for the Department of Defense (DOD). The integrated tools in the military version are: Software Specification Assistant (SSA) and Software Reengineering Environment (SRE), developed by Computer Command and Control Company (CCCC) for Naval Surface Warfare Center (NSWC) and Joint Logistics Commanders (JLC), and the Advanced Research Project Agency (ARPA) STARS Software Engineering Environment (SEE) developed by Boeing for NAVAIR PMA 205. The paper describes transitioning these integrated tools to commercial use. There is a critical need for the transition for the following reasons: First, to date, 70 percent of programmers' time is applied to software maintenance. The work of these users has not been facilitated by existing tools. The addition of Software Reengineering will also facilitate software maintenance and upgrading. In fact, the integrated tools will support the entire software life cycle. Second, the integrated tools are essential to Business Process Reengineering, which seeks radical process innovations to achieve breakthrough results. Done well, process reengineering delivers extraordinary gains in process speed, productivity and profitability. Most importantly, it discovers new opportunities for products and services in collaboration with other organizations. Legacy computer software must be changed rapidly to support innovative business processes. The integrated tools will provide commercial organizations important competitive advantages. This, in turn, will increase employment by creating new business opportunities. Third, the integrated system will produce much higher quality software than use of the tools separately. The reason for this is that producing or upgrading software requires keen understanding of extremely complex

  17. Software engineering laboratory series: Annotated bibliography of software engineering laboratory literature

    Science.gov (United States)

    Morusiewicz, Linda; Valett, Jon

    1992-01-01

    This document is an annotated bibliography of technical papers, documents, and memorandums produced by or related to the Software Engineering Laboratory. More than 100 publications are summarized. These publications cover many areas of software engineering and range from research reports to software documentation. This document has been updated and reorganized substantially since the original version (SEL-82-006, November 1982). All materials have been grouped into eight general subject areas for easy reference: (1) the Software Engineering Laboratory; (2) the Software Engineering Laboratory: Software Development Documents; (3) Software Tools; (4) Software Models; (5) Software Measurement; (6) Technology Evaluations; (7) Ada Technology; and (8) Data Collection. This document contains an index of these publications classified by individual author.

  18. Thermonuclear Reaction Rate Libraries and Software Tools for Nuclear Astrophysics Research

    International Nuclear Information System (INIS)

    Smith, Michael S.; Cyburt, Richard; Schatz, Hendrik; Smith, Karl; Warren, Scott; Ferguson, Ryan; Wiescher, Michael; Lingerfelt, Eric; Buckner, Kim; Nesaraja, Caroline D.

    2008-01-01

    Thermonuclear reaction rates are a crucial input for simulating a wide variety of astrophysical environments. A new collaboration has been formed to ensure that astrophysical modelers have access to reaction rates based on the most recent experimental and theoretical nuclear physics information. To reach this goal, a new version of the REACLIB library has been created by the Joint Institute for Nuclear Astrophysics (JINA), now available online at http://www.nscl.msu.edu/~nero/db. A complementary effort is the development of software tools in the Computational Infrastructure for Nuclear Astrophysics, online at nucastrodata.org, to streamline, manage, and access the workflow of the reaction evaluations from their initiation to peer review to incorporation into the library. Details of these new projects will be described

  19. Systems and software variability management concepts, tools and experiences

    CERN Document Server

    Capilla, Rafael; Kang, Kyo-Chul

    2013-01-01

    The success of product line engineering techniques in the last 15 years has popularized the use of software variability as a key modeling approach for describing the commonality and variability of systems at all stages of the software lifecycle. Software product lines enable a family of products to share a common core platform, while allowing for product specific functionality being built on top of the platform. Many companies have exploited the concept of software product lines to increase the resources that focus on highly differentiating functionality and thus improve their competitiveness

  20. Developing Learning Tool of Control System Engineering Using Matrix Laboratory Software Oriented on Industrial Needs

    Science.gov (United States)

    Isnur Haryudo, Subuh; Imam Agung, Achmad; Firmansyah, Rifqi

    2018-04-01

    The purpose of this research is to develop learning media of control technique using Matrix Laboratory software with industry requirement approach. Learning media serves as a tool for creating a better and effective teaching and learning situation because it can accelerate the learning process in order to enhance the quality of learning. Control Techniques using Matrix Laboratory software can enlarge the interest and attention of students, with real experience and can grow independent attitude. This research design refers to the use of research and development (R & D) methods that have been modified by multi-disciplinary team-based researchers. This research used Computer based learning method consisting of computer and Matrix Laboratory software which was integrated with props. Matrix Laboratory has the ability to visualize the theory and analysis of the Control System which is an integration of computing, visualization and programming which is easy to use. The result of this instructional media development is to use mathematical equations using Matrix Laboratory software on control system application with DC motor plant and PID (Proportional-Integral-Derivative). Considering that manufacturing in the field of Distributed Control systems (DCSs), Programmable Controllers (PLCs), and Microcontrollers (MCUs) use PID systems in production processes are widely used in industry.

  1. Software Engineering Environment for Component-based Design of Embedded Software

    DEFF Research Database (Denmark)

    Guo, Yu

    2010-01-01

    as well as application models in a computer-aided software engineering environment. Furthermore, component models have been realized following carefully developed design patterns, which provide for an efficient and reusable implementation. The components have been ultimately implemented as prefabricated...... executable objects that can be linked together into an executable application. The development of embedded software using the COMDES framework is supported by the associated integrated engineering environment consisting of a number of tools, which support basic functionalities, such as system modelling......, validation, and executable code generation for specific hardware platforms. Developing such an environment and the associated tools is a highly complex engineering task. Therefore, this thesis has investigated key design issues and analysed existing platforms supporting model-driven software development...

  2. EpiTools, A software suite for presurgical brain mapping in epilepsy: Intracerebral EEG.

    Science.gov (United States)

    Medina Villalon, S; Paz, R; Roehri, N; Lagarde, S; Pizzo, F; Colombet, B; Bartolomei, F; Carron, R; Bénar, C-G

    2018-03-29

    In pharmacoresistant epilepsy, exploration with depth electrodes can be needed to precisely define the epileptogenic zone. Accurate location of these electrodes is thus essential for the interpretation of Stereotaxic EEG (SEEG) signals. As SEEG analysis increasingly relies on signal processing, it is crucial to make a link between these results and patient's anatomy. Our aims were thus to develop a suite of software tools, called "EpiTools", able to i) precisely and automatically localize the position of each SEEG contact and ii) display the results of signal analysis in each patient's anatomy. The first tool, GARDEL (GUI for Automatic Registration and Depth Electrode Localization), is able to automatically localize SEEG contacts and to label each contact according to a pre-specified nomenclature (for instance that of FreeSurfer or MarsAtlas). The second tool, 3Dviewer, enables to visualize in the 3D anatomy of the patient the origin of signal processing results such as rate of biomarkers, connectivity graphs or Epileptogenicity Index. GARDEL was validated in 30 patients by clinicians and proved to be highly reliable to determine within the patient's individual anatomy the actual location of contacts. GARDEL is a fully automatic electrode localization tool needing limited user interaction (only for electrode naming or contact correction). The 3Dviewer is able to read signal processing results and to display them in link with patient's anatomy. EpiTools can help speeding up the interpretation of SEEG data and improving its precision. Copyright © 2018 Elsevier B.V. All rights reserved.

  3. Development of software tools for supporting building clearance and site release at UKAEA

    International Nuclear Information System (INIS)

    Jessop, G.; Pearl, M.

    2002-01-01

    UKAEA sites generally have complex histories and have been subject to a diverse range of nuclear operations. Most of the nuclear reactors, laboratories, workshops and other support facilities are now redundant and a programme of decommissioning works in accordance with IAEA guidance is in progress. Decommissioning is being carried out in phases with post- operative activities, care and maintenance and care and surveillance periods between stages to allow relatively short-lived radioactivity to decay. This reduces dose levels to personnel and minimises radioactive waste production. Following on from these stages is an end point phase which corresponds to the point at which the risks to human health and the environment are sufficiently low so that the buildings / land can be released for future use. Unconditional release corresponds to meeting the requirement for 'de-licensing'. Although reaching a de-licensable end point is the desired aim for UKAEA sites, it is recognised that this may take hundreds of years for parts of some UKAEA sites, or may never be attainable at a reasonable cost to the UK taxpayer. Thus on these sites, long term risk management systems are in place to minimise the impact on health, safety and the environment. In order to manage these short, medium and long term liabilities, UKAEA has developed a number of software tools based on good practice guidance. One of these tools in particular is being developed to address building clearance and site release. This tool, IMAGES (Information Management and Geographical Information System) integrates systematic data capture, with database management and spatial assessment (through a Geographical Information System). Details of IMAGES and its applications are discussed in the paper. This paper outlines the approach being adopted by UKAEA for building and site release and the integrated software system, IMAGES, being used to capture, collate, interpret and report results. The key to UKAEA's strategy for

  4. Circular Hough transform diffraction analysis: A software tool for automated measurement of selected area electron diffraction patterns within Digital MicrographTM

    International Nuclear Information System (INIS)

    Mitchell, D.R.G.

    2008-01-01

    A software tool (script and plugin) for computing circular Hough transforms (CHT) in Digital Micrograph TM has been developed, for the purpose of automated analysis of selected area electron diffraction patterns (SADPs) of polycrystalline materials. The CHT enables the diffraction pattern centre to be determined with sub-pixel accuracy, regardless of the exposure condition of the transmitted beam or if a beam stop is present. Radii of the diffraction rings can also be accurately measured with sub-pixel precision. If the pattern is calibrated against a known camera length, then d-spacings with an accuracy of better than 1% can be obtained. These measurements require no a priori knowledge of the pattern and very limited user interaction. The accuracy of the CHT is degraded by distortion introduced by the projector lens, and this should be minimised prior to pattern acquisition. A number of optimisations in the CHT software enable rapid processing of patterns; a typical analysis of a 1kx1k image taking just a few minutes. The CHT tool appears robust and is even able to accurately measure SADPs with very incomplete diffraction rings due to texture effects. This software tool is freely downloadable via the Internet

  5. Educational software tool for protection system engineers: distance relay; Herramienta educativa para la formacion de ingenieros en protecciones electricas: relevador de distancia

    Energy Technology Data Exchange (ETDEWEB)

    Trujillo-Guajardo, L.A.; Conde-Enriquez, A. [Universidad Autonoma de Nuevo Leon, Nuevo Leon (Mexico)]. E-mail: luistrujillo84@gmail.com; con_de@yahoo.com

    2012-04-15

    In this article, a graphical software tool is presented; this tool is based on the education of protection system engineers. The theoretical fundaments used for the design of operation characteristics of distance relays and their algorithms are presented. The software allows the evaluation and analysis of real time events or simulated ones of every stage of design of the distance relay. Some example cases are presented to illustrate the activities that could be done with the graphical software tool developed. [Spanish] En este articulo se presenta una herramienta computacional grafica para apoyar la formacion de ingenieros en protecciones electricas. Los fundamentos teoricos para el diseno de caracteristicas de operacion de relevadores de distancia, asi como las rutinas de programacion de un relevador de distancia son presentados. La herramienta desarrollada permite la evaluacion de las etapas de diseno de relevadores y el analisis de la operacion ante eventos reales o simulados. Se presentan algunos casos de ejemplo para ilustrar las actividades didacticas que son posibles de realizar con la herramienta presentada.

  6. Proceedings of the Ninth Annual Software Engineering Workshop

    Science.gov (United States)

    1984-01-01

    Experiences in measurement, utilization, and evaluation of software methodologies, models, and tools are discussed. NASA's involvement in ever larger and more complex systems, like the space station project, provides a motive for the support of software engineering research and the exchange of ideas in such forums. The topics of current SEL research are software error studies, experiments with software development, and software tools.

  7. Paladin Software Support Lab

    Data.gov (United States)

    Federal Laboratory Consortium — The Paladin Software Support Environment (SSE) occupies 2,241 square-feet. It contains the hardware and software tools required to support the Paladin Automatic Fire...

  8. A new software tool for computing Earth's atmospheric transmission of near- and far-infrared radiation

    Science.gov (United States)

    Lord, Steven D.

    1992-01-01

    This report describes a new software tool, ATRAN, which computes the transmittance of Earth's atmosphere at near- and far-infrared wavelengths. We compare the capabilities of this program with others currently available and demonstrate its utility for observational data calibration and reduction. The program employs current water-vapor and ozone models to produce fast and accurate transmittance spectra for wavelengths ranging from 0.8 microns to 10 mm.

  9. Tools for Analyzing Computing Resource Management Strategies and Algorithms for SDR Clouds

    Science.gov (United States)

    Marojevic, Vuk; Gomez-Miguelez, Ismael; Gelonch, Antoni

    2012-09-01

    Software defined radio (SDR) clouds centralize the computing resources of base stations. The computing resource pool is shared between radio operators and dynamically loads and unloads digital signal processing chains for providing wireless communications services on demand. Each new user session request particularly requires the allocation of computing resources for executing the corresponding SDR transceivers. The huge amount of computing resources of SDR cloud data centers and the numerous session requests at certain hours of a day require an efficient computing resource management. We propose a hierarchical approach, where the data center is divided in clusters that are managed in a distributed way. This paper presents a set of computing resource management tools for analyzing computing resource management strategies and algorithms for SDR clouds. We use the tools for evaluating a different strategies and algorithms. The results show that more sophisticated algorithms can achieve higher resource occupations and that a tradeoff exists between cluster size and algorithm complexity.

  10. GraphCrunch 2: Software tool for network modeling, alignment and clustering.

    Science.gov (United States)

    Kuchaiev, Oleksii; Stevanović, Aleksandar; Hayes, Wayne; Pržulj, Nataša

    2011-01-19

    Recent advancements in experimental biotechnology have produced large amounts of protein-protein interaction (PPI) data. The topology of PPI networks is believed to have a strong link to their function. Hence, the abundance of PPI data for many organisms stimulates the development of computational techniques for the modeling, comparison, alignment, and clustering of networks. In addition, finding representative models for PPI networks will improve our understanding of the cell just as a model of gravity has helped us understand planetary motion. To decide if a model is representative, we need quantitative comparisons of model networks to real ones. However, exact network comparison is computationally intractable and therefore several heuristics have been used instead. Some of these heuristics are easily computable "network properties," such as the degree distribution, or the clustering coefficient. An important special case of network comparison is the network alignment problem. Analogous to sequence alignment, this problem asks to find the "best" mapping between regions in two networks. It is expected that network alignment might have as strong an impact on our understanding of biology as sequence alignment has had. Topology-based clustering of nodes in PPI networks is another example of an important network analysis problem that can uncover relationships between interaction patterns and phenotype. We introduce the GraphCrunch 2 software tool, which addresses these problems. It is a significant extension of GraphCrunch which implements the most popular random network models and compares them with the data networks with respect to many network properties. Also, GraphCrunch 2 implements the GRAph ALigner algorithm ("GRAAL") for purely topological network alignment. GRAAL can align any pair of networks and exposes large, dense, contiguous regions of topological and functional similarities far larger than any other existing tool. Finally, GraphCruch 2 implements an

  11. GraphCrunch 2: Software tool for network modeling, alignment and clustering

    Directory of Open Access Journals (Sweden)

    Hayes Wayne

    2011-01-01

    Full Text Available Abstract Background Recent advancements in experimental biotechnology have produced large amounts of protein-protein interaction (PPI data. The topology of PPI networks is believed to have a strong link to their function. Hence, the abundance of PPI data for many organisms stimulates the development of computational techniques for the modeling, comparison, alignment, and clustering of networks. In addition, finding representative models for PPI networks will improve our understanding of the cell just as a model of gravity has helped us understand planetary motion. To decide if a model is representative, we need quantitative comparisons of model networks to real ones. However, exact network comparison is computationally intractable and therefore several heuristics have been used instead. Some of these heuristics are easily computable "network properties," such as the degree distribution, or the clustering coefficient. An important special case of network comparison is the network alignment problem. Analogous to sequence alignment, this problem asks to find the "best" mapping between regions in two networks. It is expected that network alignment might have as strong an impact on our understanding of biology as sequence alignment has had. Topology-based clustering of nodes in PPI networks is another example of an important network analysis problem that can uncover relationships between interaction patterns and phenotype. Results We introduce the GraphCrunch 2 software tool, which addresses these problems. It is a significant extension of GraphCrunch which implements the most popular random network models and compares them with the data networks with respect to many network properties. Also, GraphCrunch 2 implements the GRAph ALigner algorithm ("GRAAL" for purely topological network alignment. GRAAL can align any pair of networks and exposes large, dense, contiguous regions of topological and functional similarities far larger than any other

  12. The State of Software for Evolutionary Biology.

    Science.gov (United States)

    Darriba, Diego; Flouri, Tomáš; Stamatakis, Alexandros

    2018-05-01

    With Next Generation Sequencing data being routinely used, evolutionary biology is transforming into a computational science. Thus, researchers have to rely on a growing number of increasingly complex software. All widely used core tools in the field have grown considerably, in terms of the number of features as well as lines of code and consequently, also with respect to software complexity. A topic that has received little attention is the software engineering quality of widely used core analysis tools. Software developers appear to rarely assess the quality of their code, and this can have potential negative consequences for end-users. To this end, we assessed the code quality of 16 highly cited and compute-intensive tools mainly written in C/C++ (e.g., MrBayes, MAFFT, SweepFinder, etc.) and JAVA (BEAST) from the broader area of evolutionary biology that are being routinely used in current data analysis pipelines. Because, the software engineering quality of the tools we analyzed is rather unsatisfying, we provide a list of best practices for improving the quality of existing tools and list techniques that can be deployed for developing reliable, high quality scientific software from scratch. Finally, we also discuss journal as well as science policy and, more importantly, funding issues that need to be addressed for improving software engineering quality as well as ensuring support for developing new and maintaining existing software. Our intention is to raise the awareness of the community regarding software engineering quality issues and to emphasize the substantial lack of funding for scientific software development.

  13. MAPIT: A new software tool to assist in the transition from conceptual model to numerical simulation models

    International Nuclear Information System (INIS)

    Canales, T.W.; Grant, C.W.

    1996-01-01

    MapIt is a new software tool developed at Lawrence Livermore National Laboratory to assist ground water remediation professionals in generating numerical simulation models from a variety of physical and chemical data sources and the corresponding 1, 2, and 3 dimensional conceptual models that emerge from analysis of such data

  14. Cognitive ability rivals the effect of political sophistication on ideological voting

    DEFF Research Database (Denmark)

    Hebbelstrup Rye Rasmussen, Stig

    2016-01-01

    This article examines the impact of cognitive ability on ideological voting. We find, using a US sample and a Danish sample, that the effect of cognitive ability rivals the effect of the traditionally strongest predicter of ideological voting political sophistication. Furthermore, the results...... are consistent with the effect of cognitive ability being partly mediated by political sophistication. Much of the effect of cognitive ability remains however and is not explained by differences in education or Openness to experience either. The implications of these results for democratic theory are discussed....

  15. Interoperable mesh and geometry tools for advanced petascale simulations

    International Nuclear Information System (INIS)

    Diachin, L; Bauer, A; Fix, B; Kraftcheck, J; Jansen, K; Luo, X; Miller, M; Ollivier-Gooch, C; Shephard, M S; Tautges, T; Trease, H

    2007-01-01

    SciDAC applications have a demonstrated need for advanced software tools to manage the complexities associated with sophisticated geometry, mesh, and field manipulation tasks, particularly as computer architectures move toward the petascale. The Center for Interoperable Technologies for Advanced Petascale Simulations (ITAPS) will deliver interoperable and interchangeable mesh, geometry, and field manipulation services that are of direct use to SciDAC applications. The premise of our technology development goal is to provide such services as libraries that can be used with minimal intrusion into application codes. To develop these technologies, we focus on defining a common data model and data-structure neutral interfaces that unify a number of different services such as mesh generation and improvement, front tracking, adaptive mesh refinement, shape optimization, and solution transfer operations. We highlight the use of several ITAPS services in SciDAC applications

  16. Error-Free Software

    Science.gov (United States)

    1989-01-01

    001 is an integrated tool suited for automatically developing ultra reliable models, simulations and software systems. Developed and marketed by Hamilton Technologies, Inc. (HTI), it has been applied in engineering, manufacturing, banking and software tools development. The software provides the ability to simplify the complex. A system developed with 001 can be a prototype or fully developed with production quality code. It is free of interface errors, consistent, logically complete and has no data or control flow errors. Systems can be designed, developed and maintained with maximum productivity. Margaret Hamilton, President of Hamilton Technologies, also directed the research and development of USE.IT, an earlier product which was the first computer aided software engineering product in the industry to concentrate on automatically supporting the development of an ultrareliable system throughout its life cycle. Both products originated in NASA technology developed under a Johnson Space Center contract.

  17. 2016 KIVA-hpFE Development: A Robust and Accurate Engine Modeling Software

    Energy Technology Data Exchange (ETDEWEB)

    Carrington, David Bradley [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Waters, Jiajia [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-10-25

    Los Alamos National Laboratory and its collaborators are facilitating engine modeling by improving accuracy and robustness of the modeling, and improving the robustness of software. We also continue to improve the physical modeling methods. We are developing and implementing new mathematical algorithms, those that represent the physics within an engine. We provide software that others may use directly or that they may alter with various models e.g., sophisticated chemical kinetics, different turbulent closure methods or other fuel injection and spray systems.

  18. Evaluation of high-performance computing software

    Energy Technology Data Exchange (ETDEWEB)

    Browne, S.; Dongarra, J. [Univ. of Tennessee, Knoxville, TN (United States); Rowan, T. [Oak Ridge National Lab., TN (United States)

    1996-12-31

    The absence of unbiased and up to date comparative evaluations of high-performance computing software complicates a user`s search for the appropriate software package. The National HPCC Software Exchange (NHSE) is attacking this problem using an approach that includes independent evaluations of software, incorporation of author and user feedback into the evaluations, and Web access to the evaluations. We are applying this approach to the Parallel Tools Library (PTLIB), a new software repository for parallel systems software and tools, and HPC-Netlib, a high performance branch of the Netlib mathematical software repository. Updating the evaluations with feed-back and making it available via the Web helps ensure accuracy and timeliness, and using independent reviewers produces unbiased comparative evaluations difficult to find elsewhere.

  19. Techniques and software tools for estimating ultrasonic signal-to-noise ratios

    Science.gov (United States)

    Chiou, Chien-Ping; Margetan, Frank J.; McKillip, Matthew; Engle, Brady J.; Roberts, Ronald A.

    2016-02-01

    At Iowa State University's Center for Nondestructive Evaluation (ISU CNDE), the use of models to simulate ultrasonic inspections has played a key role in R&D efforts for over 30 years. To this end a series of wave propagation models, flaw response models, and microstructural backscatter models have been developed to address inspection problems of interest. One use of the combined models is the estimation of signal-to-noise ratios (S/N) in circumstances where backscatter from the microstructure (grain noise) acts to mask sonic echoes from internal defects. Such S/N models have been used in the past to address questions of inspection optimization and reliability. Under the sponsorship of the National Science Foundation's Industry/University Cooperative Research Center at ISU, an effort was recently initiated to improve existing research-grade software by adding graphical user interface (GUI) to become user friendly tools for the rapid estimation of S/N for ultrasonic inspections of metals. The software combines: (1) a Python-based GUI for specifying an inspection scenario and displaying results; and (2) a Fortran-based engine for computing defect signal and backscattered grain noise characteristics. The latter makes use of several models including: the Multi-Gaussian Beam Model for computing sonic fields radiated by commercial transducers; the Thompson-Gray Model for the response from an internal defect; the Independent Scatterer Model for backscattered grain noise; and the Stanke-Kino Unified Model for attenuation. The initial emphasis was on reformulating the research-grade code into a suitable modular form, adding the graphical user interface and performing computations rapidly and robustly. Thus the initial inspection problem being addressed is relatively simple. A normal-incidence pulse/echo immersion inspection is simulated for a curved metal component having a non-uniform microstructure, specifically an equiaxed, untextured microstructure in which the average

  20. SSAGES: Software Suite for Advanced General Ensemble Simulations

    Science.gov (United States)

    Sidky, Hythem; Colón, Yamil J.; Helfferich, Julian; Sikora, Benjamin J.; Bezik, Cody; Chu, Weiwei; Giberti, Federico; Guo, Ashley Z.; Jiang, Xikai; Lequieu, Joshua; Li, Jiyuan; Moller, Joshua; Quevillon, Michael J.; Rahimi, Mohammad; Ramezani-Dakhel, Hadi; Rathee, Vikramjit S.; Reid, Daniel R.; Sevgen, Emre; Thapar, Vikram; Webb, Michael A.; Whitmer, Jonathan K.; de Pablo, Juan J.

    2018-01-01

    Molecular simulation has emerged as an essential tool for modern-day research, but obtaining proper results and making reliable conclusions from simulations requires adequate sampling of the system under consideration. To this end, a variety of methods exist in the literature that can enhance sampling considerably, and increasingly sophisticated, effective algorithms continue to be developed at a rapid pace. Implementation of these techniques, however, can be challenging for experts and non-experts alike. There is a clear need for software that provides rapid, reliable, and easy access to a wide range of advanced sampling methods and that facilitates implementation of new techniques as they emerge. Here we present SSAGES, a publicly available Software Suite for Advanced General Ensemble Simulations designed to interface with multiple widely used molecular dynamics simulations packages. SSAGES allows facile application of a variety of enhanced sampling techniques—including adaptive biasing force, string methods, and forward flux sampling—that extract meaningful free energy and transition path data from all-atom and coarse-grained simulations. A noteworthy feature of SSAGES is a user-friendly framework that facilitates further development and implementation of new methods and collective variables. In this work, the use of SSAGES is illustrated in the context of simple representative applications involving distinct methods and different collective variables that are available in the current release of the suite. The code may be found at: https://github.com/MICCoM/SSAGES-public.

  1. Software component quality evaluation

    Science.gov (United States)

    Clough, A. J.

    1991-01-01

    The paper describes a software inspection process that can be used to evaluate the quality of software components. Quality criteria, process application, independent testing of the process and proposed associated tool support are covered. Early results indicate that this technique is well suited for assessing software component quality in a standardized fashion. With automated machine assistance to facilitate both the evaluation and selection of software components, such a technique should promote effective reuse of software components.

  2. SNPdetector: a software tool for sensitive and accurate SNP detection.

    Directory of Open Access Journals (Sweden)

    Jinghui Zhang

    2005-10-01

    Full Text Available Identification of single nucleotide polymorphisms (SNPs and mutations is important for the discovery of genetic predisposition to complex diseases. PCR resequencing is the method of choice for de novo SNP discovery. However, manual curation of putative SNPs has been a major bottleneck in the application of this method to high-throughput screening. Therefore it is critical to develop a more sensitive and accurate computational method for automated SNP detection. We developed a software tool, SNPdetector, for automated identification of SNPs and mutations in fluorescence-based resequencing reads. SNPdetector was designed to model the process of human visual inspection and has a very low false positive and false negative rate. We demonstrate the superior performance of SNPdetector in SNP and mutation analysis by comparing its results with those derived by human inspection, PolyPhred (a popular SNP detection tool, and independent genotype assays in three large-scale investigations. The first study identified and validated inter- and intra-subspecies variations in 4,650 traces of 25 inbred mouse strains that belong to either the Mus musculus species or the M. spretus species. Unexpected heterozygosity in CAST/Ei strain was observed in two out of 1,167 mouse SNPs. The second study identified 11,241 candidate SNPs in five ENCODE regions of the human genome covering 2.5 Mb of genomic sequence. Approximately 50% of the candidate SNPs were selected for experimental genotyping; the validation rate exceeded 95%. The third study detected ENU-induced mutations (at 0.04% allele frequency in 64,896 traces of 1,236 zebra fish. Our analysis of three large and diverse test datasets demonstrated that SNPdetector is an effective tool for genome-scale research and for large-sample clinical studies. SNPdetector runs on Unix/Linux platform and is available publicly (http://lpg.nci.nih.gov.

  3. Fighting Software Piracy: Some Global Conditional Policy Instruments

    OpenAIRE

    Asongu, Simplice A; Singh, Pritam; Le Roux, Sara

    2016-01-01

    This study examines the efficiency of tools for fighting software piracy in the conditional distributions of software piracy. Our paper examines software piracy in 99 countries for the period 1994-2010, using contemporary and non-contemporary quantile regressions. The intuition for modelling distributions contingent on existing levels of software piracy is that the effectiveness of tools against piracy may consistently decrease or increase simultaneously with increasing levels of software pir...

  4. Tool support for distributed software engineering

    NARCIS (Netherlands)

    Spanjers, H.; Ter Huurne, M.; Bendas, D.; Graaf, B.; Lormans, M.; Van Solingen, R.

    2006-01-01

    Developing a software system in collaboration with other partners, and on different geographical locations is a big challenge for organizations. In this article we first discuss a system that automates build and test processes: SoftFab. This system has been successfully applied in practice in the

  5. The IceCube Data Acquisition Software: Lessons Learned during Distributed, Collaborative, Multi-Disciplined Software Development.

    Energy Technology Data Exchange (ETDEWEB)

    Beattie, Keith S; Beattie, Keith; Day Ph.D., Christopher; Glowacki, Dave; Hanson Ph.D., Kael; Jacobsen Ph.D., John; McParland, Charles; Patton Ph.D., Simon

    2007-09-21

    In this experiential paper we report on lessons learned during the development ofthe data acquisition software for the IceCube project - specifically, how to effectively address the unique challenges presented by a distributed, collaborative, multi-institutional, multi-disciplined project such as this. While development progress in software projects is often described solely in terms of technical issues, our experience indicates that non- and quasi-technical interactions play a substantial role in the effectiveness of large software development efforts. These include: selection and management of multiple software development methodologies, the effective useof various collaborative communication tools, project management structure and roles, and the impact and apparent importance of these elements when viewed through the differing perspectives of hardware, software, scientific and project office roles. Even in areas clearly technical in nature, success is still influenced by non-technical issues that can escape close attention. In particular we describe our experiences on software requirements specification, development methodologies and communication tools. We make observations on what tools and techniques have and have not been effective in this geographically disperse (including the South Pole) collaboration and offer suggestions on how similarly structured future projects may build upon our experiences.

  6. Software Safety and Security

    CERN Document Server

    Nipkow, T; Hauptmann, B

    2012-01-01

    Recent decades have seen major advances in methods and tools for checking the safety and security of software systems. Automatic tools can now detect security flaws not only in programs of the order of a million lines of code, but also in high-level protocol descriptions. There has also been something of a breakthrough in the area of operating system verification. This book presents the lectures from the NATO Advanced Study Institute on Tools for Analysis and Verification of Software Safety and Security; a summer school held at Bayrischzell, Germany, in 2011. This Advanced Study Institute was

  7. Software Quality Assurance Metrics

    Science.gov (United States)

    McRae, Kalindra A.

    2004-01-01

    Software Quality Assurance (SQA) is a planned and systematic set of activities that ensures conformance of software life cycle processes and products conform to requirements, standards and procedures. In software development, software quality means meeting requirements and a degree of excellence and refinement of a project or product. Software Quality is a set of attributes of a software product by which its quality is described and evaluated. The set of attributes includes functionality, reliability, usability, efficiency, maintainability, and portability. Software Metrics help us understand the technical process that is used to develop a product. The process is measured to improve it and the product is measured to increase quality throughout the life cycle of software. Software Metrics are measurements of the quality of software. Software is measured to indicate the quality of the product, to assess the productivity of the people who produce the product, to assess the benefits derived from new software engineering methods and tools, to form a baseline for estimation, and to help justify requests for new tools or additional training. Any part of the software development can be measured. If Software Metrics are implemented in software development, it can save time, money, and allow the organization to identify the caused of defects which have the greatest effect on software development. The summer of 2004, I worked with Cynthia Calhoun and Frank Robinson in the Software Assurance/Risk Management department. My task was to research and collect, compile, and analyze SQA Metrics that have been used in other projects that are not currently being used by the SA team and report them to the Software Assurance team to see if any metrics can be implemented in their software assurance life cycle process.

  8. Streamlining the Design-to-Build Transition with Build-Optimization Software Tools.

    Science.gov (United States)

    Oberortner, Ernst; Cheng, Jan-Fang; Hillson, Nathan J; Deutsch, Samuel

    2017-03-17

    Scaling-up capabilities for the design, build, and test of synthetic biology constructs holds great promise for the development of new applications in fuels, chemical production, or cellular-behavior engineering. Construct design is an essential component in this process; however, not every designed DNA sequence can be readily manufactured, even using state-of-the-art DNA synthesis methods. Current biological computer-aided design and manufacture tools (bioCAD/CAM) do not adequately consider the limitations of DNA synthesis technologies when generating their outputs. Designed sequences that violate DNA synthesis constraints may require substantial sequence redesign or lead to price-premiums and temporal delays, which adversely impact the efficiency of the DNA manufacturing process. We have developed a suite of build-optimization software tools (BOOST) to streamline the design-build transition in synthetic biology engineering workflows. BOOST incorporates knowledge of DNA synthesis success determinants into the design process to output ready-to-build sequences, preempting the need for sequence redesign. The BOOST web application is available at https://boost.jgi.doe.gov and its Application Program Interfaces (API) enable integration into automated, customized DNA design processes. The herein presented results highlight the effectiveness of BOOST in reducing DNA synthesis costs and timelines.

  9. Performance evaluation of spectral deconvolution analysis tool (SDAT) software used for nuclear explosion radionuclide measurements

    International Nuclear Information System (INIS)

    Foltz Biegalski, K.M.; Biegalski, S.R.; Haas, D.A.

    2008-01-01

    The Spectral Deconvolution Analysis Tool (SDAT) software was developed to improve counting statistics and detection limits for nuclear explosion radionuclide measurements. SDAT utilizes spectral deconvolution spectroscopy techniques and can analyze both β-γ coincidence spectra for radioxenon isotopes and high-resolution HPGe spectra from aerosol monitors. Spectral deconvolution spectroscopy is an analysis method that utilizes the entire signal deposited in a gamma-ray detector rather than the small portion of the signal that is present in one gamma-ray peak. This method shows promise to improve detection limits over classical gamma-ray spectroscopy analytical techniques; however, this hypothesis has not been tested. To address this issue, we performed three tests to compare the detection ability and variance of SDAT results to those of commercial off- the-shelf (COTS) software which utilizes a standard peak search algorithm. (author)

  10. Producing and supporting sharable software

    International Nuclear Information System (INIS)

    Johnstad, H.; Nicholls, J.

    1987-02-01

    A survey is reported that addressed the question of shareable software for the High Energy Physics community. Statistics are compiled for the responses of 54 people attending a conference on the subject of shareable software to a questionnaire which addressed the usefulness of shareable software, preference of programming language, and source management tools. The results are found to reflect a continued need for shareable software in the High Energy Physics community and that this effort be performed in coordination. A strong mandate is also claimed for large facilities to support the community with software and that these facilities should act as distribution points. Considerable interest is expressed in languages other than FORTRAN, and the desire for standards or rules in programming is expressed. A need is identified for source management tools

  11. Toward Intelligent Software Defect Detection

    Science.gov (United States)

    Benson, Markland J.

    2011-01-01

    Source code level software defect detection has gone from state of the art to a software engineering best practice. Automated code analysis tools streamline many of the aspects of formal code inspections but have the drawback of being difficult to construct and either prone to false positives or severely limited in the set of defects that can be detected. Machine learning technology provides the promise of learning software defects by example, easing construction of detectors and broadening the range of defects that can be found. Pinpointing software defects with the same level of granularity as prominent source code analysis tools distinguishes this research from past efforts, which focused on analyzing software engineering metrics data with granularity limited to that of a particular function rather than a line of code.

  12. Increasing quality and managing complexity in neuroinformatics software development with continuous integration

    Directory of Open Access Journals (Sweden)

    Yury V. Zaytsev

    2013-01-01

    Full Text Available High quality neuroscience research requires accurate, reliable and well maintained neuroinformatics applications. As software projects become larger, offering more functionality and developing a denser web of interdependence between their component parts, we need more sophisticated methods to manage their complexity. If complexity is allowed to get out of hand, either the quality of the software or the speed of development suffer, and in many cases both. To address this issue, here we develop a scalable, low-cost and open source solution for continuous integration (CI, a technique which ensures the quality of changes to the code base during the development procedure, rather than relying on a pre-release integration phase. We demonstrate that a CI based workflow, due to rapid feedback about code integration problems and tracking of code health measures, enabled substantial increases in productivity for a major neuroinformatics project and additional benefits for three further projects. Beyond the scope of the current study, we identify multiple areas in which CI can be employed to further increase the quality of neuroinformatics projects by improving development practices and incorporating appropriate development tools. Finally, we discuss what measures can be taken to lower the barrier for developers of neuroinformatics applications to adopt this useful technique.

  13. Increasing quality and managing complexity in neuroinformatics software development with continuous integration.

    Science.gov (United States)

    Zaytsev, Yury V; Morrison, Abigail

    2012-01-01

    High quality neuroscience research requires accurate, reliable and well maintained neuroinformatics applications. As software projects become larger, offering more functionality and developing a denser web of interdependence between their component parts, we need more sophisticated methods to manage their complexity. If complexity is allowed to get out of hand, either the quality of the software or the speed of development suffer, and in many cases both. To address this issue, here we develop a scalable, low-cost and open source solution for continuous integration (CI), a technique which ensures the quality of changes to the code base during the development procedure, rather than relying on a pre-release integration phase. We demonstrate that a CI-based workflow, due to rapid feedback about code integration problems and tracking of code health measures, enabled substantial increases in productivity for a major neuroinformatics project and additional benefits for three further projects. Beyond the scope of the current study, we identify multiple areas in which CI can be employed to further increase the quality of neuroinformatics projects by improving development practices and incorporating appropriate development tools. Finally, we discuss what measures can be taken to lower the barrier for developers of neuroinformatics applications to adopt this useful technique.

  14. Core Community Specifications for Electron Microprobe Operating Systems: Software, Quality Control, and Data Management Issues

    Science.gov (United States)

    Fournelle, John; Carpenter, Paul

    2006-01-01

    Modem electron microprobe systems have become increasingly sophisticated. These systems utilize either UNIX or PC computer systems for measurement, automation, and data reduction. These systems have undergone major improvements in processing, storage, display, and communications, due to increased capabilities of hardware and software. Instrument specifications are typically utilized at the time of purchase and concentrate on hardware performance. The microanalysis community includes analysts, researchers, software developers, and manufacturers, who could benefit from exchange of ideas and the ultimate development of core community specifications (CCS) for hardware and software components of microprobe instrumentation and operating systems.

  15. SOFTWARE PROCESS IMPROVEMENT: AWARENESS, USE, AND BENEFITS IN CANADIAN SOFTWARE DEVELOPMENT FIRMS

    OpenAIRE

    CHEVERS, DELROY

    2017-01-01

    ABSTRACT Since 1982, the software development community has been concerned with the delivery of quality systems. Software process improvement (SPI) is an initiative to avoid the delivery of low quality systems. However, the awareness and adoption of SPI is low. Thus, this study examines the rate of awareness, use, and benefits of SPI initiatives in Canadian software development firms. Using SPSS as the analytical tool, this study found that 59% of Canadian software development firms are aware...

  16. Aircraft Design Software

    Science.gov (United States)

    1997-01-01

    Successful commercialization of the AirCraft SYNThesis (ACSYNT) tool has resulted in the creation of Phoenix Integration, Inc. ACSYNT has been exclusively licensed to the company, an outcome of a seven year, $3 million effort to provide unique software technology to a focused design engineering market. Ames Research Center formulated ACSYNT and in working with the Virginia Polytechnic Institute CAD Laboratory, began to design and code a computer-aided design for ACSYNT. Using a Joint Sponsored Research Agreement, Ames formed an industry-government-university alliance to improve and foster research and development for the software. As a result of the ACSYNT Institute, the software is becoming a predominant tool for aircraft conceptual design. ACSYNT has been successfully applied to high- speed civil transport configuration, subsonic transports, and supersonic fighters.

  17. The Relationship between Logistics Sophistication and Drivers of the Outsourcing of Logistics Activities

    Directory of Open Access Journals (Sweden)

    Peter Wanke

    2008-10-01

    Full Text Available A strong link has been established between operational excellence and the degree of sophistication of logistics organization, a function of factors such as performance monitoring, investment in Information Technology [IT] and the formalization of logistics organization, as proposed in the Bowersox, Daugherty, Dröge, Germain and Rogers (1992 Leading Edge model. At the same time, shippers have been increasingly outsourcing their logistics activities to third party providers. This paper, based on a survey with large Brazilian shippers, addresses a gap in the literature by investigating the relationship between dimensions of logistics organization sophistication and drivers of logistics outsourcing. To this end, the dimensions behind the logistics sophistication construct were first investigated. Results from factor analysis led to the identification of six dimensions of logistics sophistication. By means of multivariate logistical regression analyses it was possible to relate some of these dimensions, such as the formalization of the logistics organization, to certain drivers of the outsourcing of logistics activities of Brazilian shippers, such as cost savings. These results indicate the possibility of segmenting shippers according to characteristics of their logistics organization, which may be particularly useful to logistics service providers.

  18. Lexical Complexity Development from Dynamic Systems Theory Perspective: Lexical Density, Diversity, and Sophistication

    Directory of Open Access Journals (Sweden)

    Reza Kalantari

    2017-10-01

    Full Text Available This longitudinal case study explored Iranian EFL learners’ lexical complexity (LC through the lenses of Dynamic Systems Theory (DST. Fifty independent essays written by five intermediate to advanced female EFL learners in a TOEFL iBT preparation course over six months constituted the corpus of this study. Three Coh-Metrix indices (Graesser, McNamara, Louwerse, & Cai, 2004; McNamara & Graesser, 2012, three Lexical Complexity Analyzer indices (Lu, 2010, 2012; Lu & Ai, 2011, and four Vocabprofile indices (Cobb, 2000 were selected to measure different dimensions of LC. Results of repeated measures analysis of variance (RM ANOVA indicated an improvement with regard to only lexical sophistication. Positive and significant relationships were found between time and mean values in Academic Word List and Beyond-2000 as indicators of lexical sophistication. The remaining seven indices of LC, falling short of significance, tended to flatten over the course of this writing program. Correlation analyses among LC indices indicated that lexical density enjoyed positive correlations with lexical sophistication. However, lexical diversity revealed no significant correlations with both lexical density and lexical sophistication. This study suggests that DST perspective specifies a viable foundation for analyzing lexical complexity

  19. LevRad software as a tool to learn how to proceed with a shielding adequacy analysis

    International Nuclear Information System (INIS)

    Ferreira, C.C.; Oliveira, R.A.P.; Souza, S.O.

    2009-01-01

    Since the discovery of X-rays by Roentgen in 1895, several recommendations about the hazards from this radiation source have been published. About 14% of the total annual worldwide collective effective dose originates from the diagnostic X-rays examinations. In the UK, the collective effective dose from diagnostic X-rays examinations represents about 90% of the dose from all artificial sources. Diverse strategies have been performed, in an attempt to reduce the worldwide collective effective dose. We developed the LevRad software with the aim to teach how to proceed in an analysis of barriers shielding against diagnostic X-rays, to minimize the contact of the professional or the student with X-rays, and, finally, to prevent the consuming of the X-rays equipment. Some tests of the software were made, and preliminary results indicate that LevRad is efficient as a complementary tool for teaching professionals related to diagnostic radiology. In the case of the students, the advantage is perceived when using the software before the first contact with the X-rays equipment. The software introduces a solid knowledge about shielding adequacy analysis, prevents the consummation of the X-rays tube recurrent of the shielding adequacy analyses teaching and reduces the collective effective dose by avoiding the possible unnecessary exposures. (author)

  20. Introduction of software tools for epidemiological surveillance in infection control in Colombia

    Science.gov (United States)

    Motoa, Gabriel; Vallejo, Marta; Blanco, Víctor M; Correa, Adriana; de la Cadena, Elsa; Villegas, María Virginia

    2015-01-01

    Introduction: Healthcare-Associated Infections (HAI) are a challenge for patient safety in the hospitals. Infection control committees (ICC) should follow CDC definitions when monitoring HAI. The handmade method of epidemiological surveillance (ES) may affect the sensitivity and specificity of the monitoring system, while electronic surveillance can improve the performance, quality and traceability of recorded information. Objective: To assess the implementation of a strategy for electronic surveillance of HAI, Bacterial Resistance and Antimicrobial Consumption by the ICC of 23 high-complexity clinics and hospitals in Colombia, during the period 2012-2013. Methods: An observational study evaluating the introduction of electronic tools in the ICC was performed; we evaluated the structure and operation of the ICC, the degree of incorporation of the software HAI Solutions and the adherence to record the required information. Results: Thirty-eight percent of hospitals (8/23) had active surveillance strategies with standard criteria of the CDC, and 87% of institutions adhered to the module of identification of cases using the HAI Solutions software. In contrast, compliance with the diligence of the risk factors for device-associated HAIs was 33%. Conclusions: The introduction of ES could achieve greater adherence to a model of active surveillance, standardized and prospective, helping to improve the validity and quality of the recorded information. PMID:26309340

  1. Introduction of software tools for epidemiological surveillance in infection control in Colombia.

    Science.gov (United States)

    Hernández-Gómez, Cristhian; Motoa, Gabriel; Vallejo, Marta; Blanco, Víctor M; Correa, Adriana; de la Cadena, Elsa; Villegas, María Virginia

    2015-01-01

    Healthcare-Associated Infections (HAI) are a challenge for patient safety in the hospitals. Infection control committees (ICC) should follow CDC definitions when monitoring HAI. The handmade method of epidemiological surveillance (ES) may affect the sensitivity and specificity of the monitoring system, while electronic surveillance can improve the performance, quality and traceability of recorded information. To assess the implementation of a strategy for electronic surveillance of HAI, Bacterial Resistance and Antimicrobial Consumption by the ICC of 23 high-complexity clinics and hospitals in Colombia, during the period 2012-2013. An observational study evaluating the introduction of electronic tools in the ICC was performed; we evaluated the structure and operation of the ICC, the degree of incorporation of the software HAI Solutions and the adherence to record the required information. Thirty-eight percent of hospitals (8/23) had active surveillance strategies with standard criteria of the CDC, and 87% of institutions adhered to the module of identification of cases using the HAI Solutions software. In contrast, compliance with the diligence of the risk factors for device-associated HAIs was 33%. The introduction of ES could achieve greater adherence to a model of active surveillance, standardized and prospective, helping to improve the validity and quality of the recorded information.

  2. DSC: software tool for simulation-based design of control strategies applied to wastewater treatment plants.

    Science.gov (United States)

    Ruano, M V; Ribes, J; Seco, A; Ferrer, J

    2011-01-01

    This paper presents a computer tool called DSC (Simulation based Controllers Design) that enables an easy design of control systems and strategies applied to wastewater treatment plants. Although the control systems are developed and evaluated by simulation, this tool aims to facilitate the direct implementation of the designed control system to the PC of the full-scale WWTP (wastewater treatment plants). The designed control system can be programmed in a dedicated control application and can be connected to either the simulation software or the SCADA of the plant. To this end, the developed DSC incorporates an OPC server (OLE for process control) which facilitates an open-standard communication protocol for different industrial process applications. The potential capabilities of the DSC tool are illustrated through the example of a full-scale application. An aeration control system applied to a nutrient removing WWTP was designed, tuned and evaluated with the DSC tool before its implementation in the full scale plant. The control parameters obtained by simulation were suitable for the full scale plant with only few modifications to improve the control performance. With the DSC tool, the control systems performance can be easily evaluated by simulation. Once developed and tuned by simulation, the control systems can be directly applied to the full-scale WWTP.

  3. Workflow-Based Software Development Environment

    Science.gov (United States)

    Izygon, Michel E.

    2013-01-01

    The Software Developer's Assistant (SDA) helps software teams more efficiently and accurately conduct or execute software processes associated with NASA mission-critical software. SDA is a process enactment platform that guides software teams through project-specific standards, processes, and procedures. Software projects are decomposed into all of their required process steps or tasks, and each task is assigned to project personnel. SDA orchestrates the performance of work required to complete all process tasks in the correct sequence. The software then notifies team members when they may begin work on their assigned tasks and provides the tools, instructions, reference materials, and supportive artifacts that allow users to compliantly perform the work. A combination of technology components captures and enacts any software process use to support the software lifecycle. It creates an adaptive workflow environment that can be modified as needed. SDA achieves software process automation through a Business Process Management (BPM) approach to managing the software lifecycle for mission-critical projects. It contains five main parts: TieFlow (workflow engine), Business Rules (rules to alter process flow), Common Repository (storage for project artifacts, versions, history, schedules, etc.), SOA (interface to allow internal, GFE, or COTS tools integration), and the Web Portal Interface (collaborative web environment

  4. Object-oriented design of medical imaging software.

    Science.gov (United States)

    Ligier, Y; Ratib, O; Logean, M; Girard, C; Perrier, R; Scherrer, J R

    1994-01-01

    A special software package for interactive display and manipulation of medical images was developed at the University Hospital of Geneva, as part of a hospital wide Picture Archiving and Communication System (PACS). This software package, called Osiris, was especially designed to be easily usable and adaptable to the needs of noncomputer-oriented physicians. The Osiris software has been developed to allow the visualization of medical images obtained from any imaging modality. It provides generic manipulation tools, processing tools, and analysis tools more specific to clinical applications. This software, based on an object-oriented paradigm, is portable and extensible. Osiris is available on two different operating systems: the Unix X-11/OSF-Motif based workstations, and the Macintosh family.

  5. Beyond Bundles - Reproducible Software Environments with GNU Guix

    CERN Multimedia

    CERN. Geneva; Wurmus, Ricardo

    2018-01-01

    Building reproducible data analysis pipelines and numerical experiments is a key challenge for reproducible science, in which tools to reproduce software environments play a critical role. The advent of “container-based” deployment tools such as Docker and Singularity has made it easier to replicate software environments. These tools are very much about bundling the bits of software binaries in a convenient way, not so much about describing how software is composed. Science is not just about replicating, though—it demands the ability to inspect and to experiment. In this talk we will present GNU Guix, a software management toolkit. Guix departs from container-based solutions in that it enables declarative composition of software environments. It is comparable to “package managers” like apt or yum, but with a significant difference: Guix provides accurate provenance tracking of build artifacts, and bit-reproducible software. We will illustrate the many ways in which Guix can improve how software en...

  6. Software Maintenance Exercises for a Software Engineering Project Course

    Science.gov (United States)

    1989-02-01

    what is program style and how can it be measured? Program style has been defined as a "followed convention with respect to punctuation, capitalization ...convention with respect to punctuation, capitalization , and typographic arrangement and display." *DASC is a software tool that takes a syntactically...Specilleauons: A Frarnewo* * CM-12 Software Metrws CM- 13 Introduction to Softwarell Verification and Validation CM-14 Intelectual Property Protection for

  7. Reliability of a computer software angle tool for measuring spine and pelvic flexibility during the sit-and-reach test.

    Science.gov (United States)

    Mier, Constance M; Shapiro, Belinda S

    2013-02-01

    The purpose of this study was to determine the reliability of a computer software angle tool that measures thoracic (T), lumbar (L), and pelvic (P) angles as a means of evaluating spine and pelvic flexibility during the sit-and-reach (SR) test. Thirty adults performed the SR twice on separate days. The SR test was captured on video and later analyzed for T, L, and P angles using the computer software angle tool. During the test, 3 markers were placed over T1, T12, and L5 vertebrae to identify T, L, and P angles. Intraclass correlation coefficient (ICC) indicated a very high internal consistency (between trials) for T, L, and P angles (0.95-0.99); thus, the average of trials was used for test-retest (between days) reliability. Mean (±SD) values did not differ between days for T (51.0 ± 14.3 vs. 52.3 ± 16.2°), L (23.9 ± 7.1 vs. 23.0 ± 6.9°), or P (98.4 ± 15.6 vs. 98.3 ± 14.7°) angles. Test-retest reliability (ICC) was high for T (0.96) and P (0.97) angles and moderate for L angle (0.84). Both intrarater and interrater reliabilities were high for T (0.95, 0.94) and P (0.97, 0.97) angles and moderate for L angle (0.87, 0.82). Thus, the computer software angle tool is a highly objective method for assessing spine and pelvic flexibility during a video-captured SR test.

  8. Software development for dynamic position emission tomography: Dynamic image analysis (DIA) tool

    Energy Technology Data Exchange (ETDEWEB)

    Pyeon, Do Yeong; Jung, Young Jin [Dongseo University, Busan (Korea, Republic of); Kim, Jung Su [Dept. of Radilogical Science, Dongnam Health University, Suwon (Korea, Republic of)

    2016-09-15

    Positron Emission Tomography(PET) is nuclear medical tests which is a combination of several compounds with a radioactive isotope that can be injected into body to quantitatively measure the metabolic rate (in the body). Especially, Phenomena that increase (sing) glucose metabolism in cancer tissue using the 18F-FDG (Fluorodeoxyglucose) is utilized widely in cancer diagnosis. And then, Numerous studies have been reported that incidence seems high availability even in the modern diagnosis of dementia and Parkinson's (disease) in brain disease. When using a dynamic PET image including the time information in the static information that is provided for the diagnosis many can increase the accuracy of diagnosis. For this reason, clinical researchers getting great attention but, it is the lack of tools to conduct research. And, it interfered complex mathematical algorithm and programming skills for activation of research. In this study, in order to easy to use and enable research dPET, we developed the software based graphic user interface(GUI). In the future, by many clinical researcher using DIA-Tool is expected to be of great help to dPET research.

  9. Software development for dynamic position emission tomography: Dynamic image analysis (DIA) tool

    International Nuclear Information System (INIS)

    Pyeon, Do Yeong; Jung, Young Jin; Kim, Jung Su

    2016-01-01

    Positron Emission Tomography(PET) is nuclear medical tests which is a combination of several compounds with a radioactive isotope that can be injected into body to quantitatively measure the metabolic rate (in the body). Especially, Phenomena that increase (sing) glucose metabolism in cancer tissue using the 18F-FDG (Fluorodeoxyglucose) is utilized widely in cancer diagnosis. And then, Numerous studies have been reported that incidence seems high availability even in the modern diagnosis of dementia and Parkinson's (disease) in brain disease. When using a dynamic PET image including the time information in the static information that is provided for the diagnosis many can increase the accuracy of diagnosis. For this reason, clinical researchers getting great attention but, it is the lack of tools to conduct research. And, it interfered complex mathematical algorithm and programming skills for activation of research. In this study, in order to easy to use and enable research dPET, we developed the software based graphic user interface(GUI). In the future, by many clinical researcher using DIA-Tool is expected to be of great help to dPET research

  10. A Review of Pathway-Based Analysis Tools That Visualize Genetic Variants

    Directory of Open Access Journals (Sweden)

    Elisa Cirillo

    2017-11-01

    Full Text Available Pathway analysis is a powerful method for data analysis in genomics, most often applied to gene expression analysis. It is also promising for single-nucleotide polymorphism (SNP data analysis, such as genome-wide association study data, because it allows the interpretation of variants with respect to the biological processes in which the affected genes and proteins are involved. Such analyses support an interactive evaluation of the possible effects of variations on function, regulation or interaction of gene products. Current pathway analysis software often does not support data visualization of variants in pathways as an alternate method to interpret genetic association results, and specific statistical methods for pathway analysis of SNP data are not combined with these visualization features. In this review, we first describe the visualization options of the tools that were identified by a literature review, in order to provide insight for improvements in this developing field. Tool evaluation was performed using a computational epistatic dataset of gene–gene interactions for obesity risk. Next, we report the necessity to include in these tools statistical methods designed for the pathway-based analysis with SNP data, expressly aiming to define features for more comprehensive pathway-based analysis tools. We conclude by recognizing that pathway analysis of genetic variations data requires a sophisticated combination of the most useful and informative visual aspects of the various tools evaluated.

  11. The GlycanBuilder: a fast, intuitive and flexible software tool for building and displaying glycan structures.

    Science.gov (United States)

    Ceroni, Alessio; Dell, Anne; Haslam, Stuart M

    2007-08-07

    Carbohydrates play a critical role in human diseases and their potential utility as biomarkers for pathological conditions is a major driver for characterization of the glycome. However, the additional complexity of glycans compared to proteins and nucleic acids has slowed the advancement of glycomics in comparison to genomics and proteomics. The branched nature of carbohydrates, the great diversity of their constituents and the numerous alternative symbolic notations, make the input and display of glycans not as straightforward as for example the amino-acid sequence of a protein. Every glycoinformatic tool providing a user interface would benefit from a fast, intuitive, appealing mechanism for input and output of glycan structures in a computer readable format. A software tool for building and displaying glycan structures using a chosen symbolic notation is described here. The "GlycanBuilder" uses an automatic rendering algorithm to draw the saccharide symbols and to place them on the drawing board. The information about the symbolic notation is derived from a configurable graphical model as a set of rules governing the aspect and placement of residues and linkages. The algorithm is able to represent a structure using only few traversals of the tree and is inherently fast. The tool uses an XML format for import and export of encoded structures. The rendering algorithm described here is able to produce high-quality representations of glycan structures in a chosen symbolic notation. The automated rendering process enables the "GlycanBuilder" to be used both as a user-independent component for displaying glycans and as an easy-to-use drawing tool. The "GlycanBuilder" can be integrated in web pages as a Java applet for the visual editing of glycans. The same component is available as a web service to render an encoded structure into a graphical format. Finally, the "GlycanBuilder" can be integrated into other applications to create intuitive and appealing user

  12. The GlycanBuilder: a fast, intuitive and flexible software tool for building and displaying glycan structures

    Directory of Open Access Journals (Sweden)

    Dell Anne

    2007-08-01

    Full Text Available Abstract Background Carbohydrates play a critical role in human diseases and their potential utility as biomarkers for pathological conditions is a major driver for characterization of the glycome. However, the additional complexity of glycans compared to proteins and nucleic acids has slowed the advancement of glycomics in comparison to genomics and proteomics. The branched nature of carbohydrates, the great diversity of their constituents and the numerous alternative symbolic notations, make the input and display of glycans not as straightforward as for example the amino-acid sequence of a protein. Every glycoinformatic tool providing a user interface would benefit from a fast, intuitive, appealing mechanism for input and output of glycan structures in a computer readable format. Results A software tool for building and displaying glycan structures using a chosen symbolic notation is described here. The "GlycanBuilder" uses an automatic rendering algorithm to draw the saccharide symbols and to place them on the drawing board. The information about the symbolic notation is derived from a configurable graphical model as a set of rules governing the aspect and placement of residues and linkages. The algorithm is able to represent a structure using only few traversals of the tree and is inherently fast. The tool uses an XML format for import and export of encoded structures. Conclusion The rendering algorithm described here is able to produce high-quality representations of glycan structures in a chosen symbolic notation. The automated rendering process enables the "GlycanBuilder" to be used both as a user-independent component for displaying glycans and as an easy-to-use drawing tool. The "GlycanBuilder" can be integrated in web pages as a Java applet for the visual editing of glycans. The same component is available as a web service to render an encoded structure into a graphical format. Finally, the "GlycanBuilder" can be integrated into other

  13. The pyPHaz software, an interactive tool to analyze and visualize results from probabilistic hazard assessments

    Science.gov (United States)

    Tonini, Roberto; Selva, Jacopo; Costa, Antonio; Sandri, Laura

    2014-05-01

    Probabilistic Hazard Assessment (PHA) is becoming an essential tool for risk mitigation policies, since it allows to quantify the hazard due to hazardous phenomena and, differently from the deterministic approach, it accounts for both aleatory and epistemic uncertainties. On the other hand, one of the main disadvantages of PHA methods is that their results are not easy to understand and interpret by people who are not specialist in probabilistic tools. For scientists, this leads to the issue of providing tools that can be easily used and understood by decision makers (i.e., risk managers or local authorities). The work here presented fits into the problem of simplifying the transfer between scientific knowledge and land protection policies, by providing an interface between scientists, who produce PHA's results, and decision makers, who use PHA's results for risk analyses. In this framework we present pyPHaz, an open tool developed and designed to visualize and analyze PHA results due to one or more phenomena affecting a specific area of interest. The software implementation has been fully developed with the free and open-source Python programming language and some featured Python-based libraries and modules. The pyPHaz tool allows to visualize the Hazard Curves (HC) calculated in a selected target area together with different levels of uncertainty (mean and percentiles) on maps that can be interactively created and modified by the user, thanks to a dedicated Graphical User Interface (GUI). Moreover, the tool can be used to compare the results of different PHA models and to merge them, by creating ensemble models. The pyPHaz software has been designed with the features of storing and accessing all the data through a MySQL database and of being able to read as input the XML-based standard file formats defined in the frame of GEM (Global Earthquake Model). This format model is easy to extend also to any other kind of hazard, as it will be shown in the applications

  14. Adaptation of G-TAG Software for Validating Touch-and-Go Comet Surface Sampling Design Methodology

    Science.gov (United States)

    Mandic, Milan; Acikmese, Behcet; Blackmore, Lars

    2011-01-01

    The G-TAG software tool was developed under the R&TD on Integrated Autonomous Guidance, Navigation, and Control for Comet Sample Return, and represents a novel, multi-body dynamics simulation software tool for studying TAG sampling. The G-TAG multi-body simulation tool provides a simulation environment in which a Touch-and-Go (TAG) sampling event can be extensively tested. TAG sampling requires the spacecraft to descend to the surface, contact the surface with a sampling collection device, and then to ascend to a safe altitude. The TAG event lasts only a few seconds but is mission-critical with potentially high risk. Consequently, there is a need for the TAG event to be well characterized and studied by simulation and analysis in order for the proposal teams to converge on a reliable spacecraft design. This adaptation of the G-TAG tool was developed to support the Comet Odyssey proposal effort, and is specifically focused to address comet sample return missions. In this application, the spacecraft descends to and samples from the surface of a comet. Performance of the spacecraft during TAG is assessed based on survivability and sample collection performance. For the adaptation of the G-TAG simulation tool to comet scenarios, models are developed that accurately describe the properties of the spacecraft, approach trajectories, and descent velocities, as well as the models of the external forces and torques acting on the spacecraft. The adapted models of the spacecraft, descent profiles, and external sampling forces/torques were more sophisticated and customized for comets than those available in the basic G-TAG simulation tool. Scenarios implemented include the study of variations in requirements, spacecraft design (size, locations, etc. of the spacecraft components), and the environment (surface properties, slope, disturbances, etc.). The simulations, along with their visual representations using G-View, contributed to the Comet Odyssey New Frontiers proposal

  15. Introduction on Using the FastPCR Software and the Related Java Web Tools for PCR and Oligonucleotide Assembly and Analysis.

    Science.gov (United States)

    Kalendar, Ruslan; Tselykh, Timofey V; Khassenov, Bekbolat; Ramanculov, Erlan M

    2017-01-01

    This chapter introduces the FastPCR software as an integrated tool environment for PCR primer and probe design, which predicts properties of oligonucleotides based on experimental studies of the PCR efficiency. The software provides comprehensive facilities for designing primers for most PCR applications and their combinations. These include the standard PCR as well as the multiplex, long-distance, inverse, real-time, group-specific, unique, overlap extension PCR for multi-fragments assembling cloning and loop-mediated isothermal amplification (LAMP). It also contains a built-in program to design oligonucleotide sets both for long sequence assembly by ligase chain reaction and for design of amplicons that tile across a region(s) of interest. The software calculates the melting temperature for the standard and degenerate oligonucleotides including locked nucleic acid (LNA) and other modifications. It also provides analyses for a set of primers with the prediction of oligonucleotide properties, dimer and G/C-quadruplex detection, linguistic complexity as well as a primer dilution and resuspension calculator. The program consists of various bioinformatical tools for analysis of sequences with the GC or AT skew, CG% and GA% content, and the purine-pyrimidine skew. It also analyzes the linguistic sequence complexity and performs generation of random DNA sequence as well as restriction endonucleases analysis. The program allows to find or create restriction enzyme recognition sites for coding sequences and supports the clustering of sequences. It performs efficient and complete detection of various repeat types with visual display. The FastPCR software allows the sequence file batch processing that is essential for automation. The program is available for download at http://primerdigital.com/fastpcr.html , and its online version is located at http://primerdigital.com/tools/pcr.html .

  16. CoC GIS Tools (GIS Tool)

    Data.gov (United States)

    Department of Housing and Urban Development — This tool provides a no-cost downloadable software tool that allows users to interact with professional quality GIS maps. Users access pre-compiled projects through...

  17. Software design specification and analysis(NuFDS) approach for the safety critical software based on porgrammable logic controller(PLC)

    International Nuclear Information System (INIS)

    Koo, Seo Ryong; Seong, Poong Hyun; Jung, Jin Yong; Choi, Seong Soo

    2004-01-01

    This paper introduces the software design specification and analysis technique for the safety-critical system based on Programmable Logic Controller (PLC). During software development phases, the design phase should perform an important role to connect between requirements phase and implementation phase as a process of translating problem requirements into software structures. In this work, the Nuclear FBD-style Design Specification and analysis (NuFDS) approach was proposed. The NuFDS approach for nuclear Instrumentation and Control (I and C) software are suggested in a straight forward manner. It consists of four major specifications as follows; Database, Software Architecture, System Behavior, and PLC Hardware Configuration. Additionally, correctness, completeness, consistency, and traceability check techniques are also suggested for the formal design analysis in NuFDS approach. In addition, for the tool supporting, we are developing NuSDS tool based on the NuFDS approach which is a tool, especially for the software design specification in nuclear fields

  18. Social software in global software development

    DEFF Research Database (Denmark)

    Giuffrida, Rosalba; Dittrich, Yvonne

    2010-01-01

    variety of tools such as: instant messaging, internet forums, mailing lists, blogs, wikis, social network sites, social bookmarking, social libraries, virtual worlds. Though normally rather belonging to the private realm, the use of social software in corporate context has been reported, e.g. as a way...

  19. Technical Note: Development and performance of a software tool for quality assurance of online replanning with a conventional Linac or MR-Linac

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Guang-Pei, E-mail: gpchen@mcw.edu; Ahunbay, Ergun; Li, X. Allen [Department of Radiation Oncology, Medical College of Wisconsin, Milwaukee, Wisconsin 53226 (United States)

    2016-04-15

    Purpose: To develop an integrated quality assurance (QA) software tool for online replanning capable of efficiently and automatically checking radiation treatment (RT) planning parameters and gross plan quality, verifying treatment plan data transfer from treatment planning system (TPS) to record and verify (R&V) system, performing a secondary monitor unit (MU) calculation with or without a presence of a magnetic field from MR-Linac, and validating the delivery record consistency with the plan. Methods: The software tool, named ArtQA, was developed to obtain and compare plan and treatment parameters from both the TPS and the R&V system database. The TPS data are accessed via direct file reading and the R&V data are retrieved via open database connectivity and structured query language. Plan quality is evaluated with both the logical consistency of planning parameters and the achieved dose–volume histograms. Beams in between the TPS and R&V system are matched based on geometry configurations. To consider the effect of a 1.5 T transverse magnetic field from MR-Linac in the secondary MU calculation, a method based on modified Clarkson integration algorithm was developed and tested for a series of clinical situations. Results: ArtQA has been used in their clinic and can quickly detect inconsistencies and deviations in the entire RT planning process. With the use of the ArtQA tool, the efficiency for plan check including plan quality, data transfer, and delivery check can be improved by at least 60%. The newly developed independent MU calculation tool for MR-Linac reduces the difference between the plan and calculated MUs by 10%. Conclusions: The software tool ArtQA can be used to perform a comprehensive QA check from planning to delivery with conventional Linac or MR-Linac and is an essential tool for online replanning where the QA check needs to be performed rapidly.

  20. Technical Note: Development and performance of a software tool for quality assurance of online replanning with a conventional Linac or MR-Linac.

    Science.gov (United States)

    Chen, Guang-Pei; Ahunbay, Ergun; Li, X Allen

    2016-04-01

    To develop an integrated quality assurance (QA) software tool for online replanning capable of efficiently and automatically checking radiation treatment (RT) planning parameters and gross plan quality, verifying treatment plan data transfer from treatment planning system (TPS) to record and verify (R&V) system, performing a secondary monitor unit (MU) calculation with or without a presence of a magnetic field from MR-Linac, and validating the delivery record consistency with the plan. The software tool, named ArtQA, was developed to obtain and compare plan and treatment parameters from both the TPS and the R&V system database. The TPS data are accessed via direct file reading and the R&V data are retrieved via open database connectivity and structured query language. Plan quality is evaluated with both the logical consistency of planning parameters and the achieved dose-volume histograms. Beams in between the TPS and R&V system are matched based on geometry configurations. To consider the effect of a 1.5 T transverse magnetic field from MR-Linac in the secondary MU calculation, a method based on modified Clarkson integration algorithm was developed and tested for a series of clinical situations. ArtQA has been used in their clinic and can quickly detect inconsistencies and deviations in the entire RT planning process. With the use of the ArtQA tool, the efficiency for plan check including plan quality, data transfer, and delivery check can be improved by at least 60%. The newly developed independent MU calculation tool for MR-Linac reduces the difference between the plan and calculated MUs by 10%. The software tool ArtQA can be used to perform a comprehensive QA check from planning to delivery with conventional Linac or MR-Linac and is an essential tool for online replanning where the QA check needs to be performed rapidly.

  1. Reliability and Validity of the Footprint Assessment Method Using Photoshop CS5 Software.

    Science.gov (United States)

    Gutiérrez-Vilahú, Lourdes; Massó-Ortigosa, Núria; Costa-Tutusaus, Lluís; Guerra-Balic, Myriam

    2015-05-01

    Several sophisticated methods of footprint analysis currently exist. However, it is sometimes useful to apply standard measurement methods of recognized evidence with an easy and quick application. We sought to assess the reliability and validity of a new method of footprint assessment in a healthy population using Photoshop CS5 software (Adobe Systems Inc, San Jose, California). Forty-two footprints, corresponding to 21 healthy individuals (11 men with a mean ± SD age of 20.45 ± 2.16 years and 10 women with a mean ± SD age of 20.00 ± 1.70 years) were analyzed. Footprints were recorded in static bipedal standing position using optical podography and digital photography. Three trials for each participant were performed. The Hernández-Corvo, Chippaux-Smirak, and Staheli indices and the Clarke angle were calculated by manual method and by computerized method using Photoshop CS5 software. Test-retest was used to determine reliability. Validity was obtained by intraclass correlation coefficient (ICC). The reliability test for all of the indices showed high values (ICC, 0.98-0.99). Moreover, the validity test clearly showed no difference between techniques (ICC, 0.99-1). The reliability and validity of a method to measure, assess, and record the podometric indices using Photoshop CS5 software has been demonstrated. This provides a quick and accurate tool useful for the digital recording of morphostatic foot study parameters and their control.

  2. CRITON : A Hypermedia Design Tool

    NARCIS (Netherlands)

    Avgeriou, Paris; Retalis, Symeon

    2005-01-01

    The WWW has turned into a development and run-time environment for large-scale and complex applications. Such sophisticated applications are being deployed in increasing numbers without having been developed according to appropriate methodologies, tools and quality standards. The reason is not only

  3. Modular Software Performance Monitoring

    CERN Document Server

    Kruse, D F

    2011-01-01

    CPU clock frequency is not likely to be increased significantly in the coming years, and data analysis speed can be improved by using more processors or buying new machines, only if one is willing to change the paradigm to a parallel one. Therefore, performance monitoring procedures and tools are needed to help programmers to optimize existing software running on current and future hardware. Low level information from hardware performance counters is vital to spot specific performance problems slowing program execution. HEP software is often huge and complex, and existing tools are unable to give results with the required granularity. We will report on the approach we have chose to solve this problem that involves decomposing the application into parts and monitoring each of them separately. Both counting and sampling methods are used to allow an analysis with the required custom granularity: from global level, up to the function level. A set of tools (based on perfmon2 – a software interface to hardware co...

  4. Integrating a Decision Management Tool with UML Modeling Tools

    DEFF Research Database (Denmark)

    Könemann, Patrick

    by proposing potential subsequent design issues. In model-based software development, many decisions directly affect the structural and behavioral models used to describe and develop a software system and its architecture. However, these decisions are typically not connected to the models created during...... integration of formerly disconnected tools improves tool usability as well as decision maker productivity....

  5. Dtest Testing Software

    Science.gov (United States)

    Jain, Abhinandan; Cameron, Jonathan M.; Myint, Steven

    2013-01-01

    This software runs a suite of arbitrary software tests spanning various software languages and types of tests (unit level, system level, or file comparison tests). The dtest utility can be set to automate periodic testing of large suites of software, as well as running individual tests. It supports distributing multiple tests over multiple CPU cores, if available. The dtest tool is a utility program (written in Python) that scans through a directory (and its subdirectories) and finds all directories that match a certain pattern and then executes any tests in that directory as described in simple configuration files.

  6. Generating statements at whole-body imaging with a workflow-optimized software tool - first experiences with multireader analysis

    International Nuclear Information System (INIS)

    Mueller-Horvat, C.; Plathow, C.; Ludescher, B.; Lichy, M.P.; Claussen, C.D.; Schlemmer, H.P.; Canda, V.; Zindel, C.; Hahn, H.K.; Peitgen, H.O.; Kuhnigk, J.

    2007-01-01

    Introduction: Due to technical innovations in sectional diagram methods, whole-body imaging has increased in importance for clinical radiology, particularly for the diagnosis of systemic tumor disease. Large numbers of images have to be evaluated in increasingly shorter time periods. The aim was to create and evaluate a new software tool to assist and automate the process of diagnosing whole-body datasets. Material and Methods: Thirteen whole-body datasets were evaluated by 3 readers using the conventional system and the new software tool. The times for loading the datasets, examining 5 different regions (head, neck, thorax, abdomen and pelvis/skeletal system) and retrieving a relevant finding for demonstration were acquired. Additionally a Student T-Test was performed. For qualitative analysis the 3 readers used a scale from 0 - 4 (0 = bad, 4 = very good) to assess dataset loading convenience, lesion location assistance, and ease of use. Additionally a kappa value was calculated. Results: The average loading time was 39.7 s (± 5.5) with the conventional system and 6.5 s (± 1.4) (p 0.9). The qualitative analysis showed a significant advantage with respect to convenience (p 0.9). (orig.)

  7. Re-engineering software systems in the Department of Defense using integrated computer aided software engineering tools

    OpenAIRE

    Jennings, Charles A.

    1992-01-01

    Approved for public release; distribution is unlimited The Department of Defense (DoD) is plagues with severe cost overruns and delays in developing software systems. Existing software within Dod, some developed 15-to 20 years ago, require continual maintenance and modification. Major difficulties arise with maintaining older systems due to cryptic source code and a lack of adequate documentation. To remedy this situation, the DoD, is pursuing the integrated computer aided software engi...

  8. Circular Hough transform diffraction analysis: A software tool for automated measurement of selected area electron diffraction patterns within Digital Micrograph{sup TM}

    Energy Technology Data Exchange (ETDEWEB)

    Mitchell, D.R.G. [Institute of Materials and Engineering Science, ANSTO, PMB 1, Menai, NSW 2234 (Australia)], E-mail: drm@ansto.gov.au

    2008-03-15

    A software tool (script and plugin) for computing circular Hough transforms (CHT) in Digital Micrograph{sup TM} has been developed, for the purpose of automated analysis of selected area electron diffraction patterns (SADPs) of polycrystalline materials. The CHT enables the diffraction pattern centre to be determined with sub-pixel accuracy, regardless of the exposure condition of the transmitted beam or if a beam stop is present. Radii of the diffraction rings can also be accurately measured with sub-pixel precision. If the pattern is calibrated against a known camera length, then d-spacings with an accuracy of better than 1% can be obtained. These measurements require no a priori knowledge of the pattern and very limited user interaction. The accuracy of the CHT is degraded by distortion introduced by the projector lens, and this should be minimised prior to pattern acquisition. A number of optimisations in the CHT software enable rapid processing of patterns; a typical analysis of a 1kx1k image taking just a few minutes. The CHT tool appears robust and is even able to accurately measure SADPs with very incomplete diffraction rings due to texture effects. This software tool is freely downloadable via the Internet.

  9. A new paradigm for the development of analysis software

    International Nuclear Information System (INIS)

    Kelly, D.; Harauz, J.

    2012-01-01

    For the CANDU industry, analysis software is an important tool for scientists and engineers to examine issues related to safety, operation, and design. However, the software quality assurance approach currently used for these tools assumes the software is the delivered product. In this paper, we present a model that shifts the emphasis from software being the end-product to software being support for the end-product, the science. We describe a novel software development paradigm that supports this shift and provides the groundwork for re-examining the quality assurance practices used for analysis software. (author)

  10. Finding Security Patterns to Countermeasure Software Vulnerabilities

    OpenAIRE

    Borstad, Ole Gunnar

    2008-01-01

    Software security is an increasingly important part of software development as the risk from attackers is constantly evolving through increased exposure, threats and economic impact of security breaches. Emerging security literature describes expert knowledge such as secure development best practices. This knowledge is often not applied by software developers because they lack security awareness, security training and secure development methods and tools. Existing methods and tools require to...

  11. Software Support of Modelling using Ergonomic Tools in Engineering

    Directory of Open Access Journals (Sweden)

    Darina Dupláková

    2017-08-01

    Full Text Available One of the preconditions for correct development of industrial production is continuous interconnecting of virtual reality and real world by computer software. Computer software are used for product modelling, creation of technical documentation, scheduling, management and optimization of manufacturing processes, and efficiency increase of human work in manufacturing plants. This article describes the frequent used ergonomic software which helping to increase of human work by error rate reducing, risks factors of working environment, injury in workplaces and elimination of arising occupational diseases. They are categorized in the field of micro ergonomics and they are applicable at the manufacturing level with flexible approach in solving of established problems.

  12. BEASTling: A software tool for linguistic phylogenetics using BEAST 2

    Science.gov (United States)

    Forkel, Robert; Kaiping, Gereon A.; Atkinson, Quentin D.

    2017-01-01

    We present a new open source software tool called BEASTling, designed to simplify the preparation of Bayesian phylogenetic analyses of linguistic data using the BEAST 2 platform. BEASTling transforms comparatively short and human-readable configuration files into the XML files used by BEAST to specify analyses. By taking advantage of Creative Commons-licensed data from the Glottolog language catalog, BEASTling allows the user to conveniently filter datasets using names for recognised language families, to impose monophyly constraints so that inferred language trees are backward compatible with Glottolog classifications, or to assign geographic location data to languages for phylogeographic analyses. Support for the emerging cross-linguistic linked data format (CLDF) permits easy incorporation of data published in cross-linguistic linked databases into analyses. BEASTling is intended to make the power of Bayesian analysis more accessible to historical linguists without strong programming backgrounds, in the hopes of encouraging communication and collaboration between those developing computational models of language evolution (who are typically not linguists) and relevant domain experts. PMID:28796784

  13. BEASTling: A software tool for linguistic phylogenetics using BEAST 2.

    Directory of Open Access Journals (Sweden)

    Luke Maurits

    Full Text Available We present a new open source software tool called BEASTling, designed to simplify the preparation of Bayesian phylogenetic analyses of linguistic data using the BEAST 2 platform. BEASTling transforms comparatively short and human-readable configuration files into the XML files used by BEAST to specify analyses. By taking advantage of Creative Commons-licensed data from the Glottolog language catalog, BEASTling allows the user to conveniently filter datasets using names for recognised language families, to impose monophyly constraints so that inferred language trees are backward compatible with Glottolog classifications, or to assign geographic location data to languages for phylogeographic analyses. Support for the emerging cross-linguistic linked data format (CLDF permits easy incorporation of data published in cross-linguistic linked databases into analyses. BEASTling is intended to make the power of Bayesian analysis more accessible to historical linguists without strong programming backgrounds, in the hopes of encouraging communication and collaboration between those developing computational models of language evolution (who are typically not linguists and relevant domain experts.

  14. TRANSIT--A Software Tool for Himar1 TnSeq Analysis.

    Directory of Open Access Journals (Sweden)

    Michael A DeJesus

    2015-10-01

    Full Text Available TnSeq has become a popular technique for determining the essentiality of genomic regions in bacterial organisms. Several methods have been developed to analyze the wealth of data that has been obtained through TnSeq experiments. We developed a tool for analyzing Himar1 TnSeq data called TRANSIT. TRANSIT provides a graphical interface to three different statistical methods for analyzing TnSeq data. These methods cover a variety of approaches capable of identifying essential genes in individual datasets as well as comparative analysis between conditions. We demonstrate the utility of this software by analyzing TnSeq datasets of M. tuberculosis grown on glycerol and cholesterol. We show that TRANSIT can be used to discover genes which have been previously implicated for growth on these carbon sources. TRANSIT is written in Python, and thus can be run on Windows, OSX and Linux platforms. The source code is distributed under the GNU GPL v3 license and can be obtained from the following GitHub repository: https://github.com/mad-lab/transit.

  15. The musicality of non-musicians: an index for assessing musical sophistication in the general population.

    Directory of Open Access Journals (Sweden)

    Daniel Müllensiefen

    Full Text Available Musical skills and expertise vary greatly in Western societies. Individuals can differ in their repertoire of musical behaviours as well as in the level of skill they display for any single musical behaviour. The types of musical behaviours we refer to here are broad, ranging from performance on an instrument and listening expertise, to the ability to employ music in functional settings or to communicate about music. In this paper, we first describe the concept of 'musical sophistication' which can be used to describe the multi-faceted nature of musical expertise. Next, we develop a novel measurement instrument, the Goldsmiths Musical Sophistication Index (Gold-MSI to assess self-reported musical skills and behaviours on multiple dimensions in the general population using a large Internet sample (n = 147,636. Thirdly, we report results from several lab studies, demonstrating that the Gold-MSI possesses good psychometric properties, and that self-reported musical sophistication is associated with performance on two listening tasks. Finally, we identify occupation, occupational status, age, gender, and wealth as the main socio-demographic factors associated with musical sophistication. Results are discussed in terms of theoretical accounts of implicit and statistical music learning and with regard to social conditions of sophisticated musical engagement.

  16. The musicality of non-musicians: an index for assessing musical sophistication in the general population.

    Science.gov (United States)

    Müllensiefen, Daniel; Gingras, Bruno; Musil, Jason; Stewart, Lauren

    2014-01-01

    Musical skills and expertise vary greatly in Western societies. Individuals can differ in their repertoire of musical behaviours as well as in the level of skill they display for any single musical behaviour. The types of musical behaviours we refer to here are broad, ranging from performance on an instrument and listening expertise, to the ability to employ music in functional settings or to communicate about music. In this paper, we first describe the concept of 'musical sophistication' which can be used to describe the multi-faceted nature of musical expertise. Next, we develop a novel measurement instrument, the Goldsmiths Musical Sophistication Index (Gold-MSI) to assess self-reported musical skills and behaviours on multiple dimensions in the general population using a large Internet sample (n = 147,636). Thirdly, we report results from several lab studies, demonstrating that the Gold-MSI possesses good psychometric properties, and that self-reported musical sophistication is associated with performance on two listening tasks. Finally, we identify occupation, occupational status, age, gender, and wealth as the main socio-demographic factors associated with musical sophistication. Results are discussed in terms of theoretical accounts of implicit and statistical music learning and with regard to social conditions of sophisticated musical engagement.

  17. The New Toxicology of Sophisticated Materials: Nanotoxicology and Beyond

    Science.gov (United States)

    Maynard, Andrew D.; Warheit, David B.; Philbert, Martin A.

    2011-01-01

    It has long been recognized that the physical form of materials can mediate their toxicity—the health impacts of asbestiform materials, industrial aerosols, and ambient particulate matter are prime examples. Yet over the past 20 years, toxicology research has suggested complex and previously unrecognized associations between material physicochemistry at the nanoscale and biological interactions. With the rapid rise of the field of nanotechnology and the design and production of increasingly complex nanoscale materials, it has become ever more important to understand how the physical form and chemical composition of these materials interact synergistically to determine toxicity. As a result, a new field of research has emerged—nanotoxicology. Research within this field is highlighting the importance of material physicochemical properties in how dose is understood, how materials are characterized in a manner that enables quantitative data interpretation and comparison, and how materials move within, interact with, and are transformed by biological systems. Yet many of the substances that are the focus of current nanotoxicology studies are relatively simple materials that are at the vanguard of a new era of complex materials. Over the next 50 years, there will be a need to understand the toxicology of increasingly sophisticated materials that exhibit novel, dynamic and multifaceted functionality. If the toxicology community is to meet the challenge of ensuring the safe use of this new generation of substances, it will need to move beyond “nano” toxicology and toward a new toxicology of sophisticated materials. Here, we present a brief overview of the current state of the science on the toxicology of nanoscale materials and focus on three emerging toxicology-based challenges presented by sophisticated materials that will become increasingly important over the next 50 years: identifying relevant materials for study, physicochemical characterization, and

  18. CALDoseX: a software tool for absorbed dose calculations in diagnostic radiology

    International Nuclear Information System (INIS)

    Kramer, R.; Khourya, H.J.; Vieira, J.W.

    2008-01-01

    Conversion coefficients (CCs) between absorbed dose to organs and tissues at risk and measurable quantities commonly used in X-ray diagnosis have been calculated for the last 30 years mostly with mathematical MIRD5-type phantoms, in which organs are represented by simple geometrical bodies, like ellipsoids, tori, truncated cylinders, etc. In contrast, voxel-based phantoms are true to nature representations of human bodies. The purpose of this study is therefore to calculate CCs for common examinations in X-ray diagnosis with the recently developed MAX06 (Male Adult voXel) and FAX06 (Female Adult voXel) phantoms for various projections and different X-ray spectra and to make these CCs available to the public through a software tool, called CALDose X (CALculation of Dose for X-ray diagnosis). (author)

  19. A Performance Support Tool for Cisco Training Program Managers

    Science.gov (United States)

    Benson, Angela D.; Bothra, Jashoda; Sharma, Priya

    2004-01-01

    Performance support systems can play an important role in corporations by managing and allowing distribution of information more easily. These systems run the gamut from simple paper job aids to sophisticated computer- and web-based software applications that support the entire corporate supply chain. According to Gery (1991), a performance…

  20. Problems in Systematic Application of Software Metrics and Possible Solution

    OpenAIRE

    Rakic, Gordana; Budimac, Zoran

    2013-01-01

    Systematic application of software metric techniques can lead to significant improvements of the quality of a final software product. However, there is still the evident lack of wider utilization of software metrics techniques and tools due to many reasons. In this paper we investigate some limitations of contemporary software metrics tools and then propose construction of a new tool that would solve some of the problems. We describe the promising prototype, its internal structure, and then f...